SQL Server : Insert blank line in query - sql

This may be so last year but I'm using SQL Server 2005
stmpdate intime
----------------------
2014-10-08 08:04:43
2014-10-09 07:57:13
2014-10-10 07:57:14
2014-10-16 07:79:56
2014-10-17 07:45:56
I have this table. It keeps check-in time of the employee, but this employee didn't check-in everyday in the month. So what I want it to be is something like this
stmpdate intime
1 2014-10-01
2 2014-10-02
3 2014-10-03
4 2014-10-04
5 2014-10-05
6 2014-10-06
7 2014-10-07
8 2014-10-08 08:04:43
9 2014-10-09 07:57:13
10 2014-10-10 07:57:14
11 2014-10-11
12 2014-10-12
13 2014-10-13
14 2014-10-14
15 2014-10-15
16 2014-10-16 07:59:56
17 2014-10-17 07:45:56
18 2014-10-18
19 2014-10-19
20 2014-10-20
21 2014-10-21
22 2014-10-22
23 2014-10-23
24 2014-10-24
25 2014-10-25
26 2014-10-26
27 2014-10-27
28 2014-10-28
29 2014-10-29
30 2014-10-30
31 2014-10-31
I tried to make a temp table which contains every date in the month, and then left join it with the first table I mentioned, but it seemed to not work.
declare #datetemp table (
stmpdate varchar(10)
);
insert into #datetemp
SELECT '2014-10-01'
UNION ALL
SELECT '2014-10-02'
UNION ALL
SELECT '2014-10-03'
....
and
SELECT dtt.stmpdate, intime
FROM #datetemp dtt left join v_dayTimesheet
on dtt.stmpdate=v_dayTimesheet.stmpdate
WHERE (emp_no = '001234567')
here is the result of query above
stmpdate intime
2014-10-08 08:04:43
2014-10-09 07:57:13
2014-10-10 07:57:14
2014-10-16 07:59:56
2014-10-17 07:45:56
and here is the result of select * from #datetemp
2014-10-01
2014-10-02
2014-10-03
2014-10-04
2014-10-05
2014-10-06
2014-10-07
2014-10-08
2014-10-09
2014-10-10
2014-10-11
2014-10-12
2014-10-13
2014-10-14
2014-10-15
2014-10-16
2014-10-17
2014-10-18
2014-10-19
2014-10-20
2014-10-21
2014-10-22
2014-10-23
2014-10-24
2014-10-25
2014-10-26
2014-10-27
2014-10-28
2014-10-29
2014-10-30
2014-10-31

you're filtering for only where emp_no has a value. if they didn't check in, it won't return on that row because you just have date info and no employee number. so you have to allow for equal or null.
SELECT dtt.stmpdate, intime
FROM #datetemp dtt
left outer join v_dayTimesheet
on dtt.stmpdate=v_dayTimesheet.stmpdate
WHERE emp_no = '001234567' or emp_no is null
also, for your dates... check this out: http://www.sqlservercurry.com/2010/03/generate-start-and-end-date-range-using.html
DECLARE
#StartDate datetime = '2010-01-01',
#EndDate datetime = '2010-03-01'
;WITH datetemp as
(
SELECT #StartDate as stmpdate
UNION ALL
SELECT DATEADD(day, 1, stmpdate)
FROM datetemp
WHERE DATEADD(day, 1, stmpdate) <= #EndDate
)
SELECT stmpdate
FROM datetemp;
you would then select from datetemp as a normal table. beware, though, a common table expression can only be used once and immediately following the with statement.
just trust me on this one... run this query and see how your blank lines occur:
SELECT dtt.stmpdate, intime, emp_no
FROM #datetemp dtt
left outer join v_dayTimesheet
on dtt.stmpdate=v_dayTimesheet.stmpdate
WHERE emp_no = '001234567' or emp_no is null
all these lines will return with emp_no = 001234567
stmpdate intime
2014-10-08 08:04:43
2014-10-09 07:57:13
2014-10-10 07:57:14
2014-10-16 07:59:56
2014-10-17 07:45:56
and all your blank lines will have null as emp_no.

I got my answer!!
SELECT dtt.stmpdate, intime
FROM #datetemp dtt left join
(
SELECT stmpdate, intime
FROM v_dayTimesheet
WHERE (emp_no = '001234567')
) as vdayTimesheet
on sparedate.stmpdate=vdayTimesheet.stampdate
ORDER BY stmpdate
this is what I want, thanks everyone

SQL Query:
SQLFIDDLEExample
SELECT t2.dt,
isnull(t1.intime, '') intime
FROM
(
SELECT DATEADD(day,number,'2014-10-01') dt
FROM master..spt_values
WHERE Type = 'P'
AND DATEADD(day,number,'2014-10-01') >= '2014-10-01'
AND DATEADD(day,number,'2014-10-01') < '2014-11-01'
) t2
LEFT JOIN Table1 t1
ON t1.stmpdate = t2.dt
Result:
| DT | INTIME |
|--------------------------------|----------|
| October, 01 2014 00:00:00+0000 | |
| October, 02 2014 00:00:00+0000 | |
| October, 03 2014 00:00:00+0000 | |
| October, 04 2014 00:00:00+0000 | |
| October, 05 2014 00:00:00+0000 | |
| October, 06 2014 00:00:00+0000 | |
| October, 07 2014 00:00:00+0000 | |
| October, 08 2014 00:00:00+0000 | 08:04:43 |
| October, 09 2014 00:00:00+0000 | 07:57:13 |
| October, 10 2014 00:00:00+0000 | 07:57:14 |
| October, 11 2014 00:00:00+0000 | |
| October, 12 2014 00:00:00+0000 | |
| October, 13 2014 00:00:00+0000 | |
| October, 14 2014 00:00:00+0000 | |
| October, 15 2014 00:00:00+0000 | |
| October, 16 2014 00:00:00+0000 | 07:79:56 |
| October, 17 2014 00:00:00+0000 | 07:45:56 |
| October, 18 2014 00:00:00+0000 | |
| October, 19 2014 00:00:00+0000 | |
| October, 20 2014 00:00:00+0000 | |
| October, 21 2014 00:00:00+0000 | |
| October, 22 2014 00:00:00+0000 | |
| October, 23 2014 00:00:00+0000 | |
| October, 24 2014 00:00:00+0000 | |
| October, 25 2014 00:00:00+0000 | |
| October, 26 2014 00:00:00+0000 | |
| October, 27 2014 00:00:00+0000 | |
| October, 28 2014 00:00:00+0000 | |
| October, 29 2014 00:00:00+0000 | |
| October, 30 2014 00:00:00+0000 | |
| October, 31 2014 00:00:00+0000 | |

Related

How to get weekly data but starting from the first date of the month and do SUM calculation accordingly in BQ?

I have an issue to pull this kind of data. So I need to pull weekly data with these specifications:
The data pull will be scheduled, hence it will involve multiple months
The very first week will start from the first date (1 in every month) -- Green in the pic
The last week doesn't involve dates from the next month -- Red in the pic
The raw data and the desirable output(s) will more or less look like this:
Is there any workaround to do this in BigQuery? Thanks (attached below the data)
+-------------+-------+
| date | sales |
+-------------+-------+
| 1 Oct 2021 | 5 |
+-------------+-------+
| 2 Oct 2021 | 13 |
+-------------+-------+
| 3 Oct 2021 | 75 |
+-------------+-------+
| 4 Oct 2021 | 3 |
+-------------+-------+
| 5 Oct 2021 | 70 |
+-------------+-------+
| 6 Oct 2021 | 85 |
+-------------+-------+
| 7 Oct 2021 | 99 |
+-------------+-------+
| 8 Oct 2021 | 90 |
+-------------+-------+
| 9 Oct 2021 | 68 |
+-------------+-------+
| 10 Oct 2021 | 97 |
+-------------+-------+
| 11 Oct 2021 | 87 |
+-------------+-------+
| 12 Oct 2021 | 56 |
+-------------+-------+
| 13 Oct 2021 | 99 |
+-------------+-------+
| 14 Oct 2021 | 38 |
+-------------+-------+
| 15 Oct 2021 | 6 |
+-------------+-------+
| 16 Oct 2021 | 43 |
+-------------+-------+
| 17 Oct 2021 | 45 |
+-------------+-------+
| 18 Oct 2021 | 90 |
+-------------+-------+
| 19 Oct 2021 | 64 |
+-------------+-------+
| 20 Oct 2021 | 26 |
+-------------+-------+
| 21 Oct 2021 | 24 |
+-------------+-------+
| 22 Oct 2021 | 4 |
+-------------+-------+
| 23 Oct 2021 | 36 |
+-------------+-------+
| 24 Oct 2021 | 68 |
+-------------+-------+
| 25 Oct 2021 | 4 |
+-------------+-------+
| 26 Oct 2021 | 16 |
+-------------+-------+
| 27 Oct 2021 | 30 |
+-------------+-------+
| 28 Oct 2021 | 89 |
+-------------+-------+
| 29 Oct 2021 | 46 |
+-------------+-------+
| 30 Oct 2021 | 28 |
+-------------+-------+
| 31 Oct 2021 | 28 |
+-------------+-------+
| 1 Nov 2021 | 47 |
+-------------+-------+
| 2 Nov 2021 | 75 |
+-------------+-------+
| 3 Nov 2021 | 1 |
+-------------+-------+
| 4 Nov 2021 | 26 |
+-------------+-------+
| 5 Nov 2021 | 26 |
+-------------+-------+
| 6 Nov 2021 | 38 |
+-------------+-------+
| 7 Nov 2021 | 79 |
+-------------+-------+
| 8 Nov 2021 | 37 |
+-------------+-------+
| 9 Nov 2021 | 83 |
+-------------+-------+
| 10 Nov 2021 | 97 |
+-------------+-------+
| 11 Nov 2021 | 56 |
+-------------+-------+
| 12 Nov 2021 | 83 |
+-------------+-------+
| 13 Nov 2021 | 14 |
+-------------+-------+
| 14 Nov 2021 | 25 |
+-------------+-------+
| 15 Nov 2021 | 55 |
+-------------+-------+
| 16 Nov 2021 | 16 |
+-------------+-------+
| 17 Nov 2021 | 80 |
+-------------+-------+
| 18 Nov 2021 | 66 |
+-------------+-------+
| 19 Nov 2021 | 25 |
+-------------+-------+
| 20 Nov 2021 | 62 |
+-------------+-------+
| 21 Nov 2021 | 36 |
+-------------+-------+
| 22 Nov 2021 | 33 |
+-------------+-------+
| 23 Nov 2021 | 19 |
+-------------+-------+
| 24 Nov 2021 | 47 |
+-------------+-------+
| 25 Nov 2021 | 14 |
+-------------+-------+
| 26 Nov 2021 | 22 |
+-------------+-------+
| 27 Nov 2021 | 66 |
+-------------+-------+
| 28 Nov 2021 | 15 |
+-------------+-------+
| 29 Nov 2021 | 96 |
+-------------+-------+
| 30 Nov 2021 | 4 |
+-------------+-------+
Consider below approach
with temp as (
select parse_date('%d %B %Y', date) date, sales
from your_table
)
select format_date('%d %B %Y', weeks[ordinal(num)]) start_week, sum(sales) total_sales
from (
select sales, weeks, range_bucket(date, weeks) num
from temp, unnest([struct(generate_date_array(date_trunc(date, month), last_day(date, month), interval 7 day ) as weeks)])
)
group by start_week
if to apply to sample data (as is) in your question - output is

How to deduplicate table rows with the same date and keep the row with the most current date stamp?

A client (e-commerce store) doesn't possess a very well-built database. For instance, there are many users with a lot of shopping orders (=different IDs) for exactly the same products and on the same day. It is obvious that these seemingly multiple orders are in many cases just one unique order. At least that's what we have decided to work with to simplify the issue. (I am trying to do a basic data analytics.)
My table might look like this:
| Email | OrderID | Order_date | TotalAmount |
| ----------------- | --------- | ---------------- | ---------------- |
|customerA#gmail.com| 1 |Jan 01 2021 1:00PM| 2000 |
|customerA#gmail.com| 2 |Jan 01 2021 1:03PM| 2000 |
|customerA#gmail.com| 3 |Jan 01 2021 1:05PM| 2000 |
|customerA#gmail.com| 4 |Jan 01 2021 1:10PM| 2000 |
|customerA#gmail.com| 5 |Jan 01 2021 1:14PM| 2000 |
|customerA#gmail.com| 6 |Jan 03 2021 3:55PM| 3000 |
|customerA#gmail.com| 7 |Jan 03 2021 4:00PM| 3000 |
|customerA#gmail.com| 8 |Jan 03 2021 4:05PM| 3000 |
|customerB#gmail.com| 9 |Jan 04 2021 2:10PM| 1000 |
|customerB#gmail.com| 10 |Jan 04 2021 2:20PM| 1000 |
|customerB#gmail.com| 11 |Jan 04 2021 2:30PM| 1000 |
|customerB#gmail.com| 12 |Jan 06 2021 5:00PM| 5000 |
|customerC#gmail.com| 13 |Jan 09 2021 3:00PM| 4000 |
|customerC#gmail.com| 14 |Jan 09 2021 3:06PM| 4000 |
And my desired result would look like this:
| Email | OrderID | Order_date | TotalAmount |
| ----------------- | --------- | ---------------- | ---------------- |
|customerA#gmail.com| 5 |Jan 01 2021 1:14PM| 2000 |
|customerA#gmail.com| 8 |Jan 03 2021 4:05PM| 3000 |
|customerA#gmail.com| 11 |Jan 04 2021 2:30PM| 1000 |
|customerA#gmail.com| 12 |Jan 06 2021 5:00PM| 5000 |
|customerA#gmail.com| 14 |Jan 09 2021 3:06PM| 4000 |
I would guess this might be a common problem, but is there a simple solution to this?
Maybe there is, but I certainly don't seem to come up with one any time soon. I'd like to see even a complex solution, btw :-)
Thank you for any kind of help you can provide!
Do you mean this?
WITH
indata(Email,OrderID,Order_ts,TotalAmount) AS (
SELECT 'customerA#gmail.com', 1,TO_TIMESTAMP( 'Jan 01 2021 01:00PM','Mon DD YYYY HH12:MIAM'),2000
UNION ALL SELECT 'customerA#gmail.com', 2,TO_TIMESTAMP( 'Jan 01 2021 01:03PM','Mon DD YYYY HH12:MIAM'),2000
UNION ALL SELECT 'customerA#gmail.com', 3,TO_TIMESTAMP( 'Jan 01 2021 01:05PM','Mon DD YYYY HH12:MIAM'),2000
UNION ALL SELECT 'customerA#gmail.com', 4,TO_TIMESTAMP( 'Jan 01 2021 01:10PM','Mon DD YYYY HH12:MIAM'),2000
UNION ALL SELECT 'customerA#gmail.com', 5,TO_TIMESTAMP( 'Jan 01 2021 01:14PM','Mon DD YYYY HH12:MIAM'),2000
UNION ALL SELECT 'customerA#gmail.com', 6,TO_TIMESTAMP( 'Jan 03 2021 03:55PM','Mon DD YYYY HH12:MIAM'),3000
UNION ALL SELECT 'customerA#gmail.com', 7,TO_TIMESTAMP( 'Jan 03 2021 04:00PM','Mon DD YYYY HH12:MIAM'),3000
UNION ALL SELECT 'customerA#gmail.com', 8,TO_TIMESTAMP( 'Jan 03 2021 04:05PM','Mon DD YYYY HH12:MIAM'),3000
UNION ALL SELECT 'customerB#gmail.com', 9,TO_TIMESTAMP( 'Jan 04 2021 02:10PM','Mon DD YYYY HH12:MIAM'),1000
UNION ALL SELECT 'customerB#gmail.com',10,TO_TIMESTAMP( 'Jan 04 2021 02:20PM','Mon DD YYYY HH12:MIAM'),1000
UNION ALL SELECT 'customerB#gmail.com',11,TO_TIMESTAMP( 'Jan 04 2021 02:30PM','Mon DD YYYY HH12:MIAM'),1000
UNION ALL SELECT 'customerB#gmail.com',12,TO_TIMESTAMP( 'Jan 06 2021 05:00PM','Mon DD YYYY HH12:MIAM'),5000
UNION ALL SELECT 'customerC#gmail.com',13,TO_TIMESTAMP( 'Jan 09 2021 03:00PM','Mon DD YYYY HH12:MIAM'),4000
UNION ALL SELECT 'customerC#gmail.com',14,TO_TIMESTAMP( 'Jan 09 2021 03:06PM','Mon DD YYYY HH12:MIAM'),4000
)
,
-- need a ROW_NUMBER() to identify the last row within the day (order descending to get 1.
-- can't filter by an OLAP function, so in a fullselect, and WHERE cond in the final SELECT
with_rank AS (
SELECT
*
, ROW_NUMBER() OVER(PARTITION BY email,DAY(order_ts) ORDER BY order_ts DESC) AS rank
FROM INDATA
)
SELECT
*
FROM with_rank
WHERE rank = 1;
-- out Email | OrderID | Order_ts | TotalAmount | rank
-- out ---------------------+---------+---------------------+-------------+------
-- out customerA#gmail.com | 5 | 2021-01-01 13:14:00 | 2000 | 1
-- out customerA#gmail.com | 8 | 2021-01-03 16:05:00 | 3000 | 1
-- out customerB#gmail.com | 11 | 2021-01-04 14:30:00 | 1000 | 1
-- out customerB#gmail.com | 12 | 2021-01-06 17:00:00 | 5000 | 1
-- out customerC#gmail.com | 14 | 2021-01-09 15:06:00 | 4000 | 1

How to create churn table from transactional data?

Currently my Transaction Table has customer's transaction data for each month. Account_ID identifies the customer's ID. Order_ID is the number of orders that the customer had made. Reporting_week_start_date is the week which begins on Monday where each transaction is made (Date_Purchased).
How do i create a new table to identify the customer_status after each transaction has been made? Note that the new table has the Reporting_week_start_date until current date despite no transactions has been made .
Customer_Status
- New : customers who made their first paid subscription
- Recurring : customers with continuous payment
- Churned : when customers' subscriptions had expired and there's no renewal within the next month/same month
- Reactivated : customers who had churned and then returned to re-subscribe
Transaction Table
Account_ID | Order_ID | Reporting_week_start_date| Date_Purchased | Data_Expired
001 | 1001 | 31 Dec 2018 | 01 Jan 2019 | 08 Jan 2019
001 | 1001 | 07 Jan 2019 | 08 Jan 2019 | 15 Jan 2019
001 | 1001 | 14 Jan 2019 | 15 Jan 2019 | 22 Jan 2019 #Transaction 1
001 | 1001 | 21 Jan 2019 | 22 Jan 2019 | 29 Jan 2019
001 | 1001 | 28 Jan 2019 | 29 Jan 2019 | 31 Jan 2019
001 | 1002 | 28 Jan 2019 | 01 Feb 2019 | 08 Feb 2019
001 | 1002 | 04 Feb 2019 | 08 Feb 2019 | 15 Feb 2019 #Transaction 2
001 | 1002 | 11 Feb 2019 | 15 Feb 2019 | 22 Feb 2019
001 | 1002 | 18 Feb 2019 | 22 Feb 2019 | 28 Feb 2019
001 | 1003 | 25 Feb 2019 | 01 Mar 2019 | 08 Mar 2019
001 | 1003 | 04 Mar 2019 | 08 Mar 2019 | 15 Mar 2019
001 | 1003 | 11 Mar 2019 | 15 Mar 2019 | 22 Mar 2019 #Transaction 3
001 | 1003 | 18 Mar 2019 | 22 Mar 2019 | 29 Mar 2019
001 | 1003 | 25 Mar 2019 | 29 Mar 2019 | 31 Mar 2019
001 | 1004 | 27 May 2019 | 01 Jun 2019 | 08 Jun 2019
001 | 1004 | 03 Jun 2019 | 08 Jun 2019 | 15 Jun 2019 #Transaction 4
001 | 1004 | 10 Jun 2019 | 15 Jun 2019 | 22 Jun 2019
001 | 1004 | 17 Jun 2019 | 22 Jun 2019 | 29 Jun 2019
001 | 1004 | 24 Jun 2019 | 29 Jun 2019 | 30 Jun 2019
Expected Output
Account_ID | Order_ID | Reporting_week_start_date| Customer_status
001 | 1001 | 31 Dec 2018 | New
001 | 1001 | 07 Jan 2019 | New #Transaction 1
001 | 1001 | 14 Jan 2019 | New
001 | 1001 | 21 Jan 2019 | New
001 | 1001 | 28 Jan 2019 | New
001 | 1002 | 28 Jan 2019 | Recurring
001 | 1002 | 04 Feb 2019 | Recurring #Transaction 2
001 | 1002 | 11 Feb 2019 | Recurring
001 | 1002 | 18 Feb 2019 | Recurring
001 | 1003 | 25 Feb 2019 | Churned
001 | 1003 | 04 Mar 2019 | Churned #Transaction 3
001 | 1003 | 11 Mar 2019 | Churned
001 | 1003 | 18 Mar 2019 | Churned
001 | 1003 | 25 Mar 2019 | Churned
001 | - | 1 Apr 2019 | Churned
001 | - | 08 Apr 2019 | Churned
001 | - | 15 Apr 2019 | Churned
001 | - | 22 Apr 2019 | Churned
001 | - | 29 Apr 2019 | Churned
001 | - | 29 Apr 2019 | Churned
001 | - | 06 May 2019 | Churned
001 | - | 13 May 2019 | Churned
001 | - | 20 May 2019 | Churned
001 | - | 27 May 2019 | Churned
001 | 1004 | 27 May 2019 | Reactivated
001 | 1004 | 03 Jun 2019 | Reactivated #Transaction 4
001 | 1004 | 10 Jun 2019 | Reactivated
001 | 1004 | 17 Jun 2019 | Reactivated
001 | 1004 | 24 Jun 2019 | Reactivated'
...
...
...
current date
I think you just want window functions and case logic. Assuming the date you are referring to is Reporting_week_start_date, then the logic looks something like this:
select t.*,
(case when Reporting_week_start_date = min(Reporting_week_start_date) over (partition by account_id)
then 'New'
when Reporting_week_start_date < dateadd(lag(Reporting_week_start_date) over (partition by account_id order by Reporting_week_start_date), interval 1 month)
then 'Recurring'
when Reporting_week_start_date < dateadd(lead(Reporting_week_start_date) over (partition by account_id order by Reporting_week_start_date), interval -1 month)
then 'Churned'
else 'Reactivated'
end) as status
from transactions t;
These are not exactly the rules you have specified. But they seem very reasonable interpretations of what you want to do.

Right join multiple key where mutiple key is null

I need help from captain obvious I suppose. I'm trying to Insert data from a table into a temptable. Ok this is easy
I need to insert the data we got today and the data we got 10 days ago. The where clause may aford it, th's okay
What for me is hard is to insert the data of today only if it does not appear in the data 10 days ago
An exemple of the table I use ([datatable]) :
Date Purchase Line_Purchase
---------------------------------------------------------------------------
2017-04-29 0000002 01
2017-04-29 0000002 02
2017-04-29 0000003 01
2017-04-29 0000003 02
2017-04-29 0000003 03
2017-04-29 0000004 01
2017-04-29 0000005 01
2017-04-19 0000001 01
2017-04-19 0000001 02
2017-04-19 0000001 03
2017-04-19 0000002 01
2017-04-19 0000002 02
My desired table temptable:
Input_date Purchase Line_Purchase
-------------------------------------------------------------------------
2017-04-19 0000001 01
2017-04-19 0000001 02
2017-04-19 0000001 03
2017-04-19 0000002 01
2017-04-19 0000002 02
2017-04-29 0000003 01
2017-04-29 0000003 02
2017-04-29 0000003 03
2017-04-29 0000004 01
2017-04-29 0000005 01
Is there any request possible in SQL that can change that ?
I tried this way
INSERT INTO #TEMPTABLE
(Input_date ,Purchase ,Line_Purchase)
SELECT
table.Date
,table.Purchase
,table.Line_Purchase
FROM
datatable table
WHERE
convert(date, table.Date) = convert(date, GETDATE() - 10)
INSERT INTO #TEMPTABLE
(Input_date ,Purchase ,Line_Purchase)
SELECT
table.Date
,table.Purchase
,table.Line_Purchase
FROM
datatable table
RIGHT JOIN #TEMPTABLE temp
on table.Purchase = temp.Purchase and table.Line_Purchase = temp.Line_Purchase
WHERE
convert(date, table.Date) = convert(date, GETDATE())
AND (temp.Purchase is null AND temp.Line_Purchase is null)
Thanks in advance
You can do this with not exists():
select date as Input_date, Purchase, Line_Purchase
into #temptable
from t
where date = '2017-04-19' --convert(date, getdate() - 10);
insert into #temptable (Input_date, Purchase, Line_Purchase)
select *
from t
where date = '2017-04-29'
and not exists (
select 1
from t as i
where i.purchase=t.purchase
and i.line_purchase=t.line_purchase
and i.date = '2017-04-19' --convert(date, getdate() - 10)
);
select *
from #temptable;
rextester demo: http://rextester.com/SAQSG21367
returns:
+------------+----------+---------------+
| Input_Date | Purchase | Line_Purchase |
+------------+----------+---------------+
| 2017-04-19 | 0000001 | 01 |
| 2017-04-19 | 0000001 | 02 |
| 2017-04-19 | 0000001 | 03 |
| 2017-04-19 | 0000002 | 01 |
| 2017-04-19 | 0000002 | 02 |
| 2017-04-29 | 0000003 | 01 |
| 2017-04-29 | 0000003 | 02 |
| 2017-04-29 | 0000003 | 03 |
| 2017-04-29 | 0000004 | 01 |
| 2017-04-29 | 0000005 | 01 |
+------------+----------+---------------+
Optionally, if you are doing both of these operations at the same time you can do it in the same query using a derived table/subquery or common table expression with row_number()
;
;with cte as (
select date, Purchase, Line_Purchase
, rn = row_number() over (partition by Purchase,Line_Purchase order by date)
from t
--where date in ('2017-09-26','2017-09-16')
where date in (convert(date, getdate()), convert(date, getdate()-10))
)
select date as Input_date, Purchase, Line_Purchase
into #temptable
from cte
where rn = 1
select *
from #temptable;
rextester demo: http://rextester.com/QMF5992
returns:
+------------+----------+---------------+
| Input_date | Purchase | Line_Purchase |
+------------+----------+---------------+
| 2017-09-16 | 0000001 | 01 |
| 2017-09-16 | 0000001 | 02 |
| 2017-09-16 | 0000001 | 03 |
| 2017-09-16 | 0000002 | 01 |
| 2017-09-16 | 0000002 | 02 |
| 2017-09-26 | 0000003 | 01 |
| 2017-09-26 | 0000003 | 02 |
| 2017-09-26 | 0000003 | 03 |
| 2017-09-26 | 0000004 | 01 |
| 2017-09-26 | 0000005 | 01 |
+------------+----------+---------------+

Data Mismatch after Left Join SQL

I have two tables, one that contains production data and the other has forecasted data. I am joining the two tables to compare the actual production data to forecasted data.
My sample tables are as follows:
**Prod Tbl**
Product Plant pmonth pyear quantity
B007 2 January 2014 45
B007 2 February 2014 270
B007 2 March 2014 270
B007 2 April 2014 45
B007 2 May 2014 90
B007 2 May 2014 90
B007 2 June 2014 90
B007 2 June 2014 90
B007 2 July 2014 135
B007 2 July 2014 45
B007 2 August 2014 135
B007 2 August 2014 135
B007 2 July 2015 90
B007 2 August 2014 135
B007 2 September 2014 135
B007 2 September 2015 135
B007 2 October 2015 90
B007 2 September 2014 135
B007 2 September 2014 90
B007 2 September 2014 90
B007 2 November 2014 254
B007 2 May 2016 90
B007 2 August 2016 135
B007 2 October 2016 87
**Forecast Tbl**
Product Plant Fmonth Fyear Fqty
B007 2 July 2017 100
B007 2 August 2017 100
B007 2 September 2017 100
B007 2 October 2017 100
B007 2 Novenmber 2017 100
B007 2 December 2017 100
Query Used to Join:
Select a.Product,
a.plant,
b.pmonth,
b.pyear,
coalesce(b.quantity,0) as quantity,
a.fmonth,
a.fyear,coalesce(a.fqty,0) as fqty
from
Frcast_Tbl as a
left join on Prod_Tbl as b on (a.Product = b.Product
and a.Plant = b.plant
and b.pMonth = a.fMonth);
Result:
After Joining
Product Plant Pmonth Pyear Quantity Fmonth Fyear fqty
B007 2 July 2014 180 July 2017 100
B007 2 July 2015 90 July 2017 100
B007 2 August 2014 405 August 2017 100
B007 2 August 2016 315 August 2017 100
B007 2 September 2014 450 September 2017 100
B007 2 September 2015 135 September 2017 100
B007 2 October 2016 177 October 2017 100
B007 2 October 2015 90 October 2017 100
B007 2 November 2014 356 November 2017 100
B007 2 December 2016 90 December 2017 100
B007 2 January 2015 90 January 2018 100
B007 2 January 2016 90 January 2018 100
B007 2 January 2014 45 January 2018 100
B007 2 January 2017 90 January 2018 100
B007 2 February 2014 270 February 2018 99
B007 2 March 2014 270 March 2018 101
B007 2 March 2017 90 March 2018 101
B007 2 April 2014 45 April 2018 100
B007 2 May 2016 90 May 2018 100
B007 2 May 2014 180 May 2018 100
B007 2 May 2017 90 May 2018 100
Filtered for a particular year to better explain the problem
Producr plant pmonth pyear quantity fmonth fyear fqty
B007 2 August 2016 315 August 2017 100
B007 2 October 2016 177 October 2017 100
B007 2 December 2016 90 December 2017 100
Desired Table
Product Plant Pmonth Pyear Quantity fmonth fyear fqty
B007 2 January 2016 90 null null 0
B007 2 May 2016 90 null null 0
B007 2 June 2016 270 null null 0
B007 2 null null 0 July 2017 100
B007 2 August 2016 315 August 2017 100
B007 2 null null 0 September 2017 100
B007 2 October 2016 177 October 2017 100
B007 2 null null 0 November 2017 100
B007 2 December 2016 90 December 2017 100
What my query is doing is that it joins item, plant and month using left join but I want my resultant table to display all the months for both prod and frcast and in cases where month is not found, display null or zeros. Please help.
You could try this. The Subquery after FULL JOIN is to extract only one year from Products table.
I added a CASE for ORDER BY too.
One year version
SELECT COALESCE(a.Product,b.Product) AS PRODUCT,
COALESCE(a.plant,b.plant) AS PLANT,
b.pmonth,
b.pyear,
coalesce(b.quantity,0) as quantity,
a.fmonth AS FMONTH,
a.fyear,
coalesce(a.fqty,0) as fqty
FROM FORECAST A
FULL JOIN (SELECT * FROM PROD WHERE pyear=2016) B on a.Product = b.Product
and a.Plant = b.plant
and A.fmonth = b.pMonth
ORDER BY CASE COALESCE(b.pmonth, a.fmonth)
WHEN 'January' THEN 1
WHEN 'February' THEN 2
WHEN 'March' THEN 3
WHEN 'April' THEN 4
WHEN 'May' THEN 5
WHEN 'June' THEN 6
WHEN 'July' THEN 7
WHEN 'August' THEN 8
WHEN 'September' THEN 9
WHEN 'October' THEN 10
WHEN 'November' THEN 11
WHEN 'December' THEN 12
END ;
Pls note that your sample data (first table) are not complete.
Output:
+-------------+-------+---------+-------+----------+-----------+-------+------+
| PRODUCT | PLANT | pmonth | pyear | quantity | FMONTH | fyear | fqty |
+-------------+-------+---------+-------+----------+-----------+-------+------+
| B007 | 2 | January | 2016 | 90 | NULL | NULL | 0 |
| B007 | 2 | May | 2016 | 90 | NULL | NULL | 0 |
| B007 | 2 | June | 2016 | 270 | NULL | NULL | 0 |
| B007 | 2 | NULL | NULL | 0 | July | 2017 | 100 |
| B007 | 2 | August | 2016 | 135 | August | 2017 | 100 |
| B007 | 2 | NULL | NULL | 0 | September | 2017 | 100 |
| B007 | 2 | October | 2016 | 87 | October | 2017 | 100 |
| B007 | 2 | NULL | NULL | 0 | November | 2017 | 100 |
| B007 | 2 | NULL | NULL | 0 | December | 2017 | 100 |
+-------------+-------+---------+-------+----------+-----------+-------+------+
Added: multi year version, with group by on PROD table
SELECT COALESCE(a.Product,b.Product) AS PRODUCT,
COALESCE(a.plant,b.plant) AS PLANT,
b.pmonth,
COALESCE(b.pyear,Y.pyear) AS pyear,
COALESCE(b.quantity,0) as quantity,
a.fmonth AS FMONTH,
a.fyear,
coalesce(a.fqty,0) as fqty
FROM FORECAST A
CROSS JOIN (SELECT DISTINCT pyear FROM PROD /* WHERE pyear IN (2015,2016)*/ ) Y
FULL JOIN (SELECT Product, Plant, pyear, pmonth, SUM(quantity) AS quantity
FROM PROD /*WHERE pyear IN (2015,2016)*/
GROUP BY Product, Plant, pyear, pmonth
) B on a.Product = b.Product
and a.Plant = b.plant
and A.fmonth = b.pMonth
AND Y.pyear= B.pyear
ORDER BY COALESCE(b.pyear,Y.pyear), CASE COALESCE(b.pmonth, a.fmonth)
WHEN 'January' THEN 1
WHEN 'February' THEN 2
WHEN 'March' THEN 3
WHEN 'April' THEN 4
WHEN 'May' THEN 5
WHEN 'June' THEN 6
WHEN 'July' THEN 7
WHEN 'August' THEN 8
WHEN 'September' THEN 9
WHEN 'October' THEN 10
WHEN 'November' THEN 11
WHEN 'December' THEN 12
END ;
Output:
+---------+-------+-----------+-------+----------+-----------+-------+------+
| PRODUCT | PLANT | pmonth | pyear | quantity | FMONTH | fyear | fqty |
+---------+-------+-----------+-------+----------+-----------+-------+------+
| B007 | 2 | January | 2014 | 45 | NULL | NULL | 0 |
| B007 | 2 | February | 2014 | 270 | NULL | NULL | 0 |
| B007 | 2 | March | 2014 | 270 | NULL | NULL | 0 |
| B007 | 2 | April | 2014 | 45 | NULL | NULL | 0 |
| B007 | 2 | May | 2014 | 180 | NULL | NULL | 0 |
| B007 | 2 | June | 2014 | 180 | NULL | NULL | 0 |
| B007 | 2 | July | 2014 | 180 | July | 2017 | 100 |
| B007 | 2 | August | 2014 | 405 | August | 2017 | 100 |
| B007 | 2 | September | 2014 | 450 | September | 2017 | 100 |
| B007 | 2 | NULL | 2014 | 0 | October | 2017 | 100 |
| B007 | 2 | November | 2014 | 254 | November | 2017 | 100 |
| B007 | 2 | NULL | 2014 | 0 | December | 2017 | 100 |
| B007 | 2 | July | 2015 | 90 | July | 2017 | 100 |
| B007 | 2 | NULL | 2015 | 0 | August | 2017 | 100 |
| B007 | 2 | September | 2015 | 135 | September | 2017 | 100 |
| B007 | 2 | October | 2015 | 90 | October | 2017 | 100 |
| B007 | 2 | NULL | 2015 | 0 | November | 2017 | 100 |
| B007 | 2 | NULL | 2015 | 0 | December | 2017 | 100 |
| B007 | 2 | January | 2016 | 90 | NULL | NULL | 0 |
| B007 | 2 | May | 2016 | 90 | NULL | NULL | 0 |
| B007 | 2 | June | 2016 | 270 | NULL | NULL | 0 |
| B007 | 2 | NULL | 2016 | 0 | July | 2017 | 100 |
| B007 | 2 | August | 2016 | 135 | August | 2017 | 100 |
| B007 | 2 | NULL | 2016 | 0 | September | 2017 | 100 |
| B007 | 2 | October | 2016 | 87 | October | 2017 | 100 |
| B007 | 2 | NULL | 2016 | 0 | November | 2017 | 100 |
| B007 | 2 | NULL | 2016 | 0 | December | 2017 | 100 |
+---------+-------+-----------+-------+----------+-----------+-------+------+
Use FULL OUTER JOIN when you are filtering records for specific year
Select a.Product,a.plant,b.pmonth,b.pyear,coalesce(b.quantity,0) as quantity,a.fmonth,a.fyear,coalesce(a.fqty,0) as fqty
from Frcast_Tbl as a
FULL OUTER JOIN on Prod_Tbl as b on a.Product = b.Product and
a.Plant = b.plant and
b.pMonth = a.fMonth