Select from two dates, automatically displaying consecutive rows data - sql

I expect to select from two date, automatically displaying consecutive time rows data.
e.g:
Select *
from somefunction('2013/5','2019/3');
Expected result:
Year | Month
-----+------
2013 | 5
2013 | 6
.. | ..
2013 | 12
.. | ..
.. | ..
2019 | 1
2019 | 2
2019 | 3

I have solved the problem, and the solution is provided here .
declare #dStart datetime = '2013/05/01'
,#dEnd datetime = '2019/03/31';
SELECT year(Dateadd(month,number,#dStart)) as year,month(Dateadd(month,number,#dStart)) as month
FROM master..spt_values
WHERE
type = 'P'
AND number <= DATEDIFF(month, #dStart, #dEnd)
GO
year | month
---: | ----:
2013 | 5
2013 | 6
2013 | 7
2013 | 8
2013 | 9
2013 | 10
2013 | 11
2013 | 12
2014 | 1
2014 | 2
2014 | 3
2014 | 4
2014 | 5
2014 | 6
2014 | 7
2014 | 8
2014 | 9
2014 | 10
2014 | 11
2014 | 12
2015 | 1
2015 | 2
2015 | 3
2015 | 4
2015 | 5
2015 | 6
2015 | 7
2015 | 8
2015 | 9
2015 | 10
2015 | 11
2015 | 12
2016 | 1
2016 | 2
2016 | 3
2016 | 4
2016 | 5
2016 | 6
2016 | 7
2016 | 8
2016 | 9
2016 | 10
2016 | 11
2016 | 12
2017 | 1
2017 | 2
2017 | 3
2017 | 4
2017 | 5
2017 | 6
2017 | 7
2017 | 8
2017 | 9
2017 | 10
2017 | 11
2017 | 12
2018 | 1
2018 | 2
2018 | 3
2018 | 4
2018 | 5
2018 | 6
2018 | 7
2018 | 8
2018 | 9
2018 | 10
2018 | 11
2018 | 12
2019 | 1
2019 | 2
2019 | 3
db<>fiddle here

Related

How do I edit the code that calculates the value for the four weeks of the month for all months with pl/sql?

I divided the month into four weeks and printed the amount for each week. How do I set this up with a loop for 12 months?
declare
cursor c is
select varis_tar, tutar
from muhasebe.doviz_takip
where trunc(varis_tar) BETWEEN TO_DATE('01/10/2021', 'DD/MM/YYYY') AND
TO_DATE('31/10/2021', 'DD/MM/YYYY')
group by varis_tar,tutar;
tutar1 number(13,2):=0;
tutar2 number(13,2):=0;
tutar3 number(13,2):=0;
tutar4 number(13,2):=0;
begin
for r in c loop
if r.varis_tar between TO_DATE('01/10/2021', 'DD/MM/YYYY') AND
TO_DATE('07/10/2021', 'DD/MM/YYYY') then
tutar1:=(r.tutar)+tutar1;
--message(r.tutar);
elsif r.varis_tar between TO_DATE('07/10/2021', 'DD/MM/YYYY') AND
TO_DATE('14/10/2021', 'DD/MM/YYYY') then
tutar2:=(r.tutar)+tutar2;
--message(r.tutar);
elsif r.varis_tar between TO_DATE('14/10/2021', 'DD/MM/YYYY') AND
TO_DATE('21/10/2021', 'DD/MM/YYYY') then
tutar3:=(r.tutar)+tutar3;
--message(r.tutar);
elsif r.varis_tar between TO_DATE('21/10/2021', 'DD/MM/YYYY') AND
TO_DATE('31/10/2021', 'DD/MM/YYYY') then
tutar4:=(r.tutar)+tutar4;
--message(r.tutar);
end if;
end loop;
I tried to get the dates the same way for all the months. I tried that, but it worked wrong.
where trunc(varis_tar) BETWEEN TO_DATE('1', 'DD') AND
TO_DATE('31', 'DD')
if r.varis_tar between TO_DATE('1', 'DD') AND
TO_DATE('07', 'DD') then
elsif r.varis_tar between TO_DATE('7', 'DD') AND
TO_DATE('14', 'DD') then
elsif r.varis_tar between TO_DATE('14', 'DD') AND
TO_DATE('21', 'DD') then
elsif r.varis_tar between TO_DATE('21', 'DD') AND
TO_DATE('31', 'DD') then
I don't know if I'am understanding it correctly but:
try if extract(day from varis_tar) between 1 and 7
or more complex
l_week := to_char(varis_tar,'W'); --week number
if l_week = 1 then --first week
elsif l_week = 2 etc...
Your code has several issues:
date in Oracle is actually a datetime, so between will not count any time after the midnight of the upper boundary.
you count the midnight of the week's end twice: in current week and in the next week (between includes both boundaries).
you do not need any PL/SQL and especially a cursor loop, because it occupy resources during calculation outside of SQL context.
Use datetime format to calculate weeks, because it is easy to read and understand. Then group by corresponding components.
with a as (
select
date '2021-01-01' - 1 + level as dt
, level as val
from dual
connect by level < 400
)
, b as (
select
dt
, val
/*Map 29, 30 and 31 to 28*/
, to_char(
least(dt, trunc(dt, 'mm') + 27)
, 'yyyymmw'
) as w
from a
)
select
substr(w, 1, 4) as y
, substr(w, 5, 2) as m
, substr(w, -1) as w
, sum(val) as val
, min(dt) as dt_from
, max(dt) as dt_to
from b
group by
w
Y | M | W | VAL | DT_FROM | DT_TO
:--- | :- | :- | ---: | :--------- | :---------
2021 | 01 | 1 | 28 | 2021-01-01 | 2021-01-07
2021 | 01 | 2 | 77 | 2021-01-08 | 2021-01-14
2021 | 01 | 3 | 126 | 2021-01-15 | 2021-01-21
2021 | 01 | 4 | 265 | 2021-01-22 | 2021-01-31
2021 | 02 | 1 | 245 | 2021-02-01 | 2021-02-07
2021 | 02 | 2 | 294 | 2021-02-08 | 2021-02-14
2021 | 02 | 3 | 343 | 2021-02-15 | 2021-02-21
2021 | 02 | 4 | 392 | 2021-02-22 | 2021-02-28
2021 | 03 | 1 | 441 | 2021-03-01 | 2021-03-07
2021 | 03 | 2 | 490 | 2021-03-08 | 2021-03-14
2021 | 03 | 3 | 539 | 2021-03-15 | 2021-03-21
2021 | 03 | 4 | 855 | 2021-03-22 | 2021-03-31
2021 | 04 | 1 | 658 | 2021-04-01 | 2021-04-07
2021 | 04 | 2 | 707 | 2021-04-08 | 2021-04-14
2021 | 04 | 3 | 756 | 2021-04-15 | 2021-04-21
2021 | 04 | 4 | 1044 | 2021-04-22 | 2021-04-30
2021 | 05 | 1 | 868 | 2021-05-01 | 2021-05-07
2021 | 05 | 2 | 917 | 2021-05-08 | 2021-05-14
2021 | 05 | 3 | 966 | 2021-05-15 | 2021-05-21
2021 | 05 | 4 | 1465 | 2021-05-22 | 2021-05-31
2021 | 06 | 1 | 1085 | 2021-06-01 | 2021-06-07
2021 | 06 | 2 | 1134 | 2021-06-08 | 2021-06-14
2021 | 06 | 3 | 1183 | 2021-06-15 | 2021-06-21
2021 | 06 | 4 | 1593 | 2021-06-22 | 2021-06-30
2021 | 07 | 1 | 1295 | 2021-07-01 | 2021-07-07
2021 | 07 | 2 | 1344 | 2021-07-08 | 2021-07-14
2021 | 07 | 3 | 1393 | 2021-07-15 | 2021-07-21
2021 | 07 | 4 | 2075 | 2021-07-22 | 2021-07-31
2021 | 08 | 1 | 1512 | 2021-08-01 | 2021-08-07
2021 | 08 | 2 | 1561 | 2021-08-08 | 2021-08-14
2021 | 08 | 3 | 1610 | 2021-08-15 | 2021-08-21
2021 | 08 | 4 | 2385 | 2021-08-22 | 2021-08-31
2021 | 09 | 1 | 1729 | 2021-09-01 | 2021-09-07
2021 | 09 | 2 | 1778 | 2021-09-08 | 2021-09-14
2021 | 09 | 3 | 1827 | 2021-09-15 | 2021-09-21
2021 | 09 | 4 | 2421 | 2021-09-22 | 2021-09-30
2021 | 10 | 1 | 1939 | 2021-10-01 | 2021-10-07
2021 | 10 | 2 | 1988 | 2021-10-08 | 2021-10-14
2021 | 10 | 3 | 2037 | 2021-10-15 | 2021-10-21
2021 | 10 | 4 | 2995 | 2021-10-22 | 2021-10-31
2021 | 11 | 1 | 2156 | 2021-11-01 | 2021-11-07
2021 | 11 | 2 | 2205 | 2021-11-08 | 2021-11-14
2021 | 11 | 3 | 2254 | 2021-11-15 | 2021-11-21
2021 | 11 | 4 | 2970 | 2021-11-22 | 2021-11-30
2021 | 12 | 1 | 2366 | 2021-12-01 | 2021-12-07
2021 | 12 | 2 | 2415 | 2021-12-08 | 2021-12-14
2021 | 12 | 3 | 2464 | 2021-12-15 | 2021-12-21
2021 | 12 | 4 | 3605 | 2021-12-22 | 2021-12-31
2022 | 01 | 1 | 2583 | 2022-01-01 | 2022-01-07
2022 | 01 | 2 | 2632 | 2022-01-08 | 2022-01-14
2022 | 01 | 3 | 2681 | 2022-01-15 | 2022-01-21
2022 | 01 | 4 | 3915 | 2022-01-22 | 2022-01-31
2022 | 02 | 1 | 1194 | 2022-02-01 | 2022-02-03
db<>fiddle here
Or the same in columns:
with a as (
select
date '2021-01-01' - 1 + level as dt
, level as val
from dual
connect by level < 400
)
, b as (
select
val
/*Map 29, 30 and 31 to 28*/
, to_char(dt, 'yyyymm') as m
, to_char(
least(dt, trunc(dt, 'mm') + 27)
, 'w'
) as w
from a
)
select
substr(m, 1, 4) as y
, substr(m, 5, 2) as m
, tutar1
, tutar2
, tutar3
, tutar4
from b
pivot(
sum(val)
for w in (
1 as tutar1, 2 as tutar2
, 3 as tutar3, 4 as tutar4
)
)
Y | M | TUTAR1 | TUTAR2 | TUTAR3 | TUTAR4
:--- | :- | -----: | -----: | -----: | -----:
2021 | 01 | 28 | 77 | 126 | 265
2021 | 02 | 245 | 294 | 343 | 392
2021 | 03 | 441 | 490 | 539 | 855
2021 | 04 | 658 | 707 | 756 | 1044
2021 | 05 | 868 | 917 | 966 | 1465
2021 | 06 | 1085 | 1134 | 1183 | 1593
2021 | 07 | 1295 | 1344 | 1393 | 2075
2021 | 08 | 1512 | 1561 | 1610 | 2385
2021 | 09 | 1729 | 1778 | 1827 | 2421
2021 | 10 | 1939 | 1988 | 2037 | 2995
2021 | 11 | 2156 | 2205 | 2254 | 2970
2021 | 12 | 2366 | 2415 | 2464 | 3605
2022 | 01 | 2583 | 2632 | 2681 | 3915
2022 | 02 | 1194 | null | null | null
db<>fiddle here

How to get weekly data but starting from the first date of the month and do SUM calculation accordingly in BQ?

I have an issue to pull this kind of data. So I need to pull weekly data with these specifications:
The data pull will be scheduled, hence it will involve multiple months
The very first week will start from the first date (1 in every month) -- Green in the pic
The last week doesn't involve dates from the next month -- Red in the pic
The raw data and the desirable output(s) will more or less look like this:
Is there any workaround to do this in BigQuery? Thanks (attached below the data)
+-------------+-------+
| date | sales |
+-------------+-------+
| 1 Oct 2021 | 5 |
+-------------+-------+
| 2 Oct 2021 | 13 |
+-------------+-------+
| 3 Oct 2021 | 75 |
+-------------+-------+
| 4 Oct 2021 | 3 |
+-------------+-------+
| 5 Oct 2021 | 70 |
+-------------+-------+
| 6 Oct 2021 | 85 |
+-------------+-------+
| 7 Oct 2021 | 99 |
+-------------+-------+
| 8 Oct 2021 | 90 |
+-------------+-------+
| 9 Oct 2021 | 68 |
+-------------+-------+
| 10 Oct 2021 | 97 |
+-------------+-------+
| 11 Oct 2021 | 87 |
+-------------+-------+
| 12 Oct 2021 | 56 |
+-------------+-------+
| 13 Oct 2021 | 99 |
+-------------+-------+
| 14 Oct 2021 | 38 |
+-------------+-------+
| 15 Oct 2021 | 6 |
+-------------+-------+
| 16 Oct 2021 | 43 |
+-------------+-------+
| 17 Oct 2021 | 45 |
+-------------+-------+
| 18 Oct 2021 | 90 |
+-------------+-------+
| 19 Oct 2021 | 64 |
+-------------+-------+
| 20 Oct 2021 | 26 |
+-------------+-------+
| 21 Oct 2021 | 24 |
+-------------+-------+
| 22 Oct 2021 | 4 |
+-------------+-------+
| 23 Oct 2021 | 36 |
+-------------+-------+
| 24 Oct 2021 | 68 |
+-------------+-------+
| 25 Oct 2021 | 4 |
+-------------+-------+
| 26 Oct 2021 | 16 |
+-------------+-------+
| 27 Oct 2021 | 30 |
+-------------+-------+
| 28 Oct 2021 | 89 |
+-------------+-------+
| 29 Oct 2021 | 46 |
+-------------+-------+
| 30 Oct 2021 | 28 |
+-------------+-------+
| 31 Oct 2021 | 28 |
+-------------+-------+
| 1 Nov 2021 | 47 |
+-------------+-------+
| 2 Nov 2021 | 75 |
+-------------+-------+
| 3 Nov 2021 | 1 |
+-------------+-------+
| 4 Nov 2021 | 26 |
+-------------+-------+
| 5 Nov 2021 | 26 |
+-------------+-------+
| 6 Nov 2021 | 38 |
+-------------+-------+
| 7 Nov 2021 | 79 |
+-------------+-------+
| 8 Nov 2021 | 37 |
+-------------+-------+
| 9 Nov 2021 | 83 |
+-------------+-------+
| 10 Nov 2021 | 97 |
+-------------+-------+
| 11 Nov 2021 | 56 |
+-------------+-------+
| 12 Nov 2021 | 83 |
+-------------+-------+
| 13 Nov 2021 | 14 |
+-------------+-------+
| 14 Nov 2021 | 25 |
+-------------+-------+
| 15 Nov 2021 | 55 |
+-------------+-------+
| 16 Nov 2021 | 16 |
+-------------+-------+
| 17 Nov 2021 | 80 |
+-------------+-------+
| 18 Nov 2021 | 66 |
+-------------+-------+
| 19 Nov 2021 | 25 |
+-------------+-------+
| 20 Nov 2021 | 62 |
+-------------+-------+
| 21 Nov 2021 | 36 |
+-------------+-------+
| 22 Nov 2021 | 33 |
+-------------+-------+
| 23 Nov 2021 | 19 |
+-------------+-------+
| 24 Nov 2021 | 47 |
+-------------+-------+
| 25 Nov 2021 | 14 |
+-------------+-------+
| 26 Nov 2021 | 22 |
+-------------+-------+
| 27 Nov 2021 | 66 |
+-------------+-------+
| 28 Nov 2021 | 15 |
+-------------+-------+
| 29 Nov 2021 | 96 |
+-------------+-------+
| 30 Nov 2021 | 4 |
+-------------+-------+
Consider below approach
with temp as (
select parse_date('%d %B %Y', date) date, sales
from your_table
)
select format_date('%d %B %Y', weeks[ordinal(num)]) start_week, sum(sales) total_sales
from (
select sales, weeks, range_bucket(date, weeks) num
from temp, unnest([struct(generate_date_array(date_trunc(date, month), last_day(date, month), interval 7 day ) as weeks)])
)
group by start_week
if to apply to sample data (as is) in your question - output is

How to create churn table from transactional data?

Currently my Transaction Table has customer's transaction data for each month. Account_ID identifies the customer's ID. Order_ID is the number of orders that the customer had made. Reporting_week_start_date is the week which begins on Monday where each transaction is made (Date_Purchased).
How do i create a new table to identify the customer_status after each transaction has been made? Note that the new table has the Reporting_week_start_date until current date despite no transactions has been made .
Customer_Status
- New : customers who made their first paid subscription
- Recurring : customers with continuous payment
- Churned : when customers' subscriptions had expired and there's no renewal within the next month/same month
- Reactivated : customers who had churned and then returned to re-subscribe
Transaction Table
Account_ID | Order_ID | Reporting_week_start_date| Date_Purchased | Data_Expired
001 | 1001 | 31 Dec 2018 | 01 Jan 2019 | 08 Jan 2019
001 | 1001 | 07 Jan 2019 | 08 Jan 2019 | 15 Jan 2019
001 | 1001 | 14 Jan 2019 | 15 Jan 2019 | 22 Jan 2019 #Transaction 1
001 | 1001 | 21 Jan 2019 | 22 Jan 2019 | 29 Jan 2019
001 | 1001 | 28 Jan 2019 | 29 Jan 2019 | 31 Jan 2019
001 | 1002 | 28 Jan 2019 | 01 Feb 2019 | 08 Feb 2019
001 | 1002 | 04 Feb 2019 | 08 Feb 2019 | 15 Feb 2019 #Transaction 2
001 | 1002 | 11 Feb 2019 | 15 Feb 2019 | 22 Feb 2019
001 | 1002 | 18 Feb 2019 | 22 Feb 2019 | 28 Feb 2019
001 | 1003 | 25 Feb 2019 | 01 Mar 2019 | 08 Mar 2019
001 | 1003 | 04 Mar 2019 | 08 Mar 2019 | 15 Mar 2019
001 | 1003 | 11 Mar 2019 | 15 Mar 2019 | 22 Mar 2019 #Transaction 3
001 | 1003 | 18 Mar 2019 | 22 Mar 2019 | 29 Mar 2019
001 | 1003 | 25 Mar 2019 | 29 Mar 2019 | 31 Mar 2019
001 | 1004 | 27 May 2019 | 01 Jun 2019 | 08 Jun 2019
001 | 1004 | 03 Jun 2019 | 08 Jun 2019 | 15 Jun 2019 #Transaction 4
001 | 1004 | 10 Jun 2019 | 15 Jun 2019 | 22 Jun 2019
001 | 1004 | 17 Jun 2019 | 22 Jun 2019 | 29 Jun 2019
001 | 1004 | 24 Jun 2019 | 29 Jun 2019 | 30 Jun 2019
Expected Output
Account_ID | Order_ID | Reporting_week_start_date| Customer_status
001 | 1001 | 31 Dec 2018 | New
001 | 1001 | 07 Jan 2019 | New #Transaction 1
001 | 1001 | 14 Jan 2019 | New
001 | 1001 | 21 Jan 2019 | New
001 | 1001 | 28 Jan 2019 | New
001 | 1002 | 28 Jan 2019 | Recurring
001 | 1002 | 04 Feb 2019 | Recurring #Transaction 2
001 | 1002 | 11 Feb 2019 | Recurring
001 | 1002 | 18 Feb 2019 | Recurring
001 | 1003 | 25 Feb 2019 | Churned
001 | 1003 | 04 Mar 2019 | Churned #Transaction 3
001 | 1003 | 11 Mar 2019 | Churned
001 | 1003 | 18 Mar 2019 | Churned
001 | 1003 | 25 Mar 2019 | Churned
001 | - | 1 Apr 2019 | Churned
001 | - | 08 Apr 2019 | Churned
001 | - | 15 Apr 2019 | Churned
001 | - | 22 Apr 2019 | Churned
001 | - | 29 Apr 2019 | Churned
001 | - | 29 Apr 2019 | Churned
001 | - | 06 May 2019 | Churned
001 | - | 13 May 2019 | Churned
001 | - | 20 May 2019 | Churned
001 | - | 27 May 2019 | Churned
001 | 1004 | 27 May 2019 | Reactivated
001 | 1004 | 03 Jun 2019 | Reactivated #Transaction 4
001 | 1004 | 10 Jun 2019 | Reactivated
001 | 1004 | 17 Jun 2019 | Reactivated
001 | 1004 | 24 Jun 2019 | Reactivated'
...
...
...
current date
I think you just want window functions and case logic. Assuming the date you are referring to is Reporting_week_start_date, then the logic looks something like this:
select t.*,
(case when Reporting_week_start_date = min(Reporting_week_start_date) over (partition by account_id)
then 'New'
when Reporting_week_start_date < dateadd(lag(Reporting_week_start_date) over (partition by account_id order by Reporting_week_start_date), interval 1 month)
then 'Recurring'
when Reporting_week_start_date < dateadd(lead(Reporting_week_start_date) over (partition by account_id order by Reporting_week_start_date), interval -1 month)
then 'Churned'
else 'Reactivated'
end) as status
from transactions t;
These are not exactly the rules you have specified. But they seem very reasonable interpretations of what you want to do.

Data Mismatch after Left Join SQL

I have two tables, one that contains production data and the other has forecasted data. I am joining the two tables to compare the actual production data to forecasted data.
My sample tables are as follows:
**Prod Tbl**
Product Plant pmonth pyear quantity
B007 2 January 2014 45
B007 2 February 2014 270
B007 2 March 2014 270
B007 2 April 2014 45
B007 2 May 2014 90
B007 2 May 2014 90
B007 2 June 2014 90
B007 2 June 2014 90
B007 2 July 2014 135
B007 2 July 2014 45
B007 2 August 2014 135
B007 2 August 2014 135
B007 2 July 2015 90
B007 2 August 2014 135
B007 2 September 2014 135
B007 2 September 2015 135
B007 2 October 2015 90
B007 2 September 2014 135
B007 2 September 2014 90
B007 2 September 2014 90
B007 2 November 2014 254
B007 2 May 2016 90
B007 2 August 2016 135
B007 2 October 2016 87
**Forecast Tbl**
Product Plant Fmonth Fyear Fqty
B007 2 July 2017 100
B007 2 August 2017 100
B007 2 September 2017 100
B007 2 October 2017 100
B007 2 Novenmber 2017 100
B007 2 December 2017 100
Query Used to Join:
Select a.Product,
a.plant,
b.pmonth,
b.pyear,
coalesce(b.quantity,0) as quantity,
a.fmonth,
a.fyear,coalesce(a.fqty,0) as fqty
from
Frcast_Tbl as a
left join on Prod_Tbl as b on (a.Product = b.Product
and a.Plant = b.plant
and b.pMonth = a.fMonth);
Result:
After Joining
Product Plant Pmonth Pyear Quantity Fmonth Fyear fqty
B007 2 July 2014 180 July 2017 100
B007 2 July 2015 90 July 2017 100
B007 2 August 2014 405 August 2017 100
B007 2 August 2016 315 August 2017 100
B007 2 September 2014 450 September 2017 100
B007 2 September 2015 135 September 2017 100
B007 2 October 2016 177 October 2017 100
B007 2 October 2015 90 October 2017 100
B007 2 November 2014 356 November 2017 100
B007 2 December 2016 90 December 2017 100
B007 2 January 2015 90 January 2018 100
B007 2 January 2016 90 January 2018 100
B007 2 January 2014 45 January 2018 100
B007 2 January 2017 90 January 2018 100
B007 2 February 2014 270 February 2018 99
B007 2 March 2014 270 March 2018 101
B007 2 March 2017 90 March 2018 101
B007 2 April 2014 45 April 2018 100
B007 2 May 2016 90 May 2018 100
B007 2 May 2014 180 May 2018 100
B007 2 May 2017 90 May 2018 100
Filtered for a particular year to better explain the problem
Producr plant pmonth pyear quantity fmonth fyear fqty
B007 2 August 2016 315 August 2017 100
B007 2 October 2016 177 October 2017 100
B007 2 December 2016 90 December 2017 100
Desired Table
Product Plant Pmonth Pyear Quantity fmonth fyear fqty
B007 2 January 2016 90 null null 0
B007 2 May 2016 90 null null 0
B007 2 June 2016 270 null null 0
B007 2 null null 0 July 2017 100
B007 2 August 2016 315 August 2017 100
B007 2 null null 0 September 2017 100
B007 2 October 2016 177 October 2017 100
B007 2 null null 0 November 2017 100
B007 2 December 2016 90 December 2017 100
What my query is doing is that it joins item, plant and month using left join but I want my resultant table to display all the months for both prod and frcast and in cases where month is not found, display null or zeros. Please help.
You could try this. The Subquery after FULL JOIN is to extract only one year from Products table.
I added a CASE for ORDER BY too.
One year version
SELECT COALESCE(a.Product,b.Product) AS PRODUCT,
COALESCE(a.plant,b.plant) AS PLANT,
b.pmonth,
b.pyear,
coalesce(b.quantity,0) as quantity,
a.fmonth AS FMONTH,
a.fyear,
coalesce(a.fqty,0) as fqty
FROM FORECAST A
FULL JOIN (SELECT * FROM PROD WHERE pyear=2016) B on a.Product = b.Product
and a.Plant = b.plant
and A.fmonth = b.pMonth
ORDER BY CASE COALESCE(b.pmonth, a.fmonth)
WHEN 'January' THEN 1
WHEN 'February' THEN 2
WHEN 'March' THEN 3
WHEN 'April' THEN 4
WHEN 'May' THEN 5
WHEN 'June' THEN 6
WHEN 'July' THEN 7
WHEN 'August' THEN 8
WHEN 'September' THEN 9
WHEN 'October' THEN 10
WHEN 'November' THEN 11
WHEN 'December' THEN 12
END ;
Pls note that your sample data (first table) are not complete.
Output:
+-------------+-------+---------+-------+----------+-----------+-------+------+
| PRODUCT | PLANT | pmonth | pyear | quantity | FMONTH | fyear | fqty |
+-------------+-------+---------+-------+----------+-----------+-------+------+
| B007 | 2 | January | 2016 | 90 | NULL | NULL | 0 |
| B007 | 2 | May | 2016 | 90 | NULL | NULL | 0 |
| B007 | 2 | June | 2016 | 270 | NULL | NULL | 0 |
| B007 | 2 | NULL | NULL | 0 | July | 2017 | 100 |
| B007 | 2 | August | 2016 | 135 | August | 2017 | 100 |
| B007 | 2 | NULL | NULL | 0 | September | 2017 | 100 |
| B007 | 2 | October | 2016 | 87 | October | 2017 | 100 |
| B007 | 2 | NULL | NULL | 0 | November | 2017 | 100 |
| B007 | 2 | NULL | NULL | 0 | December | 2017 | 100 |
+-------------+-------+---------+-------+----------+-----------+-------+------+
Added: multi year version, with group by on PROD table
SELECT COALESCE(a.Product,b.Product) AS PRODUCT,
COALESCE(a.plant,b.plant) AS PLANT,
b.pmonth,
COALESCE(b.pyear,Y.pyear) AS pyear,
COALESCE(b.quantity,0) as quantity,
a.fmonth AS FMONTH,
a.fyear,
coalesce(a.fqty,0) as fqty
FROM FORECAST A
CROSS JOIN (SELECT DISTINCT pyear FROM PROD /* WHERE pyear IN (2015,2016)*/ ) Y
FULL JOIN (SELECT Product, Plant, pyear, pmonth, SUM(quantity) AS quantity
FROM PROD /*WHERE pyear IN (2015,2016)*/
GROUP BY Product, Plant, pyear, pmonth
) B on a.Product = b.Product
and a.Plant = b.plant
and A.fmonth = b.pMonth
AND Y.pyear= B.pyear
ORDER BY COALESCE(b.pyear,Y.pyear), CASE COALESCE(b.pmonth, a.fmonth)
WHEN 'January' THEN 1
WHEN 'February' THEN 2
WHEN 'March' THEN 3
WHEN 'April' THEN 4
WHEN 'May' THEN 5
WHEN 'June' THEN 6
WHEN 'July' THEN 7
WHEN 'August' THEN 8
WHEN 'September' THEN 9
WHEN 'October' THEN 10
WHEN 'November' THEN 11
WHEN 'December' THEN 12
END ;
Output:
+---------+-------+-----------+-------+----------+-----------+-------+------+
| PRODUCT | PLANT | pmonth | pyear | quantity | FMONTH | fyear | fqty |
+---------+-------+-----------+-------+----------+-----------+-------+------+
| B007 | 2 | January | 2014 | 45 | NULL | NULL | 0 |
| B007 | 2 | February | 2014 | 270 | NULL | NULL | 0 |
| B007 | 2 | March | 2014 | 270 | NULL | NULL | 0 |
| B007 | 2 | April | 2014 | 45 | NULL | NULL | 0 |
| B007 | 2 | May | 2014 | 180 | NULL | NULL | 0 |
| B007 | 2 | June | 2014 | 180 | NULL | NULL | 0 |
| B007 | 2 | July | 2014 | 180 | July | 2017 | 100 |
| B007 | 2 | August | 2014 | 405 | August | 2017 | 100 |
| B007 | 2 | September | 2014 | 450 | September | 2017 | 100 |
| B007 | 2 | NULL | 2014 | 0 | October | 2017 | 100 |
| B007 | 2 | November | 2014 | 254 | November | 2017 | 100 |
| B007 | 2 | NULL | 2014 | 0 | December | 2017 | 100 |
| B007 | 2 | July | 2015 | 90 | July | 2017 | 100 |
| B007 | 2 | NULL | 2015 | 0 | August | 2017 | 100 |
| B007 | 2 | September | 2015 | 135 | September | 2017 | 100 |
| B007 | 2 | October | 2015 | 90 | October | 2017 | 100 |
| B007 | 2 | NULL | 2015 | 0 | November | 2017 | 100 |
| B007 | 2 | NULL | 2015 | 0 | December | 2017 | 100 |
| B007 | 2 | January | 2016 | 90 | NULL | NULL | 0 |
| B007 | 2 | May | 2016 | 90 | NULL | NULL | 0 |
| B007 | 2 | June | 2016 | 270 | NULL | NULL | 0 |
| B007 | 2 | NULL | 2016 | 0 | July | 2017 | 100 |
| B007 | 2 | August | 2016 | 135 | August | 2017 | 100 |
| B007 | 2 | NULL | 2016 | 0 | September | 2017 | 100 |
| B007 | 2 | October | 2016 | 87 | October | 2017 | 100 |
| B007 | 2 | NULL | 2016 | 0 | November | 2017 | 100 |
| B007 | 2 | NULL | 2016 | 0 | December | 2017 | 100 |
+---------+-------+-----------+-------+----------+-----------+-------+------+
Use FULL OUTER JOIN when you are filtering records for specific year
Select a.Product,a.plant,b.pmonth,b.pyear,coalesce(b.quantity,0) as quantity,a.fmonth,a.fyear,coalesce(a.fqty,0) as fqty
from Frcast_Tbl as a
FULL OUTER JOIN on Prod_Tbl as b on a.Product = b.Product and
a.Plant = b.plant and
b.pMonth = a.fMonth

Average daily peak hours in a week with SQL select

I'm trying to list the weekly average of customers in different restaurants in their daily peak hours, for example:
Week | Day | Hour | Rest | Custom
20 | Mon | 08-12 | KFC | 15
20 | Mon | 12-16 | KFC | 10
20 | Mon | 16-20 | KFC | 8
20 | Tue | 08-12 | KFC | 20
20 | Tue | 12-16 | KFC | 11
20 | Tue | 16-20 | KFC | 9
20 | Mon | 08-12 | MCD | 13
20 | Mon | 12-16 | MCD | 14
20 | Mon | 16-20 | MCD | 19
20 | Tue | 08-12 | MCD | 31
20 | Tue | 12-16 | MCD | 20
20 | Tue | 16-20 | MCD | 22
20 | Mon | 08-12 | PHT | 15
20 | Mon | 12-16 | PHT | 12
20 | Mon | 16-20 | PHT | 11
20 | Tue | 08-12 | PHT | 08
20 | Tue | 12-16 | PHT | 07
20 | Tue | 16-20 | PHT | 14
The desired result should be:
WeeK | Rest | Custom
20 | KFC | 17.5
20 | MCD | 25
20 | PHT | 14.5
Is it possible to do it in one line of SQL?
This is really two steps. Get the maximum people per day per restaurant and then average that per week:
select week, rest, sum(maxc)
from (select Week, Day, Rest, max(Custom) as maxc
from t
group by Week, Day, Rest
) wdr
group by week, rest;