Populating actual dates in recursive function - sql

I am trying to create a table which contains Fiscal day,month and year.
However I want to add an actual date column in the give result as well.
My Query -
(FISCAL_DAY, BEGIN_DATE ,END_DATE ,FISCAL_MONTH,FISCAL_QUARTER,FISCAL_YEAR ) AS
(SELECT CAST(1 AS INT) ,begin_date,end_DATE,FISCAL_MONTH,FISCAL_QUARTER,FISCAL_YEAR FROM DB_NAME.ORIGINAL_FISCAL_TABLE
UNION ALL
SEL Fiscal_Day+1,begin_date,end_DATE,FISCAL_MONTH,FISCAL_QUARTER,FISCAL_YEAR
FROM TMP_FISCAL_DAY WHERE BEGIN_DATE<END_DATE AND FISCAL_DAY<END_DATE-BEGIN_DATE)
SEL * FROM TMP_FISCAL_DAY
Output
+------------+------------+------------+--------------+----------------+-------------+
| FISCAL_DAY | BEGIN_DATE | END_DATE | FISCAL_MONTH | FISCAL_QUARTER | FISCAL_YEAR |
+------------+------------+------------+--------------+----------------+-------------+
| 1 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 2 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 3 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 4 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 5 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 6 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 7 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 8 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 9 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 10 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 11 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 12 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 13 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 14 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 15 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 16 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 17 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 18 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 19 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 20 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 21 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 22 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 23 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 24 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 25 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 26 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 27 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 28 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 29 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 30 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 31 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 32 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 33 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
| 34 | 12/30/2017 | 02/02/2018 | 12 | 4 | 2,018 |
+------------+------------+------------+--------------+----------------+-------------+
Expected output
+------------+-------------+------------+----------+--------------+----------------+-------------+
| FISCAL_DAY | Actual Date | BEGIN_DATE | END_DATE | FISCAL_MONTH | FISCAL_QUARTER | FISCAL_YEAR |
+------------+-------------+------------+----------+--------------+----------------+-------------+
| 1 | 12/30/2017 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 2 | 12/31/2017 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 3 | 1/1/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 4 | 1/2/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 5 | 1/3/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 6 | 1/4/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 7 | 1/5/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 8 | 1/6/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 9 | 1/7/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 10 | 1/8/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 11 | 1/9/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 12 | 1/10/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 13 | 1/11/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 14 | 1/12/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 15 | 1/13/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 16 | 1/14/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 17 | 1/15/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 18 | 1/16/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 19 | 1/17/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 20 | 1/18/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 21 | 1/19/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 22 | 1/20/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 23 | 1/21/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 24 | 1/22/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 25 | 1/23/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 26 | 1/24/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 27 | 1/25/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 28 | 1/26/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 29 | 1/27/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 30 | 1/28/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 31 | 1/29/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 32 | 1/30/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 33 | 1/31/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
| 34 | 2/1/2018 | 12/30/2017 | 2/2/2018 | 12 | 4 | 2,018 |
+------------+-------------+------------+----------+--------------+----------------+-------------+
How do I put date in recursion such that actual dates show up ?
My Attempt (incorrect results)
WITH RECURSIVE TMP_FISCAL_DAY
(FISCAL_DAY, ACTUAL_DATE, BEGIN_DATE ,END_DATE ,FISCAL_MONTH,FISCAL_QUARTER,FISCAL_YEAR ) AS
(SELECT CAST(1 AS INT) ,cast(current_date as date), begin_date,end_DATE,FISCAL_MONTH,FISCAL_QUARTER,FISCAL_YEAR FROM DB_NAME.ORIGINAL_FISCAL_TABLE
UNION ALL
SEL Fiscal_Day+1,ACTUAL_DATE+FISCAL_DAY,begin_date,end_DATE,FISCAL_MONTH,FISCAL_QUARTER,FISCAL_YEAR
FROM TMP_FISCAL_DAY WHERE BEGIN_DATE<END_DATE AND FISCAL_DAY<END_DATE-BEGIN_DATE)
SEL * FROM TMP_FISCAL_DAY where CURRENT_DATE BETWEEN BEGIN_DATE AND END_DATE

Assuming there's one row per fiscal month in your ORIGINAL_FISCAL_TABLE you should filter the current month before recursion and then use BEGIN_DATE instead of CURRENT_DATE:
WITH RECURSIVE TMP_FISCAL_DAY ( FISCAL_DAY, ACTUAL_DATE, BEGIN_DATE ,END_DATE ,FISCAL_MONTH,FISCAL_QUARTER,FISCAL_YEAR )
AS
(
SELECT
Cast(1 AS INT)
,BEGIN_DATE
,begin_date
,end_DATE
,FISCAL_MONTH
,FISCAL_QUARTER
,FISCAL_YEAR
FROM DB_NAME.ORIGINAL_FISCAL_TABLE
WHERE Current_Date BETWEEN BEGIN_DATE AND END_DATE
UNION ALL
SELECT
Fiscal_Day+1
,ACTUAL_DATE+1
,begin_date
,end_DATE
,FISCAL_MONTH
,FISCAL_QUARTER
,FISCAL_YEAR
FROM TMP_FISCAL_DAY
WHERE ACTUAL_DATE+1 < END_DATE
)
SELECT *
FROM TMP_FISCAL_DAY
As #RonBallard wrote there's no need for recursion, you can use EXPAND ON instead:
SELECT
ACTUAL_DATE - BEGIN_DATE + 1 AS Fiscal_Day, dt.*
FROM
(
SELECT Begin(pd) AS ACTUAL_DATE, t.*
FROM ORIGINAL_FISCAL_TABLE AS t
WHERE Current_Date BETWEEN BEGIN_DATE AND END_DATE
EXPAND ON PERIOD(BEGIN_DATE, END_DATE) AS pd
) AS dt
But finally there should be no need for any kind of calculation, every company should have a calendar table with pre-calculated data:
SELECT ...
FROM myCalendar
WHERE Current_Date BETWEEN FISCAL_MONTH_BEGIN_DATE AND FISCAL_MONTH_END_DATE

Related

How do I edit the code that calculates the value for the four weeks of the month for all months with pl/sql?

I divided the month into four weeks and printed the amount for each week. How do I set this up with a loop for 12 months?
declare
cursor c is
select varis_tar, tutar
from muhasebe.doviz_takip
where trunc(varis_tar) BETWEEN TO_DATE('01/10/2021', 'DD/MM/YYYY') AND
TO_DATE('31/10/2021', 'DD/MM/YYYY')
group by varis_tar,tutar;
tutar1 number(13,2):=0;
tutar2 number(13,2):=0;
tutar3 number(13,2):=0;
tutar4 number(13,2):=0;
begin
for r in c loop
if r.varis_tar between TO_DATE('01/10/2021', 'DD/MM/YYYY') AND
TO_DATE('07/10/2021', 'DD/MM/YYYY') then
tutar1:=(r.tutar)+tutar1;
--message(r.tutar);
elsif r.varis_tar between TO_DATE('07/10/2021', 'DD/MM/YYYY') AND
TO_DATE('14/10/2021', 'DD/MM/YYYY') then
tutar2:=(r.tutar)+tutar2;
--message(r.tutar);
elsif r.varis_tar between TO_DATE('14/10/2021', 'DD/MM/YYYY') AND
TO_DATE('21/10/2021', 'DD/MM/YYYY') then
tutar3:=(r.tutar)+tutar3;
--message(r.tutar);
elsif r.varis_tar between TO_DATE('21/10/2021', 'DD/MM/YYYY') AND
TO_DATE('31/10/2021', 'DD/MM/YYYY') then
tutar4:=(r.tutar)+tutar4;
--message(r.tutar);
end if;
end loop;
I tried to get the dates the same way for all the months. I tried that, but it worked wrong.
where trunc(varis_tar) BETWEEN TO_DATE('1', 'DD') AND
TO_DATE('31', 'DD')
if r.varis_tar between TO_DATE('1', 'DD') AND
TO_DATE('07', 'DD') then
elsif r.varis_tar between TO_DATE('7', 'DD') AND
TO_DATE('14', 'DD') then
elsif r.varis_tar between TO_DATE('14', 'DD') AND
TO_DATE('21', 'DD') then
elsif r.varis_tar between TO_DATE('21', 'DD') AND
TO_DATE('31', 'DD') then
I don't know if I'am understanding it correctly but:
try if extract(day from varis_tar) between 1 and 7
or more complex
l_week := to_char(varis_tar,'W'); --week number
if l_week = 1 then --first week
elsif l_week = 2 etc...
Your code has several issues:
date in Oracle is actually a datetime, so between will not count any time after the midnight of the upper boundary.
you count the midnight of the week's end twice: in current week and in the next week (between includes both boundaries).
you do not need any PL/SQL and especially a cursor loop, because it occupy resources during calculation outside of SQL context.
Use datetime format to calculate weeks, because it is easy to read and understand. Then group by corresponding components.
with a as (
select
date '2021-01-01' - 1 + level as dt
, level as val
from dual
connect by level < 400
)
, b as (
select
dt
, val
/*Map 29, 30 and 31 to 28*/
, to_char(
least(dt, trunc(dt, 'mm') + 27)
, 'yyyymmw'
) as w
from a
)
select
substr(w, 1, 4) as y
, substr(w, 5, 2) as m
, substr(w, -1) as w
, sum(val) as val
, min(dt) as dt_from
, max(dt) as dt_to
from b
group by
w
Y | M | W | VAL | DT_FROM | DT_TO
:--- | :- | :- | ---: | :--------- | :---------
2021 | 01 | 1 | 28 | 2021-01-01 | 2021-01-07
2021 | 01 | 2 | 77 | 2021-01-08 | 2021-01-14
2021 | 01 | 3 | 126 | 2021-01-15 | 2021-01-21
2021 | 01 | 4 | 265 | 2021-01-22 | 2021-01-31
2021 | 02 | 1 | 245 | 2021-02-01 | 2021-02-07
2021 | 02 | 2 | 294 | 2021-02-08 | 2021-02-14
2021 | 02 | 3 | 343 | 2021-02-15 | 2021-02-21
2021 | 02 | 4 | 392 | 2021-02-22 | 2021-02-28
2021 | 03 | 1 | 441 | 2021-03-01 | 2021-03-07
2021 | 03 | 2 | 490 | 2021-03-08 | 2021-03-14
2021 | 03 | 3 | 539 | 2021-03-15 | 2021-03-21
2021 | 03 | 4 | 855 | 2021-03-22 | 2021-03-31
2021 | 04 | 1 | 658 | 2021-04-01 | 2021-04-07
2021 | 04 | 2 | 707 | 2021-04-08 | 2021-04-14
2021 | 04 | 3 | 756 | 2021-04-15 | 2021-04-21
2021 | 04 | 4 | 1044 | 2021-04-22 | 2021-04-30
2021 | 05 | 1 | 868 | 2021-05-01 | 2021-05-07
2021 | 05 | 2 | 917 | 2021-05-08 | 2021-05-14
2021 | 05 | 3 | 966 | 2021-05-15 | 2021-05-21
2021 | 05 | 4 | 1465 | 2021-05-22 | 2021-05-31
2021 | 06 | 1 | 1085 | 2021-06-01 | 2021-06-07
2021 | 06 | 2 | 1134 | 2021-06-08 | 2021-06-14
2021 | 06 | 3 | 1183 | 2021-06-15 | 2021-06-21
2021 | 06 | 4 | 1593 | 2021-06-22 | 2021-06-30
2021 | 07 | 1 | 1295 | 2021-07-01 | 2021-07-07
2021 | 07 | 2 | 1344 | 2021-07-08 | 2021-07-14
2021 | 07 | 3 | 1393 | 2021-07-15 | 2021-07-21
2021 | 07 | 4 | 2075 | 2021-07-22 | 2021-07-31
2021 | 08 | 1 | 1512 | 2021-08-01 | 2021-08-07
2021 | 08 | 2 | 1561 | 2021-08-08 | 2021-08-14
2021 | 08 | 3 | 1610 | 2021-08-15 | 2021-08-21
2021 | 08 | 4 | 2385 | 2021-08-22 | 2021-08-31
2021 | 09 | 1 | 1729 | 2021-09-01 | 2021-09-07
2021 | 09 | 2 | 1778 | 2021-09-08 | 2021-09-14
2021 | 09 | 3 | 1827 | 2021-09-15 | 2021-09-21
2021 | 09 | 4 | 2421 | 2021-09-22 | 2021-09-30
2021 | 10 | 1 | 1939 | 2021-10-01 | 2021-10-07
2021 | 10 | 2 | 1988 | 2021-10-08 | 2021-10-14
2021 | 10 | 3 | 2037 | 2021-10-15 | 2021-10-21
2021 | 10 | 4 | 2995 | 2021-10-22 | 2021-10-31
2021 | 11 | 1 | 2156 | 2021-11-01 | 2021-11-07
2021 | 11 | 2 | 2205 | 2021-11-08 | 2021-11-14
2021 | 11 | 3 | 2254 | 2021-11-15 | 2021-11-21
2021 | 11 | 4 | 2970 | 2021-11-22 | 2021-11-30
2021 | 12 | 1 | 2366 | 2021-12-01 | 2021-12-07
2021 | 12 | 2 | 2415 | 2021-12-08 | 2021-12-14
2021 | 12 | 3 | 2464 | 2021-12-15 | 2021-12-21
2021 | 12 | 4 | 3605 | 2021-12-22 | 2021-12-31
2022 | 01 | 1 | 2583 | 2022-01-01 | 2022-01-07
2022 | 01 | 2 | 2632 | 2022-01-08 | 2022-01-14
2022 | 01 | 3 | 2681 | 2022-01-15 | 2022-01-21
2022 | 01 | 4 | 3915 | 2022-01-22 | 2022-01-31
2022 | 02 | 1 | 1194 | 2022-02-01 | 2022-02-03
db<>fiddle here
Or the same in columns:
with a as (
select
date '2021-01-01' - 1 + level as dt
, level as val
from dual
connect by level < 400
)
, b as (
select
val
/*Map 29, 30 and 31 to 28*/
, to_char(dt, 'yyyymm') as m
, to_char(
least(dt, trunc(dt, 'mm') + 27)
, 'w'
) as w
from a
)
select
substr(m, 1, 4) as y
, substr(m, 5, 2) as m
, tutar1
, tutar2
, tutar3
, tutar4
from b
pivot(
sum(val)
for w in (
1 as tutar1, 2 as tutar2
, 3 as tutar3, 4 as tutar4
)
)
Y | M | TUTAR1 | TUTAR2 | TUTAR3 | TUTAR4
:--- | :- | -----: | -----: | -----: | -----:
2021 | 01 | 28 | 77 | 126 | 265
2021 | 02 | 245 | 294 | 343 | 392
2021 | 03 | 441 | 490 | 539 | 855
2021 | 04 | 658 | 707 | 756 | 1044
2021 | 05 | 868 | 917 | 966 | 1465
2021 | 06 | 1085 | 1134 | 1183 | 1593
2021 | 07 | 1295 | 1344 | 1393 | 2075
2021 | 08 | 1512 | 1561 | 1610 | 2385
2021 | 09 | 1729 | 1778 | 1827 | 2421
2021 | 10 | 1939 | 1988 | 2037 | 2995
2021 | 11 | 2156 | 2205 | 2254 | 2970
2021 | 12 | 2366 | 2415 | 2464 | 3605
2022 | 01 | 2583 | 2632 | 2681 | 3915
2022 | 02 | 1194 | null | null | null
db<>fiddle here

Grouping, Summing and Ordering

I want to get a breakdown by Name, Year/Month and Total. How can I do that with what I've got so far?
My data looks like this:
| name | ArtifactID | Name | DateCollected | FileSizeInBytes | WorkspaceArtifactId | TimestampOfLatestRecord |
+---------+------------+---------------------------+-------------------------+-----------------+---------------------+-------------------------+
| Pony | 1265555 | LiteDataPublishedToReview | 2018-12-21 00:00:00.000 | 5474.00 | 2534710 | 2018-12-21 09:26:49.000 |
| Wheels | 1265566 | LiteDataPublishedToReview | 2019-02-26 00:00:00.000 | 50668.00 | 2634282 | 2019-02-26 17:38:39.000 |
| Wheels | 1265567 | LiteDataPublishedToReview | 2019-01-11 00:00:00.000 | 10921638320.00 | 2634282 | 2019-01-11 16:44:04.000 |
| Wheels | 1265568 | LiteDataPublishedToReview | 2019-01-15 00:00:00.000 | 110261521.00 | 2634282 | 2019-01-15 17:43:57.000 |
| Wheels | 1265569 | LiteDataProcessed | 2018-12-13 00:00:00.000 | 123187605031.00 | 2634282 | 2018-12-13 21:50:34.000 |
| Wheels | 1265570 | FullDataProcessed | 2018-12-13 00:00:00.000 | 6810556609.00 | 2634282 | 2018-12-13 21:50:34.000 |
| Wheels | 1265571 | LiteDataProcessed | 2018-12-15 00:00:00.000 | 0.00 | 2634282 | 2018-12-15 14:52:20.000 |
| Wheels | 1265572 | FullDataProcessed | 2018-12-15 00:00:00.000 | 13362690.00 | 2634282 | 2018-12-15 14:52:20.000 |
| Wheels | 1265573 | LiteDataProcessed | 2019-01-09 00:00:00.000 | 1480303616.00 | 2634282 | 2019-01-09 13:52:23.000 |
| Wheels | 1265574 | FullDataProcessed | 2019-01-09 00:00:00.000 | 0.00 | 2634282 | 2019-01-09 13:52:23.000 |
| Wheels | 1265575 | LiteDataProcessed | 2019-02-25 00:00:00.000 | 0.00 | 2634282 | 2019-02-25 10:49:41.000 |
| Wheels | 1265576 | FullDataProcessed | 2019-02-25 00:00:00.000 | 7633201.00 | 2634282 | 2019-02-25 10:49:41.000 |
| Levack | 1265577 | LiteDataProcessed | 2018-12-16 00:00:00.000 | 0.00 | 2636230 | 2018-12-16 10:13:36.000 |
| Levack | 1265578 | FullDataProcessed | 2018-12-16 00:00:00.000 | 59202559.00 | 2636230 | 2018-12-16 10:13:36.000 |
| Van | 1265579 | LiteDataPublishedToReview | 2019-01-11 00:00:00.000 | 2646602711.00 | 2636845 | 2019-01-11 09:50:49.000 |
| Van | 1265580 | LiteDataPublishedToReview | 2019-01-10 00:00:00.000 | 10081222022.00 | 2636845 | 2019-01-10 18:32:03.000 |
| Van | 1265581 | LiteDataPublishedToReview | 2019-01-15 00:00:00.000 | 3009227476.00 | 2636845 | 2019-01-15 10:49:38.000 |
| Van | 1265582 | LiteDataPublishedToReview | 2019-03-26 00:00:00.000 | 87220831.00 | 2636845 | 2019-03-26 10:34:10.000 |
| Van | 1265583 | LiteDataPublishedToReview | 2019-03-28 00:00:00.000 | 688708119.00 | 2636845 | 2019-03-28 14:11:38.000 |
| Van | 1265584 | LiteDataProcessed | 2018-12-18 00:00:00.000 | 5408886887.00 | 2636845 | 2018-12-18 11:29:03.000 |
| Van | 1265585 | FullDataProcessed | 2018-12-18 00:00:00.000 | 0.00 | 2636845 | 2018-12-18 11:29:03.000 |
| Van | 1265586 | LiteDataProcessed | 2018-12-19 00:00:00.000 | 12535359488.00 | 2636845 | 2018-12-19 17:25:10.000 |
| Van | 1265587 | FullDataProcessed | 2018-12-19 00:00:00.000 | 0.00 | 2636845 | 2018-12-19 17:25:10.000 |
| Van | 1265588 | LiteDataProcessed | 2018-12-21 00:00:00.000 | 52599693312.00 | 2636845 | 2018-12-21 09:09:18.000 |
| Van | 1265589 | FullDataProcessed | 2018-12-21 00:00:00.000 | 0.00 | 2636845 | 2018-12-21 09:09:18.000 |
| Van | 1265590 | LiteDataProcessed | 2019-03-25 00:00:00.000 | 3588613120.00 | 2636845 | 2019-03-25 16:41:17.000 |
| Van | 1265591 | FullDataProcessed | 2019-03-25 00:00:00.000 | 0.00 | 2636845 | 2019-03-25 16:41:17.000 |
| Holiday | 1265592 | LiteDataProcessed | 2018-12-28 00:00:00.000 | 0.00 | 2638126 | 2018-12-28 09:15:21.000 |
| Holiday | 1265593 | FullDataProcessed | 2018-12-28 00:00:00.000 | 9219122847.00 | 2638126 | 2018-12-28 09:15:21.000 |
| Holiday | 1265594 | LiteDataProcessed | 2019-01-31 00:00:00.000 | 0.00 | 2638126 | 2019-01-31 14:45:07.000 |
| Holiday | 1265595 | FullDataProcessed | 2019-01-31 00:00:00.000 | 61727744.00 | 2638126 | 2019-01-31 14:45:07.000 |
| Holiday | 1265596 | LiteDataProcessed | 2019-02-05 00:00:00.000 | 0.00 | 2638126 | 2019-02-05 15:23:27.000 |
| Holiday | 1265597 | FullDataProcessed | 2019-02-05 00:00:00.000 | 199454805.00 | 2638126 | 2019-02-05 15:23:27.000 |
| Holiday | 1265598 | LiteDataProcessed | 2019-02-07 00:00:00.000 | 0.00 | 2638126 | 2019-02-07 11:55:55.000 |
| Holiday | 1265599 | FullDataProcessed | 2019-02-07 00:00:00.000 | 17944713.00 | 2638126 | 2019-02-07 11:55:55.000 |
| Holiday | 1265600 | LiteDataProcessed | 2019-02-13 00:00:00.000 | 0.00 | 2638126 | 2019-02-13 15:48:56.000 |
| Holiday | 1265601 | FullDataProcessed | 2019-02-13 00:00:00.000 | 60421568.00 | 2638126 | 2019-02-13 15:48:56.000 |
| Crosbie | 1265604 | LiteDataProcessed | 2019-01-21 00:00:00.000 | 0.00 | 2644032 | 2019-01-21 15:43:43.000 |
| Crosbie | 1265605 | FullDataProcessed | 2019-01-21 00:00:00.000 | 131445.00 | 2644032 | 2019-01-21 15:43:43.000 |
| Stone | 1265606 | LiteDataPublishedToReview | 2019-02-12 00:00:00.000 | 1626943444.00 | 2647518 | 2019-02-12 17:45:25.000 |
| Stone | 1265607 | LiteDataPublishedToReview | 2019-03-05 00:00:00.000 | 2134872671.00 | 2647518 | 2019-03-05 13:00:31.000 |
| Stone | 1265608 | LiteDataProcessed | 2019-02-05 00:00:00.000 | 38828043264.00 | 2647518 | 2019-02-05 09:40:55.000 |
| Stone | 1265609 | FullDataProcessed | 2019-02-05 00:00:00.000 | 0.00 | 2647518 | 2019-02-05 09:40:55.000 |
| Frost | 1265610 | LiteDataPublishedToReview | 2019-03-18 00:00:00.000 | 776025640.00 | 2658542 | 2019-03-18 12:34:10.000 |
| Frost | 1265611 | LiteDataPublishedToReview | 2019-03-05 00:00:00.000 | 3325335118.00 | 2658542 | 2019-03-05 15:02:39.000 |
| Frost | 1265612 | LiteDataPublishedToReview | 2019-03-20 00:00:00.000 | 211927893.00 | 2658542 | 2019-03-20 17:25:30.000 |
| Frost | 1265613 | LiteDataPublishedToReview | 2019-03-06 00:00:00.000 | 466536488.00 | 2658542 | 2019-03-06 11:00:59.000 |
| Frost | 1265614 | LiteDataPublishedToReview | 2019-03-21 00:00:00.000 | 3863850553.00 | 2658542 | 2019-03-21 17:14:27.000 |
| Frost | 1265615 | LiteDataProcessed | 2019-02-28 00:00:00.000 | 94249740012.00 | 2658542 | 2019-02-28 14:13:23.000 |
| Frost | 1265616 | FullDataProcessed | 2019-02-28 00:00:00.000 | 0.00 | 2658542 | 2019-02-28 14:13:23.000 |
| Yellow | 1265617 | LiteDataPublishedToReview | 2019-03-27 00:00:00.000 | 4550540631.00 | 2659077 | 2019-03-27 16:09:41.000 |
| Yellow | 1265618 | LiteDataProcessed | 2019-03-07 00:00:00.000 | 0.00 | 2659077 | 2019-03-07 16:53:16.000 |
| Yellow | 1265619 | FullDataProcessed | 2019-03-07 00:00:00.000 | 96139872.00 | 2659077 | 2019-03-07 16:53:16.000 |
| Yellow | 1265620 | LiteDataProcessed | 2019-03-08 00:00:00.000 | 105357273318.00 | 2659077 | 2019-03-08 16:43:24.000 |
| Yellow | 1265621 | FullDataProcessed | 2019-03-08 00:00:00.000 | 0.00 | 2659077 | 2019-03-08 16:43:24.000 |
+---------+------------+---------------------------+-------------------------+-----------------+---------------------+-------------------------+
This is my attempt:
SELECT
CAST(YEAR(ps.DateCollected) AS VARCHAR(4)) + '-' + right('00' + CAST(MONTH(ps.DateCollected) AS VARCHAR(2)), 2),
ps.[Name],
c.name,
ceiling(SUM(ps.FileSizeInBytes)/1024/1024/1024.0) [Processed]
FROM EDDSDBO.RPCCProcessingStatistics ps
inner join edds.eddsdbo.[case] c on c.artifactid = ps.workspaceartifactid
where ps.DateCollected >= '2018-12-01'
GROUP BY ps.name, c.name, CAST(YEAR(ps.DateCollected) AS VARCHAR(4)) + '-' + right('00' + CAST(MONTH(ps.DateCollected) AS VARCHAR(2)), 2)
The logic should be this:
(1) Get all values after 2018-12-01 in bytes
(2) Total them
(3) Convert to GB
(4) Ceiling the result
When I run my code and I add the results together for FullDataProcessed I get 22. However, when I manually add up the results for FullDataProcessed, I get 15.40 which when ceiling'd is 16.
I would expect the FullDataProcessed from the results of my code to equal 16, not 22.
I would guess that one or more of your records has its workspaceartifactid specified more than once in the edds.eddsdbo.[case] table. Is the primary key on the case table more than just artifactid?

Count records for "empty" rows in multiple columns and joins

I' have searched a lot through the site trying to find a solution to my problem and I have found similar problems but I haven't managed to find a solution that works in my case.
I have a tickets table like this (which has a lot more data than this):
TICKET:
+---------+--------------+------------+------------+
| ticketid| report_date | impact | open |
+---------+--------------+------------+------------+
| 1 | 29/01/2019 | 1 | true |
| 2 | 29/01/2019 | 2 | true |
| 3 | 30/01/2019 | 4 | true |
| 4 | 27/01/2019 | 1 | true |
| 5 | 29/01/2019 | 1 | true |
| 6 | 30/01/2019 | 2 | true |
+---------+--------------+------------+------------+
There is another table that holds the possible values for the impact column in the table above:
IMPACT:
+---------+
| impact |
+---------+
| 1 |
| 2 |
| 3 |
| 4 |
+---------+
My objective is to extract a result set from the ticket table where I group by the impact, report_date and open flag and count the number of tickets in each group. Therefore, for the example above, I would like to extract the following result set.
+--------------+------------+------------+-----------+
| report_date | impact | open | tkt_count |
+--------------+------------+------------+-----------+
| 27/01/2019 | 1 | true | 1 |
| 27/01/2019 | 1 | false | 0 |
| 27/01/2019 | 2 | true | 0 |
| 27/01/2019 | 2 | false | 0 |
| 27/01/2019 | 3 | true | 0 |
| 27/01/2019 | 3 | false | 0 |
| 27/01/2019 | 4 | true | 0 |
| 27/01/2019 | 4 | false | 0 |
| 29/01/2019 | 1 | true | 2 |
| 29/01/2019 | 1 | false | 0 |
| 29/01/2019 | 2 | true | 1 |
| 29/01/2019 | 2 | false | 0 |
| 29/01/2019 | 3 | true | 0 |
| 29/01/2019 | 3 | false | 0 |
| 29/01/2019 | 4 | true | 0 |
| 29/01/2019 | 4 | false | 0 |
| 30/01/2019 | 1 | true | 0 |
| 30/01/2019 | 1 | false | 0 |
| 30/01/2019 | 2 | true | 1 |
| 30/01/2019 | 2 | false | 0 |
| 30/01/2019 | 3 | true | 0 |
| 30/01/2019 | 3 | false | 0 |
| 30/01/2019 | 4 | true | 1 |
| 30/01/2019 | 4 | false | 0 |
+--------------+------------+------------+-----------+
It seems simple enough, but the problem is with the "zero" rows.
For the example that I showed here, there are no tickets with impact 3 or tickets with the open flag flase for the range of dates given. And I cannot come up with a query that will show me all the counts, even if there are no rows for some values.
Can anyone help me?
Thanks in advance.
To solve this type of problem, one way to proceed is to generate a intermediate resultset that contains all records for which a value needs to be computed, and then LEFT JOIN it with the original data, using aggregation.
SELECT
dt.report_date,
i.impact,
op.[open],
COUNT(t.report_date) tkt_count
FROM
(SELECT DISTINCT report_date FROM ticket) dt
CROSS JOIN impact i
CROSS JOIN (SELECT 'true' [open] UNION ALL SELECT 'false') op
LEFT JOIN ticket t
ON t.report_date = dt.report_date
AND t.impact = i.impact
AND t.[open] = op.[open]
GROUP BY
dt.report_date,
i.impact,
op.[open]
This query generates the intermediate resultset as follows :
report_date : all distinct dates in the original data (report_date)
impact : contains of table impact
open : fixed list containing true or false (could also have been built from distinct values in the original data, but value false was not available is your sample data)
You can choose to change the above rules, the logic should remain the same. For example if there are gaps in the report_date, another widely used option is to create a calendar table.
Demo on DB Fiddle:
report_date | impact | open | tkt_count
:------------------ | -----: | :---- | --------:
27/01/2019 00:00:00 | 1 | false | 0
27/01/2019 00:00:00 | 1 | true | 1
27/01/2019 00:00:00 | 2 | false | 0
27/01/2019 00:00:00 | 2 | true | 0
27/01/2019 00:00:00 | 3 | false | 0
27/01/2019 00:00:00 | 3 | true | 0
27/01/2019 00:00:00 | 4 | false | 0
27/01/2019 00:00:00 | 4 | true | 0
29/01/2019 00:00:00 | 1 | false | 0
29/01/2019 00:00:00 | 1 | true | 2
29/01/2019 00:00:00 | 2 | false | 0
29/01/2019 00:00:00 | 2 | true | 1
29/01/2019 00:00:00 | 3 | false | 0
29/01/2019 00:00:00 | 3 | true | 0
29/01/2019 00:00:00 | 4 | false | 0
29/01/2019 00:00:00 | 4 | true | 0
30/01/2019 00:00:00 | 1 | false | 0
30/01/2019 00:00:00 | 1 | true | 0
30/01/2019 00:00:00 | 2 | false | 0
30/01/2019 00:00:00 | 2 | true | 1
30/01/2019 00:00:00 | 3 | false | 0
30/01/2019 00:00:00 | 3 | true | 0
30/01/2019 00:00:00 | 4 | false | 0
30/01/2019 00:00:00 | 4 | true | 1
I queried against a start and end calendar table by day and cross joined all available impact/open combos and finally bought in the ticket data, counting the non-null matches.
DECLARE #Impact TABLE(Impact INT)
INSERT #Impact VALUES(1),(2),(3),(4)
DECLARE #Tickets TABLE(report_date DATETIME, Impact INT, IsOpen BIT)
INSERT #Tickets VALUES
('01/29/2019',1,1),('01/29/2019',2,1),('01/30/2019',3,1),('01/27/2019',4,1),('01/29/2019',5,1),('01/30/2019',6,1)
DECLARE #StartDate DATETIME='01/01/2019'
DECLARE #EndDate DATETIME='02/01/2019'
;WITH AllDates AS
(
SELECT Date = #StartDate
UNION ALL
SELECT Date= DATEADD(DAY, 1, Date) FROM AllDates WHERE DATEADD(DAY, 1,Date) <= #EndDate
)
,AllImpacts AS
(
SELECT DISTINCT Impact,IsOpen = 1 FROM #Impact
UNION
SELECT DISTINCT Impact,IsOpen = 0 FROM #Impact
),
AllData AS
(
SELECT D.Date,A.impact,A.IsOpen
FROM AllDates D
CROSS APPLY AllImpacts A
)
SELECT
A.Date,A.Impact,A.IsOpen,
GroupCount = COUNT(T.Impact)
FROM
AllData A
LEFT OUTER JOIN #Tickets T ON T.report_date=A.Date AND T.Impact=A.Impact AND T.IsOpen = A.IsOpen
GROUP BY
A.Date,A.Impact,A.IsOpen
ORDER BY
A.Date,A.Impact,A.IsOpen
OPTION (MAXRECURSION 0);
GO
Date | Impact | IsOpen | GroupCount
:------------------ | -----: | -----: | ---------:
01/01/2019 00:00:00 | 1 | 0 | 0
01/01/2019 00:00:00 | 1 | 1 | 0
01/01/2019 00:00:00 | 2 | 0 | 0
01/01/2019 00:00:00 | 2 | 1 | 0
01/01/2019 00:00:00 | 3 | 0 | 0
01/01/2019 00:00:00 | 3 | 1 | 0
01/01/2019 00:00:00 | 4 | 0 | 0
01/01/2019 00:00:00 | 4 | 1 | 0
02/01/2019 00:00:00 | 1 | 0 | 0
02/01/2019 00:00:00 | 1 | 1 | 0
02/01/2019 00:00:00 | 2 | 0 | 0
02/01/2019 00:00:00 | 2 | 1 | 0
02/01/2019 00:00:00 | 3 | 0 | 0
02/01/2019 00:00:00 | 3 | 1 | 0
02/01/2019 00:00:00 | 4 | 0 | 0
02/01/2019 00:00:00 | 4 | 1 | 0
03/01/2019 00:00:00 | 1 | 0 | 0
03/01/2019 00:00:00 | 1 | 1 | 0
03/01/2019 00:00:00 | 2 | 0 | 0
03/01/2019 00:00:00 | 2 | 1 | 0
03/01/2019 00:00:00 | 3 | 0 | 0
03/01/2019 00:00:00 | 3 | 1 | 0
03/01/2019 00:00:00 | 4 | 0 | 0
03/01/2019 00:00:00 | 4 | 1 | 0
04/01/2019 00:00:00 | 1 | 0 | 0
04/01/2019 00:00:00 | 1 | 1 | 0
04/01/2019 00:00:00 | 2 | 0 | 0
04/01/2019 00:00:00 | 2 | 1 | 0
04/01/2019 00:00:00 | 3 | 0 | 0
04/01/2019 00:00:00 | 3 | 1 | 0
04/01/2019 00:00:00 | 4 | 0 | 0
04/01/2019 00:00:00 | 4 | 1 | 0
05/01/2019 00:00:00 | 1 | 0 | 0
05/01/2019 00:00:00 | 1 | 1 | 0
05/01/2019 00:00:00 | 2 | 0 | 0
05/01/2019 00:00:00 | 2 | 1 | 0
05/01/2019 00:00:00 | 3 | 0 | 0
05/01/2019 00:00:00 | 3 | 1 | 0
05/01/2019 00:00:00 | 4 | 0 | 0
05/01/2019 00:00:00 | 4 | 1 | 0
06/01/2019 00:00:00 | 1 | 0 | 0
06/01/2019 00:00:00 | 1 | 1 | 0
06/01/2019 00:00:00 | 2 | 0 | 0
06/01/2019 00:00:00 | 2 | 1 | 0
06/01/2019 00:00:00 | 3 | 0 | 0
06/01/2019 00:00:00 | 3 | 1 | 0
06/01/2019 00:00:00 | 4 | 0 | 0
06/01/2019 00:00:00 | 4 | 1 | 0
07/01/2019 00:00:00 | 1 | 0 | 0
07/01/2019 00:00:00 | 1 | 1 | 0
07/01/2019 00:00:00 | 2 | 0 | 0
07/01/2019 00:00:00 | 2 | 1 | 0
07/01/2019 00:00:00 | 3 | 0 | 0
07/01/2019 00:00:00 | 3 | 1 | 0
07/01/2019 00:00:00 | 4 | 0 | 0
07/01/2019 00:00:00 | 4 | 1 | 0
08/01/2019 00:00:00 | 1 | 0 | 0
08/01/2019 00:00:00 | 1 | 1 | 0
08/01/2019 00:00:00 | 2 | 0 | 0
08/01/2019 00:00:00 | 2 | 1 | 0
08/01/2019 00:00:00 | 3 | 0 | 0
08/01/2019 00:00:00 | 3 | 1 | 0
08/01/2019 00:00:00 | 4 | 0 | 0
08/01/2019 00:00:00 | 4 | 1 | 0
09/01/2019 00:00:00 | 1 | 0 | 0
09/01/2019 00:00:00 | 1 | 1 | 0
09/01/2019 00:00:00 | 2 | 0 | 0
09/01/2019 00:00:00 | 2 | 1 | 0
09/01/2019 00:00:00 | 3 | 0 | 0
09/01/2019 00:00:00 | 3 | 1 | 0
09/01/2019 00:00:00 | 4 | 0 | 0
09/01/2019 00:00:00 | 4 | 1 | 0
10/01/2019 00:00:00 | 1 | 0 | 0
10/01/2019 00:00:00 | 1 | 1 | 0
10/01/2019 00:00:00 | 2 | 0 | 0
10/01/2019 00:00:00 | 2 | 1 | 0
10/01/2019 00:00:00 | 3 | 0 | 0
10/01/2019 00:00:00 | 3 | 1 | 0
10/01/2019 00:00:00 | 4 | 0 | 0
10/01/2019 00:00:00 | 4 | 1 | 0
11/01/2019 00:00:00 | 1 | 0 | 0
11/01/2019 00:00:00 | 1 | 1 | 0
11/01/2019 00:00:00 | 2 | 0 | 0
11/01/2019 00:00:00 | 2 | 1 | 0
11/01/2019 00:00:00 | 3 | 0 | 0
11/01/2019 00:00:00 | 3 | 1 | 0
11/01/2019 00:00:00 | 4 | 0 | 0
11/01/2019 00:00:00 | 4 | 1 | 0
12/01/2019 00:00:00 | 1 | 0 | 0
12/01/2019 00:00:00 | 1 | 1 | 0
12/01/2019 00:00:00 | 2 | 0 | 0
12/01/2019 00:00:00 | 2 | 1 | 0
12/01/2019 00:00:00 | 3 | 0 | 0
12/01/2019 00:00:00 | 3 | 1 | 0
12/01/2019 00:00:00 | 4 | 0 | 0
12/01/2019 00:00:00 | 4 | 1 | 0
13/01/2019 00:00:00 | 1 | 0 | 0
13/01/2019 00:00:00 | 1 | 1 | 0
13/01/2019 00:00:00 | 2 | 0 | 0
13/01/2019 00:00:00 | 2 | 1 | 0
13/01/2019 00:00:00 | 3 | 0 | 0
13/01/2019 00:00:00 | 3 | 1 | 0
13/01/2019 00:00:00 | 4 | 0 | 0
13/01/2019 00:00:00 | 4 | 1 | 0
14/01/2019 00:00:00 | 1 | 0 | 0
14/01/2019 00:00:00 | 1 | 1 | 0
14/01/2019 00:00:00 | 2 | 0 | 0
14/01/2019 00:00:00 | 2 | 1 | 0
14/01/2019 00:00:00 | 3 | 0 | 0
14/01/2019 00:00:00 | 3 | 1 | 0
14/01/2019 00:00:00 | 4 | 0 | 0
14/01/2019 00:00:00 | 4 | 1 | 0
15/01/2019 00:00:00 | 1 | 0 | 0
15/01/2019 00:00:00 | 1 | 1 | 0
15/01/2019 00:00:00 | 2 | 0 | 0
15/01/2019 00:00:00 | 2 | 1 | 0
15/01/2019 00:00:00 | 3 | 0 | 0
15/01/2019 00:00:00 | 3 | 1 | 0
15/01/2019 00:00:00 | 4 | 0 | 0
15/01/2019 00:00:00 | 4 | 1 | 0
16/01/2019 00:00:00 | 1 | 0 | 0
16/01/2019 00:00:00 | 1 | 1 | 0
16/01/2019 00:00:00 | 2 | 0 | 0
16/01/2019 00:00:00 | 2 | 1 | 0
16/01/2019 00:00:00 | 3 | 0 | 0
16/01/2019 00:00:00 | 3 | 1 | 0
16/01/2019 00:00:00 | 4 | 0 | 0
16/01/2019 00:00:00 | 4 | 1 | 0
17/01/2019 00:00:00 | 1 | 0 | 0
17/01/2019 00:00:00 | 1 | 1 | 0
17/01/2019 00:00:00 | 2 | 0 | 0
17/01/2019 00:00:00 | 2 | 1 | 0
17/01/2019 00:00:00 | 3 | 0 | 0
17/01/2019 00:00:00 | 3 | 1 | 0
17/01/2019 00:00:00 | 4 | 0 | 0
17/01/2019 00:00:00 | 4 | 1 | 0
18/01/2019 00:00:00 | 1 | 0 | 0
18/01/2019 00:00:00 | 1 | 1 | 0
18/01/2019 00:00:00 | 2 | 0 | 0
18/01/2019 00:00:00 | 2 | 1 | 0
18/01/2019 00:00:00 | 3 | 0 | 0
18/01/2019 00:00:00 | 3 | 1 | 0
18/01/2019 00:00:00 | 4 | 0 | 0
18/01/2019 00:00:00 | 4 | 1 | 0
19/01/2019 00:00:00 | 1 | 0 | 0
19/01/2019 00:00:00 | 1 | 1 | 0
19/01/2019 00:00:00 | 2 | 0 | 0
19/01/2019 00:00:00 | 2 | 1 | 0
19/01/2019 00:00:00 | 3 | 0 | 0
19/01/2019 00:00:00 | 3 | 1 | 0
19/01/2019 00:00:00 | 4 | 0 | 0
19/01/2019 00:00:00 | 4 | 1 | 0
20/01/2019 00:00:00 | 1 | 0 | 0
20/01/2019 00:00:00 | 1 | 1 | 0
20/01/2019 00:00:00 | 2 | 0 | 0
20/01/2019 00:00:00 | 2 | 1 | 0
20/01/2019 00:00:00 | 3 | 0 | 0
20/01/2019 00:00:00 | 3 | 1 | 0
20/01/2019 00:00:00 | 4 | 0 | 0
20/01/2019 00:00:00 | 4 | 1 | 0
21/01/2019 00:00:00 | 1 | 0 | 0
21/01/2019 00:00:00 | 1 | 1 | 0
21/01/2019 00:00:00 | 2 | 0 | 0
21/01/2019 00:00:00 | 2 | 1 | 0
21/01/2019 00:00:00 | 3 | 0 | 0
21/01/2019 00:00:00 | 3 | 1 | 0
21/01/2019 00:00:00 | 4 | 0 | 0
21/01/2019 00:00:00 | 4 | 1 | 0
22/01/2019 00:00:00 | 1 | 0 | 0
22/01/2019 00:00:00 | 1 | 1 | 0
22/01/2019 00:00:00 | 2 | 0 | 0
22/01/2019 00:00:00 | 2 | 1 | 0
22/01/2019 00:00:00 | 3 | 0 | 0
22/01/2019 00:00:00 | 3 | 1 | 0
22/01/2019 00:00:00 | 4 | 0 | 0
22/01/2019 00:00:00 | 4 | 1 | 0
23/01/2019 00:00:00 | 1 | 0 | 0
23/01/2019 00:00:00 | 1 | 1 | 0
23/01/2019 00:00:00 | 2 | 0 | 0
23/01/2019 00:00:00 | 2 | 1 | 0
23/01/2019 00:00:00 | 3 | 0 | 0
23/01/2019 00:00:00 | 3 | 1 | 0
23/01/2019 00:00:00 | 4 | 0 | 0
23/01/2019 00:00:00 | 4 | 1 | 0
24/01/2019 00:00:00 | 1 | 0 | 0
24/01/2019 00:00:00 | 1 | 1 | 0
24/01/2019 00:00:00 | 2 | 0 | 0
24/01/2019 00:00:00 | 2 | 1 | 0
24/01/2019 00:00:00 | 3 | 0 | 0
24/01/2019 00:00:00 | 3 | 1 | 0
24/01/2019 00:00:00 | 4 | 0 | 0
24/01/2019 00:00:00 | 4 | 1 | 0
25/01/2019 00:00:00 | 1 | 0 | 0
25/01/2019 00:00:00 | 1 | 1 | 0
25/01/2019 00:00:00 | 2 | 0 | 0
25/01/2019 00:00:00 | 2 | 1 | 0
25/01/2019 00:00:00 | 3 | 0 | 0
25/01/2019 00:00:00 | 3 | 1 | 0
25/01/2019 00:00:00 | 4 | 0 | 0
25/01/2019 00:00:00 | 4 | 1 | 0
26/01/2019 00:00:00 | 1 | 0 | 0
26/01/2019 00:00:00 | 1 | 1 | 0
26/01/2019 00:00:00 | 2 | 0 | 0
26/01/2019 00:00:00 | 2 | 1 | 0
26/01/2019 00:00:00 | 3 | 0 | 0
26/01/2019 00:00:00 | 3 | 1 | 0
26/01/2019 00:00:00 | 4 | 0 | 0
26/01/2019 00:00:00 | 4 | 1 | 0
27/01/2019 00:00:00 | 1 | 0 | 0
27/01/2019 00:00:00 | 1 | 1 | 0
27/01/2019 00:00:00 | 2 | 0 | 0
27/01/2019 00:00:00 | 2 | 1 | 0
27/01/2019 00:00:00 | 3 | 0 | 0
27/01/2019 00:00:00 | 3 | 1 | 0
27/01/2019 00:00:00 | 4 | 0 | 0
27/01/2019 00:00:00 | 4 | 1 | 1
28/01/2019 00:00:00 | 1 | 0 | 0
28/01/2019 00:00:00 | 1 | 1 | 0
28/01/2019 00:00:00 | 2 | 0 | 0
28/01/2019 00:00:00 | 2 | 1 | 0
28/01/2019 00:00:00 | 3 | 0 | 0
28/01/2019 00:00:00 | 3 | 1 | 0
28/01/2019 00:00:00 | 4 | 0 | 0
28/01/2019 00:00:00 | 4 | 1 | 0
29/01/2019 00:00:00 | 1 | 0 | 0
29/01/2019 00:00:00 | 1 | 1 | 1
29/01/2019 00:00:00 | 2 | 0 | 0
29/01/2019 00:00:00 | 2 | 1 | 1
29/01/2019 00:00:00 | 3 | 0 | 0
29/01/2019 00:00:00 | 3 | 1 | 0
29/01/2019 00:00:00 | 4 | 0 | 0
29/01/2019 00:00:00 | 4 | 1 | 0
30/01/2019 00:00:00 | 1 | 0 | 0
30/01/2019 00:00:00 | 1 | 1 | 0
30/01/2019 00:00:00 | 2 | 0 | 0
30/01/2019 00:00:00 | 2 | 1 | 0
30/01/2019 00:00:00 | 3 | 0 | 0
30/01/2019 00:00:00 | 3 | 1 | 1
30/01/2019 00:00:00 | 4 | 0 | 0
30/01/2019 00:00:00 | 4 | 1 | 0
31/01/2019 00:00:00 | 1 | 0 | 0
31/01/2019 00:00:00 | 1 | 1 | 0
31/01/2019 00:00:00 | 2 | 0 | 0
31/01/2019 00:00:00 | 2 | 1 | 0
31/01/2019 00:00:00 | 3 | 0 | 0
31/01/2019 00:00:00 | 3 | 1 | 0
31/01/2019 00:00:00 | 4 | 0 | 0
31/01/2019 00:00:00 | 4 | 1 | 0
01/02/2019 00:00:00 | 1 | 0 | 0
01/02/2019 00:00:00 | 1 | 1 | 0
01/02/2019 00:00:00 | 2 | 0 | 0
01/02/2019 00:00:00 | 2 | 1 | 0
01/02/2019 00:00:00 | 3 | 0 | 0
01/02/2019 00:00:00 | 3 | 1 | 0
01/02/2019 00:00:00 | 4 | 0 | 0
01/02/2019 00:00:00 | 4 | 1 | 0
db<>fiddle here

SQL group aggregate by date range columns in another table

I need a query to group an aggregate in one table by date ranges in another table.
Table 1
weeknumber | weekyear | weekstart | weekend
------------+----------+------------+------------
18 | 2016 | 2016-02-01 | 2016-02-08
19 | 2016 | 2016-02-08 | 2016-02-15
20 | 2016 | 2016-02-15 | 2016-02-22
21 | 2016 | 2016-02-22 | 2016-02-29
22 | 2016 | 2016-02-29 | 2016-03-07
23 | 2016 | 2016-03-07 | 2016-03-14
24 | 2016 | 2016-03-14 | 2016-03-21
25 | 2016 | 2016-03-21 | 2016-03-28
26 | 2016 | 2016-03-28 | 2016-04-04
27 | 2016 | 2016-04-04 | 2016-04-11
28 | 2016 | 2016-04-11 | 2016-04-18
29 | 2016 | 2016-04-18 | 2016-04-25
30 | 2016 | 2016-04-25 | 2016-05-02
31 | 2016 | 2016-05-02 | 2016-05-09
32 | 2016 | 2016-05-09 | 2016-05-16
33 | 2016 | 2016-05-16 | 2016-05-23
34 | 2016 | 2016-05-23 | 2016-05-30
35 | 2016 | 2016-05-30 | 2016-06-06
36 | 2016 | 2016-06-06 | 2016-06-13
37 | 2016 | 2016-06-13 | 2016-06-20
38 | 2016 | 2016-06-20 | 2016-06-27
39 | 2016 | 2016-06-27 | 2016-07-04
40 | 2016 | 2016-07-04 | 2016-07-11
41 | 2016 | 2016-07-11 | 2016-07-18
42 | 2016 | 2016-07-18 | 2016-07-25
43 | 2016 | 2016-07-25 | 2016-08-01
44 | 2016 | 2016-08-01 | 2016-08-08
45 | 2016 | 2016-08-08 | 2016-08-15
46 | 2016 | 2016-08-15 | 2016-08-22
47 | 2016 | 2016-08-22 | 2016-08-29
48 | 2016 | 2016-08-29 | 2016-09-05
49 | 2016 | 2016-09-05 | 2016-09-12
Table 2
accountid | rdate | fee1 | fee2 | fee3 | fee4
-----------+------------+------+------+------+------
481164 | 2015-12-22 | 8 | 1 | 5 | 1
481164 | 2002-12-22 | 1 | 0 | 0 | 0
481166 | 2015-12-22 | 1 | 0 | 0 | 0
481166 | 2016-10-20 | 14 | 0 | 0 | 0
481166 | 2016-10-02 | 5 | 0 | 0 | 0
481166 | 2016-01-06 | 18 | 4 | 0 | 5
482136 | 2016-07-04 | 18 | 0 | 0 | 0
481164 | 2016-07-04 | 2 | 3 | 4 | 5
481164 | 2016-06-28 | 34 | 0 | 0 | 0
481166 | 2016-07-20 | 50 | 0 | 0 | 69
481166 | 2016-07-13 | 16 | 0 | 0 | 5
481166 | 2016-09-15 | 8 | 0 | 0 | 2
481166 | 2016-10-03 | 8 | 0 | 0 | 0
I need to aggregate fee1+fee2+fee3+fee4 for rdates in each date range(weekstart,weekend) in table 1 and then group by accountid. Something like this:
accountid | weekstart | weekend | SUM
-----------+------------+------------+------
481164 | 2016-02-01 | 2016-02-08 | 69
481166 | 2016-02-01 | 2016-02-08 | 44
481164 | 2016-02-08 | 2016-02-15 | 22
481166 | 2016-02-08 | 2016-02-15 | 12
select accountid, weekstart, weekend,
sum(fee1 + fee2 + fee3 + fee4) as total_fee
from table2
inner join table1 on table2.rdate >= table1.weekstart and table2.rdate < table1.weekend
group by accountid, weekstart, weekend;
Just a thing:
weeknumber | weekyear | weekstart | weekend
------------+----------+------------+------------
18 | 2016 | 2016-02-01 | 2016-02-08
19 | 2016 | 2016-02-08 | 2016-02-15
weekend for week 18 should be 2016-02-07, because 2016-02-08 is weekstart for week 19.
weeknumber | weekyear | weekstart | weekend
------------+----------+------------+------------
18 | 2016 | 2016-02-01 | 2016-02-07
19 | 2016 | 2016-02-08 | 2016-02-14
Check it here: http://rextester.com/NCBO56250

SQL Paging - Search the OFFSET value to get a specific page

I have a problem with pagination. Using MYSQL, MariaDB and PostgreSQL. I am looking for a solution without vendor specific functions like ROW_NUMBER().
I have a (simplified) table as shown. I want to retrieve a page with 10 Rows containing a given id value.
SELECT id, costcentre_id, costcentreuser_id, createdate FROM devices
WHERE id < 62 ORDER BY createdate DESC;
+----+---------------+-------------------+---------------------+
| id | costcentre_id | costcentreuser_id | createdate |
+----+---------------+-------------------+---------------------+
| 61 | 18 | 31 | 2015-07-13 13:54:06 |+++++++
| 55 | 13 | 28 | 2015-07-13 13:54:05 |
| 53 | 16 | 27 | 2015-07-13 13:54:05 |
| 54 | 16 | 27 | 2015-07-13 13:54:05 |
| 56 | 13 | 28 | 2015-07-13 13:54:05 | Page 1
| 57 | 5 | 29 | 2015-07-13 13:54:05 |
| 58 | 5 | 29 | 2015-07-13 13:54:05 |
| 59 | 17 | 30 | 2015-07-13 13:54:05 |
| 60 | 17 | 30 | 2015-07-13 13:54:05 |
| 46 | 5 | 23 | 2015-07-13 13:54:04 |
| 45 | 5 | 23 | 2015-07-13 13:54:04 |+++++++
| 47 | 13 | 24 | 2015-07-13 13:54:04 |
| 48 | 13 | 24 | 2015-07-13 13:54:04 |
| 49 | 14 | 25 | 2015-07-13 13:54:04 |
| 50 | 14 | 25 | 2015-07-13 13:54:04 |
| 51 | 15 | 26 | 2015-07-13 13:54:04 | Page 2
| 52 | 15 | 26 | 2015-07-13 13:54:04 |
| 37 | 5 | 19 | 2015-07-13 13:54:03 |
| 38 | 5 | 19 | 2015-07-13 13:54:03 |
| 39 | 12 | 20 | 2015-07-13 13:54:03 |
| 40 | 12 | 20 | 2015-07-13 13:54:03 |+++++++
| 41 | 5 | 21 | 2015-07-13 13:54:03 |
| 42 | 5 | 21 | 2015-07-13 13:54:03 |
| 43 | 11 | 22 | 2015-07-13 13:54:03 |
| 44 | 11 | 22 | 2015-07-13 13:54:03 |
| 36 | 11 | 18 | 2015-07-13 13:54:02 | Page 3
| 35 |** 11 | 18 | 2015-07-13 13:54:02 |
| 34 | 6 | 17 | 2015-07-13 13:54:02 |
| 33 | 6 | 17 | 2015-07-13 13:54:02 |
| 32 | 5 | 16 | 2015-07-13 13:54:02 |
| 31 | 5 | 16 | 2015-07-13 13:54:02 |+++++++
| 30 | 5 | 15 | 2015-07-13 13:54:02 |
| 29 | 5 | 15 | 2015-07-13 13:54:02 |
| 21 | 5 | 11 | 2015-07-13 13:54:01 |
| 22 | 5 | 11 | 2015-07-13 13:54:01 |
| 23 | 5 | 12 | 2015-07-13 13:54:01 | Page 4
| 24 | 5 | 12 | 2015-07-13 13:54:01 |
| 25 | 5 | 13 | 2015-07-13 13:54:01 |
| 26 | 5 | 13 | 2015-07-13 13:54:01 |
| 27 | 10 | 14 | 2015-07-13 13:54:01 |
| 28 | 10 | 14 | 2015-07-13 13:54:01 |+++++++
| 11 | 6 | 6 | 2015-07-13 13:54:00 |
| 12 | 6 | 6 | 2015-07-13 13:54:00 |
| 13 | 7 | 7 | 2015-07-13 13:54:00 |
| 14 | 7 | 7 | 2015-07-13 13:54:00 |
| 15 | 5 | 8 | 2015-07-13 13:54:00 |
| 16 | 5 | 8 | 2015-07-13 13:54:00 |
| 17 | 8 | 9 | 2015-07-13 13:54:00 |
| 18 | 8 | 9 | 2015-07-13 13:54:00 |
| 19 | 9 | 10 | 2015-07-13 13:54:00 |
| 20 | 9 | 10 | 2015-07-13 13:54:00 |
| 2 | 1 | 1 | 2015-07-13 13:53:59 |
| 3 | 2 | 2 | 2015-07-13 13:53:59 |
| 4 | 2 | 2 | 2015-07-13 13:53:59 |
| 5 | 3 | 3 | 2015-07-13 13:53:59 |
| 6 | 3 | 3 | 2015-07-13 13:53:59 |
| 7 | 4 | 4 | 2015-07-13 13:53:59 |
| 8 | 4 | 4 | 2015-07-13 13:53:59 |
| 9 | 5 | 5 | 2015-07-13 13:53:59 |
| 10 | 5 | 5 | 2015-07-13 13:53:59 |
| 1 | 1 | 1 | 2015-07-13 13:53:59 |
+----+---------------+-------------------+---------------------+
I want to get the page with id 35 (here page 3)
SELECT id, costcentre_id, costcentreuser_id, createdate FROM devices
WHERE id < 62 ORDER BY createdate DESC LIMIT 10 OFFSET 20;
+----+---------------+-------------------+---------------------+
| id | costcentre_id | costcentreuser_id | createdate |
+----+---------------+-------------------+---------------------+
| 37 | 5 | 19 | 2015-07-13 13:54:03 |
| 40 | 12 | 20 | 2015-07-13 13:54:03 |
| 41 | 5 | 21 | 2015-07-13 13:54:03 |
| 38 | 5 | 19 | 2015-07-13 13:54:03 |
| 42 | 5 | 21 | 2015-07-13 13:54:03 |
| 35 |** 11 | 18 | 2015-07-13 13:54:02 |
| 36 | 11 | 18 | 2015-07-13 13:54:02 |
| 33 | 6 | 17 | 2015-07-13 13:54:02 |
| 29 | 5 | 15 | 2015-07-13 13:54:02 |
| 30 | 5 | 15 | 2015-07-13 13:54:02 |
+----+---------------+-------------------+---------------------+
But how to calculate the OFFSET value automatically?
Thank you for any idea!
You can use TOP to your advantage here. Some DBMSs support a variable or dynamic TOP condition, but if not, this would need to be generated in your target language.
This is also not the most efficient way, but only depends on whether you can have a deterministic sorting key.
--offset 10, page 2
SELECT *
FROM (
SELECT TOP 10 * --top offset number
FROM (
SELECT TOP 20 * --top offset * page number
FROM MyTable
ORDER BY id --sort ASCENDING
) T1
ORDER BY id DESC --sort by same key DESCENDING
) T2
ORDER BY id --reorder to original order, unless you want to order in client app