my difficulty is to join two tables with gmv value.
table 1
| id | historical_date| status |
| -------- | -------------- | -------------- |
| 45615266 | 2021-06-02 | Pending |
| 45615266 | 2021-12-05 | Validated |
table 2
| id | grass_date | value |
| -------- | -------------- | -------------- |
| 45615266 | 2021-02-02 | 24.02 |
| 45615266 | 2021-03-17 | 15.48 |
| 45615266 | 2020-12-21 | 1993.85 |
| 45615266 | 2021-06-02 | 74.56 |
| 45615266 | 2021-07-14 | 74.48 |
| 45615266 | 2021-12-04 | 99.48 |
Expected
| id | historical_date | status | grass_date | value |
| 45615266 | 2021-06-02 | Pending | 2021-02-02 | 24.02 |
| 45615266 | 2021-06-02 | Pending | 2021-03-17 | 15.48 |
| 45615266 | 2021-06-02 | Pending | 2020-12-21 | 1993.85 |
| 45615266 | 2021-12-05 | Validated | 2021-06-02 | 74.56 |
| 45615266 | 2021-12-05 | Validated | 2021-07-14 | 74.48 |
| 45615266 | 2021-12-05 | Validated | 2021-12-04 | 99.48 |
I use Trino (prestoSQL) and the subquery is limited to one column
You can use a NATURAL JOIN:
SELECT * FROM table 1 NATURAL JOIN table 2
Related
I'd expect this to work to get me a list of calendar dates over the past 12 months excluding weekends; but it just gives me the entire list of dates - which I suppose is fine - but want to know why the below is incorrect.
SELECT ADD_MONTHS(TRUNC(SYSDATE,'MM'),-12) - 1 + rownum AS CalendarDate
FROM all_objects
WHERE ADD_MONTHS(TRUNC(SYSDATE,'MM'),-12) - 1 + rownum <= sysdate
AND to_char(sysdate,'DY') NOT IN ('SAT','SUN')
Because you're doing this:
AND to_char(sysdate,'DY') NOT IN ('SAT','SUN')
And today isn't Saturday or Sunday. You need to look at the calculated CalendarDate value; but you can't do that in the same level of subquery. You could try to recalculate it:
AND to_char(ADD_MONTHS(TRUNC(SYSDATE,'MM'),-12) - 1 + rownum,'DY') NOT IN ('SAT','SUN')
but this will return no rows - at least when run at the moment. As it happens, March 1st 2020 was a Sunday, so that is excluded; and because of when and how rownum is generated, that result is excluded, and the next one sees the same value, which is excluded, and so on.
You can use an inline view to avoid both issues:
SELECT CalendarDate
FROM (
SELECT ADD_MONTHS(TRUNC(SYSDATE,'MM'),-12) - 1 + rownum AS CalendarDate
FROM all_objects
WHERE ADD_MONTHS(TRUNC(SYSDATE,'MM'),-12) - 1 + rownum <= sysdate
)
WHERE to_char(CalendarDate,'DY','NLS_DATE_LANGUAGE=ENGLISH') NOT IN ('SAT','SUN')
CALENDARDATE
02-MAR-20
03-MAR-20
04-MAR-20
05-MAR-20
06-MAR-20
09-MAR-20
10-MAR-20
...
db<>fiddle
I've chucked in a language modifier to stop it behaving differently for users with sessions not set to English.
Querying against all_objects isn't ideal though, it would be better to use a hierarcical query:
SELECT *
FROM (
SELECT ADD_MONTHS(TRUNC(SYSDATE,'MM'),-12) - 1 + level AS CalendarDate
FROM dual
CONNECT BY level <= TRUNC(SYSDATE) - ADD_MONTHS(TRUNC(SYSDATE,'MM'),-12) + 1
)
WHERE to_char(CalendarDate,'DY','NLS_DATE_LANGUAGE=ENGLISH') NOT IN ('SAT','SUN')
ORDER BY CalendarDate
db<>fiddle
or a recursive CTE, if you're 11gR2+:
WITH rcte (CalendarDate) AS (
SELECT ADD_MONTHS(TRUNC(SYSDATE,'MM'),-12)
FROM dual
UNION ALL
SELECT rcte.CalendarDate + interval '1' day
FROM rcte
WHERE rcte.CalendarDate < TRUNC(SYSDATE)
)
SELECT CalendarDate
FROM rcte
WHERE to_char(CalendarDate,'DY','NLS_DATE_LANGUAGE=ENGLISH') NOT IN ('SAT','SUN')
ORDER BY CalendarDate
db<>fiddle (as 18c to avoid a couple of issues with the patch level in the 11g version it uses).
You checking whether today is sunday or monday with to_char(sysdate,'DY'). you need to check CalendarDate which is not available in your window. You can use cte to calculate the calendar then you can remove weekends with your condition as below.
with cte (CalendarDate) as
(
SELECT ADD_MONTHS(TRUNC(SYSDATE,'MM'),-12) - 1 + rownum AS CalendarDate
FROM all_objects
WHERE ADD_MONTHS(TRUNC(SYSDATE,'MM'),-12) - 1 + rownum <= sysdate
)
select * from cte where
to_char(CalendarDate,'DY') not in ('SAT','SUN');
| CALENDARDATE |
| :----------- |
| 02-MAR-20 |
| 03-MAR-20 |
| 04-MAR-20 |
| 05-MAR-20 |
| 06-MAR-20 |
| 09-MAR-20 |
| 10-MAR-20 |
| 11-MAR-20 |
| 12-MAR-20 |
| 13-MAR-20 |
| 16-MAR-20 |
| 17-MAR-20 |
| 18-MAR-20 |
| 19-MAR-20 |
| 20-MAR-20 |
| 23-MAR-20 |
| 24-MAR-20 |
| 25-MAR-20 |
| 26-MAR-20 |
| 27-MAR-20 |
| 30-MAR-20 |
| 31-MAR-20 |
| 01-APR-20 |
| 02-APR-20 |
| 03-APR-20 |
| 06-APR-20 |
| 07-APR-20 |
| 08-APR-20 |
| 09-APR-20 |
| 10-APR-20 |
| 13-APR-20 |
| 14-APR-20 |
| 15-APR-20 |
| 16-APR-20 |
| 17-APR-20 |
| 20-APR-20 |
| 21-APR-20 |
| 22-APR-20 |
| 23-APR-20 |
| 24-APR-20 |
| 27-APR-20 |
| 28-APR-20 |
| 29-APR-20 |
| 30-APR-20 |
| 01-MAY-20 |
| 04-MAY-20 |
| 05-MAY-20 |
| 06-MAY-20 |
| 07-MAY-20 |
| 08-MAY-20 |
| 11-MAY-20 |
| 12-MAY-20 |
| 13-MAY-20 |
| 14-MAY-20 |
| 15-MAY-20 |
| 18-MAY-20 |
| 19-MAY-20 |
| 20-MAY-20 |
| 21-MAY-20 |
| 22-MAY-20 |
| 25-MAY-20 |
| 26-MAY-20 |
| 27-MAY-20 |
| 28-MAY-20 |
| 29-MAY-20 |
| 01-JUN-20 |
| 02-JUN-20 |
| 03-JUN-20 |
| 04-JUN-20 |
| 05-JUN-20 |
| 08-JUN-20 |
| 09-JUN-20 |
| 10-JUN-20 |
| 11-JUN-20 |
| 12-JUN-20 |
| 15-JUN-20 |
| 16-JUN-20 |
| 17-JUN-20 |
| 18-JUN-20 |
| 19-JUN-20 |
| 22-JUN-20 |
| 23-JUN-20 |
| 24-JUN-20 |
| 25-JUN-20 |
| 26-JUN-20 |
| 29-JUN-20 |
| 30-JUN-20 |
| 01-JUL-20 |
| 02-JUL-20 |
| 03-JUL-20 |
| 06-JUL-20 |
| 07-JUL-20 |
| 08-JUL-20 |
| 09-JUL-20 |
| 10-JUL-20 |
| 13-JUL-20 |
| 14-JUL-20 |
| 15-JUL-20 |
| 16-JUL-20 |
| 17-JUL-20 |
| 20-JUL-20 |
| 21-JUL-20 |
| 22-JUL-20 |
| 23-JUL-20 |
| 24-JUL-20 |
| 27-JUL-20 |
| 28-JUL-20 |
| 29-JUL-20 |
| 30-JUL-20 |
| 31-JUL-20 |
| 03-AUG-20 |
| 04-AUG-20 |
| 05-AUG-20 |
| 06-AUG-20 |
| 07-AUG-20 |
| 10-AUG-20 |
| 11-AUG-20 |
| 12-AUG-20 |
| 13-AUG-20 |
| 14-AUG-20 |
| 17-AUG-20 |
| 18-AUG-20 |
| 19-AUG-20 |
| 20-AUG-20 |
| 21-AUG-20 |
| 24-AUG-20 |
| 25-AUG-20 |
| 26-AUG-20 |
| 27-AUG-20 |
| 28-AUG-20 |
| 31-AUG-20 |
| 01-SEP-20 |
| 02-SEP-20 |
| 03-SEP-20 |
| 04-SEP-20 |
| 07-SEP-20 |
| 08-SEP-20 |
| 09-SEP-20 |
| 10-SEP-20 |
| 11-SEP-20 |
| 14-SEP-20 |
| 15-SEP-20 |
| 16-SEP-20 |
| 17-SEP-20 |
| 18-SEP-20 |
| 21-SEP-20 |
| 22-SEP-20 |
| 23-SEP-20 |
| 24-SEP-20 |
| 25-SEP-20 |
| 28-SEP-20 |
| 29-SEP-20 |
| 30-SEP-20 |
| 01-OCT-20 |
| 02-OCT-20 |
| 05-OCT-20 |
| 06-OCT-20 |
| 07-OCT-20 |
| 08-OCT-20 |
| 09-OCT-20 |
| 12-OCT-20 |
| 13-OCT-20 |
| 14-OCT-20 |
| 15-OCT-20 |
| 16-OCT-20 |
| 19-OCT-20 |
| 20-OCT-20 |
| 21-OCT-20 |
| 22-OCT-20 |
| 23-OCT-20 |
| 26-OCT-20 |
| 27-OCT-20 |
| 28-OCT-20 |
| 29-OCT-20 |
| 30-OCT-20 |
| 02-NOV-20 |
| 03-NOV-20 |
| 04-NOV-20 |
| 05-NOV-20 |
| 06-NOV-20 |
| 09-NOV-20 |
| 10-NOV-20 |
| 11-NOV-20 |
| 12-NOV-20 |
| 13-NOV-20 |
| 16-NOV-20 |
| 17-NOV-20 |
| 18-NOV-20 |
| 19-NOV-20 |
| 20-NOV-20 |
| 23-NOV-20 |
| 24-NOV-20 |
| 25-NOV-20 |
| 26-NOV-20 |
| 27-NOV-20 |
| 30-NOV-20 |
| 01-DEC-20 |
| 02-DEC-20 |
| 03-DEC-20 |
| 04-DEC-20 |
| 07-DEC-20 |
| 08-DEC-20 |
| 09-DEC-20 |
| 10-DEC-20 |
| 11-DEC-20 |
| 14-DEC-20 |
| 15-DEC-20 |
| 16-DEC-20 |
| 17-DEC-20 |
| 18-DEC-20 |
| 21-DEC-20 |
| 22-DEC-20 |
| 23-DEC-20 |
| 24-DEC-20 |
| 25-DEC-20 |
| 28-DEC-20 |
| 29-DEC-20 |
| 30-DEC-20 |
| 31-DEC-20 |
| 01-JAN-21 |
| 04-JAN-21 |
| 05-JAN-21 |
| 06-JAN-21 |
| 07-JAN-21 |
| 08-JAN-21 |
| 11-JAN-21 |
| 12-JAN-21 |
| 13-JAN-21 |
| 14-JAN-21 |
| 15-JAN-21 |
| 18-JAN-21 |
| 19-JAN-21 |
| 20-JAN-21 |
| 21-JAN-21 |
| 22-JAN-21 |
| 25-JAN-21 |
| 26-JAN-21 |
| 27-JAN-21 |
| 28-JAN-21 |
| 29-JAN-21 |
| 01-FEB-21 |
| 02-FEB-21 |
| 03-FEB-21 |
| 04-FEB-21 |
| 05-FEB-21 |
| 08-FEB-21 |
| 09-FEB-21 |
| 10-FEB-21 |
| 11-FEB-21 |
| 12-FEB-21 |
| 15-FEB-21 |
| 16-FEB-21 |
| 17-FEB-21 |
| 18-FEB-21 |
| 19-FEB-21 |
| 22-FEB-21 |
| 23-FEB-21 |
| 24-FEB-21 |
| 25-FEB-21 |
| 26-FEB-21 |
| 01-MAR-21 |
| 02-MAR-21 |
| 03-MAR-21 |
| 04-MAR-21 |
| 05-MAR-21 |
| 08-MAR-21 |
| 09-MAR-21 |
db<>fiddle here
This question already has answers here:
Select first row in each GROUP BY group?
(20 answers)
Closed 2 years ago.
Right now I have this query:
SELECT DISTINCT
stock_picking.id as delivery_order_id,
sale_order.id as sale_order_id,
sale_order.name as sale_order_name,
stock_picking.origin as stock_picking_origin,
stock_picking.name as stock_picking_name,
stock_picking.create_date as stock_picking_create_date,
sub.count_origin as sale_order_delivery_order_done_count
FROM
(
SELECT
origin,
COUNT(origin) as count_origin
FROM stock_picking
WHERE state = 'done'
GROUP BY origin
HAVING COUNT(origin) > 1
ORDER BY origin
) sub
JOIN sale_order ON sale_order.name = sub.origin
JOIN account_invoice ON account_invoice.origin = sale_order.name
JOIN stock_picking ON stock_picking.origin = sale_order.name
WHERE
account_invoice.create_date >= '04/17/20' AND
sale_order.create_date <= '04/01/20 07:00' AND
sale_order.create_date >= '03/01/20'
ORDER BY sale_order.name
;
It returns:
+-------------------+---------------+-----------------+----------------------+--------------------+----------------------------+--------------------------------------+
| delivery_order_id | sale_order_id | sale_order_name | stock_picking_origin | stock_picking_name | stock_picking_create_date | sale_order_delivery_order_done_count |
+-------------------+---------------+-----------------+----------------------+--------------------+----------------------------+--------------------------------------+
| 2053131 | 5840046 | 3258428 | 3258428 | WH/OUT/1804215 | 2020-03-01 07:10:32.144694 | 2 |
| 2071149 | 5840046 | 3258428 | 3258428 | WH/OUT/1819605 | 2020-03-03 18:00:25.208632 | 2 |
| 2154480 | 5840046 | 3258428 | 3258428 | WH/OUT/1894584 | 2020-03-11 08:39:33.514114 | 2 |
| 2053494 | 5840408 | 3258728 | 3258728 | WH/OUT/1804574 | 2020-03-01 07:41:26.728154 | 2 |
| 2105133 | 5840408 | 3258728 | 3258728 | WH/OUT/1849288 | 2020-03-07 13:59:10.049683 | 2 |
| 2192492 | 5840408 | 3258728 | 3258728 | WH/OUT/1929553 | 2020-03-13 09:10:26.18469 | 2 |
| 2061022 | 5861189 | 3279458 | 3279458 | WH/OUT/1811084 | 2020-03-02 14:37:35.803326 | 2 |
| 2170656 | 5861189 | 3279458 | 3279458 | WH/OUT/1909477 | 2020-03-12 08:57:15.434752 | 2 |
| 2072002 | 5885577 | 3294059 | 3294059 | WH/OUT/109633 | 2020-03-04 02:44:03.302924 | 2 |
| 2130430 | 5885577 | 3294059 | 3294059 | WH/OUT/114259 | 2020-03-10 03:13:58.33838 | 2 |
+-------------------+---------------+-----------------+----------------------+--------------------+----------------------------+--------------------------------------+
I want to make sure that the column sale_order_id is unique, but picked from the least delivery_order_id and not aggregated.
I want to have a result like this:
+-------------------+---------------+-----------------+----------------------+--------------------+----------------------------+--------------------------------------+
| delivery_order_id | sale_order_id | sale_order_name | stock_picking_origin | stock_picking_name | stock_picking_create_date | sale_order_delivery_order_done_count |
+-------------------+---------------+-----------------+----------------------+--------------------+----------------------------+--------------------------------------+
| 2053131 | 5840046 | 3258428 | 3258428 | WH/OUT/1804215 | 2020-03-01 07:10:32.144694 | 2 |
| 2053494 | 5840408 | 3258728 | 3258728 | WH/OUT/1804574 | 2020-03-01 07:41:26.728154 | 2 |
| 2061022 | 5861189 | 3279458 | 3279458 | WH/OUT/1811084 | 2020-03-02 14:37:35.803326 | 2 |
| 2072002 | 5885577 | 3294059 | 3294059 | WH/OUT/109633 | 2020-03-04 02:44:03.302924 | 2 |
+-------------------+---------------+-----------------+----------------------+--------------------+----------------------------+--------------------------------------+
You can use distinct on. Your query is complicated, so I'll encapsulate it in a CTE:
with q as (
. . .
)
select distinct on (sale_order_id) q.*
from q
order by sale_order_id, delivery_order_id;
I am having trouble in SQl query,The query result should be like this
+------------+------------+-----+------+-------+--+--+--+
| District | Tehsil | yes | no | Total | | | |
+------------+------------+-----+------+-------+--+--+--+
| ABBOTTABAD | ABBOTTABAD | 377 | 5927 | 6304 | | | |
| ABBOTTABAD | HAVELIAN | 112 | 2276 | 2388 | | | |
| ABBOTTABAD | Overall | 489 | 8203 | 8692 | | | |
| CHARSADDA | CHARSADDA | 289 | 3762 | 4051 | | | |
| CHARSADDA | SHABQADAR | 121 | 1376 | 1497 | | | |
| CHARSADDA | TANGI | 94 | 1703 | 1797 | | | |
| CHARSADDA | Overall | 504 | 6841 | 7345 | | | |
+------------+------------+-----+------+-------+--+--+--+
The overall total should be should be shown at the end of every parent category but now it is showing like this
+------------+------------+-----+------+-------+--+--+--+
| District | Tehsil | yes | no | Total | | | |
+------------+------------+-----+------+-------+--+--+--+
| ABBOTTABAD | ABBOTTABAD | 377 | 5927 | 6304 | | | |
| ABBOTTABAD | HAVELIAN | 112 | 2276 | 2388 | | | |
| ABBOTTABAD | Overall | 489 | 8203 | 8692 | | | |
| CHARSADDA | CHARSADDA | 289 | 3762 | 4051 | | | |
| CHARSADDA | Overall | 504 | 6841 | 7345 | | | |
| CHARSADDA | SHABQADAR | 121 | 1376 | 1497 | | | |
| CHARSADDA | TANGI | 94 | 1703 | 1797 | | | |
+------------+------------+-----+------+-------+--+--+--+
My query is sorting second column with respect to first column although order by query is applied on my first column. This is my query
select District as 'District', tName as 'tehsil',[1] as 'yes',[0] as 'no',ISNULL([1]+[0], 0) as "Total" from
(
select d.Name as 'District',
case when grouping (t.Name)=1 then 'Overall' else t.Name end as tName,
BoundaryWallAvailable,
count(*) as total from School s
INNER JOIN SchoolIndicator i ON (i.refSchoolID=s.SchoolID)
INNER JOIN Tehsil t ON (t.TehsilID=s.refTehsilID)
INNER JOIN district d ON (d.DistrictID=t.refDistrictID)
group by
GROUPING sets((d.Name, BoundaryWallAvailable), (d.Name,t.Name, BoundaryWallAvailable))
) B
PIVOT
(
max(total) for BoundaryWallAvailable in ([1],[0])
) as Pvt
order by District
P.S: BoundaryWall is one column through pivoting i am breaking it into Yes and No Column
timesetup table (Unnormalized), this table is for time conditioning that used for filtering result
| time_id | period_from | period_to | session1_from | session1_to | session2_from | session2_to | session3_from | session3_to | session4_from | session4_to | session5_from | session5_to |
|---------|-------------|------------|---------------|-------------|---------------|-------------|---------------|-------------|---------------|-------------|---------------|-------------|
| 1 | 10/09/2015 | 11/09/2015 | 04:00:00 | 05:00:00 | 12:00:00 | 13:00:00 | 15:00:00 | 16:00:00 | 18:00:00 | 18:35:00 | 19:00:00 | 20:00:00 |
| 2 | 12/09/2015 | 13/09/2015 | 04:10:00 | 05:10:00 | 12:10:00 | 13:10:00 | 15:10:00 | 16:10:00 | 18:10:00 | 18:45:00 | 19:10:00 | 20:10:00 |
|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
Normalized Version :
dateperiode table :
| period_id | date_from | date_to |
|-----------|------------|------------|
| 1 | 10/09/2015 | 11/09/2015 |
| 2 | 12/09/2015 | 13/09/2015 |
|-------------------------------------|
sessionrange table :
| sessionrange_id | session | session_from | session_to |
|-----------------|-----------|--------------|-------------|
| 1 | 1 | 04:00:00 | 05:00:00 |
| 2 | 2 | 12:00:00 | 13:00:00 |
| 3 | 3 | 15:00:00 | 16:00:00 |
| 4 | 4 | 18:00:00 | 18:35:00 |
| 5 | 5 | 19:00:00 | 20:00:00 |
| 6 | 1 | 04:10:00 | 05:10:00 |
| 7 | 2 | 12:10:00 | 13:10:00 |
| 8 | 3 | 15:10:00 | 16:10:00 |
| 9 | 4 | 18:10:00 | 18:45:00 |
| 10 | 5 | 19:10:00 | 20:10:00 |
|-----------------|-----------|--------------|-------------|
timsetup table :
| period_id | sessionrange_id |
|-----------|-----------------|
| 1 | 1 |
| 1 | 2 |
| 1 | 3 |
| 1 | 4 |
| 1 | 5 |
| 2 | 6 |
| 2 | 7 |
| 2 | 8 |
| 2 | 9 |
| 2 | 10 |
|-----------|-----------------|
checktime table, this table is containing data from fingerprint scanner (tapping_time field)
|userid | tapping_time |
|--------|----------------------|
|234 | 10/09/2015 04:20:04 |
|234 | 10/09/2015 04:20:06 |
|234 | 10/09/2015 12:15:35 |
|234 | 10/09/2015 15:31:11 |
|234 | 10/09/2015 18:19:10 |
|234 | 10/09/2015 18:19:15 |
|234 | 10/09/2015 19:37:53 |
|234 | 11/09/2015 04:38:42 |
|234 | 11/09/2015 04:38:47 |
|234 | 11/09/2015 12:21:27 |
|234 | 11/09/2015 15:45:30 |
|234 | 11/09/2015 15:45:37 |
|234 | 11/09/2015 18:27:15 |
|234 | 11/09/2015 19:55:08 |
|234 | 11/09/2015 19:55:12 |
|234 | 12/09/2015 04:45:10 |
|234 | 12/09/2015 04:45:13 |
|234 | 12/09/2015 13:12:55 |
|234 | 12/09/2015 16:35:08 |
|234 | 12/09/2015 18:49:10 |
|234 | 12/09/2015 20:20:57 |
|234 | 13/09/2015 05:11:56 |
|234 | 13/09/2015 05:12:05 |
|234 | 13/09/2015 12:45:13 |
|234 | 13/09/2015 15:47:25 |
|234 | 13/09/2015 18:31:27 |
|234 | 13/09/2015 18:31:30 |
|234 | 13/09/2015 20:01:18 |
|-------------------------------|
I'm trying to create an Access query that groups the date/time into the periods in the timesetup table. tapping_time should have only one result per session (even when user tapping more than once at a time) based on sessions condition in timesetup table. Also users tap times outside of sessions (time between session_from and session_to) field should never counted or appear in result table.
Desired Result:
| userid | date | tapping_on | session |
|--------|------------|------------|---------|
| 234 | 10/09/2015 | 04:20:04 | 1 |
| 234 | 10/09/2015 | 12:15:35 | 2 |
| 234 | 10/09/2015 | 15:31:11 | 3 |
| 234 | 10/09/2015 | 18:19:10 | 4 |
| 234 | 10/09/2015 | 19:37:53 | 5 |
| 234 | 11/09/2015 | 04:38:42 | 1 |
| 234 | 11/09/2015 | 12:21:27 | 2 |
| 234 | 11/09/2015 | 15:45:30 | 3 |
| 234 | 11/09/2015 | 18:27:15 | 4 |
| 234 | 11/09/2015 | 19:55:08 | 5 |
| 234 | 12/09/2015 | 04:45:10 | 1 |
| 234 | 12/09/2015 | 16:35:08 | 3 |
| 234 | 12/09/2015 | 18:49:10 | 4 |
| 234 | 13/09/2015 | 12:45:13 | 2 |
| 234 | 13/09/2015 | 15:47:25 | 3 |
| 234 | 13/09/2015 | 18:31:27 | 4 |
| 234 | 13/09/2015 | 20:01:18 | 5 |
|--------------------------------------------|
You might have gone a little bit overboard on the normalization, since the relation between dateperiode and sessionrange looks like a one to many relationship to me, thus doesn't require a junction table.
With the following data, I believe the following query should work, but unfortunately I don't have a decent test setup at the ready:
SELECT userid, tDatepart As [date], Min(t.tapping_time) As tapping_on, session FROM
(
SELECT session_from, session_to, date_from, date_to, session
FROM timsetup i
INNER JOIN dateperiode d ON i.period_id = d.period_id
INNER JOIN sessionrange q ON t.sessionrange_id = q.sessionrange_id
) As s
INNER JOIN
(SELECT Int(tapping_time) As tDatepart, CDate(tapping_time - Int(tapping_time)) As tTimepart, userid, tapping_time
FROM checktime) t
ON (t.tDatePart >= s.date_from AND t.tDatePart <= s.date_to AND t.tTimePart >= s.session_from AND t.tTimePart <= s.session_to)
GROUP BY userid, tDatepart, session
I have a dataset for which I have to conditionally count rows from table B that are between two dates in table A. I have to do this without the use of a correlated subquery in the SELECT clause, as this is not supported in Netezza - docs: https://www.ibm.com/support/knowledgecenter/en/SSULQD_7.0.3/com.ibm.nz.dbu.doc/c_dbuser_correlated_subqueries_ntz_sql.html.
Background on tables: Users can log in to a site (logins). When they log in, they can take actions, which are in (actions_taken). The desired output is a count of rows that are between the actions_taken action_date and lag_action_date.
Data and attempt found here: http://rextester.com/NLDH13254
Table: actions_taken (with added calculations - see RexTester.)
| user_id | action_type | action_date | lag_action_date | elapsed_days |
|---------|---------------|-------------|-----------------|--------------|
| 12345 | action_type_1 | 6/27/2017 | 3/3/2017 | 116 |
| 12345 | action_type_1 | 3/3/2017 | 2/28/2017 | 3 |
| 12345 | action_type_1 | 2/28/2017 | NULL | NULL |
| 12345 | action_type_2 | 3/6/2017 | 3/3/2017 | 3 |
| 12345 | action_type_2 | 3/3/2017 | 3/25/2016 | 343 |
| 12345 | action_type_2 | 3/25/2016 | NULL | NULL |
| 12345 | action_type_4 | 3/6/2017 | 3/3/2017 | 3 |
| 12345 | action_type_4 | 3/3/2017 | NULL | NULL |
| 99887 | action_type_1 | 4/1/2017 | 2/11/2017 | 49 |
| 99887 | action_type_1 | 2/11/2017 | 1/28/2017 | 14 |
| 99887 | action_type_1 | 1/28/2017 | NULL | NULL |
Table: logins
| user_id | login_date |
|---------|------------|
| 12345 | 6/27/2017 |
| 12345 | 6/26/2017 |
| 12345 | 3/7/2017 |
| 12345 | 3/6/2017 |
| 12345 | 3/3/2017 |
| 12345 | 3/2/2017 |
| 12345 | 3/1/2017 |
| 12345 | 2/28/2017 |
| 12345 | 2/27/2017 |
| 12345 | 2/25/2017 |
| 12345 | 3/25/2016 |
| 12345 | 3/23/2016 |
| 12345 | 3/20/2016 |
| 99887 | 6/27/2017 |
| 99887 | 6/26/2017 |
| 99887 | 6/24/2017 |
| 99887 | 4/2/2017 |
| 99887 | 4/1/2017 |
| 99887 | 3/30/2017 |
| 99887 | 3/8/2017 |
| 99887 | 3/6/2017 |
| 99887 | 3/3/2017 |
| 99887 | 3/2/2017 |
| 99887 | 2/28/2017 |
| 99887 | 2/11/2017 |
| 99887 | 1/28/2017 |
| 99887 | 1/26/2017 |
| 99887 | 5/28/2016 |
DESIRED OUTPUT: cnt_logins_between_action_dates field
| user_id | action_type | action_date | lag_action_date | elapsed_days | cnt_logins_between_action_dates |
|---------|---------------|-------------|-----------------|--------------|---------------------------------|
| 12345 | action_type_1 | 6/27/2017 | 3/3/2017 | 116 | 5 |
| 12345 | action_type_1 | 3/3/2017 | 2/28/2017 | 3 | 4 |
| 12345 | action_type_1 | 2/28/2017 | NULL | NULL | 1 |
| 12345 | action_type_2 | 3/6/2017 | 3/3/2017 | 3 | 2 |
| 12345 | action_type_2 | 3/3/2017 | 3/25/2016 | 343 | 7 |
| 12345 | action_type_2 | 3/25/2016 | NULL | NULL | 1 |
| 12345 | action_type_4 | 3/6/2017 | 3/3/2017 | 3 | 2 |
| 12345 | action_type_4 | 3/3/2017 | NULL | NULL | 1 |
| 99887 | action_type_1 | 4/1/2017 | 2/11/2017 | 49 | 8 |
| 99887 | action_type_1 | 2/11/2017 | 1/28/2017 | 14 | 2 |
| 99887 | action_type_1 | 1/28/2017 | NULL | NULL | 1 |
You don't need a correlated sub-query. Get the previous date using lag and join the logins table to count the actions between dates.
with prev_dates as (select at.*
,coalesce(lag(action_date) over(partition by user_id,action_type order by action_date)
,action_date) as lag_action_date
from actions_taken at
)
select at.user_id,at.action_type,at.action_date,at.lag_action_date
,at.action_date-at.lag_action_date as elapsed_days
,count(*) as cnt
from prev_dates at
join login l on l.user_id=at.user_id and l.login_date<=at.action_date and l.login_date>=at.lag_action_date
group by at.user_id,at.action_type,at.action_date,at.lag_action_date
order by 1,2,3