Consult the same table with filter value - sql

I would appreciate how can i get this result.
My original table:
Date | Indicator | Value
2020-01-01 | 1 | 3000.00
2020-01-02 | 1 | 2500.00
2020-01-03 | 1 | 1000.00
2020-01-01 | 2 | 12.50
2020-01-02 | 2 | 13.23
2020-01-03 | 2 | 14.24
2020-01-01 | 3 | 150.00
2020-01-02 | 3 | 300.00
2020-01-03 | 3 | 200.00
I need to expanse the value of indicator 1 for the rest indicators
respecting the date.
Date | Indicator | Value | Result
2020-01-01 | 1 | 3000.00 | 3000.00
2020-01-02 | 1 | 2500.00 | 2500.00
2020-01-03 | 1 | 1000.00 | 1000.00
2020-01-01 | 2 | 12.50 | 3000.00
2020-01-02 | 2 | 13.23 | 2500.00
2020-01-03 | 2 | 14.24 | 1000.00
2020-01-01 | 3 | 150.00 | 3000.00
2020-01-02 | 3 | 300.00 | 2500.00
2020-01-03 | 3 | 200.00 | 1000.00

One method uses a window function:
select t.*,
first_value(value) over (partition by date order by indicator) as result
from t;

Related

How to drop duplicate rows from postgresql sql table

date | window | points | actual_bool | previous_bool | creation_time | source
------------+---------+---------+---------------------+---------------------------------+----------------------------+--------
2021-02-11 | 110 | 0.6 | 0 | 0 | 2021-02-14 09:20:57.51966 | bldgh
2021-02-11 | 150 | 0.7 | 1 | 0 | 2021-02-14 09:20:57.51966 | fiata
2021-02-11 | 110 | 0.7 | 1 | 0 | 2021-02-14 09:20:57.51966 | nfiws
2021-02-11 | 150 | 0.7 | 1 | 0 | 2021-02-14 09:20:57.51966 | fiata
2021-02-11 | 110 | 0.6 | 0 | 0 | 2021-02-14 09:20:57.51966 | bldgh
2021-02-11 | 110 | 0.3 | 0 | 1 | 2021-02-14 09:22:22.969014 | asdg1
2021-02-11 | 110 | 0.6 | 0 | 0 | 2021-02-14 09:22:22.969014 | j
2021-02-11 | 110 | 0.3 | 0 | 1 | 2021-02-14 09:22:22.969014 | aba
2021-02-11 | 110 | 0.5 | 0 | 1 | 2021-02-14 09:22:22.969014 | fg
2021-02-11 | 110 | 0.6 | 1 | 0 | 2021-02-14 09:22:22.969014 | wdda
2021-02-11 | 110 | 0.7 | 1 | 1 | 2021-02-14 09:23:21.977685 | dda
2021-02-11 | 110 | 0.5 | 1 | 0 | 2021-02-14 09:23:21.977685 | dd
2021-02-11 | 110 | 0.6 | 1 | 1 | 2021-02-14 09:23:21.977685 | so
2021-02-11 | 110 | 0.5 | 1 | 1 | 2021-02-14 09:23:21.977685 | dar
2021-02-11 | 110 | 0.6 | 1 | 1 | 2021-02-14 09:23:21.977685 | firr
2021-02-11 | 110 | 0.8 | 1 | 1 | 2021-02-14 09:24:15.831411 | xim
2021-02-11 | 110 | 0.8 | 1 | 1 | 2021-02-14 09:24:15.831411 | cxyy
2021-02-11 | 110 | 0.3 | 0 | 1 | 2021-02-14 09:24:15.831411 | bisd
2021-02-11 | 110 | 0.1 | 0 | 1 | 2021-02-14 09:24:15.831411 | cope
2021-02-11 | 110 | 0.2 | 0 | 1 | 2021-02-14 09:24:15.831411 | sand
...
I have the following dataset in a postgresql table called testtable in testdb.
I have accidentally copied over the database and duplicated rows.
How can I delete the duplicates?
Row 1 and row 5 are copies in this frame and row 2 and row 4 are copies too.
I have never used sql before to drop duplicates I have no idea where to start.
I tried
select creation_time, count(creation_time) from classification group by creation_time having count (creation_time)>1 order by source;
But all it did was show me howmany duplicates I had in each day,
Like this
creation_time | count
----------------------------+-------
2021-02-14 09:20:57.51966 | 10
2021-02-14 09:22:22.969014 | 10
2021-02-14 09:23:21.977685 | 10
2021-02-14 09:24:15.831411 | 10
2021-02-14 09:24:27.733763 | 10
2021-02-14 09:24:38.41793 | 10
2021-02-14 09:27:04.432466 | 10
2021-02-14 09:27:21.62256 | 10
2021-02-14 09:27:22.677763 | 10
2021-02-14 09:27:37.996054 | 10
2021-02-14 09:28:09.275041 | 10
2021-02-14 09:28:22.649391 | 10
...
There should only be 5 unique records in each creation_timestamp.
It doesnt show me the duplicates and even if i did it would have no idea how to drop them.
That is a lot of rows to delete. I would suggest just recreating the table:
create table new_classification as
select distinct c.*
from classification c;
After you have validated the data, you can reload it if you really want:
truncate table classification;
insert into classification
select *
from new_classification;
This process should be much faster than deleting 90% of the rows.

Grouping, Summing and Ordering

I want to get a breakdown by Name, Year/Month and Total. How can I do that with what I've got so far?
My data looks like this:
| name | ArtifactID | Name | DateCollected | FileSizeInBytes | WorkspaceArtifactId | TimestampOfLatestRecord |
+---------+------------+---------------------------+-------------------------+-----------------+---------------------+-------------------------+
| Pony | 1265555 | LiteDataPublishedToReview | 2018-12-21 00:00:00.000 | 5474.00 | 2534710 | 2018-12-21 09:26:49.000 |
| Wheels | 1265566 | LiteDataPublishedToReview | 2019-02-26 00:00:00.000 | 50668.00 | 2634282 | 2019-02-26 17:38:39.000 |
| Wheels | 1265567 | LiteDataPublishedToReview | 2019-01-11 00:00:00.000 | 10921638320.00 | 2634282 | 2019-01-11 16:44:04.000 |
| Wheels | 1265568 | LiteDataPublishedToReview | 2019-01-15 00:00:00.000 | 110261521.00 | 2634282 | 2019-01-15 17:43:57.000 |
| Wheels | 1265569 | LiteDataProcessed | 2018-12-13 00:00:00.000 | 123187605031.00 | 2634282 | 2018-12-13 21:50:34.000 |
| Wheels | 1265570 | FullDataProcessed | 2018-12-13 00:00:00.000 | 6810556609.00 | 2634282 | 2018-12-13 21:50:34.000 |
| Wheels | 1265571 | LiteDataProcessed | 2018-12-15 00:00:00.000 | 0.00 | 2634282 | 2018-12-15 14:52:20.000 |
| Wheels | 1265572 | FullDataProcessed | 2018-12-15 00:00:00.000 | 13362690.00 | 2634282 | 2018-12-15 14:52:20.000 |
| Wheels | 1265573 | LiteDataProcessed | 2019-01-09 00:00:00.000 | 1480303616.00 | 2634282 | 2019-01-09 13:52:23.000 |
| Wheels | 1265574 | FullDataProcessed | 2019-01-09 00:00:00.000 | 0.00 | 2634282 | 2019-01-09 13:52:23.000 |
| Wheels | 1265575 | LiteDataProcessed | 2019-02-25 00:00:00.000 | 0.00 | 2634282 | 2019-02-25 10:49:41.000 |
| Wheels | 1265576 | FullDataProcessed | 2019-02-25 00:00:00.000 | 7633201.00 | 2634282 | 2019-02-25 10:49:41.000 |
| Levack | 1265577 | LiteDataProcessed | 2018-12-16 00:00:00.000 | 0.00 | 2636230 | 2018-12-16 10:13:36.000 |
| Levack | 1265578 | FullDataProcessed | 2018-12-16 00:00:00.000 | 59202559.00 | 2636230 | 2018-12-16 10:13:36.000 |
| Van | 1265579 | LiteDataPublishedToReview | 2019-01-11 00:00:00.000 | 2646602711.00 | 2636845 | 2019-01-11 09:50:49.000 |
| Van | 1265580 | LiteDataPublishedToReview | 2019-01-10 00:00:00.000 | 10081222022.00 | 2636845 | 2019-01-10 18:32:03.000 |
| Van | 1265581 | LiteDataPublishedToReview | 2019-01-15 00:00:00.000 | 3009227476.00 | 2636845 | 2019-01-15 10:49:38.000 |
| Van | 1265582 | LiteDataPublishedToReview | 2019-03-26 00:00:00.000 | 87220831.00 | 2636845 | 2019-03-26 10:34:10.000 |
| Van | 1265583 | LiteDataPublishedToReview | 2019-03-28 00:00:00.000 | 688708119.00 | 2636845 | 2019-03-28 14:11:38.000 |
| Van | 1265584 | LiteDataProcessed | 2018-12-18 00:00:00.000 | 5408886887.00 | 2636845 | 2018-12-18 11:29:03.000 |
| Van | 1265585 | FullDataProcessed | 2018-12-18 00:00:00.000 | 0.00 | 2636845 | 2018-12-18 11:29:03.000 |
| Van | 1265586 | LiteDataProcessed | 2018-12-19 00:00:00.000 | 12535359488.00 | 2636845 | 2018-12-19 17:25:10.000 |
| Van | 1265587 | FullDataProcessed | 2018-12-19 00:00:00.000 | 0.00 | 2636845 | 2018-12-19 17:25:10.000 |
| Van | 1265588 | LiteDataProcessed | 2018-12-21 00:00:00.000 | 52599693312.00 | 2636845 | 2018-12-21 09:09:18.000 |
| Van | 1265589 | FullDataProcessed | 2018-12-21 00:00:00.000 | 0.00 | 2636845 | 2018-12-21 09:09:18.000 |
| Van | 1265590 | LiteDataProcessed | 2019-03-25 00:00:00.000 | 3588613120.00 | 2636845 | 2019-03-25 16:41:17.000 |
| Van | 1265591 | FullDataProcessed | 2019-03-25 00:00:00.000 | 0.00 | 2636845 | 2019-03-25 16:41:17.000 |
| Holiday | 1265592 | LiteDataProcessed | 2018-12-28 00:00:00.000 | 0.00 | 2638126 | 2018-12-28 09:15:21.000 |
| Holiday | 1265593 | FullDataProcessed | 2018-12-28 00:00:00.000 | 9219122847.00 | 2638126 | 2018-12-28 09:15:21.000 |
| Holiday | 1265594 | LiteDataProcessed | 2019-01-31 00:00:00.000 | 0.00 | 2638126 | 2019-01-31 14:45:07.000 |
| Holiday | 1265595 | FullDataProcessed | 2019-01-31 00:00:00.000 | 61727744.00 | 2638126 | 2019-01-31 14:45:07.000 |
| Holiday | 1265596 | LiteDataProcessed | 2019-02-05 00:00:00.000 | 0.00 | 2638126 | 2019-02-05 15:23:27.000 |
| Holiday | 1265597 | FullDataProcessed | 2019-02-05 00:00:00.000 | 199454805.00 | 2638126 | 2019-02-05 15:23:27.000 |
| Holiday | 1265598 | LiteDataProcessed | 2019-02-07 00:00:00.000 | 0.00 | 2638126 | 2019-02-07 11:55:55.000 |
| Holiday | 1265599 | FullDataProcessed | 2019-02-07 00:00:00.000 | 17944713.00 | 2638126 | 2019-02-07 11:55:55.000 |
| Holiday | 1265600 | LiteDataProcessed | 2019-02-13 00:00:00.000 | 0.00 | 2638126 | 2019-02-13 15:48:56.000 |
| Holiday | 1265601 | FullDataProcessed | 2019-02-13 00:00:00.000 | 60421568.00 | 2638126 | 2019-02-13 15:48:56.000 |
| Crosbie | 1265604 | LiteDataProcessed | 2019-01-21 00:00:00.000 | 0.00 | 2644032 | 2019-01-21 15:43:43.000 |
| Crosbie | 1265605 | FullDataProcessed | 2019-01-21 00:00:00.000 | 131445.00 | 2644032 | 2019-01-21 15:43:43.000 |
| Stone | 1265606 | LiteDataPublishedToReview | 2019-02-12 00:00:00.000 | 1626943444.00 | 2647518 | 2019-02-12 17:45:25.000 |
| Stone | 1265607 | LiteDataPublishedToReview | 2019-03-05 00:00:00.000 | 2134872671.00 | 2647518 | 2019-03-05 13:00:31.000 |
| Stone | 1265608 | LiteDataProcessed | 2019-02-05 00:00:00.000 | 38828043264.00 | 2647518 | 2019-02-05 09:40:55.000 |
| Stone | 1265609 | FullDataProcessed | 2019-02-05 00:00:00.000 | 0.00 | 2647518 | 2019-02-05 09:40:55.000 |
| Frost | 1265610 | LiteDataPublishedToReview | 2019-03-18 00:00:00.000 | 776025640.00 | 2658542 | 2019-03-18 12:34:10.000 |
| Frost | 1265611 | LiteDataPublishedToReview | 2019-03-05 00:00:00.000 | 3325335118.00 | 2658542 | 2019-03-05 15:02:39.000 |
| Frost | 1265612 | LiteDataPublishedToReview | 2019-03-20 00:00:00.000 | 211927893.00 | 2658542 | 2019-03-20 17:25:30.000 |
| Frost | 1265613 | LiteDataPublishedToReview | 2019-03-06 00:00:00.000 | 466536488.00 | 2658542 | 2019-03-06 11:00:59.000 |
| Frost | 1265614 | LiteDataPublishedToReview | 2019-03-21 00:00:00.000 | 3863850553.00 | 2658542 | 2019-03-21 17:14:27.000 |
| Frost | 1265615 | LiteDataProcessed | 2019-02-28 00:00:00.000 | 94249740012.00 | 2658542 | 2019-02-28 14:13:23.000 |
| Frost | 1265616 | FullDataProcessed | 2019-02-28 00:00:00.000 | 0.00 | 2658542 | 2019-02-28 14:13:23.000 |
| Yellow | 1265617 | LiteDataPublishedToReview | 2019-03-27 00:00:00.000 | 4550540631.00 | 2659077 | 2019-03-27 16:09:41.000 |
| Yellow | 1265618 | LiteDataProcessed | 2019-03-07 00:00:00.000 | 0.00 | 2659077 | 2019-03-07 16:53:16.000 |
| Yellow | 1265619 | FullDataProcessed | 2019-03-07 00:00:00.000 | 96139872.00 | 2659077 | 2019-03-07 16:53:16.000 |
| Yellow | 1265620 | LiteDataProcessed | 2019-03-08 00:00:00.000 | 105357273318.00 | 2659077 | 2019-03-08 16:43:24.000 |
| Yellow | 1265621 | FullDataProcessed | 2019-03-08 00:00:00.000 | 0.00 | 2659077 | 2019-03-08 16:43:24.000 |
+---------+------------+---------------------------+-------------------------+-----------------+---------------------+-------------------------+
This is my attempt:
SELECT
CAST(YEAR(ps.DateCollected) AS VARCHAR(4)) + '-' + right('00' + CAST(MONTH(ps.DateCollected) AS VARCHAR(2)), 2),
ps.[Name],
c.name,
ceiling(SUM(ps.FileSizeInBytes)/1024/1024/1024.0) [Processed]
FROM EDDSDBO.RPCCProcessingStatistics ps
inner join edds.eddsdbo.[case] c on c.artifactid = ps.workspaceartifactid
where ps.DateCollected >= '2018-12-01'
GROUP BY ps.name, c.name, CAST(YEAR(ps.DateCollected) AS VARCHAR(4)) + '-' + right('00' + CAST(MONTH(ps.DateCollected) AS VARCHAR(2)), 2)
The logic should be this:
(1) Get all values after 2018-12-01 in bytes
(2) Total them
(3) Convert to GB
(4) Ceiling the result
When I run my code and I add the results together for FullDataProcessed I get 22. However, when I manually add up the results for FullDataProcessed, I get 15.40 which when ceiling'd is 16.
I would expect the FullDataProcessed from the results of my code to equal 16, not 22.
I would guess that one or more of your records has its workspaceartifactid specified more than once in the edds.eddsdbo.[case] table. Is the primary key on the case table more than just artifactid?

Count records for "empty" rows in multiple columns and joins

I' have searched a lot through the site trying to find a solution to my problem and I have found similar problems but I haven't managed to find a solution that works in my case.
I have a tickets table like this (which has a lot more data than this):
TICKET:
+---------+--------------+------------+------------+
| ticketid| report_date | impact | open |
+---------+--------------+------------+------------+
| 1 | 29/01/2019 | 1 | true |
| 2 | 29/01/2019 | 2 | true |
| 3 | 30/01/2019 | 4 | true |
| 4 | 27/01/2019 | 1 | true |
| 5 | 29/01/2019 | 1 | true |
| 6 | 30/01/2019 | 2 | true |
+---------+--------------+------------+------------+
There is another table that holds the possible values for the impact column in the table above:
IMPACT:
+---------+
| impact |
+---------+
| 1 |
| 2 |
| 3 |
| 4 |
+---------+
My objective is to extract a result set from the ticket table where I group by the impact, report_date and open flag and count the number of tickets in each group. Therefore, for the example above, I would like to extract the following result set.
+--------------+------------+------------+-----------+
| report_date | impact | open | tkt_count |
+--------------+------------+------------+-----------+
| 27/01/2019 | 1 | true | 1 |
| 27/01/2019 | 1 | false | 0 |
| 27/01/2019 | 2 | true | 0 |
| 27/01/2019 | 2 | false | 0 |
| 27/01/2019 | 3 | true | 0 |
| 27/01/2019 | 3 | false | 0 |
| 27/01/2019 | 4 | true | 0 |
| 27/01/2019 | 4 | false | 0 |
| 29/01/2019 | 1 | true | 2 |
| 29/01/2019 | 1 | false | 0 |
| 29/01/2019 | 2 | true | 1 |
| 29/01/2019 | 2 | false | 0 |
| 29/01/2019 | 3 | true | 0 |
| 29/01/2019 | 3 | false | 0 |
| 29/01/2019 | 4 | true | 0 |
| 29/01/2019 | 4 | false | 0 |
| 30/01/2019 | 1 | true | 0 |
| 30/01/2019 | 1 | false | 0 |
| 30/01/2019 | 2 | true | 1 |
| 30/01/2019 | 2 | false | 0 |
| 30/01/2019 | 3 | true | 0 |
| 30/01/2019 | 3 | false | 0 |
| 30/01/2019 | 4 | true | 1 |
| 30/01/2019 | 4 | false | 0 |
+--------------+------------+------------+-----------+
It seems simple enough, but the problem is with the "zero" rows.
For the example that I showed here, there are no tickets with impact 3 or tickets with the open flag flase for the range of dates given. And I cannot come up with a query that will show me all the counts, even if there are no rows for some values.
Can anyone help me?
Thanks in advance.
To solve this type of problem, one way to proceed is to generate a intermediate resultset that contains all records for which a value needs to be computed, and then LEFT JOIN it with the original data, using aggregation.
SELECT
dt.report_date,
i.impact,
op.[open],
COUNT(t.report_date) tkt_count
FROM
(SELECT DISTINCT report_date FROM ticket) dt
CROSS JOIN impact i
CROSS JOIN (SELECT 'true' [open] UNION ALL SELECT 'false') op
LEFT JOIN ticket t
ON t.report_date = dt.report_date
AND t.impact = i.impact
AND t.[open] = op.[open]
GROUP BY
dt.report_date,
i.impact,
op.[open]
This query generates the intermediate resultset as follows :
report_date : all distinct dates in the original data (report_date)
impact : contains of table impact
open : fixed list containing true or false (could also have been built from distinct values in the original data, but value false was not available is your sample data)
You can choose to change the above rules, the logic should remain the same. For example if there are gaps in the report_date, another widely used option is to create a calendar table.
Demo on DB Fiddle:
report_date | impact | open | tkt_count
:------------------ | -----: | :---- | --------:
27/01/2019 00:00:00 | 1 | false | 0
27/01/2019 00:00:00 | 1 | true | 1
27/01/2019 00:00:00 | 2 | false | 0
27/01/2019 00:00:00 | 2 | true | 0
27/01/2019 00:00:00 | 3 | false | 0
27/01/2019 00:00:00 | 3 | true | 0
27/01/2019 00:00:00 | 4 | false | 0
27/01/2019 00:00:00 | 4 | true | 0
29/01/2019 00:00:00 | 1 | false | 0
29/01/2019 00:00:00 | 1 | true | 2
29/01/2019 00:00:00 | 2 | false | 0
29/01/2019 00:00:00 | 2 | true | 1
29/01/2019 00:00:00 | 3 | false | 0
29/01/2019 00:00:00 | 3 | true | 0
29/01/2019 00:00:00 | 4 | false | 0
29/01/2019 00:00:00 | 4 | true | 0
30/01/2019 00:00:00 | 1 | false | 0
30/01/2019 00:00:00 | 1 | true | 0
30/01/2019 00:00:00 | 2 | false | 0
30/01/2019 00:00:00 | 2 | true | 1
30/01/2019 00:00:00 | 3 | false | 0
30/01/2019 00:00:00 | 3 | true | 0
30/01/2019 00:00:00 | 4 | false | 0
30/01/2019 00:00:00 | 4 | true | 1
I queried against a start and end calendar table by day and cross joined all available impact/open combos and finally bought in the ticket data, counting the non-null matches.
DECLARE #Impact TABLE(Impact INT)
INSERT #Impact VALUES(1),(2),(3),(4)
DECLARE #Tickets TABLE(report_date DATETIME, Impact INT, IsOpen BIT)
INSERT #Tickets VALUES
('01/29/2019',1,1),('01/29/2019',2,1),('01/30/2019',3,1),('01/27/2019',4,1),('01/29/2019',5,1),('01/30/2019',6,1)
DECLARE #StartDate DATETIME='01/01/2019'
DECLARE #EndDate DATETIME='02/01/2019'
;WITH AllDates AS
(
SELECT Date = #StartDate
UNION ALL
SELECT Date= DATEADD(DAY, 1, Date) FROM AllDates WHERE DATEADD(DAY, 1,Date) <= #EndDate
)
,AllImpacts AS
(
SELECT DISTINCT Impact,IsOpen = 1 FROM #Impact
UNION
SELECT DISTINCT Impact,IsOpen = 0 FROM #Impact
),
AllData AS
(
SELECT D.Date,A.impact,A.IsOpen
FROM AllDates D
CROSS APPLY AllImpacts A
)
SELECT
A.Date,A.Impact,A.IsOpen,
GroupCount = COUNT(T.Impact)
FROM
AllData A
LEFT OUTER JOIN #Tickets T ON T.report_date=A.Date AND T.Impact=A.Impact AND T.IsOpen = A.IsOpen
GROUP BY
A.Date,A.Impact,A.IsOpen
ORDER BY
A.Date,A.Impact,A.IsOpen
OPTION (MAXRECURSION 0);
GO
Date | Impact | IsOpen | GroupCount
:------------------ | -----: | -----: | ---------:
01/01/2019 00:00:00 | 1 | 0 | 0
01/01/2019 00:00:00 | 1 | 1 | 0
01/01/2019 00:00:00 | 2 | 0 | 0
01/01/2019 00:00:00 | 2 | 1 | 0
01/01/2019 00:00:00 | 3 | 0 | 0
01/01/2019 00:00:00 | 3 | 1 | 0
01/01/2019 00:00:00 | 4 | 0 | 0
01/01/2019 00:00:00 | 4 | 1 | 0
02/01/2019 00:00:00 | 1 | 0 | 0
02/01/2019 00:00:00 | 1 | 1 | 0
02/01/2019 00:00:00 | 2 | 0 | 0
02/01/2019 00:00:00 | 2 | 1 | 0
02/01/2019 00:00:00 | 3 | 0 | 0
02/01/2019 00:00:00 | 3 | 1 | 0
02/01/2019 00:00:00 | 4 | 0 | 0
02/01/2019 00:00:00 | 4 | 1 | 0
03/01/2019 00:00:00 | 1 | 0 | 0
03/01/2019 00:00:00 | 1 | 1 | 0
03/01/2019 00:00:00 | 2 | 0 | 0
03/01/2019 00:00:00 | 2 | 1 | 0
03/01/2019 00:00:00 | 3 | 0 | 0
03/01/2019 00:00:00 | 3 | 1 | 0
03/01/2019 00:00:00 | 4 | 0 | 0
03/01/2019 00:00:00 | 4 | 1 | 0
04/01/2019 00:00:00 | 1 | 0 | 0
04/01/2019 00:00:00 | 1 | 1 | 0
04/01/2019 00:00:00 | 2 | 0 | 0
04/01/2019 00:00:00 | 2 | 1 | 0
04/01/2019 00:00:00 | 3 | 0 | 0
04/01/2019 00:00:00 | 3 | 1 | 0
04/01/2019 00:00:00 | 4 | 0 | 0
04/01/2019 00:00:00 | 4 | 1 | 0
05/01/2019 00:00:00 | 1 | 0 | 0
05/01/2019 00:00:00 | 1 | 1 | 0
05/01/2019 00:00:00 | 2 | 0 | 0
05/01/2019 00:00:00 | 2 | 1 | 0
05/01/2019 00:00:00 | 3 | 0 | 0
05/01/2019 00:00:00 | 3 | 1 | 0
05/01/2019 00:00:00 | 4 | 0 | 0
05/01/2019 00:00:00 | 4 | 1 | 0
06/01/2019 00:00:00 | 1 | 0 | 0
06/01/2019 00:00:00 | 1 | 1 | 0
06/01/2019 00:00:00 | 2 | 0 | 0
06/01/2019 00:00:00 | 2 | 1 | 0
06/01/2019 00:00:00 | 3 | 0 | 0
06/01/2019 00:00:00 | 3 | 1 | 0
06/01/2019 00:00:00 | 4 | 0 | 0
06/01/2019 00:00:00 | 4 | 1 | 0
07/01/2019 00:00:00 | 1 | 0 | 0
07/01/2019 00:00:00 | 1 | 1 | 0
07/01/2019 00:00:00 | 2 | 0 | 0
07/01/2019 00:00:00 | 2 | 1 | 0
07/01/2019 00:00:00 | 3 | 0 | 0
07/01/2019 00:00:00 | 3 | 1 | 0
07/01/2019 00:00:00 | 4 | 0 | 0
07/01/2019 00:00:00 | 4 | 1 | 0
08/01/2019 00:00:00 | 1 | 0 | 0
08/01/2019 00:00:00 | 1 | 1 | 0
08/01/2019 00:00:00 | 2 | 0 | 0
08/01/2019 00:00:00 | 2 | 1 | 0
08/01/2019 00:00:00 | 3 | 0 | 0
08/01/2019 00:00:00 | 3 | 1 | 0
08/01/2019 00:00:00 | 4 | 0 | 0
08/01/2019 00:00:00 | 4 | 1 | 0
09/01/2019 00:00:00 | 1 | 0 | 0
09/01/2019 00:00:00 | 1 | 1 | 0
09/01/2019 00:00:00 | 2 | 0 | 0
09/01/2019 00:00:00 | 2 | 1 | 0
09/01/2019 00:00:00 | 3 | 0 | 0
09/01/2019 00:00:00 | 3 | 1 | 0
09/01/2019 00:00:00 | 4 | 0 | 0
09/01/2019 00:00:00 | 4 | 1 | 0
10/01/2019 00:00:00 | 1 | 0 | 0
10/01/2019 00:00:00 | 1 | 1 | 0
10/01/2019 00:00:00 | 2 | 0 | 0
10/01/2019 00:00:00 | 2 | 1 | 0
10/01/2019 00:00:00 | 3 | 0 | 0
10/01/2019 00:00:00 | 3 | 1 | 0
10/01/2019 00:00:00 | 4 | 0 | 0
10/01/2019 00:00:00 | 4 | 1 | 0
11/01/2019 00:00:00 | 1 | 0 | 0
11/01/2019 00:00:00 | 1 | 1 | 0
11/01/2019 00:00:00 | 2 | 0 | 0
11/01/2019 00:00:00 | 2 | 1 | 0
11/01/2019 00:00:00 | 3 | 0 | 0
11/01/2019 00:00:00 | 3 | 1 | 0
11/01/2019 00:00:00 | 4 | 0 | 0
11/01/2019 00:00:00 | 4 | 1 | 0
12/01/2019 00:00:00 | 1 | 0 | 0
12/01/2019 00:00:00 | 1 | 1 | 0
12/01/2019 00:00:00 | 2 | 0 | 0
12/01/2019 00:00:00 | 2 | 1 | 0
12/01/2019 00:00:00 | 3 | 0 | 0
12/01/2019 00:00:00 | 3 | 1 | 0
12/01/2019 00:00:00 | 4 | 0 | 0
12/01/2019 00:00:00 | 4 | 1 | 0
13/01/2019 00:00:00 | 1 | 0 | 0
13/01/2019 00:00:00 | 1 | 1 | 0
13/01/2019 00:00:00 | 2 | 0 | 0
13/01/2019 00:00:00 | 2 | 1 | 0
13/01/2019 00:00:00 | 3 | 0 | 0
13/01/2019 00:00:00 | 3 | 1 | 0
13/01/2019 00:00:00 | 4 | 0 | 0
13/01/2019 00:00:00 | 4 | 1 | 0
14/01/2019 00:00:00 | 1 | 0 | 0
14/01/2019 00:00:00 | 1 | 1 | 0
14/01/2019 00:00:00 | 2 | 0 | 0
14/01/2019 00:00:00 | 2 | 1 | 0
14/01/2019 00:00:00 | 3 | 0 | 0
14/01/2019 00:00:00 | 3 | 1 | 0
14/01/2019 00:00:00 | 4 | 0 | 0
14/01/2019 00:00:00 | 4 | 1 | 0
15/01/2019 00:00:00 | 1 | 0 | 0
15/01/2019 00:00:00 | 1 | 1 | 0
15/01/2019 00:00:00 | 2 | 0 | 0
15/01/2019 00:00:00 | 2 | 1 | 0
15/01/2019 00:00:00 | 3 | 0 | 0
15/01/2019 00:00:00 | 3 | 1 | 0
15/01/2019 00:00:00 | 4 | 0 | 0
15/01/2019 00:00:00 | 4 | 1 | 0
16/01/2019 00:00:00 | 1 | 0 | 0
16/01/2019 00:00:00 | 1 | 1 | 0
16/01/2019 00:00:00 | 2 | 0 | 0
16/01/2019 00:00:00 | 2 | 1 | 0
16/01/2019 00:00:00 | 3 | 0 | 0
16/01/2019 00:00:00 | 3 | 1 | 0
16/01/2019 00:00:00 | 4 | 0 | 0
16/01/2019 00:00:00 | 4 | 1 | 0
17/01/2019 00:00:00 | 1 | 0 | 0
17/01/2019 00:00:00 | 1 | 1 | 0
17/01/2019 00:00:00 | 2 | 0 | 0
17/01/2019 00:00:00 | 2 | 1 | 0
17/01/2019 00:00:00 | 3 | 0 | 0
17/01/2019 00:00:00 | 3 | 1 | 0
17/01/2019 00:00:00 | 4 | 0 | 0
17/01/2019 00:00:00 | 4 | 1 | 0
18/01/2019 00:00:00 | 1 | 0 | 0
18/01/2019 00:00:00 | 1 | 1 | 0
18/01/2019 00:00:00 | 2 | 0 | 0
18/01/2019 00:00:00 | 2 | 1 | 0
18/01/2019 00:00:00 | 3 | 0 | 0
18/01/2019 00:00:00 | 3 | 1 | 0
18/01/2019 00:00:00 | 4 | 0 | 0
18/01/2019 00:00:00 | 4 | 1 | 0
19/01/2019 00:00:00 | 1 | 0 | 0
19/01/2019 00:00:00 | 1 | 1 | 0
19/01/2019 00:00:00 | 2 | 0 | 0
19/01/2019 00:00:00 | 2 | 1 | 0
19/01/2019 00:00:00 | 3 | 0 | 0
19/01/2019 00:00:00 | 3 | 1 | 0
19/01/2019 00:00:00 | 4 | 0 | 0
19/01/2019 00:00:00 | 4 | 1 | 0
20/01/2019 00:00:00 | 1 | 0 | 0
20/01/2019 00:00:00 | 1 | 1 | 0
20/01/2019 00:00:00 | 2 | 0 | 0
20/01/2019 00:00:00 | 2 | 1 | 0
20/01/2019 00:00:00 | 3 | 0 | 0
20/01/2019 00:00:00 | 3 | 1 | 0
20/01/2019 00:00:00 | 4 | 0 | 0
20/01/2019 00:00:00 | 4 | 1 | 0
21/01/2019 00:00:00 | 1 | 0 | 0
21/01/2019 00:00:00 | 1 | 1 | 0
21/01/2019 00:00:00 | 2 | 0 | 0
21/01/2019 00:00:00 | 2 | 1 | 0
21/01/2019 00:00:00 | 3 | 0 | 0
21/01/2019 00:00:00 | 3 | 1 | 0
21/01/2019 00:00:00 | 4 | 0 | 0
21/01/2019 00:00:00 | 4 | 1 | 0
22/01/2019 00:00:00 | 1 | 0 | 0
22/01/2019 00:00:00 | 1 | 1 | 0
22/01/2019 00:00:00 | 2 | 0 | 0
22/01/2019 00:00:00 | 2 | 1 | 0
22/01/2019 00:00:00 | 3 | 0 | 0
22/01/2019 00:00:00 | 3 | 1 | 0
22/01/2019 00:00:00 | 4 | 0 | 0
22/01/2019 00:00:00 | 4 | 1 | 0
23/01/2019 00:00:00 | 1 | 0 | 0
23/01/2019 00:00:00 | 1 | 1 | 0
23/01/2019 00:00:00 | 2 | 0 | 0
23/01/2019 00:00:00 | 2 | 1 | 0
23/01/2019 00:00:00 | 3 | 0 | 0
23/01/2019 00:00:00 | 3 | 1 | 0
23/01/2019 00:00:00 | 4 | 0 | 0
23/01/2019 00:00:00 | 4 | 1 | 0
24/01/2019 00:00:00 | 1 | 0 | 0
24/01/2019 00:00:00 | 1 | 1 | 0
24/01/2019 00:00:00 | 2 | 0 | 0
24/01/2019 00:00:00 | 2 | 1 | 0
24/01/2019 00:00:00 | 3 | 0 | 0
24/01/2019 00:00:00 | 3 | 1 | 0
24/01/2019 00:00:00 | 4 | 0 | 0
24/01/2019 00:00:00 | 4 | 1 | 0
25/01/2019 00:00:00 | 1 | 0 | 0
25/01/2019 00:00:00 | 1 | 1 | 0
25/01/2019 00:00:00 | 2 | 0 | 0
25/01/2019 00:00:00 | 2 | 1 | 0
25/01/2019 00:00:00 | 3 | 0 | 0
25/01/2019 00:00:00 | 3 | 1 | 0
25/01/2019 00:00:00 | 4 | 0 | 0
25/01/2019 00:00:00 | 4 | 1 | 0
26/01/2019 00:00:00 | 1 | 0 | 0
26/01/2019 00:00:00 | 1 | 1 | 0
26/01/2019 00:00:00 | 2 | 0 | 0
26/01/2019 00:00:00 | 2 | 1 | 0
26/01/2019 00:00:00 | 3 | 0 | 0
26/01/2019 00:00:00 | 3 | 1 | 0
26/01/2019 00:00:00 | 4 | 0 | 0
26/01/2019 00:00:00 | 4 | 1 | 0
27/01/2019 00:00:00 | 1 | 0 | 0
27/01/2019 00:00:00 | 1 | 1 | 0
27/01/2019 00:00:00 | 2 | 0 | 0
27/01/2019 00:00:00 | 2 | 1 | 0
27/01/2019 00:00:00 | 3 | 0 | 0
27/01/2019 00:00:00 | 3 | 1 | 0
27/01/2019 00:00:00 | 4 | 0 | 0
27/01/2019 00:00:00 | 4 | 1 | 1
28/01/2019 00:00:00 | 1 | 0 | 0
28/01/2019 00:00:00 | 1 | 1 | 0
28/01/2019 00:00:00 | 2 | 0 | 0
28/01/2019 00:00:00 | 2 | 1 | 0
28/01/2019 00:00:00 | 3 | 0 | 0
28/01/2019 00:00:00 | 3 | 1 | 0
28/01/2019 00:00:00 | 4 | 0 | 0
28/01/2019 00:00:00 | 4 | 1 | 0
29/01/2019 00:00:00 | 1 | 0 | 0
29/01/2019 00:00:00 | 1 | 1 | 1
29/01/2019 00:00:00 | 2 | 0 | 0
29/01/2019 00:00:00 | 2 | 1 | 1
29/01/2019 00:00:00 | 3 | 0 | 0
29/01/2019 00:00:00 | 3 | 1 | 0
29/01/2019 00:00:00 | 4 | 0 | 0
29/01/2019 00:00:00 | 4 | 1 | 0
30/01/2019 00:00:00 | 1 | 0 | 0
30/01/2019 00:00:00 | 1 | 1 | 0
30/01/2019 00:00:00 | 2 | 0 | 0
30/01/2019 00:00:00 | 2 | 1 | 0
30/01/2019 00:00:00 | 3 | 0 | 0
30/01/2019 00:00:00 | 3 | 1 | 1
30/01/2019 00:00:00 | 4 | 0 | 0
30/01/2019 00:00:00 | 4 | 1 | 0
31/01/2019 00:00:00 | 1 | 0 | 0
31/01/2019 00:00:00 | 1 | 1 | 0
31/01/2019 00:00:00 | 2 | 0 | 0
31/01/2019 00:00:00 | 2 | 1 | 0
31/01/2019 00:00:00 | 3 | 0 | 0
31/01/2019 00:00:00 | 3 | 1 | 0
31/01/2019 00:00:00 | 4 | 0 | 0
31/01/2019 00:00:00 | 4 | 1 | 0
01/02/2019 00:00:00 | 1 | 0 | 0
01/02/2019 00:00:00 | 1 | 1 | 0
01/02/2019 00:00:00 | 2 | 0 | 0
01/02/2019 00:00:00 | 2 | 1 | 0
01/02/2019 00:00:00 | 3 | 0 | 0
01/02/2019 00:00:00 | 3 | 1 | 0
01/02/2019 00:00:00 | 4 | 0 | 0
01/02/2019 00:00:00 | 4 | 1 | 0
db<>fiddle here

SQL query for setting column based on last seven entries

Problem
I am having trouble figuring out how to create a query that can tell if any userentry is preceded by 7 days without any activity (secondsPlayed == 0) and if so, then indicate it with the value of 1, otherwise 0.
This also means that if the user has less than 7 entries, the value will be 0 across all entries.
Input table:
+------------------------------+-------------------------+---------------+
| userid | estimationDate | secondsPlayed |
+------------------------------+-------------------------+---------------+
| a | 2016-07-14 00:00:00 UTC | 192.5 |
| a | 2016-07-15 00:00:00 UTC | 357.3 |
| a | 2016-07-16 00:00:00 UTC | 0 |
| a | 2016-07-17 00:00:00 UTC | 0 |
| a | 2016-07-18 00:00:00 UTC | 0 |
| a | 2016-07-19 00:00:00 UTC | 0 |
| a | 2016-07-20 00:00:00 UTC | 0 |
| a | 2016-07-21 00:00:00 UTC | 0 |
| a | 2016-07-22 00:00:00 UTC | 0 |
| a | 2016-07-23 00:00:00 UTC | 0 |
| a | 2016-07-24 00:00:00 UTC | 0 |
| ---------------------------- | ---------------------- | ---- |
| b | 2016-07-02 00:00:00 UTC | 31.2 |
| b | 2016-07-03 00:00:00 UTC | 42.1 |
| b | 2016-07-04 00:00:00 UTC | 41.9 |
| b | 2016-07-05 00:00:00 UTC | 43.2 |
| b | 2016-07-06 00:00:00 UTC | 91.5 |
| b | 2016-07-07 00:00:00 UTC | 0 |
| b | 2016-07-08 00:00:00 UTC | 0 |
| b | 2016-07-09 00:00:00 UTC | 239.1 |
| b | 2016-07-10 00:00:00 UTC | 0 |
+------------------------------+-------------------------+---------------+
The intended output table should look like this:
Output table:
+------------------------------+-------------------------+---------------+----------+
| userid | estimationDate | secondsPlayed | inactive |
+------------------------------+-------------------------+---------------+----------+
| a | 2016-07-14 00:00:00 UTC | 192.5 | 0 |
| a | 2016-07-15 00:00:00 UTC | 357.3 | 0 |
| a | 2016-07-16 00:00:00 UTC | 0 | 0 |
| a | 2016-07-17 00:00:00 UTC | 0 | 0 |
| a | 2016-07-18 00:00:00 UTC | 0 | 0 |
| a | 2016-07-19 00:00:00 UTC | 0 | 0 |
| a | 2016-07-20 00:00:00 UTC | 0 | 0 |
| a | 2016-07-21 00:00:00 UTC | 0 | 0 |
| a | 2016-07-22 00:00:00 UTC | 0 | 1 |
| a | 2016-07-23 00:00:00 UTC | 0 | 1 |
| a | 2016-07-24 00:00:00 UTC | 0 | 1 |
| ---------------------------- | ----------------------- | ----- | ----- |
| b | 2016-07-02 00:00:00 UTC | 31.2 | 0 |
| b | 2016-07-03 00:00:00 UTC | 42.1 | 0 |
| b | 2016-07-04 00:00:00 UTC | 41.9 | 0 |
| b | 2016-07-05 00:00:00 UTC | 43.2 | 0 |
| b | 2016-07-06 00:00:00 UTC | 91.5 | 0 |
| b | 2016-07-07 00:00:00 UTC | 0 | 0 |
| b | 2016-07-08 00:00:00 UTC | 0 | 0 |
| b | 2016-07-09 00:00:00 UTC | 239.1 | 0 |
| b | 2016-07-10 00:00:00 UTC | 0 | 0 |
+------------------------------+-------------------------+---------------+----------+
Thoughts
At first I was thinking about using the Lag function with a 7 offset, but this would obviously not relate to any of the subjects in between.
I was also thinking about creating a rolling window/average for a period of 7 days and evaluating if this is above 0. However this might be a bit above my skill level.
Any chance anyone has a good solution to this problem.
Assuming that you have data every day (which seems like a reasonable assumption), you can sum a window function:
select t.*,
(case when sum(secondsplayed) over (partition by userid
order by estimationdate
rows between 6 preceding and current row
) = 0 and
row_number() over (partition by userid order by estimationdate) >= 7
then 1
else 0
end) as inactive
from t;
In addition to no holes in the dates, this also assumes that secondsplayed is never negative. (Negative values can easily be incorporated into the logic, but that seems unnecessary.)
In my experience this type of input tables do not consist of inactivity entries and usually look like this (only activity entries are present here)
Input table:
+------------------------------+-------------------------+---------------+
| userid | estimationDate | secondsPlayed |
+------------------------------+-------------------------+---------------+
| a | 2016-07-14 00:00:00 UTC | 192.5 |
| a | 2016-07-15 00:00:00 UTC | 357.3 |
| ---------------------------- | ---------------------- | ---- |
| b | 2016-07-02 00:00:00 UTC | 31.2 |
| b | 2016-07-03 00:00:00 UTC | 42.1 |
| b | 2016-07-04 00:00:00 UTC | 41.9 |
| b | 2016-07-05 00:00:00 UTC | 43.2 |
| b | 2016-07-06 00:00:00 UTC | 91.5 |
| b | 2016-07-09 00:00:00 UTC | 239.1 |
+------------------------------+-------------------------+---------------+
So, below is for BigQuery Standard SQL and input as above
#standardSQL
WITH `project.dataset.table` AS (
SELECT 'a' userid, TIMESTAMP '2016-07-14 00:00:00 UTC' estimationDate, 192.5 secondsPlayed UNION ALL
SELECT 'a', '2016-07-15 00:00:00 UTC', 357.3 UNION ALL
SELECT 'b', '2016-07-02 00:00:00 UTC', 31.2 UNION ALL
SELECT 'b', '2016-07-03 00:00:00 UTC', 42.1 UNION ALL
SELECT 'b', '2016-07-04 00:00:00 UTC', 41.9 UNION ALL
SELECT 'b', '2016-07-05 00:00:00 UTC', 43.2 UNION ALL
SELECT 'b', '2016-07-06 00:00:00 UTC', 91.5 UNION ALL
SELECT 'b', '2016-07-09 00:00:00 UTC', 239.1
), time_frame AS (
SELECT day
FROM UNNEST(GENERATE_DATE_ARRAY('2016-07-02', '2016-07-24')) day
)
SELECT
users.userid,
day,
IFNULL(secondsPlayed, 0) secondsPlayed,
CAST(1 - SIGN(SUM(IFNULL(secondsPlayed, 0))
OVER(
PARTITION BY users.userid
ORDER BY UNIX_DATE(day)
RANGE BETWEEN 6 PRECEDING AND CURRENT ROW
)) AS INT64) AS inactive
FROM time_frame tf
CROSS JOIN (SELECT DISTINCT userid FROM `project.dataset.table`) users
LEFT JOIN `project.dataset.table` t
ON day = DATE(estimationDate) AND users.userid = t.userid
ORDER BY userid, day
with result
Row userid day secondsPlayed inactive
...
13 a 2016-07-14 192.5 0
14 a 2016-07-15 357.3 0
15 a 2016-07-15 357.3 0
16 a 2016-07-16 0.0 0
17 a 2016-07-17 0.0 0
18 a 2016-07-18 0.0 0
19 a 2016-07-19 0.0 0
20 a 2016-07-20 0.0 0
21 a 2016-07-21 0.0 0
22 a 2016-07-22 0.0 1
23 a 2016-07-23 0.0 1
24 a 2016-07-24 0.0 1
25 b 2016-07-02 31.2 0
26 b 2016-07-03 42.1 0
27 b 2016-07-04 41.9 0
28 b 2016-07-05 43.2 0
29 b 2016-07-06 91.5 0
30 b 2016-07-07 0.0 0
31 b 2016-07-08 0.0 0
32 b 2016-07-09 239.1 0
33 b 2016-07-10 0.0 0
...

hql split time into intervals

I have a Hive table with some data and i would like to split it in to 15 minutes intervals et return the total call duration for every interval
Hive Table example :
ID Start End Total Duration
1 1502296261 1502325061 28800
My output should be shown as :
ID Interval Duration
1 2017-08-09 18:30:00 839
1 2017-08-09 18:45:00 900
1 2017-08-09 19:00:00 900
...
1 2017-08-10 02:15:00 900
1 2017-08-10 02:30:00 61
What is the best solution to do that in a efficient way ?
Thanks.
This is the basic solution.
The displayed timestamp (Interval) depends on your system timezone.
with t as (select stack(1,1,1502296261,1502325061) as (`ID`,`Start`,`End`))
select t.`ID` as `ID`
,from_unixtime((t.`Start` div (15*60) + pe.pos)*(15*60)) as `Interval`
, case
when pe.pos = t.`End` div (15*60) - t.`Start` div (15*60)
then t.`End`
else (t.`Start` div (15*60) + pe.pos + 1)*(15*60)
end
- case
when pe.pos = 0
then t.`Start`
else (t.`Start` div (15*60) + pe.pos)*(15*60)
end as `Duration`
from t
lateral view
posexplode(split(space(int(t.`End` div (15*60) - t.`Start` div (15*60))),' ')) pe
;
+----+---------------------+----------+
| id | interval | duration |
+----+---------------------+----------+
| 1 | 2017-08-09 09:30:00 | 839 |
| 1 | 2017-08-09 09:45:00 | 900 |
| 1 | 2017-08-09 10:00:00 | 900 |
| 1 | 2017-08-09 10:15:00 | 900 |
| 1 | 2017-08-09 10:30:00 | 900 |
| 1 | 2017-08-09 10:45:00 | 900 |
| 1 | 2017-08-09 11:00:00 | 900 |
| 1 | 2017-08-09 11:15:00 | 900 |
| 1 | 2017-08-09 11:30:00 | 900 |
| 1 | 2017-08-09 11:45:00 | 900 |
| 1 | 2017-08-09 12:00:00 | 900 |
| 1 | 2017-08-09 12:15:00 | 900 |
| 1 | 2017-08-09 12:30:00 | 900 |
| 1 | 2017-08-09 12:45:00 | 900 |
| 1 | 2017-08-09 13:00:00 | 900 |
| 1 | 2017-08-09 13:15:00 | 900 |
| 1 | 2017-08-09 13:30:00 | 900 |
| 1 | 2017-08-09 13:45:00 | 900 |
| 1 | 2017-08-09 14:00:00 | 900 |
| 1 | 2017-08-09 14:15:00 | 900 |
| 1 | 2017-08-09 14:30:00 | 900 |
| 1 | 2017-08-09 14:45:00 | 900 |
| 1 | 2017-08-09 15:00:00 | 900 |
| 1 | 2017-08-09 15:15:00 | 900 |
| 1 | 2017-08-09 15:30:00 | 900 |
| 1 | 2017-08-09 15:45:00 | 900 |
| 1 | 2017-08-09 16:00:00 | 900 |
| 1 | 2017-08-09 16:15:00 | 900 |
| 1 | 2017-08-09 16:30:00 | 900 |
| 1 | 2017-08-09 16:45:00 | 900 |
| 1 | 2017-08-09 17:00:00 | 900 |
| 1 | 2017-08-09 17:15:00 | 900 |
| 1 | 2017-08-09 17:30:00 | 61 |
+----+---------------------+----------+