Count blocks of data based on id and status code - sql

I have a dataset that looks something like this
id
status
datetime
123456
0
07/02/2023 12:43
123456
4
07/02/2023 12:49
123456
5
07/02/2023 12:58
123456
5
07/02/2023 13:48
123456
7
07/02/2023 14:29
123456
0
07/02/2023 14:50
123456
4
07/02/2023 14:50
123456
5
07/02/2023 14:51
123456
9
07/02/2023 15:27
567890
0
07/02/2023 11:44
567890
4
07/02/2023 12:23
567890
5
07/02/2023 12:29
567890
5
07/02/2023 13:26
567890
5
07/02/2023 13:28
567890
5
07/02/2023 13:28
567890
5
07/02/2023 13:29
567890
9
07/02/2023 13:55
For each id in the dataset there are a number of statuses that need to be identified as 'blocks' of activity, where each block starts with a status code of 0 (and is sorted by datetime)
What I'd like to do is to add a column that identifies this block. So my data would look like this with that column added.
id
status
datetime
block
123456
0
07/02/2023 12:43
1
123456
4
07/02/2023 12:49
1
123456
5
07/02/2023 12:58
1
123456
5
07/02/2023 13:48
1
123456
7
07/02/2023 14:29
1
123456
0
07/02/2023 14:50
2
123456
4
07/02/2023 14:50
2
123456
5
07/02/2023 14:51
2
123456
9
07/02/2023 15:27
2
567890
0
07/02/2023 11:44
1
567890
4
07/02/2023 12:23
1
567890
5
07/02/2023 12:29
1
567890
5
07/02/2023 13:26
1
567890
5
07/02/2023 13:28
1
567890
5
07/02/2023 13:28
1
567890
5
07/02/2023 13:29
1
567890
9
07/02/2023 13:55
1
I've used window functions before, but I can't get my head around how to do this.

You may get the desired result using CTE and the below query
Sample Table:
CREATE TABLE SampleData (
id INT,
status INT,
datetime DATETIME
);
Sample Data:
INSERT INTO SampleData (id, status, datetime)
VALUES
(123456, 0, '07/02/2023 12:43'),
(123456, 4, '07/02/2023 12:49'),
(123456, 5, '07/02/2023 12:58'),
(123456, 5, '07/02/2023 13:48'),
(123456, 7, '07/02/2023 14:29'),
(123456, 0, '07/02/2023 14:50'),
(123456, 4, '07/02/2023 14:50'),
(123456, 5, '07/02/2023 14:51'),
(123456, 9, '07/02/2023 15:27'),
(567890, 0, '07/02/2023 11:44'),
(567890, 4, '07/02/2023 12:23'),
(567890, 5, '07/02/2023 12:29'),
(567890, 5, '07/02/2023 13:26'),
(567890, 5, '07/02/2023 13:28'),
(567890, 5, '07/02/2023 13:28'),
(567890, 5, '07/02/2023 13:29'),
(567890, 9, '07/02/2023 13:55');
Query:
WITH CteSampleData AS (
SELECT *,
SUM(CASE WHEN status = 0 THEN 1 ELSE 0 END)
OVER (PARTITION BY id ORDER BY datetime) AS block
FROM SampleData
)
SELECT id, status, datetime, block
FROM CteSampleData
ORDER BY id, datetime;

Related

How do I return all data from both tables where dates do and do not match, using a Left join but it only seems to be returning matching records

I have two tables as below, I want to return all data from both tables but also want to return nulls from table (odwh_usage.osprey_agreement_revenue)[first table below] where month.start_date and osprey_agreement_revenue.trans_date do not match, but I am getting no NULLS. As you can see osprey_agreement_revenue.trans_date only goes to 28/11/2020 but month.start_date goes up to 01/05/2021; I have joined on a range and would have thought all the month table dates in the range would be returned regardless if they match or not, but nothing is returned. Here is what I have tried:
SELECT *
FROM odwh_usage.osprey_agreement_revenue osprey_agreement_revenue
LEFT OUTER JOIN odwh_system.month month ON ( month."start_date" >= osprey_agreement_revenue.trans_date AND backout_status = 'N')
WHERE
osprey_outlet_subaccount_number = '188021000270'
ORDER BY trans_date
osprey_outlet_subaccount_number
agreement_id
bill_ref_no
product_id
trans_date
tracking_id
rated_amount
amount
discount
vat
backout_status
188021000270
00000000000002638868_00000001580385418946_00118
79003199
EAD_DC_ETH_O
28/03/2020
2865140
62.84
62.84
0
12.57
N
188021000270
00000000000002638868_00000001580385418946_00118
79003199
EAD_ETH_O
28/03/2020
2865141
238.86
238.86
0
47.77
N
188021000270
00000000000002638868_00000001580385418946_00118
79485727
EAD_DC_ETH_O
28/04/2020
2865140
95.92
95.92
0
19.18
N
188021000270
00000000000002638868_00000001580385418946_00118
79485727
EAD_ETH_O
28/04/2020
2865141
364.58
364.58
0
72.92
N
188021000270
00000000000002638868_00000001580385418946_00118
80182008
EAD_DC_ETH_O
28/05/2020
2865140
95.92
95.92
0
19.18
N
188021000270
00000000000002638868_00000001580385418946_00118
80182008
EAD_ETH_O
28/05/2020
2865141
364.58
364.58
0
72.92
N
188021000270
00000000000002638868_00000001580385418946_00118
82190556
EAD_DC_ETH_O
28/06/2020
2865140
95.92
95.92
0
19.18
N
188021000270
00000000000002638868_00000001580385418946_00118
82190556
EAD_ETH_O
28/06/2020
2865141
364.58
364.58
0
72.92
N
188021000270
00000000000002638868_00000001580385418946_00118
83453611
EAD_DC_ETH_O
28/07/2020
2865140
95.92
95.92
0
19.18
N
188021000270
00000000000002638868_00000001580385418946_00118
83453611
EAD_ETH_O
28/07/2020
2865141
364.58
364.58
0
72.92
N
188021000270
00000000000002638868_00000001580385418946_00118
84544005
EAD_DC_ETH_O
28/08/2020
2865140
95.92
95.92
0
19.18
N
188021000270
00000000000002638868_00000001580385418946_00118
84544005
EAD_ETH_O
28/08/2020
2865141
364.58
364.58
0
72.92
N
188021000270
00000000000002638868_00000001580385418946_00118
85715214
EAD_DC_ETH_O
28/09/2020
2865140
95.92
95.92
0
19.18
N
188021000270
00000000000002638868_00000001580385418946_00118
85715214
EAD_ETH_O
28/09/2020
2865141
364.58
364.58
0
72.92
N
188021000270
00000000000002638868_00000001580385418946_00118
88189966
EAD_DC_ETH_O
28/10/2020
2865140
95.92
95.92
0
19.18
N
188021000270
00000000000002638868_00000001580385418946_00118
88189966
EAD_ETH_O
28/10/2020
2865141
364.58
364.58
0
72.92
N
188021000270
00000000000002638868_00000001580385418946_00118
91485706
EAD_DC_ETH_O
28/11/2020
2865140
86.63
86.63
0
17.33
N
188021000270
00000000000002638868_00000001580385418946_00118
91485706
EAD_ETH_O
28/11/2020
2865141
329.29
329.29
0
65.86
N
month
start_date
end_date
financial_year
01/01/2010
01/01/2010
31/01/2010
2009/10
01/02/2010
01/02/2010
28/02/2010
2009/10
01/03/2010
01/03/2010
31/03/2010
2009/10
01/04/2010
01/04/2010
30/04/2010
2009/10
01/05/2010
01/05/2010
31/05/2010
2009/10
01/06/2010
01/06/2010
30/06/2010
2009/10
01/07/2010
01/07/2010
31/07/2010
2010/11
01/08/2010
01/08/2010
31/08/2010
2010/11
01/09/2010
01/09/2010
30/09/2010
2010/11
01/10/2010
01/10/2010
31/10/2010
2010/11
01/11/2010
01/11/2010
30/11/2010
2010/11
01/12/2010
01/12/2010
31/12/2010
2010/11
01/01/2011
01/01/2011
31/01/2011
2010/11
01/02/2011
01/02/2011
28/02/2011
2010/11
01/03/2011
01/03/2011
31/03/2011
2010/11
01/04/2011
01/04/2011
30/04/2011
2010/11
01/05/2011
01/05/2011
31/05/2011
2010/11
01/06/2011
01/06/2011
30/06/2011
2010/11
01/07/2011
01/07/2011
31/07/2011
2011/12
01/08/2011
01/08/2011
31/08/2011
2011/12
01/09/2011
01/09/2011
30/09/2011
2011/12
01/10/2011
01/10/2011
31/10/2011
2011/12
01/11/2011
01/11/2011
30/11/2011
2011/12
01/12/2011
01/12/2011
31/12/2011
2011/12
01/01/2012
01/01/2012
31/01/2012
2011/12
01/02/2012
01/02/2012
29/02/2012
2011/12
01/03/2012
01/03/2012
31/03/2012
2011/12
01/04/2012
01/04/2012
30/04/2012
2011/12
01/05/2012
01/05/2012
31/05/2012
2011/12
01/06/2012
01/06/2012
30/06/2012
2011/12
01/07/2012
01/07/2012
31/07/2012
2012/13
01/08/2012
01/08/2012
31/08/2012
2012/13
01/09/2012
01/09/2012
30/09/2012
2012/13
01/10/2012
01/10/2012
31/10/2012
2012/13
01/11/2012
01/11/2012
30/11/2012
2012/13
01/12/2012
01/12/2012
31/12/2012
2012/13
01/01/2013
01/01/2013
31/01/2013
2012/13
01/02/2013
01/02/2013
28/02/2013
2012/13
01/03/2013
01/03/2013
31/03/2013
2012/13
01/04/2013
01/04/2013
30/04/2013
2012/13
01/05/2013
01/05/2013
31/05/2013
2012/13
01/06/2013
01/06/2013
30/06/2013
2012/13
01/07/2013
01/07/2013
31/07/2013
2013/14
01/08/2013
01/08/2013
31/08/2013
2013/14
01/09/2013
01/09/2013
30/09/2013
2013/14
01/10/2013
01/10/2013
31/10/2013
2013/14
01/11/2013
01/11/2013
30/11/2013
2013/14
01/12/2013
01/12/2013
31/12/2013
2013/14
01/01/2014
01/01/2014
31/01/2014
2013/14
01/02/2014
01/02/2014
28/02/2014
2013/14
01/03/2014
01/03/2014
31/03/2014
2013/14
01/04/2014
01/04/2014
30/04/2014
2013/14
01/05/2014
01/05/2014
31/05/2014
2013/14
01/06/2014
01/06/2014
30/06/2014
2013/14
01/07/2014
01/07/2014
31/07/2014
2014/15
01/08/2014
01/08/2014
31/08/2014
2014/15
01/09/2014
01/09/2014
30/09/2014
2014/15
01/10/2014
01/10/2014
31/10/2014
2014/15
01/11/2014
01/11/2014
30/11/2014
2014/15
01/12/2014
01/12/2014
31/12/2014
2014/15
01/01/2015
01/01/2015
31/01/2015
2014/15
01/02/2015
01/02/2015
28/02/2015
2014/15
01/03/2015
01/03/2015
31/03/2015
2014/15
01/04/2015
01/04/2015
30/04/2015
2014/15
01/05/2015
01/05/2015
31/05/2015
2014/15
01/06/2015
01/06/2015
30/06/2015
2014/15
01/07/2015
01/07/2015
31/07/2015
2015/16
01/08/2015
01/08/2015
31/08/2015
2015/16
01/09/2015
01/09/2015
30/09/2015
2015/16
01/10/2015
01/10/2015
31/10/2015
2015/16
01/11/2015
01/11/2015
30/11/2015
2015/16
01/12/2015
01/12/2015
31/12/2015
2015/16
01/01/2016
01/01/2016
31/01/2016
2015/16
01/02/2016
01/02/2016
29/02/2016
2015/16
01/03/2016
01/03/2016
31/03/2016
2015/16
01/04/2016
01/04/2016
30/04/2016
2015/16
01/05/2016
01/05/2016
31/05/2016
2015/16
01/06/2016
01/06/2016
30/06/2016
2015/16
01/07/2016
01/07/2016
31/07/2016
2016/17
01/08/2016
01/08/2016
31/08/2016
2016/17
01/09/2016
01/09/2016
30/09/2016
2016/17
01/10/2016
01/10/2016
31/10/2016
2016/17
01/11/2016
01/11/2016
30/11/2016
2016/17
01/12/2016
01/12/2016
31/12/2016
2016/17
01/01/2017
01/01/2017
31/01/2017
2016/17
01/02/2017
01/02/2017
28/02/2017
2016/17
01/03/2017
01/03/2017
31/03/2017
2016/17
01/04/2017
01/04/2017
30/04/2017
2016/17
01/05/2017
01/05/2017
31/05/2017
2016/17
01/06/2017
01/06/2017
30/06/2017
2016/17
01/07/2017
01/07/2017
31/07/2017
2017/18
01/08/2017
01/08/2017
31/08/2017
2017/18
01/09/2017
01/09/2017
30/09/2017
2017/18
01/10/2017
01/10/2017
31/10/2017
2017/18
01/11/2017
01/11/2017
30/11/2017
2017/18
01/12/2017
01/12/2017
31/12/2017
2017/18
01/01/2018
01/01/2018
31/01/2018
2017/18
01/02/2018
01/02/2018
28/02/2018
2017/18
01/03/2018
01/03/2018
31/03/2018
2017/18
01/04/2018
01/04/2018
30/04/2018
2017/18
01/05/2018
01/05/2018
31/05/2018
2017/18
01/06/2018
01/06/2018
30/06/2018
2017/18
01/07/2018
01/07/2018
31/07/2018
2018/19
01/08/2018
01/08/2018
31/08/2018
2018/19
01/09/2018
01/09/2018
30/09/2018
2018/19
01/10/2018
01/10/2018
31/10/2018
2018/19
01/11/2018
01/11/2018
30/11/2018
2018/19
01/12/2018
01/12/2018
31/12/2018
2018/19
01/01/2019
01/01/2019
31/01/2019
2018/19
01/02/2019
01/02/2019
28/02/2019
2018/19
01/03/2019
01/03/2019
31/03/2019
2018/19
01/04/2019
01/04/2019
30/04/2019
2018/19
01/05/2019
01/05/2019
31/05/2019
2018/19
01/06/2019
01/06/2019
30/06/2019
2018/19
01/07/2019
01/07/2019
31/07/2019
2019/20
01/08/2019
01/08/2019
31/08/2019
2019/20
01/09/2019
01/09/2019
30/09/2019
2019/20
01/10/2019
01/10/2019
31/10/2019
2019/20
01/11/2019
01/11/2019
30/11/2019
2019/20
01/12/2019
01/12/2019
31/12/2019
2019/20
01/01/2020
01/01/2020
31/01/2020
2019/20
01/02/2020
01/02/2020
29/02/2020
2019/20
01/03/2020
01/03/2020
31/03/2020
2019/20
01/04/2020
01/04/2020
30/04/2020
2019/20
01/05/2020
01/05/2020
31/05/2020
2019/20
01/06/2020
01/06/2020
30/06/2020
2019/20
01/07/2020
01/07/2020
31/07/2020
2020/21
01/08/2020
01/08/2020
31/08/2020
2020/21
01/09/2020
01/09/2020
30/09/2020
2020/21
01/10/2020
01/10/2020
31/10/2020
2020/21
01/11/2020
01/11/2020
30/11/2020
2020/21
01/12/2020
01/12/2020
31/12/2020
2020/21
01/01/2021
01/01/2021
31/01/2021
2020/21
01/02/2021
01/02/2021
28/02/2021
2020/21
01/03/2021
01/03/2021
31/03/2021
2020/21
01/04/2021
01/04/2021
30/04/2021
2020/21
01/05/2021
01/05/2021
31/05/2021
2020/21

How to get count incremental by date

I am trying to get a count of rows with incremental dates.
My table looks like this:
ID name status create_date
1 John AC 2016-01-01 00:00:26.513
2 Jane AC 2016-01-02 00:00:26.513
3 Kane AC 2016-01-02 00:00:26.513
4 Carl AC 2016-01-03 00:00:26.513
5 Dave AC 2016-01-04 00:00:26.513
6 Gina AC 2016-01-04 00:00:26.513
Now what I want to return from the SQL is something like this:
Date Count
2016-01-01 1
2016-01-02 3
2016-01-03 4
2016-01-04 6
You can make use of COUNT() OVER () without PARTITION BY,by using ORDER BY. It will give you the cumulative sum.Use DISTINCT to filter out the duplicate values.
SELECT DISTINCT CAST(create_date AS DATE) [Date],
COUNT(create_date) OVER (ORDER BY CAST(create_date AS DATE)) as [COUNT]
FROM [YourTable]
SELECT create_date, COUNT(create_date) as [COUNT]
FROM (
SELECT CAST(create_date AS DATE) create_date
FROM [YourTable]
) T
GROUP BY create_date
Per your description, you need a continuous dates list, Does it make sense?
This sample only generating one-month data.
CREATE TABLE #tt(ID INT, name VARCHAR(10), status VARCHAR(10), create_date DATETIME)
INSERT INTO #tt
SELECT 1,'John','AC','2016-01-01 00:00:26.513' UNION
SELECT 2,'Jane','AC','2016-01-02 00:00:26.513' UNION
SELECT 3,'Kane','AC','2016-01-02 00:00:26.513' UNION
SELECT 4,'Carl','AC','2016-01-03 00:00:26.513' UNION
SELECT 5,'Dave','AC','2016-01-04 00:00:26.513' UNION
SELECT 6,'Gina','AC','2016-01-04 00:00:26.513' UNION
SELECT 7,'Tina','AC','2016-01-08 00:00:26.513'
SELECT * FROM #tt
SELECT CONVERT(DATE,DATEADD(d,sv.number,n.FirstDate)) AS [Date],COUNT(n.num) AS [Count]
FROM master.dbo.spt_values AS sv
LEFT JOIN (
SELECT MIN(t.create_date)OVER() AS FirstDate,DATEDIFF(d,MIN(t.create_date)OVER(),t.create_date) AS num FROM #tt AS t
) AS n ON n.num<=sv.number
WHERE sv.type='P' AND sv.number>=0 AND MONTH(DATEADD(d,sv.number,n.FirstDate))=MONTH(n.FirstDate)
GROUP BY CONVERT(DATE,DATEADD(d,sv.number,n.FirstDate))
Date Count
---------- -----------
2016-01-01 1
2016-01-02 3
2016-01-03 4
2016-01-04 6
2016-01-05 6
2016-01-06 6
2016-01-07 6
2016-01-08 7
2016-01-09 7
2016-01-10 7
2016-01-11 7
2016-01-12 7
2016-01-13 7
2016-01-14 7
2016-01-15 7
2016-01-16 7
2016-01-17 7
2016-01-18 7
2016-01-19 7
2016-01-20 7
2016-01-21 7
2016-01-22 7
2016-01-23 7
2016-01-24 7
2016-01-25 7
2016-01-26 7
2016-01-27 7
2016-01-28 7
2016-01-29 7
2016-01-30 7
2016-01-31 7
2017-01-01 7
2017-01-02 7
2017-01-03 7
2017-01-04 7
2017-01-05 7
2017-01-06 7
2017-01-07 7
2017-01-08 7
2017-01-09 7
2017-01-10 7
2017-01-11 7
2017-01-12 7
2017-01-13 7
2017-01-14 7
2017-01-15 7
2017-01-16 7
2017-01-17 7
2017-01-18 7
2017-01-19 7
2017-01-20 7
2017-01-21 7
2017-01-22 7
2017-01-23 7
2017-01-24 7
2017-01-25 7
2017-01-26 7
2017-01-27 7
2017-01-28 7
2017-01-29 7
2017-01-30 7
2017-01-31 7
2018-01-01 7
2018-01-02 7
2018-01-03 7
2018-01-04 7
2018-01-05 7
2018-01-06 7
2018-01-07 7
2018-01-08 7
2018-01-09 7
2018-01-10 7
2018-01-11 7
2018-01-12 7
2018-01-13 7
2018-01-14 7
2018-01-15 7
2018-01-16 7
2018-01-17 7
2018-01-18 7
2018-01-19 7
2018-01-20 7
2018-01-21 7
2018-01-22 7
2018-01-23 7
2018-01-24 7
2018-01-25 7
2018-01-26 7
2018-01-27 7
2018-01-28 7
2018-01-29 7
2018-01-30 7
2018-01-31 7
2019-01-01 7
2019-01-02 7
2019-01-03 7
2019-01-04 7
2019-01-05 7
2019-01-06 7
2019-01-07 7
2019-01-08 7
2019-01-09 7
2019-01-10 7
2019-01-11 7
2019-01-12 7
2019-01-13 7
2019-01-14 7
2019-01-15 7
2019-01-16 7
2019-01-17 7
2019-01-18 7
2019-01-19 7
2019-01-20 7
2019-01-21 7
2019-01-22 7
2019-01-23 7
2019-01-24 7
2019-01-25 7
2019-01-26 7
2019-01-27 7
2019-01-28 7
2019-01-29 7
2019-01-30 7
2019-01-31 7
2020-01-01 7
2020-01-02 7
2020-01-03 7
2020-01-04 7
2020-01-05 7
2020-01-06 7
2020-01-07 7
2020-01-08 7
2020-01-09 7
2020-01-10 7
2020-01-11 7
2020-01-12 7
2020-01-13 7
2020-01-14 7
2020-01-15 7
2020-01-16 7
2020-01-17 7
2020-01-18 7
2020-01-19 7
2020-01-20 7
2020-01-21 7
2020-01-22 7
2020-01-23 7
2020-01-24 7
2020-01-25 7
2020-01-26 7
2020-01-27 7
2020-01-28 7
2020-01-29 7
2020-01-30 7
2020-01-31 7
2021-01-01 7
2021-01-02 7
2021-01-03 7
2021-01-04 7
2021-01-05 7
2021-01-06 7
2021-01-07 7
2021-01-08 7
2021-01-09 7
2021-01-10 7
2021-01-11 7
2021-01-12 7
2021-01-13 7
2021-01-14 7
2021-01-15 7
2021-01-16 7
2021-01-17 7
2021-01-18 7
2021-01-19 7
2021-01-20 7
2021-01-21 7
2021-01-22 7
2021-01-23 7
2021-01-24 7
2021-01-25 7
2021-01-26 7
2021-01-27 7
2021-01-28 7
2021-01-29 7
2021-01-30 7
2021-01-31 7
select r.date,count(r.date) count
from
(
select id,name,substring(convert(nvarchar(50),create_date),1,10) date
from tblName
) r
group by r.date
In this code, in the subquery part,
I select the first 10 letter of date which is converted from dateTime to nvarchar so I make like '2016-01-01'. (which is not also necessary but for make code more readable I prefer to do it in this way).
Then with a simple group by I have date and date's count.

SQL query, create groups by dates

This is my initial table, (the dates are in DD/MM/YY format)
ID DAY TYPE_ID TYPE NUM START_DATE END_DATE
---- --------- ------- ---- ---- --------- ---------
4241 15/09/15 2 1 66 01/01/00 31/12/99
4241 16/09/15 2 1 66 01/01/00 31/12/99
4241 17/09/15 9 1 59 17/09/15 18/09/15
4241 18/09/15 9 1 59 17/09/15 18/09/15
4241 19/09/15 2 1 66 01/01/00 31/12/99
4241 20/09/15 2 1 66 01/01/00 31/12/99
4241 15/09/15 3 2 63 01/01/00 31/12/99
4241 16/09/15 8 2 159 16/09/15 17/09/15
4241 17/09/15 8 2 159 16/09/15 17/09/15
4241 18/09/15 3 2 63 01/01/00 31/12/99
4241 19/09/15 3 2 63 01/01/00 31/12/99
4241 20/09/15 3 2 63 01/01/00 31/12/99
2134 15/09/15 2 1 66 01/01/00 31/12/99
2134 16/09/15 2 1 66 01/01/00 31/12/99
2134 17/09/15 9 1 59 17/09/15 18/09/15
2134 18/09/15 9 1 59 17/09/15 18/09/15
2134 19/09/15 2 1 66 01/01/00 31/12/99
2134 20/09/15 2 1 66 01/01/00 31/12/99
2134 15/09/15 3 2 63 01/01/00 31/12/99
2134 16/09/15 8 2 159 16/09/15 17/09/15
2134 17/09/15 8 2 159 16/09/15 17/09/15
2134 18/09/15 3 2 63 01/01/00 31/12/99
2134 19/09/15 3 2 63 01/01/00 31/12/99
2134 20/09/15 3 2 63 01/01/00 31/12/99
And I've to create groups with initial DAY and end DAY for the same ID, and TYPE.
I don't want to group by day, I need to create a group every time my TYPE_ID changes, based on the initial order (ID, TYPE, DAY ASC)
This is the result that I want to achieve:
ID DAY_INI DAY_END TYPE_ID TYPE NUM START_DATE END_DATE
---- --------- --------- ------- ---- ---- --------- ---------
4241 15/09/15 16/09/15 2 1 66 01/01/00 31/12/99
4241 17/09/15 18/09/15 9 1 59 17/09/15 18/09/15
4241 19/09/15 20/09/15 2 1 66 01/01/00 31/12/99
4241 15/09/15 15/09/15 3 2 63 01/01/00 31/12/99
4241 16/09/15 17/09/15 8 2 159 16/09/15 17/09/15
4241 18/09/15 20/09/15 3 2 63 01/01/00 31/12/99
2134 15/09/15 16/09/15 2 1 66 01/01/00 31/12/99
2134 17/09/15 18/09/15 9 1 59 17/09/15 18/09/15
2134 19/09/15 20/09/15 2 1 66 01/01/00 31/12/99
2134 15/09/15 15/09/15 3 2 63 01/01/00 31/12/99
2134 16/09/15 17/09/15 8 2 159 16/09/15 17/09/15
2134 18/09/15 20/09/15 3 2 63 01/01/00 31/12/99
Could you please give any clue about how to do it??, thanks!
SQL Fiddle
Oracle 11g R2 Schema Setup:
CREATE TABLE TEST ( ID, DAY, TYPE_ID, TYPE, NUM, START_DATE, END_DATE ) AS
SELECT 4241, DATE '2015-09-15', 2, 1, 66, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-16', 2, 1, 66, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-17', 9, 1, 59, DATE '2015-09-17', DATE '2015-09-18' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-18', 9, 1, 59, DATE '2015-09-17', DATE '2015-09-18' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-19', 2, 1, 66, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-20', 2, 1, 66, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-15', 3, 2, 63, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-16', 8, 2, 159, DATE '2015-09-16', DATE '2015-09-17' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-17', 8, 2, 159, DATE '2015-09-16', DATE '2015-09-17' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-18', 3, 2, 63, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-19', 3, 2, 63, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 4241, DATE '2015-09-20', 3, 2, 63, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-15', 2, 1, 66, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-16', 2, 1, 66, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-17', 9, 1, 59, DATE '2015-09-17', DATE '2015-09-18' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-18', 9, 1, 59, DATE '2015-09-17', DATE '2015-09-18' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-19', 2, 1, 66, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-20', 2, 1, 66, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-15', 3, 2, 63, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-16', 8, 2, 159, DATE '2015-09-16', DATE '2015-09-17' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-17', 8, 2, 159, DATE '2015-09-16', DATE '2015-09-17' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-18', 3, 2, 63, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-19', 3, 2, 63, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
UNION ALL SELECT 2134, DATE '2015-09-20', 3, 2, 63, DATE '2000-01-01', DATE '1999-12-31' FROM DUAL
Query 1:
WITH group_changes AS (
SELECT t.*,
CASE TYPE_ID WHEN LAG( TYPE_ID ) OVER ( PARTITION BY ID, TYPE ORDER BY DAY ) THEN 0 ELSE 1 END AS HAS_CHANGED_GROUP
FROM TEST t
),
groups AS (
SELECT ID, DAY, TYPE_ID, TYPE, NUM, START_DATE, END_DATE,
SUM( HAS_CHANGED_GROUP ) OVER ( PARTITION BY ID, TYPE ORDER BY DAY ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW ) AS GRP
FROM group_changes
)
SELECT ID,
MIN( DAY ) AS DAY_INI,
MAX( DAY ) AS DAY_END,
MIN( TYPE_ID ) AS TYPE_ID,
TYPE,
MIN( NUM ) AS NUM,
MIN( START_DATE ) AS START_DATE,
MIN( END_DATE ) AS END_DATE
FROM groups
GROUP BY ID, TYPE, GRP
Results:
| ID | DAY_INI | DAY_END | TYPE_ID | TYPE | NUM | START_DATE | END_DATE |
|------|-----------------------------|-----------------------------|---------|------|-----|-----------------------------|-----------------------------|
| 4241 | September, 17 2015 00:00:00 | September, 18 2015 00:00:00 | 9 | 1 | 59 | September, 17 2015 00:00:00 | September, 18 2015 00:00:00 |
| 2134 | September, 15 2015 00:00:00 | September, 15 2015 00:00:00 | 3 | 2 | 63 | January, 01 2000 00:00:00 | December, 31 1999 00:00:00 |
| 2134 | September, 18 2015 00:00:00 | September, 20 2015 00:00:00 | 3 | 2 | 63 | January, 01 2000 00:00:00 | December, 31 1999 00:00:00 |
| 4241 | September, 15 2015 00:00:00 | September, 16 2015 00:00:00 | 2 | 1 | 66 | January, 01 2000 00:00:00 | December, 31 1999 00:00:00 |
| 4241 | September, 19 2015 00:00:00 | September, 20 2015 00:00:00 | 2 | 1 | 66 | January, 01 2000 00:00:00 | December, 31 1999 00:00:00 |
| 4241 | September, 15 2015 00:00:00 | September, 15 2015 00:00:00 | 3 | 2 | 63 | January, 01 2000 00:00:00 | December, 31 1999 00:00:00 |
| 4241 | September, 16 2015 00:00:00 | September, 17 2015 00:00:00 | 8 | 2 | 159 | September, 16 2015 00:00:00 | September, 17 2015 00:00:00 |
| 2134 | September, 17 2015 00:00:00 | September, 18 2015 00:00:00 | 9 | 1 | 59 | September, 17 2015 00:00:00 | September, 18 2015 00:00:00 |
| 2134 | September, 15 2015 00:00:00 | September, 16 2015 00:00:00 | 2 | 1 | 66 | January, 01 2000 00:00:00 | December, 31 1999 00:00:00 |
| 2134 | September, 19 2015 00:00:00 | September, 20 2015 00:00:00 | 2 | 1 | 66 | January, 01 2000 00:00:00 | December, 31 1999 00:00:00 |
| 2134 | September, 16 2015 00:00:00 | September, 17 2015 00:00:00 | 8 | 2 | 159 | September, 16 2015 00:00:00 | September, 17 2015 00:00:00 |
| 4241 | September, 18 2015 00:00:00 | September, 20 2015 00:00:00 | 3 | 2 | 63 | January, 01 2000 00:00:00 | December, 31 1999 00:00:00 |
Add an enumeration to the original data set (using Row_Number or rownum). Add the MIN(Enumeration) for each group. Then sort the groups by the enumeration.

Calculate average values for rows with different ids in MS Excel

File contains information about products per day, and I need to calculate average values for month for each product.
Source data looks like this:
A B C D
id date rating price
1 1 2014/01/01 2 20
2 1 2014/01/02 2 20
3 1 2014/01/03 2 20
4 1 2014/01/04 1 20
5 1 2014/01/05 1 20
6 1 2014/01/06 1 20
7 1 2014/01/07 1 20
8 3 2014/01/01 5 99
9 3 2014/01/02 5 99
10 3 2014/01/03 5 99
11 3 2014/01/04 5 99
12 3 2014/01/05 5 120
13 3 2014/01/06 5 120
14 3 2014/01/07 5 120
Need to get:
A B C D
id date rating price
1 1 1.42 20
2 3 5 108
How to do that? Need some advanced formula or VB Script.
Update: I have data for long period - about 2 years. Need to calculate average values for each product for each week, and after for each month.
Source data example:
id date rating
4 2013-09-01 445
4 2013-09-02 446
4 2013-09-03 447
4 2013-09-04 448
4 2013-09-05 449
4 2013-09-06 450
4 2013-09-07 451
4 2013-09-08 452
4 2013-09-09 453
4 2013-09-10 454
4 2013-09-11 455
4 2013-09-12 456
4 2013-09-13 457
4 2013-09-14 458
4 2013-09-15 459
4 2013-09-16 460
4 2013-09-17 461
4 2013-09-18 462
4 2013-09-19 463
4 2013-09-20 464
4 2013-09-21 465
4 2013-09-22 466
4 2013-09-23 467
4 2013-09-24 468
4 2013-09-25 469
4 2013-09-26 470
4 2013-09-27 471
4 2013-09-28 472
4 2013-09-29 473
4 2013-09-30 474
4 2013-10-01 475
4 2013-10-02 476
4 2013-10-03 477
4 2013-10-04 478
4 2013-10-05 479
4 2013-10-06 480
4 2013-10-07 481
4 2013-10-08 482
4 2013-10-09 483
4 2013-10-10 484
4 2013-10-11 485
4 2013-10-12 486
4 2013-10-13 487
4 2013-10-14 488
4 2013-10-15 489
4 2013-10-16 490
4 2013-10-17 491
4 2013-10-18 492
4 2013-10-19 493
4 2013-10-20 494
4 2013-10-21 495
4 2013-10-22 496
4 2013-10-23 497
4 2013-10-24 498
4 2013-10-25 499
4 2013-10-26 500
4 2013-10-27 501
4 2013-10-28 502
4 2013-10-29 503
4 2013-10-30 504
4 2013-10-31 505
7 2013-09-01 1445
7 2013-09-02 1446
7 2013-09-03 1447
7 2013-09-04 1448
7 2013-09-05 1449
7 2013-09-06 1450
7 2013-09-07 1451
7 2013-09-08 1452
7 2013-09-09 1453
7 2013-09-10 1454
7 2013-09-11 1455
7 2013-09-12 1456
7 2013-09-13 1457
7 2013-09-14 1458
7 2013-09-15 1459
7 2013-09-16 1460
7 2013-09-17 1461
7 2013-09-18 1462
7 2013-09-19 1463
7 2013-09-20 1464
7 2013-09-21 1465
7 2013-09-22 1466
7 2013-09-23 1467
7 2013-09-24 1468
7 2013-09-25 1469
7 2013-09-26 1470
7 2013-09-27 1471
7 2013-09-28 1472
7 2013-09-29 1473
7 2013-09-30 1474
7 2013-10-01 1475
7 2013-10-02 1476
7 2013-10-03 1477
7 2013-10-04 1478
7 2013-10-05 1479
7 2013-10-06 1480
7 2013-10-07 1481
7 2013-10-08 1482
7 2013-10-09 1483
7 2013-10-10 1484
7 2013-10-11 1485
7 2013-10-12 1486
7 2013-10-13 1487
7 2013-10-14 1488
7 2013-10-15 1489
7 2013-10-16 1490
7 2013-10-17 1491
7 2013-10-18 1492
7 2013-10-19 1493
7 2013-10-20 1494
7 2013-10-21 1495
7 2013-10-22 1496
7 2013-10-23 1497
7 2013-10-24 1498
7 2013-10-25 1499
7 2013-10-26 1500
7 2013-10-27 1501
7 2013-10-28 1502
7 2013-10-29 1503
7 2013-10-30 1504
7 2013-10-31 1505
This is the job of a pivot table, and it takes about 30secs to do it
Update:
as per your update, put the date into the Report Filter and modify to suit

combing two rows of data in Sql

So my data looks like this:
+-----------+---------+-------------+-------+-------------+--+
| time | Outlets | Meal_Period | cover | day_of_week | |
+-----------+---------+-------------+-------+-------------+--+
| 10/1/2013 | 72 | 1 | 0 | Tuesday | |
| 10/1/2013 | 72 | 2 | 31 | Tuesday | |
| 10/1/2013 | 72 | 3 | 116 | Tuesday | |
| 10/1/2013 | 72 | 6 | 32 | Tuesday | |
| 10/1/2013 | 187 | 17 | 121 | Tuesday | |
| 10/1/2013 | 187 | 18 | 214 | Tuesday | |
| 10/1/2013 | 187 | 19 | 204 | Tuesday | |
| 10/1/2013 | 101 | 2 | 0 | Tuesday | |
| 10/1/2013 | 101 | 3 | 0 | Tuesday | |
| 10/1/2013 | 101 | 4 | 0 | Tuesday | |
| 10/1/2013 | 101 | 6 | 0 | Tuesday | |
| 10/1/2013 | 282 | 1 | 17 | Tuesday | |
| 10/1/2013 | 282 | 2 | 207 | Tuesday | |
| 10/1/2013 | 282 | 3 | 340 | Tuesday | |
| 10/1/2013 | 282 | 6 | 4 | Tuesday | |
| 10/1/2013 | 103 | 1 | 0 | Tuesday | |
+-----------+---------+-------------+-------+-------------+--+
The code is:
IF OBJECT_ID('tempdb.dbo.#time') IS NOT NULL
DROP TABLE #time
SELECT DATEADD(dd, 0, DATEDIFF(DD, 0, open_dttime)) AS 'time'
,profit_center_id AS 'Outlets'
,meal_period_id AS 'Meal_Period'
,sum(num_covers) AS 'Number_Covers'
INTO #time
FROM [STOF_Infogen].[dbo].[Order_Header]
WHERE CasinoID = 'csg'
AND profit_center_id IN (
'102'
,'100'
,'283'
,'101'
,'282'
,'187'
,'280'
,'103'
,'281'
,'72'
,'183'
)
AND (
open_dttime BETWEEN '2014-02-01 06:30'
AND '2014-03-01 06:30'
)
GROUP BY profit_center_id
,open_dttime
,meal_period_id
ORDER BY profit_center_id
,meal_period_id
IF OBJECT_ID('tempdb.dbo.#time2') IS NOT NULL
DROP TABLE #time2
SELECT [TIME]
,Outlets AS 'Outlets'
,meal_period AS 'Meal_Period'
,SUM(number_covers) AS 'cover'
,DATENAME(DW, [time]) AS 'day_of_week'
INTO #time2
FROM #time
GROUP BY [TIME]
,Outlets
,Meal_Period
ORDER BY [TIME] ASC
,Outlets
,Meal_Period
SELECT *
FROM #time2
I created temporary drop tables for my date but I'm having two issues;
I will like to group where the profit centres are 187 and 282 while still keeping the other rows.
for some reason I can't tweek the date stamp because it excludes the last day of the month.
As always any help is appreciated.
Making some test data:
DECLARE #MealInfo TABLE
(
MealTime DATETIME,
Outlets VARCHAR(10),
Meal_Period int,
Cover INT
)
INSERT INTO #MealInfo
VALUES
('10/1/2013', '72', 1, 0),
('10/1/2013', '72', 2, 31),
('10/1/2013', '72', 3, 116),
('10/1/2013', '72', 6, 32),
('10/1/2013', '187', 17, 121),
('10/1/2013', '187', 18, 214),
('10/1/2013', '187', 19, 204),
('10/1/2013', '101', 2, 0),
('10/1/2013', '101', 3, 0),
('10/1/2013', '101', 4, 0),
('10/1/2013', '101', 6, 0),
('10/1/2013', '282', 1, 17),
('10/1/2013', '282', 2, 207),
('10/1/2013', '282', 3, 340),
('10/1/2013', '282', 6, 4),
('10/1/2013', '103', 1, 0);
Because you want to group 187 and 282 together, I use a case statement to lump them into one outlet and then we can group on the outlets to break out the sums:
SELECT
m.MealTime,
m.Outlets,
m.Meal_Period,
SUM(m.Cover) AS Number_Covers
FROM
(
SELECT mi.MealTime,
(CASE WHEN mi.Outlets IN ('187', '282') THEN '187+282' ELSE mi.Outlets END) Outlets,
mi.Meal_Period,
mi.Cover
FROM #MealInfo mi
) m
GROUP BY m.MealTime, m.Outlets, m.Meal_Period
Here is the output:
MealTime Outlets Meal_Period Number_Covers
2013-10-01 00:00:00.000 101 2 0
2013-10-01 00:00:00.000 101 3 0
2013-10-01 00:00:00.000 101 4 0
2013-10-01 00:00:00.000 101 6 0
2013-10-01 00:00:00.000 103 1 0
2013-10-01 00:00:00.000 187+282 1 17
2013-10-01 00:00:00.000 187+282 2 207
2013-10-01 00:00:00.000 187+282 3 340
2013-10-01 00:00:00.000 187+282 6 4
2013-10-01 00:00:00.000 187+282 17 121
2013-10-01 00:00:00.000 187+282 18 214
2013-10-01 00:00:00.000 187+282 19 204
2013-10-01 00:00:00.000 72 1 0
2013-10-01 00:00:00.000 72 2 31
2013-10-01 00:00:00.000 72 3 116
2013-10-01 00:00:00.000 72 6 32
If your data had overlapping periods for 187 and 282, the sum total would contain both parts into 1 column.