Comparing Data by time in Sql - sql

I have the data
SELECT [Dt]
,x1
,[SaleCount]
,x2
,x3
,action
FROM table
action is an action( 0 no it, 1 is).
The essence of the matter is as follows:
for example for group
x1 + x2 + x3
1 2 2017
may be an action, and here there are some options for the influence of this group on the other.
such as a group
x1 + x2 + x3,
2 + 3 + 2017
I need to restructure the data, so that there are certain combinations.
There is a action on the group 1 + 2 + 2017 and no action on the group 2 + 3 + 2017(no marker);
or there is a action of the group 1 + 2 + 2017, but in group 2 + 3 + 2017 was also action before the action for group 1 + 2 + 2017(marker before)
(that is, the ones by action column for 2+3+2017 go before the ones by action column for of group 1 + 2 + 2017)
or there is a action on the group 1 + 2 + 2017, but on the group 2 + 3 + 2017 there was also an action AFTER the action group 1 + 2 + 2017(marker after)
(that is, the ones by action column for 2+3+2017 go after the ones by action column for of group 1 + 2 + 2017)
So I need to allocate combinations of action corresponding to these conditions.
data sample
Declare #t table
(Dt date,
x1 int,
Sale int,
x2 int,
x3 int,
action int,
typegroup varchar);
insert into #t values
('23.07.2018',1,2,2017,1,0,basis),
('24.07.2018',1,2,2017,2,0,basis),
('25.07.2018',1,2,2017,3,0,basis),
('26.07.2018',1,2,2017,4,0,basis),
('27.07.2018',1,2,2017,5,0,basis),
('28.07.2018',1,2,2017,6,0,basis),
('29.07.2018',1,2,2017,7,0,basis),
('30.07.2018',1,2,2017,8,0,basis),
('31.07.2018',1,2,2017,9,0,basis),
('01.08.2018',1,2,2017,10,0,basis),
('02.08.2018',1,2,2017,11,0,basis),
('03.08.2018',1,2,2017,12,1,basis),
('04.08.2018',1,2,2017,13,1,basis),
('05.08.2018',1,2,2017,14,1,basis),
('06.08.2018',1,2,2017,15,1,basis),
('07.08.2018',1,2,2017,16,1,basis),
('08.08.2018',1,2,2017,17,1,basis),
('09.08.2018',1,2,2017,18,1,basis),
('10.08.2018',1,2,2017,19,1,basis),
('11.08.2018',1,2,2017,20,1,basis),
('12.08.2018',1,2,2017,21,1,basis),
('13.08.2018',1,2,2017,22,1,basis),
('14.08.2018',1,2,2017,23,1,basis),
('15.08.2018',1,2,2017,24,1,basis),
('16.08.2018',1,2,2017,25,1,basis),
('17.08.2018',1,2,2017,26,1,basis),
('18.08.2018',1,2,2017,27,0,basis),
('19.08.2018',1,2,2017,28,0,basis),
('20.08.2018',1,2,2017,29,0,basis),
('21.08.2018',1,2,2017,30,0,basis),
('22.08.2018',1,2,2017,31,0,basis),
('23.08.2018',1,2,2017,32,0,basis),
('24.08.2018',1,2,2017,33,0,basis),
('25.08.2018',1,2,2017,34,0,basis),
('23.07.2018',2,3,2017,1,0,no),
('24.07.2018',2,3,2017,2,0,no),
('25.07.2018',2,3,2017,3,0,no),
('26.07.2018',2,3,2017,4,0,no),
('27.07.2018',2,3,2017,5,0,no),
('28.07.2018',2,3,2017,6,0,no),
('29.07.2018',2,3,2017,7,0,no),
('30.07.2018',2,3,2017,8,0,no),
('31.07.2018',2,3,2017,9,0,no),
('01.08.2018',2,3,2017,10,0,no),
('02.08.2018',2,3,2017,11,0,no),
('03.08.2018',2,3,2017,12,0,no),
('04.08.2018',2,3,2017,13,0,no),
('05.08.2018',2,3,2017,14,0,no),
('06.08.2018',2,3,2017,15,0,no),
('07.08.2018',2,3,2017,16,0,no),
('08.08.2018',2,3,2017,17,0,no),
('09.08.2018',2,3,2017,18,0,no),
('10.08.2018',2,3,2017,19,0,no),
('11.08.2018',2,3,2017,20,0,no),
('12.08.2018',2,3,2017,21,0,no),
('13.08.2018',2,3,2017,22,0,no),
('14.08.2018',2,3,2017,23,0,no),
('15.08.2018',2,3,2017,24,0,no),
('16.08.2018',2,3,2017,25,0,no),
('17.08.2018',2,3,2017,26,0,no),
('18.08.2018',2,3,2017,27,0,no),
('19.08.2018',2,3,2017,28,0,no),
('20.08.2018',2,3,2017,29,0,no),
('21.08.2018',2,3,2017,30,0,no),
('22.08.2018',2,3,2017,31,0,no),
('23.08.2018',2,3,2017,32,0,no),
('24.08.2018',2,3,2017,33,0,no),
('25.08.2018',2,3,2017,34,0,no),
('23.07.2018',3,4,2017,1,1,before),
('24.07.2018',3,4,2017,2,1,before),
('25.07.2018',3,4,2017,3,1,before),
('26.07.2018',3,4,2017,4,1,before),
('27.07.2018',3,4,2017,5,1,before),
('28.07.2018',3,4,2017,6,1,before),
('29.07.2018',3,4,2017,7,1,before),
('30.07.2018',3,4,2017,8,1,before),
('31.07.2018',3,4,2017,9,1,before),
('01.08.2018',3,4,2017,10,1,before),
('02.08.2018',3,4,2017,11,0,before),
('03.08.2018',3,4,2017,12,0,before),
('04.08.2018',3,4,2017,13,0,before),
('05.08.2018',3,4,2017,14,0,before),
('06.08.2018',3,4,2017,15,0,before),
('07.08.2018',3,4,2017,16,0,before),
('08.08.2018',3,4,2017,17,0,before),
('09.08.2018',3,4,2017,18,0,before),
('10.08.2018',3,4,2017,19,0,before),
('11.08.2018',3,4,2017,20,0,before),
('12.08.2018',3,4,2017,21,0,before),
('13.08.2018',3,4,2017,22,0,before),
('14.08.2018',3,4,2017,23,0,before),
('15.08.2018',3,4,2017,24,0,before),
('16.08.2018',3,4,2017,25,0,before),
('17.08.2018',3,4,2017,26,0,before),
('18.08.2018',3,4,2017,27,0,before),
('19.08.2018',3,4,2017,28,0,before),
('20.08.2018',3,4,2017,29,0,before),
('21.08.2018',3,4,2017,30,0,before),
('22.08.2018',3,4,2017,31,0,before),
('23.08.2018',3,4,2017,32,0,before),
('24.08.2018',3,4,2017,33,0,before);
#
I compare by time, i.e. at the same time, i looking for all group that meet the above conditions.
In this example, for the group 1 + 2 + 2017
group 2 + 3 + 2017 did not have action
and the group 3 + 4 + 2017 had a action before starting action for 1 + 2 + 2017
and nothing more no.
NOW Let's work the next group for example 3 + 4 + 2017, look at the time when it had an action and how it affected other group in th
e same time under the specified conditions. I.E 3 + 4 + 2017 became basis.
How to do it?
For group, markers must be generated.
the basis is the group for which we are looking for comparisons.
and everything with which it is compared, marked
" no", or "before", or the flag of the "after" , depending on what combination of group in time sql found.
In other words, there can be very many such recombinations of striations with each other.
I.E. in relation to one group, 1 + 2 + 2017 may be the basis , and to the other, for example to 10 + 10 + 2017, it can not have any action at all.

Here i found the good solution. https://social.msdn.microsoft.com/Forums/sqlserver/en-US/d891c693-2d38-4064-9784-4b21cd2fca11/comparing-data-by-time-in-sql?forum=transactsql
It works.
set dateformat dmy
go
Declare #t table
(Dt date,
x1 int,
x2 int,
x3 int,
sale int,
action int,
typegroup varchar(20));
insert into #t values
('23.07.2018',1,2,2017,1,0,''),
('24.07.2018',1,2,2017,2,0,''),
('25.07.2018',1,2,2017,3,0,''),
('26.07.2018',1,2,2017,4,0,''),
('27.07.2018',1,2,2017,5,0,''),
('28.07.2018',1,2,2017,6,0,''),
('29.07.2018',1,2,2017,7,0,''),
('30.07.2018',1,2,2017,8,0,''),
('31.07.2018',1,2,2017,9,0,''),
('01.08.2018',1,2,2017,10,0,''),
('02.08.2018',1,2,2017,11,0,''),
('03.08.2018',1,2,2017,12,1,''),
('04.08.2018',1,2,2017,13,1,''),
('05.08.2018',1,2,2017,14,1,''),
('06.08.2018',1,2,2017,15,1,''),
('07.08.2018',1,2,2017,16,1,''),
('08.08.2018',1,2,2017,17,1,''),
('09.08.2018',1,2,2017,18,1,''),
('10.08.2018',1,2,2017,19,1,''),
('11.08.2018',1,2,2017,20,1,''),
('12.08.2018',1,2,2017,21,1,''),
('13.08.2018',1,2,2017,22,1,''),
('14.08.2018',1,2,2017,23,1,''),
('15.08.2018',1,2,2017,24,1,''),
('16.08.2018',1,2,2017,25,1,''),
('17.08.2018',1,2,2017,26,1,''),
('18.08.2018',1,2,2017,27,0,''),
('19.08.2018',1,2,2017,28,0,''),
('20.08.2018',1,2,2017,29,0,''),
('21.08.2018',1,2,2017,30,0,''),
('22.08.2018',1,2,2017,31,0,''),
('23.08.2018',1,2,2017,32,0,''),
('24.08.2018',1,2,2017,33,0,''),
('25.08.2018',1,2,2017,34,0,''),
('23.07.2018',2,3,2017,1,0,''),
('24.07.2018',2,3,2017,2,0,''),
('25.07.2018',2,3,2017,3,0,''),
('26.07.2018',2,3,2017,4,0,''),
('27.07.2018',2,3,2017,5,0,''),
('28.07.2018',2,3,2017,6,0,''),
('29.07.2018',2,3,2017,7,0,''),
('30.07.2018',2,3,2017,8,0,''),
('31.07.2018',2,3,2017,9,0,''),
('01.08.2018',2,3,2017,10,0,''),
('02.08.2018',2,3,2017,11,0,''),
('03.08.2018',2,3,2017,12,0,''),
('04.08.2018',2,3,2017,13,0,''),
('05.08.2018',2,3,2017,14,0,''),
('06.08.2018',2,3,2017,15,0,''),
('07.08.2018',2,3,2017,16,0,''),
('08.08.2018',2,3,2017,17,0,''),
('09.08.2018',2,3,2017,18,0,''),
('10.08.2018',2,3,2017,19,0,''),
('11.08.2018',2,3,2017,20,0,''),
('12.08.2018',2,3,2017,21,0,''),
('13.08.2018',2,3,2017,22,0,''),
('14.08.2018',2,3,2017,23,0,''),
('15.08.2018',2,3,2017,24,0,''),
('16.08.2018',2,3,2017,25,0,''),
('17.08.2018',2,3,2017,26,0,''),
('18.08.2018',2,3,2017,27,0,''),
('19.08.2018',2,3,2017,28,0,''),
('20.08.2018',2,3,2017,29,0,''),
('21.08.2018',2,3,2017,30,0,''),
('22.08.2018',2,3,2017,31,0,''),
('23.08.2018',2,3,2017,32,0,''),
('24.08.2018',2,3,2017,33,0,''),
('25.08.2018',2,3,2017,34,0,''),
('23.07.2018',3,4,2017,1,1,''),
('24.07.2018',3,4,2017,2,1,''),
('25.07.2018',3,4,2017,3,1,''),
('26.07.2018',3,4,2017,4,1,''),
('27.07.2018',3,4,2017,5,1,''),
('28.07.2018',3,4,2017,6,1,''),
('29.07.2018',3,4,2017,7,1,''),
('30.07.2018',3,4,2017,8,1,''),
('31.07.2018',3,4,2017,9,1,''),
('01.08.2018',3,4,2017,10,1,''),
('02.08.2018',3,4,2017,11,0,''),
('03.08.2018',3,4,2017,12,0,''),
('04.08.2018',3,4,2017,13,0,''),
('05.08.2018',3,4,2017,14,0,''),
('06.08.2018',3,4,2017,15,0,''),
('07.08.2018',3,4,2017,16,0,''),
('08.08.2018',3,4,2017,17,0,''),
('09.08.2018',3,4,2017,18,0,''),
('10.08.2018',3,4,2017,19,0,''),
('11.08.2018',3,4,2017,20,0,''),
('12.08.2018',3,4,2017,21,0,''),
('13.08.2018',3,4,2017,22,0,''),
('14.08.2018',3,4,2017,23,0,''),
('15.08.2018',3,4,2017,24,0,''),
('16.08.2018',3,4,2017,25,0,''),
('17.08.2018',3,4,2017,26,0,''),
('18.08.2018',3,4,2017,27,1,''),
('19.08.2018',3,4,2017,28,1,''),
('20.08.2018',3,4,2017,29,1,''),
('21.08.2018',3,4,2017,30,1,''),
('22.08.2018',3,4,2017,31,1,''),
('23.08.2018',3,4,2017,32,1,''),
('24.08.2018',3,4,2017,33,1,'');
declare #x1 int,
#x2 int,
#x3 int,#mindt date,#maxdt date
--pass any group values here
select #x1 = 1, #x2 = 2,#x3= 2017
Select #mindt = min(Dt), #maxdt = max(Dt)
from #t
where x1 = #x1
and x2 = #x2
and x3 = #x3
and action =1
update r
set typegroup= type
from (select *,
case when x1=#x1 and x2 = #x2 and x3 = #x3 then 'basis'
when action = 1 and max(Dt) over (partition by nxt) > coalesce(#maxdt,'99991231') then 'after'
when action = 1 and min(Dt) over (partition by nxt) < coalesce(#mindt,'19000101') then 'before'
end as type
from #t t
outer apply
(
Select min(Dt) as nxt
from #t
where x1 = t.x1
and x2 = t.x2
and x3 = t.x3
and action <> t.action
and Dt > t.Dt
)t1)r
select *
from #t
order by x1,x2,x3,sale
/*
Output
-----------------------------------------------------------
Dt x1 x2 x3 sale action typegroup
--------------------------------------------------------------------------
2018-07-23 1 2 2017 1 0 basis
2018-07-24 1 2 2017 2 0 basis
2018-07-25 1 2 2017 3 0 basis
2018-07-26 1 2 2017 4 0 basis
2018-07-27 1 2 2017 5 0 basis
2018-07-28 1 2 2017 6 0 basis
2018-07-29 1 2 2017 7 0 basis
2018-07-30 1 2 2017 8 0 basis
2018-07-31 1 2 2017 9 0 basis
2018-08-01 1 2 2017 10 0 basis
2018-08-02 1 2 2017 11 0 basis
2018-08-03 1 2 2017 12 1 basis
2018-08-04 1 2 2017 13 1 basis
2018-08-05 1 2 2017 14 1 basis
2018-08-06 1 2 2017 15 1 basis
2018-08-07 1 2 2017 16 1 basis
2018-08-08 1 2 2017 17 1 basis
2018-08-09 1 2 2017 18 1 basis
2018-08-10 1 2 2017 19 1 basis
2018-08-11 1 2 2017 20 1 basis
2018-08-12 1 2 2017 21 1 basis
2018-08-13 1 2 2017 22 1 basis
2018-08-14 1 2 2017 23 1 basis
2018-08-15 1 2 2017 24 1 basis
2018-08-16 1 2 2017 25 1 basis
2018-08-17 1 2 2017 26 1 basis
2018-08-18 1 2 2017 27 0 basis
2018-08-19 1 2 2017 28 0 basis
2018-08-20 1 2 2017 29 0 basis
2018-08-21 1 2 2017 30 0 basis
2018-08-22 1 2 2017 31 0 basis
2018-08-23 1 2 2017 32 0 basis
2018-08-24 1 2 2017 33 0 basis
2018-08-25 1 2 2017 34 0 basis
2018-07-23 2 3 2017 1 0 NULL
2018-07-24 2 3 2017 2 0 NULL
2018-07-25 2 3 2017 3 0 NULL
2018-07-26 2 3 2017 4 0 NULL
2018-07-27 2 3 2017 5 0 NULL
2018-07-28 2 3 2017 6 0 NULL
2018-07-29 2 3 2017 7 0 NULL
2018-07-30 2 3 2017 8 0 NULL
2018-07-31 2 3 2017 9 0 NULL
2018-08-01 2 3 2017 10 0 NULL
2018-08-02 2 3 2017 11 0 NULL
2018-08-03 2 3 2017 12 0 NULL
2018-08-04 2 3 2017 13 0 NULL
2018-08-05 2 3 2017 14 0 NULL
2018-08-06 2 3 2017 15 0 NULL
2018-08-07 2 3 2017 16 0 NULL
2018-08-08 2 3 2017 17 0 NULL
2018-08-09 2 3 2017 18 0 NULL
2018-08-10 2 3 2017 19 0 NULL
2018-08-11 2 3 2017 20 0 NULL
2018-08-12 2 3 2017 21 0 NULL
2018-08-13 2 3 2017 22 0 NULL
2018-08-14 2 3 2017 23 0 NULL
2018-08-15 2 3 2017 24 0 NULL
2018-08-16 2 3 2017 25 0 NULL
2018-08-17 2 3 2017 26 0 NULL
2018-08-18 2 3 2017 27 0 NULL
2018-08-19 2 3 2017 28 0 NULL
2018-08-20 2 3 2017 29 0 NULL
2018-08-21 2 3 2017 30 0 NULL
2018-08-22 2 3 2017 31 0 NULL
2018-08-23 2 3 2017 32 0 NULL
2018-08-24 2 3 2017 33 0 NULL
2018-08-25 2 3 2017 34 0 NULL
2018-07-23 3 4 2017 1 1 before
2018-07-24 3 4 2017 2 1 before
2018-07-25 3 4 2017 3 1 before
2018-07-26 3 4 2017 4 1 before
2018-07-27 3 4 2017 5 1 before
2018-07-28 3 4 2017 6 1 before
2018-07-29 3 4 2017 7 1 before
2018-07-30 3 4 2017 8 1 before
2018-07-31 3 4 2017 9 1 before
2018-08-01 3 4 2017 10 1 before
2018-08-02 3 4 2017 11 0 NULL
2018-08-03 3 4 2017 12 0 NULL
2018-08-04 3 4 2017 13 0 NULL
2018-08-05 3 4 2017 14 0 NULL
2018-08-06 3 4 2017 15 0 NULL
2018-08-07 3 4 2017 16 0 NULL
2018-08-08 3 4 2017 17 0 NULL
2018-08-09 3 4 2017 18 0 NULL
2018-08-10 3 4 2017 19 0 NULL
2018-08-11 3 4 2017 20 0 NULL
2018-08-12 3 4 2017 21 0 NULL
2018-08-13 3 4 2017 22 0 NULL
2018-08-14 3 4 2017 23 0 NULL
2018-08-15 3 4 2017 24 0 NULL
2018-08-16 3 4 2017 25 0 NULL
2018-08-17 3 4 2017 26 0 NULL
2018-08-18 3 4 2017 27 1 after
2018-08-19 3 4 2017 28 1 after
2018-08-20 3 4 2017 29 1 after
2018-08-21 3 4 2017 30 1 after
2018-08-22 3 4 2017 31 1 after
2018-08-23 3 4 2017 32 1 after
2018-08-24 3 4 2017 33 1 after

Related

need to add data to rows for each month of the year

I have a table where there is a transaction for each location under different billing codes, if there is a transaction under one billing code in one particular month of the year and if the billing code is discontinued in next month then the final transaction table should have both new billing code as well discontinued billing code for next 12 months (YTD transactions)
the Input table looks like this
Year
LocationID
Month
invoiceID
code
amt1
amt ytd
2021
6394
1
101
F
1
1
2021
6394
1
101
G
10
10
2021
6394
2
102
F
2
3
2021
6394
3
103
F
3
6
2021
6394
4
104
F
4
10
2021
6394
5
105
F
5
15
2021
6394
6
106
F
2
17
2021
6394
6
106
G
1
11
2021
6394
7
107
F
2
19
2021
6394
8
108
F
3
22
2021
6394
9
109
F
1
23
2021
6394
10
1010
F
2
25
2021
6394
11
1011
F
1
26
2021
6394
12
1012
F
3
29
my expected output is
Year
LocationID
Month
invoiceID
code
amt1
amt ytd
2021
6394
1
101
F
1
1
2021
6394
1
101
G
10
10
2021
6394
2
102
F
2
3
2021
6394
2
102
G
0
10
2021
6394
3
103
F
3
6
2021
6394
3
103
G
0
10
2021
6394
4
104
F
4
10
2021
6394
4
104
G
0
10
2021
6394
5
105
F
5
15
2021
6394
5
105
G
0
10
2021
6394
6
106
F
2
17
2021
6394
6
106
G
1
11
2021
6394
7
107
F
2
19
2021
6394
7
107
G
0
11
2021
6394
8
108
F
3
22
2021
6394
8
108
G
0
11
2021
6394
9
109
F
1
23
2021
6394
9
109
G
0
11
2021
6394
10
1010
F
2
25
2021
6394
10
1010
G
0
11
2021
6394
11
1011
F
1
26
2021
6394
11
1011
G
0
11
2021
6394
12
1012
F
3
29
2021
6394
12
1012
G
0
11
i have tried this
WITH temp1 AS
(select year,invoiceID,locationID,count(1)
FROM InputTable
GROUP BY year, invoiceID,ocationID HAVING count(1)=1)
,
temp2 AS
( SELECT * FROM InputTable
WHERE concat(year,'_',invoiceID,'_',locationID) in
( SELECT concat(b.year, '_',b.invoiceID,'_',b.locationID) FROM InputTable b
GROUP BY b.year, b.invoiceID, b.locationID
HAVING count (1)>1
)
)
SELECT DISTINCT
c.year,
CASE WHEN c.code=d.code THEN c.post_date_month ELSE c.post_date_month END AS post_date_month,
c.invoiceID,
CASE WHEN c.code=d.code THEN c.locationID ELSE d.locationID END AS locationID,
CASE WHEN c.code=d.code THEN c.code ELSE d.code END AS code,
CASE WHEN c.code=d.code THEN c.base_amount ELSE d.base_amount END AS base_amount,
CASE WHEN c.code=d.code THEN c.base_amount_ytd ELSE d.base_amount_ytd END AS base_amount_ytd
FROM (
SELECT DISTINCT b.*
FROM temp1 a
INNER JOIN InputTable b
ON a.year=b.year AND a.invoiceID=b.invoiceID AND a.locationID=b.locationID
) c
LEFT JOIN temp2 d
ON 1=1
;
First you need all the rows that have a COUNT(*) of 1 in the cte where you build the "new" rows and union that to te existing rows.
From your example i have build the CTE, that fits your data, but you need to adept the CTE to your needs
WITH CTE AS (
SELECT
Year, LocationID, Month, invoiceID, 'G',0,
CASE WHEN "amt ytd" > 10 THEN 11 ELSE 10 END
FROM InputTable
GROUP BY Year, LocationID, Month,invoiceID
HAVING COUNT(*) = 1)
SELECT * FROM InputTable
UNION
SELECT * FROM CTE
ORDER BY Year, LocationID, Month, invoiceID

sql split yearly record into 12 monthly records

I am trying to use common table expression to split an yearly record into 12 monthly records. I have to do it for next 20 years records . That means 20 rows into 600 rows (20*12=600 records).
What is the best way to do it. Can anyone help with an efficient way to do it.
Using a single table as shown below. Year 0 means current year so it should split into remaining months and year=1 means next year onward it should split into 12 (months) records
id year value
1 0 3155174.87
1 1 30423037.3
1 2 35339631.25
expected result should look like this:
Id Year Month Value Calender year
1 0 5 150 2022
1 0 6 150 2022
1 0 7 150 2022
1 0 8 150 2022
1 0 9 150 2022
1 0 10 150 2022
1 0 11 150 2022
1 0 12 150 2022
1 0 1 150 2023
1 0 2 150 2023
1 0 3 150 2023
1 0 4 150 2023
1 1 5 100 2023
1 1 6 100 2023
1 1 7 100 2023
1 1 8 100 2023
1 1 9 100 2023
1 1 10 100 2023
1 1 11 100 2023
1 1 12 100 2023
1 1 1 100 2024
1 1 2 100 2024
1 1 3 100 2024
1 1 4 100 2024
You can simply join onto a list of months, and then use a bit of arithmetic to split the Value
SELECT
t.Id,
t.Year,
v.Month,
Value = t.Value / CASE WHEN t.Year = 0 THEN 13 - MONTH(GETDATE()) ELSE 12 END
FROM YourTable t
JOIN (VALUES
(1),(2),(3),(4),(5),(6),(7),(8),(9),(10),(11),(12)
) v(Month) ON t.year > 0 OR v.Month >= MONTH(GETDATE());
db<>fiddle

Transposing multiple related columns

While transposing single columns is pretty straight forward I need to transpose a large amount of data with 3 sets of , 10+ related columns needed to be transposed.
create table test
(month int,year int,po1 int,po2 int,ro1 int,ro2 int,mo1 int,mo2 int, mo3 int);
insert into test
values
(5,2013,100,20,10,1,3,4,5),(4,2014,200,30,20,2,4,5,6),(6,2015,200,80,30,3,5,6,7) ;
select * FROM test;
gives
month
year
po1
po2
ro1
ro2
mo1
mo2
mo3
5
2013
100
20
10
1
3
4
5
4
2014
200
30
20
2
4
5
6
6
2015
200
80
30
3
5
6
7
Transposing using UNPIVOT
select
month, year,
PO, RO, MO
from ( SELECT * from test) src
unpivot
( PO for Description in (po1, po2))unpiv1
unpivot
(RO for Description1 in (ro1, ro2)) unpiv2
unpivot
(MO for Description2 in (mo1, mo2, mo3)) unpiv3
order by year
Gives me this
month
year
PO
RO
MO
5
2013
100
10
3
5
2013
100
10
4
5
2013
100
10
5
5
2013
100
1
3
5
2013
100
1
4
5
2013
100
1
5
5
2013
20
10
3
5
2013
20
10
4
5
2013
20
10
5
5
2013
20
1
3
5
2013
20
1
4
5
2013
20
1
5
4
2014
200
20
4
4
2014
200
20
5
4
2014
200
20
6
4
2014
200
2
4
4
2014
200
2
5
4
2014
200
2
6
4
2014
30
20
4
4
2014
30
20
5
4
2014
30
20
6
4
2014
30
2
4
4
2014
30
2
5
4
2014
30
2
6
6
2015
200
30
5
6
2015
200
30
6
6
2015
200
30
7
6
2015
200
3
5
6
2015
200
3
6
6
2015
200
3
7
6
2015
80
30
5
6
2015
80
30
6
6
2015
80
30
7
6
2015
80
3
5
6
2015
80
3
6
6
2015
80
3
7
I will like to turn it to something like this. Is that possible?
month
year
PO
RO
MO
5
2013
100
10
3
5
2013
20
1
4
5
2013
0
0
5
4
2014
200
20
4
4
2014
30
2
5
4
2014
0
0
6
6
2015
200
30
5
6
2015
80
3
6
6
2015
0
0
7
Maybe use a query like below which creates rows as per your design using CROSS APPLY
select month,year,po,ro,mo from
test cross apply
(values (po1,ro1,mo1), (po2,ro2,mo2),(0,0,mo3))v(po,ro,mo)
see demo here
Unpivot acts similar as union,Use union all in your case
SELECT month,
year,
po1 AS PO,
ro1 AS RO,
mo1 AS MO
FROM test
UNION ALL
SELECT month,
year,
po2,
ro2,
mo2
FROM test
UNION ALL
SELECT month,
year,
0,
0,
mo2
FROM test

Pandas - creating new column based on data from other records

I have a pandas dataframe which has the folowing columns -
Day, Month, Year, City, Temperature.
I would like to have a new column that has the average (mean) temperature in same date (day\month) of all previous years.
Can someone please assist?
Thanks :-)
Try:
dti = pd.date_range('2000-1-1', '2021-12-1', freq='D')
temp = np.random.randint(10, 20, len(dti))
df = pd.DataFrame({'Day': dti.day, 'Month': dti.month, 'Year': dti.year,
'City': 'Nice', 'Temperature': temp})
out = df.set_index('Year').groupby(['City', 'Month', 'Day']) \
.expanding()['Temperature'].mean().reset_index()
Output:
>>> out
Day Month Year City Temperature
0 1 1 2000 Nice 12.000000
1 1 1 2001 Nice 12.000000
2 1 1 2002 Nice 11.333333
3 1 1 2003 Nice 12.250000
4 1 1 2004 Nice 11.800000
... ... ... ... ... ...
8001 31 12 2016 Nice 15.647059
8002 31 12 2017 Nice 15.555556
8003 31 12 2018 Nice 15.631579
8004 31 12 2019 Nice 15.750000
8005 31 12 2020 Nice 15.666667
[8006 rows x 5 columns]
Focus on 1st January of the dataset:
>>> df[df['Day'].eq(1) & df['Month'].eq(1)]
Day Month Year City Temperature # Mean
0 1 1 2000 Nice 12 # 12
366 1 1 2001 Nice 12 # 12
731 1 1 2002 Nice 10 # 11.33
1096 1 1 2003 Nice 15 # 12.25
1461 1 1 2004 Nice 10 # 11.80
1827 1 1 2005 Nice 12 # and so on
2192 1 1 2006 Nice 17
2557 1 1 2007 Nice 16
2922 1 1 2008 Nice 19
3288 1 1 2009 Nice 12
3653 1 1 2010 Nice 10
4018 1 1 2011 Nice 16
4383 1 1 2012 Nice 13
4749 1 1 2013 Nice 15
5114 1 1 2014 Nice 14
5479 1 1 2015 Nice 13
5844 1 1 2016 Nice 15
6210 1 1 2017 Nice 13
6575 1 1 2018 Nice 15
6940 1 1 2019 Nice 18
7305 1 1 2020 Nice 11
7671 1 1 2021 Nice 14

Need YTD and MTD calculations in SQL

Date Amt ytd mtd
01-Jan-21 1 2 2
01-Jan-21 1 2 2
02-Jan-21 1 3 3
03-Jan-21 1 4 4
01-Feb-21 1 5 1
02-Feb-21 1 6 2
03-Feb-21 1 7 3
04-Feb-21 1 8 4
05-Feb-21 1 9 5
01-Mar-21 1 10 1
02-Mar-21 1 11 2
03-Mar-21 1 12 3
04-Mar-21 1 13 4
01-Apr-21 1 14 1
02-Apr-21 1 15 2
03-Apr-21 1 16 3
01-May-21 1 17 1
02-May-21 1 18 2
03-May-21 1 19 3
04-May-21 1 20 4
05-May-21 1 21 5
06-May-21 1 22 6
I have the first two columns (Date, Amt) and i need the YTD and MTD columns in MS SQL so that i can show the above table.
Seems like a rolling COUNT OVER was used to calculate the ytd & mtd in the Oracle source.
(Personally, I would prefere RANK or DENSE_RANK)
And since Oracle datestamps can be casted to a DATE as-is.
SELECT [Date], Amt
, ytd = COUNT(*) OVER (ORDER BY CAST([Date] AS DATE))
, mtd = COUNT(*) OVER (PARTITION BY EOMONTH(CAST([Date] AS DATE)) ORDER BY CAST([Date] AS DATE))
FROM your_table
ORDER BY CAST([Date] AS DATE)
Date
Amt
ytd
mtd
01-Jan-21
1
2
2
01-Jan-21
1
2
2
02-Jan-21
1
3
3
03-Jan-21
1
4
4
01-Feb-21
1
5
1
02-Feb-21
1
6
2
03-Feb-21
1
7
3
04-Feb-21
1
8
4
05-Feb-21
1
9
5
db<>fiddle here