Spotfire - Sum() over - sum

I'm trying to sum the values in column INDICATOR for the last 30 days from DATE, by account.
My expression is: Sum([INDICATOR]) over (Intersect([id],LastPeriods(30,[DATE]))) but the results are not accurate.
Any help is appreciated.
Sample data below:
DATE 30DAYSBACK ID INDICATOR RUNNING30 EXPECTED
3/2/16 2/1/16 ABC 1 3 3
3/2/16 2/1/16 ABC 1 3 3
3/2/16 2/1/16 ABC 1 3 3
3/7/16 2/6/16 ABC 1 7 7
3/7/16 2/6/16 ABC 1 7 7
3/7/16 2/6/16 ABC 1 7 7
3/7/16 2/6/16 ABC 1 7 7
3/8/16 2/7/16 ABC 1 10 10
3/8/16 2/7/16 ABC 1 10 10
3/8/16 2/7/16 ABC 1 10 10
3/10/16 2/9/16 ABC 1 12 12
3/10/16 2/9/16 ABC 1 12 12
3/14/16 2/13/16 ABC 1 13 13
3/15/16 2/14/16 ABC 1 14 14
3/16/16 2/15/16 ABC 1 15 15
3/21/16 2/20/16 ABC 1 16 16
3/22/16 2/21/16 ABC 1 17 17
3/23/16 2/22/16 ABC 1 19 19
3/23/16 2/22/16 ABC 1 19 19
3/25/16 2/24/16 ABC 1 20 20
3/29/16 2/28/16 ABC 1 22 22
3/29/16 2/28/16 ABC 1 22 22
3/30/16 2/29/16 ABC 1 27 27
3/30/16 2/29/16 ABC 1 27 27
3/30/16 2/29/16 ABC 1 27 27
3/30/16 2/29/16 ABC 1 27 27
3/30/16 2/29/16 ABC 1 27 27
3/31/16 3/1/16 ABC 1 29 29
3/31/16 3/1/16 ABC 1 29 29
4/1/16 3/2/16 ABC 1 31 31
4/1/16 3/2/16 ABC 1 31 31
4/4/16 3/5/16 ABC 1 32 29
4/5/16 3/6/16 ABC 1 33 30
4/13/16 3/14/16 ABC 1 34 27
4/13/16 3/14/16 ABC 1 34 27
4/13/16 3/14/16 ABC 1 34 27
4/13/16 3/14/16 ABC 1 34 27
4/15/16 3/16/16 ABC 1 35 24
4/20/16 3/21/16 ABC 1 31 26
4/20/16 3/21/16 ABC 1 31 26
4/20/16 3/21/16 ABC 1 31 26
4/25/16 3/26/16 ABC 1 31 25
4/25/16 3/26/16 ABC 1 31 25
4/25/16 3/26/16 ABC 1 31 25
4/26/16 3/27/16 ABC 1 31 26
4/27/16 3/28/16 ABC 1 34 29
4/27/16 3/28/16 ABC 1 34 29
4/27/16 3/28/16 ABC 1 34 29
4/27/16 3/28/16 ABC 1 34 29
4/28/16 3/29/16 ABC 1 35 30

I wan't able to determine a suitable solution within Spotfire. However, I was able to write some R code that allows for the Date and Indicator columns to be passed to it, sums the indicator for the past 30 days, and then returns this as a column.
Here is the code:
record.date <-as.Date(date.column, "%m/%d/%Y")
indicator.vector <- indicator.column
num.entry <- length(record.date)
running.thirty <- vector(length=num.entry)
count <- 0
for(i in 1:num.entry){
date.test <- record.date[i]
for(j in 1:num.entry){
if(record.date[j] >= date.test-30 ){
if(record.date[j] <= date.test){
count <- count + indicator.vector[j]
}
}
}
running.thirty[i] <- count
count <- 0
}
output<-running.thirty
Use Tools >> Register Data Function
(1) Insert the script
(2) Create the two input parameters:
input parameters
(3) Create the output parameter:
output parameter
NOTE: I think there are some errors in your expected values near the end of your data set.

Related

Select data based on few consideration

I wanted to select data based on the below considerations -
Now we have two tables -
TABLE 1 -
DLR_ID
CALL_ID
VEHICLE
PLAN_NO
CALL_STATUS
1
11
AA
5
Generated
2
12
AA
5
Generated
1
13
AA
10
Generated
2
14
AA
10
Not Generated
1
15
BB
5
Generated
1
16
BB
10
Generated
2
17
CC
5
Not Generated
3
18
CC
5
Generated
1
19
DD
5
Not Generated
4
20
DD
5
Not Generated
3
21
EE
5
Generated
2
22
FF
10
Generated
4
23
FF
10
Generated
5
24
GG
20
Generated
6
25
GG
20
Generated
TABLE 2 -
DLR_ID
CALL_ID
CALL_COUNT
CALL_RESULT_STATUS
CALL_DATE(DD/MM/YYYY)
1
11
1
Continue
16/03/2021
1
11
2
Give-up
20/03/2021
2
12
1
Completed
15/03/2021
1
13
1
Continue
01/04/2021
1
15
1
Completed
21/02/2021
1
16
1
Give-up
20/03/2021
3
18
1
Continue
21/05/2021
3
21
1
Give-up
24/04/2021
2
22
1
Completed
19/03/2021
4
23
1
Completed
03/05/2021
5
24
1
Continue
11/02/2021
5
24
2
Completed
11/05/2021
6
25
1
Continue
10/02/2021
6
25
2
Continue
21/02/2021
6
25
3
Continue
21/04/2021
OUTPUT -
DLR_ID
VEHICLE
PLAN_NO
CALL_STATUS
CALL_ID
CALL_DATE
CALL_RESULT_STATUS
1
AA
5
Generated
12
15/03/2021
Completed
2
AA
5
Generated
12
15/03/2021
Completed
1
AA
10
Generated
13
01/04/2021
Continue
2
AA
10
Not Generated
13
01/04/2021
Continue
1
BB
5
Generated
15
21/02/2021
Completed
1
BB
10
Generated
15
21/02/2021
Completed
2
CC
5
Not Generated
18
21/05/2021
Continue
3
CC
5
Generated
18
21/05/2021
Continue
1
DD
5
Not Generated
4
DD
5
Not Generated
3
EE
5
Generated
21
21/04/2021
Give-up
2
FF
10
Generated
23
03/05/2021
Completed
4
FF
10
Generated
23
03/05/2021
Completed
5
GG
20
Generated
24
11/05/2021
Completed
6
GG
20
Generated
24
11/05/2021
Completed
Kindly help me out in extracting the building oracle query to extract the data like mentioned in OUTPUT table.
Code which I was trying is -
SELECT t1.DLR_id, t1.VEHICLE,t1.PLAN_NO,t1.CALL_STATUS,
NVL(MAX(CASE WHEN t1.CALL_STATUS='Generated' and t2.CALL_RESULT_STATUS = 'Completed' THEN t2.CALL_ID END),
MAX(CASE WHEN t1.CALL_STATUS!='Generated' and t2.CALL_RESULT_STATUS != 'Completed' THEN t2.CALL_ID END)) as CALL_ID
FROM Table1 t1
left JOIN Table2 t2
ON t1.DLR_ID=t2.DLR_ID
and t2.call_id = t1.call_id
group by T1.DLR_ID,t1.VEHICLE,t1.PLAN_NO,
T1.CALL_STATUS
order by t1.VEHICLE,t1.plan_no,t1.dlr_id

sum every 7 rows from column sales while ints representing n days away from installation of promotion-material (before and after the installation)

2 Stores, each with its sales data per day. Both get equipped with promotion material but not at the same day. After the pr_day the promotion material will stay there. Meaning, there should be a sales boost from the day of the installation of the promotion material.
Installation Date:
Store A - 05/15/2019
Store B - 05/17/2019
To see if the promotion was a success we measure the sales before the pr-date and after by returning number of sales (not revenue but pieces sold) next to the int, indicating how far away it was from the pr-day: (sum of sales from both stores)
pr_date| sales
-28 | 35
-27 | 40
-26 | 21
-25 | 36
-24 | 29
-23 | 36
-22 | 43
-21 | 31
-20 | 32
-19 | 21
-18 | 17
-17 | 34
-16 | 34
-15 | 37
-14 | 32
-13 | 29
-12 | 25
-11 | 45
-10 | 43
-9 | 26
-8 | 27
-7 | 33
-6 | 36
-5 | 17
-4 | 34
-3 | 33
-2 | 21
-1 | 28
1 | 16
2 | 6
3 | 16
4 | 29
5 | 32
6 | 30
7 | 30
8 | 30
9 | 17
10 | 12
11 | 35
12 | 30
13 | 15
14 | 28
15 | 14
16 | 16
17 | 13
18 | 27
19 | 22
20 | 34
21 | 33
22 | 22
23 | 13
24 | 35
25 | 28
26 | 19
27 | 17
28 | 29
you may noticed, that i already removed the day from the installation of the promotion material.
The issue starts with the different installation date of the pr-material. If I group by weekday it will combine the sales from different days away from the installation. It will just start at whatever weekday i define:
Select DATEDIFF(wk, change_date, sales_date), sum(sales)
from tbl_sales
group by DATEDIFF(wk, change_date, sales_date)
result:
week | sales
-4 | 75
-3 | 228
-2 | 204
-1 | 235
0 | 149
1 | 173
2 | 151
3 | 167
4 | 141
the numbers are not from the right days and there is one week to many. Guess this is comming from sql grouping the sales starting from Sunday and because the pr_dates are different it generates more than just the 8 weeks (4 before, 4 after)
trying to find a sustainable solution i couldn't find the right fit and decided to post it here. Very thankfull for every thoughts of the community about this topics. Quite sure there is a smart solution for this problem cause it doesn't look like a rare request to me
I tried it with over as well but i don't see how to sum the 7 days together as they are not date days anymore but delta to the pr-date
Desired Result:
week | sales
-4 | 240
-3 | 206
-2 | 227
-1 | 202
1 | 159
2 | 167
3 | 159
4 | 163
Attachment from my analysis by hand what the Results should be:
Why do i need the weekly summary -> the Stores are performing differently depending on the weekday. With summing 7 days together I make sure we don't compare mondays to sundays and so on. Furthermore, the result will be represented in a Line- or Barchart where you could see the weekday variation in a ugly way. Meaning it will be hard for your eyes to see the trend/devolopment of the salesnumbers. Whereas the weekly comparison will absorb this variations.
If anything is unclear please feel free to let me know so i could provide you with futher details
Thank you very much
Additional the different Installation date overview:
Shop A:
store A
delta date sales
-28 17.04.2019 20
-27 18.04.2019 20
-26 19.04.2019 13
-25 20.04.2019 25
-24 21.04.2019 16
-23 22.04.2019 20
-22 23.04.2019 26
-21 24.04.2019 15
-20 25.04.2019 20
-19 26.04.2019 13
-18 27.04.2019 13
-17 28.04.2019 20
-16 29.04.2019 21
-15 30.04.2019 20
-14 01.05.2019 17
-13 02.05.2019 13
-12 03.05.2019 9
-11 04.05.2019 34
-10 05.05.2019 28
-9 06.05.2019 19
-8 07.05.2019 14
-7 08.05.2019 23
-6 09.05.2019 18
-5 10.05.2019 9
-4 11.05.2019 22
-3 12.05.2019 17
-2 13.05.2019 14
-1 14.05.2019 19
0 15.05.2019 11
1 16.05.2019 0
2 17.05.2019 0
3 18.05.2019 1
4 19.05.2019 19
5 20.05.2019 18
6 21.05.2019 14
7 22.05.2019 11
8 23.05.2019 12
9 24.05.2019 8
10 25.05.2019 7
11 26.05.2019 19
12 27.05.2019 15
13 28.05.2019 15
14 29.05.2019 11
15 30.05.2019 5
16 31.05.2019 8
17 01.06.2019 10
18 02.06.2019 19
19 03.06.2019 14
20 04.06.2019 21
21 05.06.2019 22
22 06.06.2019 7
23 07.06.2019 6
24 08.06.2019 23
25 09.06.2019 17
26 10.06.2019 9
27 11.06.2019 8
28 12.06.2019 23
Shop B:
store B
delta date sales
-28 19.04.2019 15
-27 20.04.2019 20
-26 21.04.2019 8
-25 22.04.2019 11
-24 23.04.2019 13
-23 24.04.2019 16
-22 25.04.2019 17
-21 26.04.2019 16
-20 27.04.2019 12
-19 28.04.2019 8
-18 29.04.2019 4
-17 30.04.2019 14
-16 01.05.2019 13
-15 02.05.2019 17
-14 03.05.2019 15
-13 04.05.2019 16
-12 05.05.2019 16
-11 06.05.2019 11
-10 07.05.2019 15
-9 08.05.2019 7
-8 09.05.2019 13
-7 10.05.2019 10
-6 11.05.2019 18
-5 12.05.2019 8
-4 13.05.2019 12
-3 14.05.2019 16
-2 15.05.2019 7
-1 16.05.2019 9
0 17.05.2019 9
1 18.05.2019 16
2 19.05.2019 6
3 20.05.2019 15
4 21.05.2019 10
5 22.05.2019 14
6 23.05.2019 16
7 24.05.2019 19
8 25.05.2019 18
9 26.05.2019 9
10 27.05.2019 5
11 28.05.2019 16
12 29.05.2019 15
13 30.05.2019 17
14 31.05.2019 9
15 01.06.2019 8
16 02.06.2019 3
17 03.06.2019 8
18 04.06.2019 8
19 05.06.2019 13
20 06.06.2019 11
21 07.06.2019 15
22 08.06.2019 7
23 09.06.2019 12
24 10.06.2019 11
25 11.06.2019 10
26 12.06.2019 9
27 13.06.2019 6
28 14.06.2019 9
Try
select wk, sum(sales)
from (
select
isnull(sa.sales,0) + isnull(sb.sales,0) sales
, isnull(sa.delta , sb.delta) delta
, case when isnull(sa.delta , sb.delta) = 0 then 0
else case when isnull(sa.delta , sb.delta) > 0 then (isnull(sa.delta , sb.delta) -1) /7 +1
else (isnull(sa.delta , sb.delta) +1) /7 -1
end
end wk
from shopA sa
full join shopB sb on sa.delta=sb.delta
) t
group by wk;
sql fiddle
A more readable version, it doesn't run faster, just using CROSS APLLY this way allows to indroduce sort of intermediate variables for cleaner code.
select wk, sum(sales)
from (
select
isnull(sa.sales,0) + isnull(sb.sales,0) sales
, dlt delta
, case when dlt = 0 then 0
else case when dlt > 0 then (dlt - 1) / 7 + 1
else (dlt + 1) / 7 - 1
end
end wk
from shopA sa
full join shopB sb on sa.delta=sb.delta
cross apply (
select dlt = isnull(sa.delta, sb.delta)
) tmp
) t
group by wk;
Finally, if you already have a query which produces a dataset with the (pr_date, sales) columns
select wk, sum(sales)
from (
select sales
, case when pr_date = 0 then 0
else case when pr_date > 0 then (pr_date - 1) / 7 + 1
else (pr_date + 1) / 7 - 1
end
end wk
from (
-- ... you query here ...
)pr_date_sales
) t
group by wk;
I think you just need to take the day difference and use arithmetic. Using datediff() with week counts week-boundaries -- which is not what you want. That is, it normalizes the weeks to calendar weeks.
You want to leave out the day of the promotion, which makes this a wee bit more complicated.
I think this is the logic:
Select v.week_diff, sum(sales)
from tbl_sales s cross join
(values (case when change_date < sales_date
then (datediff(day, change_date, sales_date) + 1) / 7
else (datediff(day, change_date, sales_date) - 1) / 7
end)
) v(week_diff)
where change_date <> sales_date
group by v.week_diff;
There might be an off-by-one problem, depending on what you really want to do when the dates are the same.

Comparing Data by time in Sql

I have the data
SELECT [Dt]
,x1
,[SaleCount]
,x2
,x3
,action
FROM table
action is an action( 0 no it, 1 is).
The essence of the matter is as follows:
for example for group
x1 + x2 + x3
1 2 2017
may be an action, and here there are some options for the influence of this group on the other.
such as a group
x1 + x2 + x3,
2 + 3 + 2017
I need to restructure the data, so that there are certain combinations.
There is a action on the group 1 + 2 + 2017 and no action on the group 2 + 3 + 2017(no marker);
or there is a action of the group 1 + 2 + 2017, but in group 2 + 3 + 2017 was also action before the action for group 1 + 2 + 2017(marker before)
(that is, the ones by action column for 2+3+2017 go before the ones by action column for of group 1 + 2 + 2017)
or there is a action on the group 1 + 2 + 2017, but on the group 2 + 3 + 2017 there was also an action AFTER the action group 1 + 2 + 2017(marker after)
(that is, the ones by action column for 2+3+2017 go after the ones by action column for of group 1 + 2 + 2017)
So I need to allocate combinations of action corresponding to these conditions.
data sample
Declare #t table
(Dt date,
x1 int,
Sale int,
x2 int,
x3 int,
action int,
typegroup varchar);
insert into #t values
('23.07.2018',1,2,2017,1,0,basis),
('24.07.2018',1,2,2017,2,0,basis),
('25.07.2018',1,2,2017,3,0,basis),
('26.07.2018',1,2,2017,4,0,basis),
('27.07.2018',1,2,2017,5,0,basis),
('28.07.2018',1,2,2017,6,0,basis),
('29.07.2018',1,2,2017,7,0,basis),
('30.07.2018',1,2,2017,8,0,basis),
('31.07.2018',1,2,2017,9,0,basis),
('01.08.2018',1,2,2017,10,0,basis),
('02.08.2018',1,2,2017,11,0,basis),
('03.08.2018',1,2,2017,12,1,basis),
('04.08.2018',1,2,2017,13,1,basis),
('05.08.2018',1,2,2017,14,1,basis),
('06.08.2018',1,2,2017,15,1,basis),
('07.08.2018',1,2,2017,16,1,basis),
('08.08.2018',1,2,2017,17,1,basis),
('09.08.2018',1,2,2017,18,1,basis),
('10.08.2018',1,2,2017,19,1,basis),
('11.08.2018',1,2,2017,20,1,basis),
('12.08.2018',1,2,2017,21,1,basis),
('13.08.2018',1,2,2017,22,1,basis),
('14.08.2018',1,2,2017,23,1,basis),
('15.08.2018',1,2,2017,24,1,basis),
('16.08.2018',1,2,2017,25,1,basis),
('17.08.2018',1,2,2017,26,1,basis),
('18.08.2018',1,2,2017,27,0,basis),
('19.08.2018',1,2,2017,28,0,basis),
('20.08.2018',1,2,2017,29,0,basis),
('21.08.2018',1,2,2017,30,0,basis),
('22.08.2018',1,2,2017,31,0,basis),
('23.08.2018',1,2,2017,32,0,basis),
('24.08.2018',1,2,2017,33,0,basis),
('25.08.2018',1,2,2017,34,0,basis),
('23.07.2018',2,3,2017,1,0,no),
('24.07.2018',2,3,2017,2,0,no),
('25.07.2018',2,3,2017,3,0,no),
('26.07.2018',2,3,2017,4,0,no),
('27.07.2018',2,3,2017,5,0,no),
('28.07.2018',2,3,2017,6,0,no),
('29.07.2018',2,3,2017,7,0,no),
('30.07.2018',2,3,2017,8,0,no),
('31.07.2018',2,3,2017,9,0,no),
('01.08.2018',2,3,2017,10,0,no),
('02.08.2018',2,3,2017,11,0,no),
('03.08.2018',2,3,2017,12,0,no),
('04.08.2018',2,3,2017,13,0,no),
('05.08.2018',2,3,2017,14,0,no),
('06.08.2018',2,3,2017,15,0,no),
('07.08.2018',2,3,2017,16,0,no),
('08.08.2018',2,3,2017,17,0,no),
('09.08.2018',2,3,2017,18,0,no),
('10.08.2018',2,3,2017,19,0,no),
('11.08.2018',2,3,2017,20,0,no),
('12.08.2018',2,3,2017,21,0,no),
('13.08.2018',2,3,2017,22,0,no),
('14.08.2018',2,3,2017,23,0,no),
('15.08.2018',2,3,2017,24,0,no),
('16.08.2018',2,3,2017,25,0,no),
('17.08.2018',2,3,2017,26,0,no),
('18.08.2018',2,3,2017,27,0,no),
('19.08.2018',2,3,2017,28,0,no),
('20.08.2018',2,3,2017,29,0,no),
('21.08.2018',2,3,2017,30,0,no),
('22.08.2018',2,3,2017,31,0,no),
('23.08.2018',2,3,2017,32,0,no),
('24.08.2018',2,3,2017,33,0,no),
('25.08.2018',2,3,2017,34,0,no),
('23.07.2018',3,4,2017,1,1,before),
('24.07.2018',3,4,2017,2,1,before),
('25.07.2018',3,4,2017,3,1,before),
('26.07.2018',3,4,2017,4,1,before),
('27.07.2018',3,4,2017,5,1,before),
('28.07.2018',3,4,2017,6,1,before),
('29.07.2018',3,4,2017,7,1,before),
('30.07.2018',3,4,2017,8,1,before),
('31.07.2018',3,4,2017,9,1,before),
('01.08.2018',3,4,2017,10,1,before),
('02.08.2018',3,4,2017,11,0,before),
('03.08.2018',3,4,2017,12,0,before),
('04.08.2018',3,4,2017,13,0,before),
('05.08.2018',3,4,2017,14,0,before),
('06.08.2018',3,4,2017,15,0,before),
('07.08.2018',3,4,2017,16,0,before),
('08.08.2018',3,4,2017,17,0,before),
('09.08.2018',3,4,2017,18,0,before),
('10.08.2018',3,4,2017,19,0,before),
('11.08.2018',3,4,2017,20,0,before),
('12.08.2018',3,4,2017,21,0,before),
('13.08.2018',3,4,2017,22,0,before),
('14.08.2018',3,4,2017,23,0,before),
('15.08.2018',3,4,2017,24,0,before),
('16.08.2018',3,4,2017,25,0,before),
('17.08.2018',3,4,2017,26,0,before),
('18.08.2018',3,4,2017,27,0,before),
('19.08.2018',3,4,2017,28,0,before),
('20.08.2018',3,4,2017,29,0,before),
('21.08.2018',3,4,2017,30,0,before),
('22.08.2018',3,4,2017,31,0,before),
('23.08.2018',3,4,2017,32,0,before),
('24.08.2018',3,4,2017,33,0,before);
#
I compare by time, i.e. at the same time, i looking for all group that meet the above conditions.
In this example, for the group 1 + 2 + 2017
group 2 + 3 + 2017 did not have action
and the group 3 + 4 + 2017 had a action before starting action for 1 + 2 + 2017
and nothing more no.
NOW Let's work the next group for example 3 + 4 + 2017, look at the time when it had an action and how it affected other group in th
e same time under the specified conditions. I.E 3 + 4 + 2017 became basis.
How to do it?
For group, markers must be generated.
the basis is the group for which we are looking for comparisons.
and everything with which it is compared, marked
" no", or "before", or the flag of the "after" , depending on what combination of group in time sql found.
In other words, there can be very many such recombinations of striations with each other.
I.E. in relation to one group, 1 + 2 + 2017 may be the basis , and to the other, for example to 10 + 10 + 2017, it can not have any action at all.
Here i found the good solution. https://social.msdn.microsoft.com/Forums/sqlserver/en-US/d891c693-2d38-4064-9784-4b21cd2fca11/comparing-data-by-time-in-sql?forum=transactsql
It works.
set dateformat dmy
go
Declare #t table
(Dt date,
x1 int,
x2 int,
x3 int,
sale int,
action int,
typegroup varchar(20));
insert into #t values
('23.07.2018',1,2,2017,1,0,''),
('24.07.2018',1,2,2017,2,0,''),
('25.07.2018',1,2,2017,3,0,''),
('26.07.2018',1,2,2017,4,0,''),
('27.07.2018',1,2,2017,5,0,''),
('28.07.2018',1,2,2017,6,0,''),
('29.07.2018',1,2,2017,7,0,''),
('30.07.2018',1,2,2017,8,0,''),
('31.07.2018',1,2,2017,9,0,''),
('01.08.2018',1,2,2017,10,0,''),
('02.08.2018',1,2,2017,11,0,''),
('03.08.2018',1,2,2017,12,1,''),
('04.08.2018',1,2,2017,13,1,''),
('05.08.2018',1,2,2017,14,1,''),
('06.08.2018',1,2,2017,15,1,''),
('07.08.2018',1,2,2017,16,1,''),
('08.08.2018',1,2,2017,17,1,''),
('09.08.2018',1,2,2017,18,1,''),
('10.08.2018',1,2,2017,19,1,''),
('11.08.2018',1,2,2017,20,1,''),
('12.08.2018',1,2,2017,21,1,''),
('13.08.2018',1,2,2017,22,1,''),
('14.08.2018',1,2,2017,23,1,''),
('15.08.2018',1,2,2017,24,1,''),
('16.08.2018',1,2,2017,25,1,''),
('17.08.2018',1,2,2017,26,1,''),
('18.08.2018',1,2,2017,27,0,''),
('19.08.2018',1,2,2017,28,0,''),
('20.08.2018',1,2,2017,29,0,''),
('21.08.2018',1,2,2017,30,0,''),
('22.08.2018',1,2,2017,31,0,''),
('23.08.2018',1,2,2017,32,0,''),
('24.08.2018',1,2,2017,33,0,''),
('25.08.2018',1,2,2017,34,0,''),
('23.07.2018',2,3,2017,1,0,''),
('24.07.2018',2,3,2017,2,0,''),
('25.07.2018',2,3,2017,3,0,''),
('26.07.2018',2,3,2017,4,0,''),
('27.07.2018',2,3,2017,5,0,''),
('28.07.2018',2,3,2017,6,0,''),
('29.07.2018',2,3,2017,7,0,''),
('30.07.2018',2,3,2017,8,0,''),
('31.07.2018',2,3,2017,9,0,''),
('01.08.2018',2,3,2017,10,0,''),
('02.08.2018',2,3,2017,11,0,''),
('03.08.2018',2,3,2017,12,0,''),
('04.08.2018',2,3,2017,13,0,''),
('05.08.2018',2,3,2017,14,0,''),
('06.08.2018',2,3,2017,15,0,''),
('07.08.2018',2,3,2017,16,0,''),
('08.08.2018',2,3,2017,17,0,''),
('09.08.2018',2,3,2017,18,0,''),
('10.08.2018',2,3,2017,19,0,''),
('11.08.2018',2,3,2017,20,0,''),
('12.08.2018',2,3,2017,21,0,''),
('13.08.2018',2,3,2017,22,0,''),
('14.08.2018',2,3,2017,23,0,''),
('15.08.2018',2,3,2017,24,0,''),
('16.08.2018',2,3,2017,25,0,''),
('17.08.2018',2,3,2017,26,0,''),
('18.08.2018',2,3,2017,27,0,''),
('19.08.2018',2,3,2017,28,0,''),
('20.08.2018',2,3,2017,29,0,''),
('21.08.2018',2,3,2017,30,0,''),
('22.08.2018',2,3,2017,31,0,''),
('23.08.2018',2,3,2017,32,0,''),
('24.08.2018',2,3,2017,33,0,''),
('25.08.2018',2,3,2017,34,0,''),
('23.07.2018',3,4,2017,1,1,''),
('24.07.2018',3,4,2017,2,1,''),
('25.07.2018',3,4,2017,3,1,''),
('26.07.2018',3,4,2017,4,1,''),
('27.07.2018',3,4,2017,5,1,''),
('28.07.2018',3,4,2017,6,1,''),
('29.07.2018',3,4,2017,7,1,''),
('30.07.2018',3,4,2017,8,1,''),
('31.07.2018',3,4,2017,9,1,''),
('01.08.2018',3,4,2017,10,1,''),
('02.08.2018',3,4,2017,11,0,''),
('03.08.2018',3,4,2017,12,0,''),
('04.08.2018',3,4,2017,13,0,''),
('05.08.2018',3,4,2017,14,0,''),
('06.08.2018',3,4,2017,15,0,''),
('07.08.2018',3,4,2017,16,0,''),
('08.08.2018',3,4,2017,17,0,''),
('09.08.2018',3,4,2017,18,0,''),
('10.08.2018',3,4,2017,19,0,''),
('11.08.2018',3,4,2017,20,0,''),
('12.08.2018',3,4,2017,21,0,''),
('13.08.2018',3,4,2017,22,0,''),
('14.08.2018',3,4,2017,23,0,''),
('15.08.2018',3,4,2017,24,0,''),
('16.08.2018',3,4,2017,25,0,''),
('17.08.2018',3,4,2017,26,0,''),
('18.08.2018',3,4,2017,27,1,''),
('19.08.2018',3,4,2017,28,1,''),
('20.08.2018',3,4,2017,29,1,''),
('21.08.2018',3,4,2017,30,1,''),
('22.08.2018',3,4,2017,31,1,''),
('23.08.2018',3,4,2017,32,1,''),
('24.08.2018',3,4,2017,33,1,'');
declare #x1 int,
#x2 int,
#x3 int,#mindt date,#maxdt date
--pass any group values here
select #x1 = 1, #x2 = 2,#x3= 2017
Select #mindt = min(Dt), #maxdt = max(Dt)
from #t
where x1 = #x1
and x2 = #x2
and x3 = #x3
and action =1
update r
set typegroup= type
from (select *,
case when x1=#x1 and x2 = #x2 and x3 = #x3 then 'basis'
when action = 1 and max(Dt) over (partition by nxt) > coalesce(#maxdt,'99991231') then 'after'
when action = 1 and min(Dt) over (partition by nxt) < coalesce(#mindt,'19000101') then 'before'
end as type
from #t t
outer apply
(
Select min(Dt) as nxt
from #t
where x1 = t.x1
and x2 = t.x2
and x3 = t.x3
and action <> t.action
and Dt > t.Dt
)t1)r
select *
from #t
order by x1,x2,x3,sale
/*
Output
-----------------------------------------------------------
Dt x1 x2 x3 sale action typegroup
--------------------------------------------------------------------------
2018-07-23 1 2 2017 1 0 basis
2018-07-24 1 2 2017 2 0 basis
2018-07-25 1 2 2017 3 0 basis
2018-07-26 1 2 2017 4 0 basis
2018-07-27 1 2 2017 5 0 basis
2018-07-28 1 2 2017 6 0 basis
2018-07-29 1 2 2017 7 0 basis
2018-07-30 1 2 2017 8 0 basis
2018-07-31 1 2 2017 9 0 basis
2018-08-01 1 2 2017 10 0 basis
2018-08-02 1 2 2017 11 0 basis
2018-08-03 1 2 2017 12 1 basis
2018-08-04 1 2 2017 13 1 basis
2018-08-05 1 2 2017 14 1 basis
2018-08-06 1 2 2017 15 1 basis
2018-08-07 1 2 2017 16 1 basis
2018-08-08 1 2 2017 17 1 basis
2018-08-09 1 2 2017 18 1 basis
2018-08-10 1 2 2017 19 1 basis
2018-08-11 1 2 2017 20 1 basis
2018-08-12 1 2 2017 21 1 basis
2018-08-13 1 2 2017 22 1 basis
2018-08-14 1 2 2017 23 1 basis
2018-08-15 1 2 2017 24 1 basis
2018-08-16 1 2 2017 25 1 basis
2018-08-17 1 2 2017 26 1 basis
2018-08-18 1 2 2017 27 0 basis
2018-08-19 1 2 2017 28 0 basis
2018-08-20 1 2 2017 29 0 basis
2018-08-21 1 2 2017 30 0 basis
2018-08-22 1 2 2017 31 0 basis
2018-08-23 1 2 2017 32 0 basis
2018-08-24 1 2 2017 33 0 basis
2018-08-25 1 2 2017 34 0 basis
2018-07-23 2 3 2017 1 0 NULL
2018-07-24 2 3 2017 2 0 NULL
2018-07-25 2 3 2017 3 0 NULL
2018-07-26 2 3 2017 4 0 NULL
2018-07-27 2 3 2017 5 0 NULL
2018-07-28 2 3 2017 6 0 NULL
2018-07-29 2 3 2017 7 0 NULL
2018-07-30 2 3 2017 8 0 NULL
2018-07-31 2 3 2017 9 0 NULL
2018-08-01 2 3 2017 10 0 NULL
2018-08-02 2 3 2017 11 0 NULL
2018-08-03 2 3 2017 12 0 NULL
2018-08-04 2 3 2017 13 0 NULL
2018-08-05 2 3 2017 14 0 NULL
2018-08-06 2 3 2017 15 0 NULL
2018-08-07 2 3 2017 16 0 NULL
2018-08-08 2 3 2017 17 0 NULL
2018-08-09 2 3 2017 18 0 NULL
2018-08-10 2 3 2017 19 0 NULL
2018-08-11 2 3 2017 20 0 NULL
2018-08-12 2 3 2017 21 0 NULL
2018-08-13 2 3 2017 22 0 NULL
2018-08-14 2 3 2017 23 0 NULL
2018-08-15 2 3 2017 24 0 NULL
2018-08-16 2 3 2017 25 0 NULL
2018-08-17 2 3 2017 26 0 NULL
2018-08-18 2 3 2017 27 0 NULL
2018-08-19 2 3 2017 28 0 NULL
2018-08-20 2 3 2017 29 0 NULL
2018-08-21 2 3 2017 30 0 NULL
2018-08-22 2 3 2017 31 0 NULL
2018-08-23 2 3 2017 32 0 NULL
2018-08-24 2 3 2017 33 0 NULL
2018-08-25 2 3 2017 34 0 NULL
2018-07-23 3 4 2017 1 1 before
2018-07-24 3 4 2017 2 1 before
2018-07-25 3 4 2017 3 1 before
2018-07-26 3 4 2017 4 1 before
2018-07-27 3 4 2017 5 1 before
2018-07-28 3 4 2017 6 1 before
2018-07-29 3 4 2017 7 1 before
2018-07-30 3 4 2017 8 1 before
2018-07-31 3 4 2017 9 1 before
2018-08-01 3 4 2017 10 1 before
2018-08-02 3 4 2017 11 0 NULL
2018-08-03 3 4 2017 12 0 NULL
2018-08-04 3 4 2017 13 0 NULL
2018-08-05 3 4 2017 14 0 NULL
2018-08-06 3 4 2017 15 0 NULL
2018-08-07 3 4 2017 16 0 NULL
2018-08-08 3 4 2017 17 0 NULL
2018-08-09 3 4 2017 18 0 NULL
2018-08-10 3 4 2017 19 0 NULL
2018-08-11 3 4 2017 20 0 NULL
2018-08-12 3 4 2017 21 0 NULL
2018-08-13 3 4 2017 22 0 NULL
2018-08-14 3 4 2017 23 0 NULL
2018-08-15 3 4 2017 24 0 NULL
2018-08-16 3 4 2017 25 0 NULL
2018-08-17 3 4 2017 26 0 NULL
2018-08-18 3 4 2017 27 1 after
2018-08-19 3 4 2017 28 1 after
2018-08-20 3 4 2017 29 1 after
2018-08-21 3 4 2017 30 1 after
2018-08-22 3 4 2017 31 1 after
2018-08-23 3 4 2017 32 1 after
2018-08-24 3 4 2017 33 1 after

Moving sum over date range

I have this table that has wide range of dates and a corresponding value for each one of those dates, an example shown below.
Date Value
6/01/2013 8
6/02/2013 4
6/03/2013 1
6/04/2013 7
6/05/2013 1
6/06/2013 1
6/07/2013 3
6/08/2013 8
6/09/2013 4
6/10/2013 2
6/11/2013 10
6/12/2013 4
6/13/2013 7
6/14/2013 3
6/15/2013 2
6/16/2013 1
6/17/2013 7
6/18/2013 5
6/19/2013 1
6/20/2013 4
What I am trying to do is create a query that will create a new column that will display the sum of the Value’s column for a specified date range. For example down below, the sum column contains the sum of its corresponding date going back one full week. So the Sum of the date 6/9/2013 would be the sum of the values from 6/03/2013 to 6/09/2013.
Date Sum
6/01/2013 8
6/02/2013 12
6/03/2013 13
6/04/2013 20
6/05/2013 21
6/06/2013 22
6/07/2013 25
6/08/2013 25
6/09/2013 25
6/10/2013 26
6/11/2013 29
6/12/2013 32
6/13/2013 38
6/14/2013 38
6/15/2013 32
6/16/2013 29
6/17/2013 34
6/18/2013 29
6/19/2013 26
6/20/2013 23
I’ve tried to using the LIMIT clause but I could not get it to work, any help would be greatly appreciated.
zoo has a function rollapply which can do what you need:
z <- zoo(x$Value, order.by=x$Date)
rollapply(z, width = 7, FUN = sum, partial = TRUE, align = "right")
## 2013-06-01 8
## 2013-06-02 12
## 2013-06-03 13
## 2013-06-04 20
## 2013-06-05 21
## 2013-06-06 22
## 2013-06-07 25
## 2013-06-08 25
## 2013-06-09 25
## 2013-06-10 26
## 2013-06-11 29
## 2013-06-12 32
## 2013-06-13 38
## 2013-06-14 38
## 2013-06-15 32
## 2013-06-16 29
## 2013-06-17 34
## 2013-06-18 29
## 2013-06-19 26
## 2013-06-20 23
Using data.table
require(data.table)
#Build some sample data
data <- data.table(Date=1:20,Value=rpois(20,10))
#Build reference table
Ref <- data[,list(Compare_Value=list(I(Value)),Compare_Date=list(I(Date)))]
#Use lapply to get last seven days of value by id
data[,Roll.Val := lapply(Date, function(x) {
d <- as.numeric(Ref$Compare_Date[[1]] - x)
sum((d <= 0 & d >= -7)*Ref$Compare_Value[[1]])})]
head(data,10)
Date Value Roll.Val
1: 1 14 14
2: 2 7 21
3: 3 9 30
4: 4 5 35
5: 5 10 45
6: 6 10 55
7: 7 15 70
8: 8 14 84
9: 9 8 78
10: 10 12 83
Here is another solution if anyone is interested:
library("devtools")
install_github("boRingTrees","mgahan")
require(boRingTrees)
rollingByCalcs(data,dates="Date",target="Value",stat=sum,lower=0,upper=7)
Here is one way of doing it
> input <- read.table(text = "Date Value
+ 6/01/2013 8
+ 6/02/2013 4
+ 6/03/2013 1
+ 6/04/2013 7
+ 6/05/2013 1
+ 6/06/2013 1
+ 6/07/2013 3
+ 6/08/2013 8
+ 6/09/2013 4
+ 6/10/2013 2
+ 6/11/2013 10
+ 6/12/2013 4
+ 6/13/2013 7
+ 6/14/2013 3
+ 6/15/2013 2
+ 6/16/2013 1
+ 6/17/2013 7
+ 6/18/2013 5
+ 6/19/2013 1
+ 6/20/2013 4 ", as.is = TRUE, header = TRUE)
> input$Date <- as.Date(input$Date, format = "%m/%d/%Y") # convert Date
>
> # create a sequence that goes a week back from the current data
> x <- data.frame(Date = seq(min(input$Date) - 6, max(input$Date), by = '1 day'))
>
> # merge
> merged <- merge(input, x, all = TRUE)
>
> # replace NAs with zero
> merged$Value[is.na(merged$Value)] <- 0L
>
> # use 'filter' for the running sum and delete first 6
> input$Sum <- filter(merged$Value, rep(1, 7), sides = 1)[-(1:6)]
> input
Date Value Sum
1 2013-06-01 8 8
2 2013-06-02 4 12
3 2013-06-03 1 13
4 2013-06-04 7 20
5 2013-06-05 1 21
6 2013-06-06 1 22
7 2013-06-07 3 25
8 2013-06-08 8 25
9 2013-06-09 4 25
10 2013-06-10 2 26
11 2013-06-11 10 29
12 2013-06-12 4 32
13 2013-06-13 7 38
14 2013-06-14 3 38
15 2013-06-15 2 32
16 2013-06-16 1 29
17 2013-06-17 7 34
18 2013-06-18 5 29
19 2013-06-19 1 26
20 2013-06-20 4 23
>

How to verify whether records exist for the last x days (calendar days) in SQL not using the between key word

Want verify whether my table is having the records for the last 6 consecutive days in SQL
SNO FLIGHT_DATE LANDINGS
45 9/1/2013 1
31 10/1/2013 1
32 11/1/2013 1
30 11/24/2013 1
27 11/25/2013 1
28 11/26/2013 1
29 11/26/2013 1
33 11/26/2013 1
26 11/30/2013 1
25 12/1/2013 1
34 12/1/2013 1
24 12/2/2013 1
35 12/3/2013 1
36 12/3/2013 1
44 12/4/2013 1
46 12/6/2013 1
47 12/6/2013 1
Is this what you want?
SELECT
*
FROM
Table1
WHERE
FLIGHT_DATE > dateadd(day,-6,datediff(day,0,getdate()))
AND
FLIGHT_DATE < GETDATE();
SQL FIDDLE