Update fields in table based on previous field records - sql-server-2012

I have a table of records for pay periods with fields WeekStart and WeekEnd populated for ever fiscal year. In my application a user should be able to update the first WeekEnd date for a given fiscal year and that should in turn update subsequent records WeekStart date by adding 1 day to the Previous WeekEnd date and for the same records's WeekEnd date, add 13 days to the new WeekStart date.
This is part of a Stored Procedure written in SQL Server 2016.
UPDATE [dbo].[staffing_BiweeklyPPCopy]
SET
[WeekStart] = DATEADD(DD, 1, LAG([WeekEnd], 1) OVER (ORDER BY [ID])),
[WeekEnd] = DATEADD(DD, 14, LAG([WeekEnd], 1) OVER (ORDER BY [ID]))
WHERE
[FiscalYear] = #fiscalyear
Original Table contents shown below...
ID WeekStart WeekEnd
163 2018-10-01 2018-10-13
164 2018-10-14 2018-10-27
165 2018-10-28 2018-11-10
166 2018-11-11 2018-11-24
167 2018-11-25 2018-12-08
168 2018-12-09 2018-12-22
169 2018-12-23 2019-01-05
170 2019-01-06 2019-01-19
171 2019-01-20 2019-02-02
172 2019-02-03 2019-02-16
173 2019-02-17 2019-03-02
174 2019-03-03 2019-03-16
175 2019-03-17 2019-03-30
176 2019-03-31 2019-04-13
177 2019-04-14 2019-04-27
178 2019-04-28 2019-05-11
179 2019-05-12 2019-05-25
180 2019-05-26 2019-06-08
181 2019-06-09 2019-06-22
182 2019-06-23 2019-07-06
183 2019-07-07 2019-07-20
184 2019-07-21 2019-08-03
185 2019-08-04 2019-08-17
186 2019-08-18 2019-08-31
187 2019-09-01 2019-09-14
188 2019-09-15 2019-09-28
189 2019-09-29 2019-09-30
For example if a user updates the weekend date for record ID 163 to '2018-10-14', the table will update as follows..
ID WeekStart WeekEnd
163 2018-10-01 2018-10-14
164 2018-10-15 2018-10-28
165 2018-10-29 2018-11-11
166 2018-11-12 2018-11-25
167 2018-11-26 2018-12-09
.
.
.
189 2019-09-30 2019-09-30
Thank you in advance.

Related

Find Maximum Value in Column Pandas

I have a data frame like this- Machine Vibration data.
datetime
tagid
value
quality
0
2021-03-01 13:43:41.440
B42
345
192
1
2021-03-01 13:43:41.440
B43
958
192
2
2021-03-01 13:43:41.440
B44
993
192
3
2021-03-01 13:43:41.440
B45
1224
192
4
2021-03-01 13:43:43.527
B188
6665
192
5
2021-03-01 13:43:43.527
B189
7162
192
6
2021-03-01 13:43:43.527
B190
7193
192
7
2021-03-01 13:43:43.747
C29
2975
192
8
2021-03-01 13:43:43.747
C30
4445
192
9
2021-03-01 13:43:43.747
C31
4015
192
I want to convert this to hourly maximum value for each tag id.
Sample Output
datetime
tagid
value
quality
01-03-2021 13:00
C91
3982
192
01-03-2021 14:00
C91
3972
192
01-03-2021 13:00
C92
9000
192
01-03-2021 14:00
C92
9972
192
01-03-2021 13:00
B42
396
192
01-03-2021 14:00
B42
370
192
01-03-2021 15:00
B42
370
192
I tried with grouper, but couldn't get output.
Use Grouper with aggregate max:
df = df.groupby([pd.Grouper(freq='H', key='datetime'), 'tagid']).max().reset_index()

MSSQL MAX returns all results?

I have tried the following query to return the highest P.Maxvalue for each ME.Name from the last day between 06:00 and 18:00:
SELECT MAX(P.MaxValue) AS Value,P.DateTime,ME.Name AS ID
FROM vManagedEntity AS ME INNER JOIN
Perf.vPerfHourly AS P ON ME.ManagedEntityRowId = P.ManagedEntityRowId INNER JOIN
vPerformanceRuleInstance AS PRI ON P.PerformanceRuleInstanceRowId = PRI.PerformanceRuleInstanceRowId INNER JOIN
vPerformanceRule AS PR ON PRI.RuleRowId = PR.RuleRowId
WHERE (ME.ManagedEntityTypeRowId = 2546) AND (pr.ObjectName = 'VMGuest-cpu') AND (pr.CounterName LIKE 'cpuUsageMHz') AND (CAST(p.DateTime as time) >= '06:00:00' AND CAST(p.DateTime as time) <='18:00:00') AND (p.DateTime > DATEADD(day, - 1, getutcdate()))
group by ME.Name,P.DateTime
ORDER by id
but it seems to return each MaxValue for each ID instead of the highest?
like:
Value DateTime ID
55 2018-02-19 12:00:00.000 bob:vm-100736
51 2018-02-19 13:00:00.000 bob:vm-100736
53 2018-02-19 14:00:00.000 bob:vm-100736
52 2018-02-19 15:00:00.000 bob:vm-100736
52 2018-02-19 16:00:00.000 bob:vm-100736
51 2018-02-19 17:00:00.000 bob:vm-100736
54 2018-02-19 18:00:00.000 bob:vm-100736
51 2018-02-20 06:00:00.000 bob:vm-100736
51 2018-02-20 07:00:00.000 bob:vm-100736
53 2018-02-20 08:00:00.000 bob:vm-100736
52 2018-02-20 09:00:00.000 bob:vm-100736
78 2018-02-19 12:00:00.000 bob:vm-101
82 2018-02-19 13:00:00.000 bob:vm-101
79 2018-02-19 14:00:00.000 bob:vm-101
78 2018-02-19 15:00:00.000 bob:vm-101
79 2018-02-19 16:00:00.000 bob:vm-101
77 2018-02-19 17:00:00.000 bob:vm-101
82 2018-02-19 18:00:00.000 bob:vm-101
82 2018-02-20 06:00:00.000 bob:vm-101
79 2018-02-20 07:00:00.000 bob:vm-101
81 2018-02-20 08:00:00.000 bob:vm-101
82 2018-02-20 09:00:00.000 bob:vm-101
155 2018-02-19 12:00:00.000 bob:vm-104432
there is one value per hour for each id hence twelve results for each id
does MAX not work in this way i want ?
Thanks
expected view like this :
Value DateTime ID
55 2018-02-19 12:00:00.000 bob:vm-100736
82 2018-02-19 13:00:00.000 bob:vm-101
etc
If you're using group by on datetime and id, you'll get all datetimes and all ids, it's that simple.
If you don't need exact time, you can group by date only:
SELECT MAX(P.MaxValue) AS Value, cast(P.DateTime as date) as dat, ME.Name AS ID
...
group by ME.Name, cast(P.DateTime as date)
Or if you do, you may use not exists clause instead of group by.

Getting a cumulative number of records within a day

I am trying to get a cumulative sum of records within a time period of a day. Below is a current sample of my data.
DT No_of_records
2017-05-01 00:00:00.000 241
2017-05-01 04:00:00.000 601
2017-05-01 08:00:00.000 207
2017-05-01 12:00:00.000 468
2017-05-01 16:00:00.000 110
2017-05-01 20:00:00.000 450
2017-05-02 00:00:00.000 151
2017-05-02 04:00:00.000 621
2017-05-02 08:00:00.000 179
2017-05-02 12:00:00.000 163
2017-05-02 16:00:00.000 579
2017-05-02 20:00:00.000 299
I am trying to sum up the number of records until the day changes in another column. My desired output is below.
DT No_of_records cumulative
2017-05-01 00:00:00.000 241 241
2017-05-01 04:00:00.000 601 842
2017-05-01 08:00:00.000 207 1049
2017-05-01 12:00:00.000 468 1517
2017-05-01 16:00:00.000 110 1627
2017-05-01 20:00:00.000 450 2077
2017-05-02 00:00:00.000 151 151
2017-05-02 04:00:00.000 621 772
2017-05-02 08:00:00.000 179 951
2017-05-02 12:00:00.000 163 1114
2017-05-02 16:00:00.000 579 1693
2017-05-02 20:00:00.000 299 1992
Do any of you have ideas on how to get the cumulative column?
If 2012+ you can use with window function sum() over
Select *
,cumulative = sum(No_of_records) over (Partition by cast(DT as date) Order by DT)
From YourTable
You can do this with a windowed SUM():
Select DT, No_of_records,
Sum(No_of_records) Over (Partition By Convert(Date, DT) Order By DT) As cumulative
From YourTable
For older version use CROSS APPLY or Correlated sub-query
SELECT DT,
No_of_records,
cs.cumulative
FROM YourTable a
CROSS apply(SELECT Sum(No_of_records)
FROM YourTable b
WHERE Cast(a.DT AS DATE) = Cast(b.DT AS DATE)
AND a.DT >= b.DT) cs (cumulative)
Rextester Demo

Find frequency of data in sql server 2014 by date and time

so here is a question.
I have a table FacebookInfo with a column name.
Another table is FacebookPost with a column created_time and foriegn key as facebookinfoid mapped to FacebookInfo Id column
So basically FacebookInfo has a record of Facebook Pages and FacebookPost is the posts of those facebook pages.
What I want to find out is how frequently the pages are posting on Facebook, so I want to find out the average posts per day, the difference in hours between those posts, average time on first post of a day and average time on last post of the day.
Thanks for help.
Here is some sample data
FacebookInfo
id name
3 Qatar Airways
4 KLM Royal Dutch Airlines
5 LATAM Airlines
6 Southwest Airlines
FacebookPost
id facebookinfoid created_time
777 3 2016-12-06 12:54:31.000
778 3 2016-12-05 09:54:09.000
779 3 2016-12-02 12:40:46.000
780 3 2016-12-01 13:00:00.000
781 3 2016-11-30 11:29:53.000
782 3 2016-11-30 09:00:00.000
783 3 2016-11-29 10:09:45.000
784 3 2016-11-28 14:00:00.000
785 3 2016-11-27 11:21:11.000
786 3 2016-11-26 12:00:01.000
787 3 2016-11-25 11:58:55.000
788 3 2016-11-24 10:28:19.000
789 3 2016-11-23 16:20:29.000
790 3 2016-11-23 11:19:42.000
791 3 2016-11-21 12:03:07.000
792 3 2016-11-18 13:36:41.000
793 3 2016-11-17 11:08:41.000
794 3 2016-11-16 12:01:00.000
795 3 2016-11-15 13:39:06.000
796 3 2016-11-11 15:11:56.000
1454 4 2016-12-06 15:00:22.000
1455 4 2016-12-05 14:59:04.000
1456 4 2016-12-05 09:00:07.000
1457 4 2016-12-04 15:00:07.000
1458 4 2016-12-03 10:00:08.000
1459 4 2016-12-02 15:00:15.000
1460 4 2016-12-01 14:00:00.000
1461 4 2016-11-30 13:30:24.000
1462 4 2016-11-29 15:00:07.000
1463 4 2016-11-28 15:00:19.000
1464 4 2016-11-28 09:00:09.000
1465 4 2016-11-26 10:00:06.000
1466 4 2016-11-24 15:00:04.000
1467 4 2016-11-23 09:00:09.000
1468 4 2016-11-22 15:01:04.000
1469 4 2016-11-21 15:00:07.000
1470 4 2016-11-21 05:00:10.000
1471 4 2016-11-19 10:00:07.000
1472 4 2016-11-18 09:00:10.000
1473 4 2016-11-17 15:00:01.000
2454 5 2016-12-05 16:00:01.000
2455 5 2016-12-02 16:02:37.000
2456 5 2016-12-01 16:00:09.000
2457 5 2016-11-30 16:00:48.000
2458 5 2016-11-29 16:01:34.000
2459 5 2016-11-28 16:00:00.000
2460 5 2016-11-25 16:00:01.000
2461 5 2016-11-23 16:00:00.000
2462 5 2016-11-22 16:00:00.000
2463 5 2016-11-21 16:00:00.000
2464 5 2016-11-19 16:00:03.000
2465 5 2016-11-18 16:00:00.000
2466 5 2016-11-17 16:00:01.000
2467 5 2016-11-16 16:00:03.000
2468 5 2016-11-15 16:00:01.000
2469 5 2016-11-12 16:00:00.000
2470 5 2016-11-11 16:00:00.000
2471 5 2016-11-10 16:00:01.000
2472 5 2016-11-09 16:00:00.000
2473 5 2016-11-08 16:00:02.000
3059 6 2016-12-06 15:14:30.000
3060 6 2016-12-04 21:38:33.000
3061 6 2016-12-03 22:27:40.000
3062 6 2016-12-02 21:29:42.000
3063 6 2016-12-01 23:00:04.000
3064 6 2016-11-30 22:00:02.000
3065 6 2016-11-30 20:28:17.000
3066 6 2016-11-29 17:57:02.000
3067 6 2016-11-28 20:49:59.000
3068 6 2016-11-26 17:10:55.000
3069 6 2016-11-26 12:50:45.000
3070 6 2016-11-25 21:16:31.000
3071 6 2016-11-25 01:27:09.000
3072 6 2016-11-24 15:50:16.000
3073 6 2016-11-23 22:00:01.000
3074 6 2016-11-23 15:10:32.000
3075 6 2016-11-22 21:42:42.000
3076 6 2016-11-22 16:29:28.000
3077 6 2016-11-22 03:03:21.000
3078 6 2016-11-22 01:45:41.000

How do you time weight average data between two dates for any desired interval in Microsoft SQL Server 2014 Express?

I am trying to write a SQL call that will time weight average data in any desired interval. This meaning i only have to change a few parameters inside the query so the final output will be in Day's, Hours and or minutes between my two desired dates.
I also need it to fill in data between missing intervals like this Example
Actual DataBase Data
Time_Stamp Time_Stamp_ms BPS_FIT0161
2014-07-26 22:32:36 74 164
2014-07-26 22:32:37 71 164
2014-07-26 22:32:38 71 164
2014-07-26 22:32:39 70 162
2014-07-26 22:32:40 71 162
2014-07-26 22:32:41 67 162
2014-07-26 22:32:42 64 165
2014-07-26 22:32:43 63 164
2014-07-26 22:32:44 62 164
2014-07-26 22:32:45 63 163
2014-07-26 22:32:46 59 163
2014-07-26 22:32:47 56 165
2014-07-26 22:32:48 55 167
2014-07-26 22:32:49 54 167
2014-07-26 22:32:50 54 168
2014-07-26 22:32:51 51 168
2014-07-26 22:32:52 47 171
2014-07-26 22:32:53 46 173
2014-07-26 22:32:54 111 177
2014-07-26 22:32:55 42 178
2014-07-26 22:38:56 99 178
2014-07-26 23:24:57 426 178
2014-07-27 00:21:58 854 178
2014-07-27 01:53:09 229 178
2014-07-27 03:30:11 419 178
2014-07-27 05:25:14 56 178
2014-07-27 07:32:16 881 178
2014-07-27 09:48:20 48 178
2014-07-27 12:55:24 286 178
2014-07-27 16:13:28 562 178
2014-07-27 20:10:33 803 178
2014-07-28 00:56:40 26 178
2014-07-28 06:38:47 753 178
2014-07-28 08:38:47 753 178
2014-07-28 09:24:37 219 248
2014-07-28 09:24:38 218 248
2014-07-28 09:24:39 214 226
2014-07-28 09:24:40 212 226
2014-07-28 09:24:41 212 226
2014-07-28 09:24:42 208 224
2014-07-28 09:24:43 207 222
2014-07-28 09:24:44 206 222
2014-07-28 09:24:45 206 222
2014-07-28 10:11:45 604 202
SQL Time Weighted Average (TWA) Hourly query should look something like this:
Date_time BPS_FIT0161 TWA
2014-07-26 22:00:00 177.4342105
2014-07-26 23:00:00 178
2014-07-27 00:00:00 178
2014-07-27 01:00:00 178
2014-07-27 02:00:00 178
2014-07-27 03:00:00 178
2014-07-27 04:00:00 178
2014-07-27 05:00:00 178
2014-07-27 06:00:00 178
2014-07-27 07:00:00 178
2014-07-27 08:00:00 178
2014-07-27 09:00:00 178
2014-07-27 10:00:00 178
2014-07-27 11:00:00 178
2014-07-27 12:00:00 178
2014-07-27 13:00:00 178
2014-07-27 14:00:00 178
2014-07-27 15:00:00 178
2014-07-27 16:00:00 178
2014-07-27 17:00:00 178
2014-07-27 18:00:00 178
2014-07-27 19:00:00 178
2014-07-27 20:00:00 178
2014-07-27 21:00:00 178
2014-07-27 22:00:00 178
2014-07-27 23:00:00 178
2014-07-28 00:00:00 178
2014-07-28 01:00:00 178
2014-07-28 02:00:00 178
2014-07-28 03:00:00 178
2014-07-28 04:00:00 178
2014-07-28 05:00:00 178
2014-07-28 06:00:00 178
2014-07-28 07:00:00 178
2014-07-28 08:00:00 178
2014-07-28 09:00:00 179.5349202
2014-07-28 10:00:00 202.0857852
Thank you so much for your help!
http://sqlfiddle.com/#!6/a8db7/1
End of my question
Beginning of solutions
Using VKP's Solution
SELECT dateadd(hour, datediff(hour, 0, Time_Stamp),0) as Date_Time,
sum(time_stamp_ms * bps_fit0161)/sum(time_stamp_ms) as weighted_avg
FROM BPS
WHERE Time_Stamp BETWEEN CONVERT(DATETIME, '2014-07-26 00:00:00', 102)
AND CONVERT(DATETIME, '2014-07-30 00:00:00', 102)
group by dateadd(hour, datediff(hour, 0, Time_Stamp),0),
dateadd(day,datediff(day, 0, Time_Stamp),0)
order by dateadd(hour, datediff(hour, 0, Time_Stamp),0),
dateadd(day,datediff(day, 0, Time_Stamp),0)
My Query's Result
Date_Time weighted_avg
2014-07-26 00:00:00.000 180
2014-07-26 01:00:00.000 113
2014-07-26 02:00:00.000 147
2014-07-26 03:00:00.000 221
2014-07-26 04:00:00.000 252
2014-07-26 05:00:00.000 379
2014-07-26 06:00:00.000 370
2014-07-26 07:00:00.000 253
2014-07-26 08:00:00.000 125
2014-07-26 09:00:00.000 119
2014-07-26 10:00:00.000 125
2014-07-26 11:00:00.000 117
2014-07-26 12:00:00.000 160
2014-07-26 13:00:00.000 123
2014-07-26 14:00:00.000 86
2014-07-26 15:00:00.000 81
2014-07-26 16:00:00.000 100
2014-07-26 17:00:00.000 108
2014-07-26 18:00:00.000 175
2014-07-26 19:00:00.000 238
2014-07-26 20:00:00.000 211
2014-07-26 21:00:00.000 231
2014-07-26 22:00:00.000 173
2014-07-26 23:00:00.000 178
2014-07-27 00:00:00.000 178
2014-07-27 01:00:00.000 178 <----- Start of missing Data!
2014-07-27 03:00:00.000 178
2014-07-27 05:00:00.000 178
2014-07-27 07:00:00.000 178
2014-07-27 09:00:00.000 178
2014-07-27 12:00:00.000 178
2014-07-27 16:00:00.000 178
2014-07-27 20:00:00.000 178
2014-07-28 00:00:00.000 178
2014-07-28 06:00:00.000 178
2014-07-28 09:00:00.000 160
2014-07-28 10:00:00.000 134
2014-07-28 11:00:00.000 113
2014-07-28 12:00:00.000 136
2014-07-28 13:00:00.000 131
2014-07-28 14:00:00.000 84
2014-07-28 15:00:00.000 102
As you can see I am missing hours due to our database being offline or wireless nodes on my network losing communication. What would you change in the above query to auto fill missing hours with the previous hours data, the same goes for missing data for days.
http://sqlfiddle.com/#!6/a8db7/31/0
declare #s datetime
declare #e datetime
set #s = '2014-07-26 00:00:00'
set #e = '2014-07-30 00:00:00'
;with x(n) as
(
SELECT TOP (DATEDIFF(HOUR, #s, #e) + 1)
rn = ROW_NUMBER() OVER (ORDER BY [object_id])
FROM sys.all_columns ORDER BY [object_id]
)
select DATEADD(HOUR, n-1, #s) as dt into t from x
;with y as (
SELECT
row_number() over(order by t.dt) as rn,
t.dt,
sum(time_stamp_ms * bps_fit0161) / sum(time_stamp_ms) as weighted_avg
FROM BPS
right join t on t.dt = dateadd(hour, datediff(hour, 0, Time_Stamp),0)
group by t.dt,dateadd(hour, datediff(hour, 0, Time_Stamp),0)
)
select y.dt ,
case when y.weighted_avg is null then prev_y.weighted_avg
else y.weighted_avg end as weighted_avg
from y
left join y prev_y on y.rn = prev_y.rn-1
Try this. This groups by the start of any hour till the end of that hour.
Edited: To include all the hours between specified times. This may get you closer to what you are looking for.