I need to get the average time between phone calls per agent at one of our call centers. These are the queries I have thus far:
--This query gets all the agentIDs and their callstarts and callends, and as far as I know, attaches an incrementing row-number to them.
drop table #thuranin
SELECT AgentID, CallStart, CallEnd, row_number() over (order by (select NULL)) AS rowInt
INTO #thuranin
FROM Main.CallRecord
WHERE DialerPoolManagementID is null and incomingcall = 0
ORDER BY AgentID
--This query attempts to get the time between each call
drop table #ploto
SELECT nin.AgentID, (CAST(ISNULL(th.CallStart, 0) - nin.CallEnd AS float)) AS AverageTimeBetween
INTO #ploto
FROM #thuranin nin
LEFT JOIN #thuranin AS th ON th.rowInt = (SELECT MIN(rowInt) FROM #thuranin WHERE rowInt > #thuranin.rowInt AND AgentID=th.AgentID)
--This query should average the times by agent
SELECT agentID, AVG(ABS(AverageTimeBetween)) as avGTimebwn
FROM #ploto
WHERE ABS(AverageTimeBetween) > 20000
GROUP BY AgentID
However, this fails (hence why I'm here). It returns values in the -40000's, and I'm not entirely sure why. I need to get the average amount of time between a call end and a call start per agent.
I know that calls at the end of the day to start of the next day could be inflating that number, but I'm unsure of how to deal with that either.
Here's a sample hundred rows from the #thuranin temporary table, if it helps:
AgentID CallStart CallEnd rowInt
NULL 2013-05-29 13:48:39.000 2013-05-29 13:57:20.000 139541
191 2013-05-29 13:50:16.000 2013-05-29 13:50:43.000 139581
NULL 2013-05-29 13:52:04.000 2013-05-29 13:52:46.000 139621
115 2013-05-29 13:53:20.000 2013-05-29 13:53:21.000 139661
190 2013-05-29 13:56:27.000 2013-05-29 13:57:59.000 139701
NULL 2013-05-29 13:58:46.000 2013-05-29 13:59:44.000 139741
171 2013-05-03 18:37:07.000 2013-05-03 18:37:14.000 139781
NULL 2013-05-03 18:39:49.000 2013-05-03 18:41:52.000 139821
107 2013-05-03 18:42:32.000 2013-05-03 18:42:38.000 139861
184 2013-05-03 18:45:38.000 2013-05-03 18:46:08.000 139901
NULL 2013-05-03 18:47:07.000 2013-05-03 18:47:57.000 139941
31 2013-06-14 15:22:02.000 2013-06-14 15:22:44.000 139981
31 2013-06-14 15:24:47.000 2013-06-14 15:25:16.000 140021
31 2013-06-14 15:29:10.000 2013-06-14 15:29:11.000 140061
31 2013-06-14 15:33:57.000 2013-06-14 15:34:06.000 140101
31 2013-06-14 15:41:32.000 2013-06-14 15:42:18.000 140141
172 2013-04-24 21:48:47.000 2013-04-24 21:51:45.000 140181
169 2013-04-24 21:50:42.000 2013-04-24 21:50:53.000 140221
65 2013-04-24 21:52:47.000 2013-04-24 21:52:54.000 140261
169 2013-04-24 21:57:49.000 2013-04-24 21:57:57.000 140301
NULL 2013-04-24 22:04:59.000 2013-04-24 22:06:11.000 140341
31 2013-06-20 14:37:45.000 2013-06-20 14:38:29.000 140381
31 2013-06-20 14:40:27.000 2013-06-20 14:41:09.000 140421
31 2013-06-20 14:44:05.000 2013-06-20 14:44:39.000 140461
31 2013-06-20 14:50:53.000 2013-06-20 14:51:17.000 140501
31 2013-06-20 14:58:52.000 2013-06-20 14:59:24.000 140541
31 2013-07-10 19:54:21.000 2013-07-10 19:54:31.000 140581
31 2013-07-10 20:01:24.000 2013-07-10 20:01:51.000 140621
31 2013-07-10 20:06:23.000 2013-07-10 20:07:14.000 140661
31 2013-07-10 20:09:46.000 2013-07-10 20:09:56.000 140701
31 2013-07-10 20:12:10.000 2013-07-10 20:12:49.000 140741
31 2013-07-10 20:14:45.000 2013-07-10 20:14:59.000 140781
175 2013-07-01 22:35:54.000 2013-07-01 22:36:14.000 140821
191 2013-07-01 22:42:29.000 2013-07-01 22:43:43.000 140861
175 2013-07-01 22:49:42.000 2013-07-01 22:49:57.000 140901
107 2013-07-01 22:59:39.000 2013-07-01 23:00:48.000 140941
191 2013-07-01 23:09:52.000 2013-07-01 23:10:52.000 140981
NULL 2013-04-02 15:47:14.000 2013-04-02 15:48:06.000 141021
NULL 2013-04-02 15:48:48.000 2013-04-02 15:49:07.000 141061
NULL 2013-04-02 15:50:03.000 2013-04-02 15:50:53.000 141101
196 2013-04-02 15:52:05.000 2013-04-02 15:52:52.000 141141
NULL 2013-04-02 15:53:03.000 2013-04-02 15:53:06.000 141181
NULL 2013-05-08 16:17:54.000 2013-05-08 16:18:10.000 141221
140 2013-05-08 16:19:53.000 2013-05-08 16:20:05.000 141261
188 2013-05-08 16:21:34.000 2013-05-08 16:38:04.000 141301
NULL 2013-05-08 16:23:22.000 2013-05-08 16:25:02.000 141341
NULL 2013-05-08 16:25:16.000 2013-05-08 16:27:02.000 141381
31 2013-07-01 23:13:21.000 2013-07-01 23:14:24.000 141421
31 2013-07-01 23:24:23.000 2013-07-01 23:25:23.000 141461
31 2013-07-01 23:40:14.000 2013-07-01 23:40:50.000 141501
31 2013-07-01 23:44:35.000 2013-07-01 23:45:18.000 141541
31 2013-07-01 23:51:58.000 2013-07-01 23:54:33.000 141581
31 2013-07-02 13:03:17.000 2013-07-02 13:04:21.000 141621
158 2013-07-02 13:10:14.000 2013-07-02 13:11:09.000 141661
189 2013-07-02 13:13:48.000 2013-07-02 13:13:55.000 141701
202 2013-07-02 13:16:42.000 2013-07-02 13:16:42.000 141741
107 2013-07-02 13:19:31.000 2013-07-02 13:19:48.000 141781
31 2013-07-02 13:22:31.000 2013-07-02 13:24:44.000 141821
NULL 2013-03-21 18:59:22.000 2013-03-21 19:00:20.000 141861
NULL 2013-03-21 19:01:20.000 2013-03-21 19:01:30.000 141901
112 2013-03-21 19:03:29.000 2013-03-21 19:04:02.000 141941
159 2013-03-21 19:05:27.000 2013-03-21 19:06:31.000 141981
169 2013-03-21 19:07:25.000 2013-03-21 19:08:32.000 142021
NULL 2013-03-15 14:03:40.000 2013-03-15 14:04:14.000 142061
NULL 2013-03-15 14:04:41.000 2013-03-15 14:05:01.000 142101
NULL 2013-03-15 14:06:08.000 2013-03-15 14:07:10.000 142141
NULL 2013-03-15 14:07:47.000 2013-03-15 14:08:48.000 142181
65 2013-03-15 14:09:02.000 2013-03-15 14:09:17.000 142221
183 2013-05-21 17:25:14.000 2013-05-21 17:26:29.000 142261
NULL 2013-05-21 17:27:59.000 2013-05-21 17:28:35.000 142301
NULL 2013-05-21 17:31:42.000 2013-05-21 17:32:47.000 142341
166 2013-05-21 17:35:05.000 2013-05-21 17:36:01.000 142381
182 2013-05-21 17:37:38.000 2013-05-21 17:37:48.000 142421
166 2013-05-21 17:39:46.000 2013-05-21 17:40:21.000 142461
166 2013-04-24 22:13:50.000 2013-04-24 22:14:46.000 142501
65 2013-04-24 22:22:18.000 2013-04-24 22:22:21.000 142541
182 2013-04-24 22:25:54.000 2013-04-24 22:26:01.000 142581
116 2013-04-24 22:31:14.000 2013-04-24 22:31:23.000 142621
182 2013-04-24 22:35:55.000 2013-04-24 22:36:10.000 142661
31 2013-06-20 15:12:42.000 2013-06-20 15:13:39.000 142701
31 2013-06-20 15:20:08.000 2013-06-20 15:20:28.000 142741
31 2013-06-20 15:23:29.000 2013-06-20 15:23:45.000 142781
31 2013-06-20 15:26:39.000 2013-06-20 15:27:06.000 142821
31 2013-06-20 15:28:57.000 2013-06-20 15:29:44.000 142861
NULL 2013-04-24 22:37:50.000 2013-04-24 22:38:37.000 142901
NULL 2013-04-24 22:40:07.000 2013-04-24 22:41:41.000 142941
116 2013-04-24 22:45:09.000 2013-04-24 22:45:24.000 142981
187 2013-04-24 22:48:15.000 2013-04-24 22:48:24.000 143021
NULL 2013-04-24 22:54:57.000 2013-04-24 22:55:33.000 143061
NULL 2013-05-01 21:36:20.000 2013-05-01 21:37:44.000 143101
NULL 2013-05-01 21:39:56.000 2013-05-01 21:40:11.000 143141
NULL 2013-05-01 21:43:57.000 2013-05-01 21:46:34.000 143181
NULL 2013-05-01 21:49:29.000 2013-05-01 21:49:43.000 143221
NULL 2013-05-01 21:56:55.000 2013-05-01 21:57:26.000 143261
NULL 2013-05-01 22:03:51.000 2013-05-01 22:04:34.000 143301
85 2013-07-10 20:16:50.000 2013-07-10 20:16:52.000 143341
31 2013-07-10 20:19:46.000 2013-07-10 20:20:00.000 143381
31 2013-07-10 20:24:22.000 2013-07-10 20:25:03.000 143421
31 2013-07-10 20:26:23.000 2013-07-10 20:27:32.000 143461
31 2013-07-10 20:28:03.000 2013-07-10 20:28:51.000 143501
I really tried to understand your #ploto generation but couldn't. For example I didn't understand the join and what you are selecting and why would you want to get 0 when it is null (which means 1900/01/01). Also subtracting date times and casting to float is a hard to follow way of getting time as a day fraction.
From your description this is what I inferred:
WITH ploto
AS (SELECT AgentId,
CallEnd,
LEAD(CallStart) OVER (PARTITION BY agentId ORDER BY CallStart) AS nextCall
FROM #thuranin)
SELECT AgentId,
AVG(DATEDIFF(SECOND, CallEnd, nextCall)) AS average
FROM ploto
WHERE nextCall > CallEnd
AND DATEDIFF(HOUR, callEnd, nextCall) < 6
GROUP BY AgentID;
With your sample data set output is (in seconds):
AgentId Average
NULL 287
31 326
65 1764
116 826
166 225
169 416
175 808
182 594
191 1569
And here is SQLFiddle link.
EDIT: Some explanation. You didn't specify version so I assumed at least MS SQL 2012. There are CallStarts (and CallEnd) that overlap with the previous (AgentId NULL). I removed them from check. Also arbitrarily assumed if there is more than 6 hours between a CallEnd and next CallStart then I should think employee's work has ended and she\he is gone home, that one shouldn't count.
Related
I'm trying to do a weekly forecast in FBProphet for just 5 weeks ahead. The make_future_dataframe method doesn't seem to be working right....makes the correct one week intervals except for one week between jul 3 and Jul 5....every other interval is correct at 7 days or a week. Code and output below:
INPUT DATAFRAME
ds y
548 2010-01-01 3117
547 2010-01-08 2850
546 2010-01-15 2607
545 2010-01-22 2521
544 2010-01-29 2406
... ... ...
4 2020-06-05 2807
3 2020-06-12 2892
2 2020-06-19 3012
1 2020-06-26 3077
0 2020-07-03 3133
CODE
future = m.make_future_dataframe(periods=5, freq='W')
future.tail(9)
OUTPUT
ds
545 2020-06-12
546 2020-06-19
547 2020-06-26
548 2020-07-03
549 2020-07-05
550 2020-07-12
551 2020-07-19
552 2020-07-26
553 2020-08-02
All you need to do is create a dataframe with the dates you need for predict method. utilizing the make_future_dataframe method is not necessary.
I am using an old SQL Server 2000.
Here is some sample data:
ROOMDATE rate bus_id quantity
2018-09-21 00:00:00.000 129 346686 2
2018-09-21 00:00:00.000 162 354247 36
2018-09-21 00:00:00.000 159 382897 150
2018-09-21 00:00:00.000 120 556111 25
2018-09-22 00:00:00.000 129 346686 8
2018-09-22 00:00:00.000 162 354247 86
2018-09-22 00:00:00.000 159 382897 150
2018-09-22 00:00:00.000 120 556111 25
2018-09-23 00:00:00.000 129 346686 23
2018-09-23 00:00:00.000 162 354247 146
2018-09-23 00:00:00.000 159 382897 9
2018-09-23 00:00:00.000 94 570135 23
Essentially what I am wanting is the MAX quantity of each day with it's corresponding rate and bus_id.
For example, I would want the following rows from my sample data above:
ROOMDATE rate bus_id quantity
2018-09-21 00:00:00.000 159 382897 150
2018-09-22 00:00:00.000 159 382897 150
2018-09-23 00:00:00.000 162 354247 146
From what I have read, SQL Server 2000 did not support ROW_NUMBER. But we can phrase your query using a subquery which finds the max quantity for each day:
SELECT t1.*
FROM yourTable t1
INNER JOIN
(
SELECT
CONVERT(char(10), ROOMDATE, 120) AS ROOMDATE,
MAX(quantity) AS max_quantity
FROM yourTable
GROUP BY CONVERT(char(10), ROOMDATE, 120)
) t2
ON CONVERT(char(10), t1.ROOMDATE, 120) = t2.ROOMDATE AND
t1.quantity = t2.max_quantity
ORDER BY
t1.ROOMDATE;
Demo
I have a table that contains Id, Date and a float value as below:
ID startDt Days
1328 2015-04-01 00:00:00.000 15
2444 2015-04-03 00:00:00.000 5.7
1658 2015-05-08 00:00:00.000 6
1329 2015-05-12 00:00:00.000 28.5
1849 2015-06-23 00:00:00.000 28.5
1581 2015-06-30 00:00:00.000 25.5
3535 2015-07-03 00:00:00.000 3
3536 2015-08-13 00:00:00.000 13.5
2166 2015-09-22 00:00:00.000 28.5
3542 2015-11-05 00:00:00.000 13.5
3543 2015-12-18 00:00:00.000 6
2445 2015-12-25 00:00:00.000 5.7
4096 2015-12-31 00:00:00.000 7.5
2446 2016-01-01 00:00:00.000 5.7
4287 2016-02-11 00:00:00.000 13.5
4288 2016-02-18 00:00:00.000 13.5
4492 2016-03-02 00:00:00.000 19.7
2447 2016-03-25 00:00:00.000 5.7
I am using a stored procedure which adds up the Days then subtracts it from a fixed value stored in a variable.
The total in the table is 245 and the variable is set to 245 so I should get a value of 0 when subtracting the two. However, I am getting a value of 5.6843418860808E-14 instead. I cant figure out why this is the case and I have gone and re entered each number in the table but I still get the same result.
This is my sql statement that I am using to calculate the result:
Declare #AL_Taken as float
Declare #AL_Remaining as float
Declare #EntitledLeave as float
Set #EntitledLeave=245
set #AL_Taken= (select sum(Days) from tblALMain)
Set #AL_Remaining=#EntitledLeave-#AL_Taken
Select #EntitledLeave, #AL_Taken, #AL_Remaining
The select returns the following:
245, 245, 5.6843418860808E-14
Can anyone suggest why I am getting this number when I should be getting 0?
Thanks for the help
Rob
I changed the data type to Decimal as Tab Allenman suggested and this resolved my issue. I still dont understand why I didnt get zero when using float as all the values added up to 245 exactly (I even re-entered the values manually) and 245 - 245 should have given me 0.
Thanks again for all the comments and explanations.
Rob
I would like to clean up some data returned from a query. This query :
select seriesId,
startDate,
reportingCountyId,
countyId,
countyName,
pocId,
pocValue
from someTable
where seriesId = 147
and pocid = 2
and countyId in (2033,2040)
order by startDate
usually returns 2 county matches for all years:
seriesId startDate reportingCountyId countyId countyName pocId pocValue
147 2004-01-01 00:00:00.000 6910 2040 CountyOne 2 828
147 2005-01-01 00:00:00.000 2998 2033 CountyTwo 2 4514
147 2005-01-01 00:00:00.000 3000 2040 CountyOne 2 2446
147 2006-01-01 00:00:00.000 3018 2033 CountyTwo 2 5675
147 2006-01-01 00:00:00.000 4754 2040 CountyOne 2 2265
147 2007-01-01 00:00:00.000 3894 2033 CountyTwo 2 6250
147 2007-01-01 00:00:00.000 3895 2040 CountyOne 2 2127
147 2008-01-01 00:00:00.000 4842 2033 CountyTwo 2 5696
147 2008-01-01 00:00:00.000 4846 2040 CountyOne 2 2013
147 2009-01-01 00:00:00.000 6786 2033 CountyTwo 2 2578
147 2009-01-01 00:00:00.000 6817 2040 CountyTwo 2 1933
147 2010-01-01 00:00:00.000 6871 2040 CountyOne 2 1799
147 2010-01-01 00:00:00.000 6872 2033 CountyTwo 2 4223
147 2011-01-01 00:00:00.000 8314 2033 CountyTwo 2 3596
147 2011-01-01 00:00:00.000 8315 2040 CountyOne 2 1559
But note please that the first entry has only CountyOne for 2004. I would like to return a fake row for CountyTwo for a graph I am doing. It would be sufficient to fill it like CountyOne only with pocValue = 0.
thanks!!!!!!!!
Try this (if you need blank row for that countryid)
; with CTE AS
(SELECT 2033 As CountryID UNION SELECT 2040),
CTE2 AS
(
seriesId, startDate, reportingCountyId,
countyId, countyName, pocId, pocValue
from someTable where
seriesId = 147 and pocid = 2 and countyId in (2033,2040)
order by startDate
)
SELECT x1.CountyId, x2.*, IsNull(pocValue,0) NewpocValue FROM CTE x
LEFT OUTER JOIN CTE2 x2 ON x1.CountyId = x2.reportingCountyId
I am faced with getting the latest (newest) record for a part, and my only way is to do it using a "max date" approach.
Here is a basic schema (and sample data):
ldDate ldTime ldPart id
2010-10-26 00:00:00.000 52867 90-R6600-4100 186
2010-11-01 00:00:00.000 24634 90-R6600-4100 187
2010-11-24 00:00:00.000 58785 90-R6600-4100 194
2010-11-24 00:00:00.000 58771 90-R6600-4100 195
2010-11-17 00:00:00.000 29588 90-R6600-4100 201
2010-11-08 00:00:00.000 29196 90-R6600-4100 282
2010-11-08 00:00:00.000 29640 90-R6600-4100 290
2010-10-19 00:00:00.000 58695 90-R6600-4100 350
2010-09-22 00:00:00.000 32742 BH4354-F0 338
2010-09-22 00:00:00.000 32504 BH4354-F0 340
2010-11-17 00:00:00.000 31157 BH4354-F0 206
2010-11-08 00:00:00.000 27601 BH4354-F0 218
2010-11-08 00:00:00.000 27865 BH4354-F0 21
2010-09-22 00:00:00.000 23264 BR6950-F0 70
2010-09-22 00:00:00.000 23270 BR6950-F0 77
2010-09-24 00:00:00.000 27781 BR6950-F0 97
2010-11-24 00:00:00.000 57735 BR6950-F0 196
What I have above is an ldDate with no time, and an ldTime but not a proper HH:MM:SS representation, it is seconds since midnight, so the first line would be:
2010-10-26 00:00:00.000 52867
or if I convert the time:
52 867 seconds = 14.6852778 hours
~2010-10-26 14:68:52
Is there a clean way I can combine both columns in my SQL query? I want to do a simple max(ldCombinedDate) and group by ldPart (part number).
The id, is almost useless, so ignore it, it is there right now for what order a record was inserted, nothing else, could be used for a nested query maybe? NOTE: Order of entry does not mean latest date...
Thanks...
The expression
DATEADD(second, ldTime, ldDate)
will return the combined date. So, I guess you want something like this:
SELECT ldPart, MAX(DATEADD(second, ldTime, ldDate)) AS MaxCombinedDate
FROM yourtable
GROUP BY ldPart