I am trying to run through a table and get the sum of the mins column however it always tells me that mins is not a valid column name
SELECT
acs.cid,
DATEDIFF (n, acs.StartTime, acs.EndTime ) as prepromomins,
CASE
WHEN (p.multiplier is null) THEN DATEDIFF (n , acs.StartTime, acs.EndTime)
ELSE (DATEDIFF ( n , acs.StartTime , acs.EndTime ) * p.multiplier)
END AS mins
FROM
activecards as acs
LEFT JOIN Promotions as p
ON acs.StartTime > p.StartTime and acs.EndTime < p.EndTime
Try this one :
SELECT SUM(DATEDIFF(n, acs.StartTime, acs.EndTime) * ISNULL(p.multiplier,1)) AS TotalMins
FROM activecards as acs
LEFT JOIN Promotions as p
ON acs.StartTime > p.StartTime and acs.EndTime < p.EndTime
Related
the query
SELECT TOP 10
total_worker_time/execution_count AS Avg_CPU_Time
,execution_count
,total_elapsed_time/execution_count as AVG_Run_Time
,(SELECT
SUBSTRING(text,statement_start_offset/2,(CASE
WHEN statement_end_offset = -1 THEN LEN(CONVERT(nvarchar(max), text)) * 2
ELSE statement_end_offset
END -statement_start_offset)/2
) FROM sys.dm_exec_sql_text(sql_handle)
) AS query_text
FROM sys.dm_exec_query_stats
ORDER BY AVG_Run_Time DESC
can someone help me how to code it for the last two weeks of run
https://gyazo.com/4f79c9670f3281a8ac5342cdaee28951
table results example
Avg_CPU_Time execution_count AVG_Run_Time query_text
16182805 7 479013574 select x.database_id as iDatabaseID ,#sDatabase as sDatabase ,x.[object_id] as iTableID ,x.index_id as iIndexID ,cast(sum(case when x.alloc_unit_type_desc = 'LOB_DATA' then 0 else x.page_count end) / 128. as decimal(9,3)) as nInRowMB ,cast(sum(case when x.alloc_unit_type_desc = 'LOB_DATA' then x.page_count else 0 end) / 128. as decimal(9,3)) as nLobMB ,cast(sum(x.page_count) / 128. as decimal(9,3)) as nTotalMB ,cast(max(x.avg_fragmentation_in_percent) as decimal(5,2)) as nLogicalFragmentation into #index from #table as t cross apply Metadata.dbo.udm_db_index_physical_stats_tvf(t.iDatabaseID, t.iTableID, null, null, 'sampled') as x group by database_id ,[object_id] ,index_
26520575 2 50647402 select a.AccountID , a.EndDate , ax.[357A020C-EF75-42FD-92B0-3AA2F1F0B4D8] , ax.[585EEDB4-68F6-4C62-ACDA-6FEE57144DB1] into ZTemp_OngoingCanxProcess_AccsToUpdate from Account a join AccountEx ax on ax.AccountID = a.AccountID join AccountTemplate at on at.AccountTemplateID = a.AccountTemplateID where isnull(at.isongoing,0) = 0 and (ax.[357A020C-EF75-42FD-92B0-3AA2F1F0B4D8] is null and ax.[585EEDB4-68F6-4C62-ACDA-6FEE57144DB1] is null) and (a.EndDate is not null and a.EndDate < cast(getdate() as date
sys.dm_exec_query_stats is a MS SQL specific dynamic management view and the rows lifetime is bound with plan cache itself. As soon as the plan cache is dropped, the results that you found will be dropped. You can however, store the results in seperate table for two weeks. But you will still schedule your query to run in certain intervals and insert new rows with a time stamp so you can understand when it was captured.
I would instead recommend query store if your sql server version is 2016 or newer
I have a query where I need to show 24 hour calls for each day.
But I am getting the hours which I have calls only.
My requirement is I need to get all the hours split and 0 if there are no calls.
Please suggest
Below is my code.
select #TrendStartDate
,isd.Name
,isd.Call_ID
,isd.callType
,DATEPART(HOUR,isd.ArrivalTime)
from [PHONE_CALLS] ISD WITH (NOLOCK)
WHERE CallType = 'Incoming'
and Name not in ('DefaultQueue')
and CAST(ArrivalTime as DATe) between #TrendStartDate and #TrendEndDate
The basic idea is that you use a table containing numbers from 0 to 23, and left join that to your data table:
WITH CTE AS
(
SELECT TOP 24 ROW_NUMBER() OVER(ORDER BY ##SPID) - 1 As TheHour
FROM sys.objects
)
SELECT #TrendStartDate
,isd.Name
,isd.Call_ID
,isd.callType
,TheHour
FROM CTE
LEFT JOIN [PHONE_CALLS] ISD WITH (NOLOCK)
ON DATEPART(HOUR,isd.ArrivalTime) = TheHour
AND CallType = 'Incoming'
AND Name NOT IN ('DefaultQueue')
AND CAST(ArrivalTime as DATe) BETWEEN #TrendStartDate AND #TrendEndDate
If you have a tally table, you should use that. If not, the cte will provide you with numbers from 0 to 23.
If you have a numbers table you can use a query like the following:
SELECT d.Date,
h.Hour,
Calls = COUNT(pc.Call_ID)
FROM ( SELECT [Hour] = Number
FROM dbo.Numbers
WHERE Number >= 0
AND Number < 24
) AS h
CROSS JOIN
( SELECT Date = DATEADD(DAY, Number, #TrendStartDate)
FROM dbo.Numbers
WHERE Number <= DATEDIFF(DAY, #TrendStartDate, #TrendEndDate)
) AS d
LEFT JOIN [PHONE_CALLS] AS pc
ON pc.CallType = 'Incoming'
AND pc.Name NOT IN ('DefaultQueue')
AND CAST(pc.ArrivalTime AS DATE) = d.Date
AND DATEPART(HOUR, pc.ArrivalTime) = h.Hour
GROUP BY d.Date, h.Hour
ORDER BY d.Date, h.Hour;
The key is to get all the hours you need:
SELECT [Hour] = Number
FROM dbo.Numbers
WHERE Number >= 0
AND Number < 24
And all the days that you need in your range:
SELECT Date = DATEADD(DAY, Number, #TrendStartDate)
FROM dbo.Numbers
WHERE Number < DATEDIFF(DAY, #TrendStartDate, #TrendEndDate)
Then cross join the two, so that you are guaranteed to have all 24 hours for each day you want. Finally, you can left join to your call table to get the count of calls.
Example on DB<>Fiddle
You can use SQL SERVER recursivity with CTE to generate the hours between 0 and 23 and then a left outer join with the call table
You also use any other Method mentioned in this link to generate numbers from 0 to 23
Link to SQLFiddle
set dateformat ymd
declare #calls as table(date date,hour int,calls int)
insert into #calls values('2020-01-02',0,66),('2020-01-02',1,888),
('2020-01-02',2,5),('2020-01-02',3,8),
('2020-01-02',4,9),('2020-01-02',5,55),('2020-01-02',6,44),('2020-01-02',7,87),('2020-01-02',8,90),
('2020-01-02',9,34),('2020-01-02',10,22),('2020-01-02',11,65),('2020-01-02',12,54),('2020-01-02',13,78),
('2020-01-02',23,99);
with cte as (select 0 n,date from #calls union all select 1+n,date from cte where 1+n <24)
select distinct(cte.date),cte.n [Hour],isnull(ca.calls,0) calls from cte left outer join #calls ca on cte.n=ca.hour and cte.date=ca.date
I have a SQL Server view to show an overview of account statements, first we calculate the latest closing balances of the user accounts to know what the latest balance was from their account. This is the LATEST_CB_DATES part.
Than we calculate the next business days, meaning the 2 next days where we are expecting to receive a balance in the database. This happens in NEXT_B_DAYS
Finally we calculate if the account is expecting a closing balance, received one or received one too late. Note that we use a window reception ending for this.
IF EXISTS (SELECT TABLE_NAME FROM INFORMATION_SCHEMA.VIEWS
WHERE TABLE_NAME = 'VIEW_AS_AS_ACCT_STAT')
DROP VIEW VIEW_AS_AS_ACCT_STAT
GO
CREATE VIEW VIEW_AS_AS_ACCT_STAT AS
WITH LATEST_CB_DATES AS (
SELECT * FROM (
SELECT row_number() over (partition by SD_ACCT.ID order by (AS_ACCT_STAT.CBAL_BAL_DATE) DESC) RN,SD_ACCT.ID, SD_ACCT.ACCT_NBR, AS_ACCT_STAT.CBAL_BAL_DATE AS BAL_DATE, SD_ACCT.CODE, SD_ACCT.CCY, SD_ACCT_GRP.ID AS GRP_ID, SD_ACCT_GRP.CODE AS ACCT_GRP_CODE, SD_ACCT.DATA_OWNER_ID, AS_ACCT_STAT.STATIC_DATA_BNK AS BANK_CODE, AS_ACCT_STAT.STATIC_DATA_HLD AS HOLDER_CODE
FROM SD_ACCT
LEFT JOIN AS_ACCT on SD_ACCT.ID = AS_ACCT.STATIC_DATA_ACCT_ID
LEFT JOIN AS_ACCT_STAT on AS_ACCT.ID = AS_ACCT_STAT.ACCT_ID
JOIN SD_ACCT_GRP_MEMBER ON SD_ACCT.ID = SD_ACCT_GRP_MEMBER.ACCT_ID
JOIN SD_ACCT_GRP on SD_ACCT_GRP_MEMBER.GRP_ID = SD_ACCT_GRP.ID
JOIN SD_ACCT_GRP_ROLE on SD_ACCT_GRP_ROLE.ID = SD_ACCT_GRP.ROLE_ID
WHERE SD_ACCT_GRP_ROLE.CODE = 'AccountStatementsToReceive' AND (AS_ACCT_STAT.VALID = 1 OR AS_ACCT_STAT.VALID IS NULL)
) LST_STMT
WHERE RN = 1
),
NEXT_B_DAYS AS (
SELECT VIEW_BUSINESS_DATES.CAL_ID, VIEW_BUSINESS_DATES.BUSINESS_DATE,
LEAD(VIEW_BUSINESS_DATES.BUSINESS_DATE, 1) OVER (PARTITION BY VIEW_BUSINESS_DATES.CAL_CODE ORDER BY VIEW_BUSINESS_DATES.BUSINESS_DATE) AS NEXT_BUSINESS_DATE,
LEAD(VIEW_BUSINESS_DATES.BUSINESS_DATE, 2) OVER (PARTITION BY VIEW_BUSINESS_DATES.CAL_CODE ORDER BY VIEW_BUSINESS_DATES.BUSINESS_DATE) AS SECOND_BUSINESS_DATE
FROM VIEW_BUSINESS_DATES
)
SELECT LATEST_CB_DATES.ID AS ACCT_ID,
LATEST_CB_DATES.CODE AS ACCT_CODE,
LATEST_CB_DATES.ACCT_NBR,
LATEST_CB_DATES.CCY AS ACCT_CCY,
LATEST_CB_DATES.BAL_DATE AS LATEST_CLOSING_BAL_DATE,
LATEST_CB_DATES.DATA_OWNER_ID,
LATEST_CB_DATES.BANK_CODE,
LATEST_CB_DATES.HOLDER_CODE,
LATEST_CB_DATES.ACCT_GRP_CODE,
CASE
WHEN LATEST_CB_DATES.BAL_DATE IS NULL THEN 'Expecting'
WHEN NEXT_B_DAYS.NEXT_BUSINESS_DATE IS NULL OR NEXT_B_DAYS.SECOND_BUSINESS_DATE IS NULL THEN 'Late'
WHEN AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END IS NOT NULL AND GETDATE() >= TODATETIMEOFFSET(CAST(NEXT_B_DAYS.SECOND_BUSINESS_DATE AS DATETIME) + CAST(CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END AS TIME) AS DATETIME), SEC_TIMEZONE.UTC_TIME_TOTAL_OFFSET) THEN 'Late'
WHEN AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END IS NULL AND GETDATE() >= TODATETIMEOFFSET(CAST(NEXT_B_DAYS.SECOND_BUSINESS_DATE AS DATETIME) + CAST(CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_START AS TIME) AS DATETIME), SEC_TIMEZONE.UTC_TIME_TOTAL_OFFSET) AND CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END AS TIME) >= CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_START AS TIME) THEN 'Expecting'
WHEN AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END IS NULL AND GETDATE() >= TODATETIMEOFFSET(CAST(NEXT_B_DAYS.NEXT_BUSINESS_DATE AS DATETIME) + CAST(CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_START AS TIME) AS DATETIME), SEC_TIMEZONE.UTC_TIME_TOTAL_OFFSET) AND CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END AS TIME) < CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_START AS TIME) THEN 'Expecting' -- overnight
WHEN AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END IS NULL AND CAST (GETDATE() AS DATE) > NEXT_B_DAYS.SECOND_BUSINESS_DATE THEN 'Expecting'
ELSE 'Received'
END AS STAT,
CASE
WHEN LATEST_CB_DATES.BAL_DATE IS NULL THEN NULL
WHEN NEXT_B_DAYS.NEXT_BUSINESS_DATE IS NULL OR NEXT_B_DAYS.SECOND_BUSINESS_DATE IS NULL THEN NULL
WHEN AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END IS NOT NULL THEN CAST(NEXT_B_DAYS.SECOND_BUSINESS_DATE AS DATETIME) + CAST(CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END AS TIME) AS DATETIME)
ELSE NULL
END AS DEADLINE,
SEC_TIMEZONE.UTC_TIME_TOTAL_OFFSET AS TIME_ZONE
FROM AS_AS_RECEPTION_CONF
JOIN LATEST_CB_DATES ON AS_AS_RECEPTION_CONF.ACCT_GRP_ID = LATEST_CB_DATES.GRP_ID
JOIN SEC_TIMEZONE ON SEC_TIMEZONE.ID = AS_AS_RECEPTION_CONF.TIME_ZONE_ID
LEFT JOIN NEXT_B_DAYS ON AS_AS_RECEPTION_CONF.CALENDAR_ID = NEXT_B_DAYS.CAL_ID AND LATEST_CB_DATES.BAL_DATE = NEXT_B_DAYS.BUSINESS_DATE
GO
SELECT * FROM VIEW_AS_AS_ACCT_STAT
What is the issue? Nothing, this works fine, but it's slow. We created a graphical report to display the data for our customers, but it takes 1minute, 30 seconds to load this SQL when you have 5000 accounts, which is too slow.
I guess the reason is the last line, but I didn't manage to refactor it well
LEFT JOIN NEXT_B_DAYS ON AS_AS_RECEPTION_CONF.CALENDAR_ID =
NEXT_B_DAYS.CAL_ID AND LATEST_CB_DATES.BAL_DATE =
NEXT_B_DAYS.BUSINESS_DATE
The exeuction plan of my sql can be found here
How can I refactor this to make my view still work but much more performant?
DBMS is intersystems-cache!
Here is my full query:
SELECT m.Name AS MessageType, COUNT(l.name) AS MessageCount, CAST(AVG(ResponseTime) AS DECIMAL(5, 2)) AS AvgResponseTime
FROM
(SELECT DISTINCT(name) FROM ENSLIB_HL7.Message) m LEFT JOIN
(
SELECT CAST(li.SessionId AS Bigint) AS session_id, li.name, MIN(li.TimeCreated) AS SessionStart, MAX(lo.TimeCreated) AS SessionEnd, CAST(DATEDIFF(s, MIN(li.TimeCreated), MAX(lo.TimeCreated)) AS DECIMAL(5, 2)) AS ResponseTime
FROM (SELECT h1.SessionId, h1.TimeCreated, $PIECE(RawContent, '|', 4), m1.name FROM ens.messageheader h1, ENSLIB_HL7.Message m1 WHERE h1.MessageBodyId = m1.id AND h1.TimeCreated > DATEADD(mi, -30, GETUTCDATE())) li
JOIN (SELECT h2.SessionId, h2.TimeCreated FROM ens.messageheader h2, ENSLIB_HL7.Message m2 WHERE h2.MessageBodyId = m2.id AND h2.TimeCreated > DATEADD(mi, -30, GETUTCDATE())) lo
ON li.SessionId = lo.SessionId
GROUP BY li.SessionId
) l on m.name = l.name
GROUP BY l.Name
This gives me 4 results:
VXU_V04 0 (null)
ADT_A03 3 0.01
ADT_A04 3 0.01
ADT_A08 143 0.01
Given that there is one result with 0 records, it seems like it is working. However, if I run SELECT DISTINCT(name) FROM ENSLIB_HL7.Message I get 10 results:
VXU_V04
ADT_A08
ACK_A08
ADT_A03
ACK_A03
ADT_A04
ACK_A04
ACK_V04
ADT_A01
ACK_A01
Why don't I get ten rows with my full query?
Change the GROUP BY to:
GROUP BY m.Name
You are aggregating by the column in the second table, so you only get one row for all the NULL values.
Most databases would reject this syntax, but apparently Intersystems Cache allows it.
I have this query:
SELECT `s`.`time` , SUM( s.love ) AS total_love, SUM( s.sad ) AS total_sad, SUM( s.angry ) AS total_angry, SUM( s.happy ) AS total_happy
FROM (`employee_workshift` AS e)
JOIN `workshift` AS w ON `e`.`workshift_uuid` = `w`.`uuid`
JOIN `shift_summary` AS s ON `w`.`uuid` = `s`.`workshift_uuid`
WHERE `s`.`location_uuid` = '81956feb-3fd7-0e84-e9fe-b640434dfad0'
AND `e`.`employee_uuid` = '3866a979-bc5e-56cb-cede-863afc47b8b5'
AND `s`.`workshift_uuid` = '8c9dbd85-18a3-6ca9-e3f3-06eb602b6f38'
AND `s`.`time` >= CAST( '18:00:00' AS TIME )
AND `s`.`time` <= CAST( '00:00:00' AS TIME )
AND `s`.`date` LIKE '%2014-03%'
My problem is it returns "NULL" but when I changed my 'end_time' to "23:59:59", it returned the right data. I've got an idea to pull the hour of both 'start_time' and 'end_time' and then insert it in a loop to get everything between them.
$time_start = 15;
$time_end = 03;
So it should produce: 15,16,17,18,19,20,21,22,23,00,01,02,03
Then I'll compare them all. But this would take a lot of line and effort than just simply using "BETWEEN". Or should I just use "in_array"? Have you encountered this? I hope someone could help. Thanks.
19:00 is certainly bigger then 00:00 - so your approach should not work.
Try using full timestamp (including date) to get all data you need.
Try to use this query. I don't know your data structure so check INNER JOIN between s and s1 tables. The join must be one row to one row - the difference only in date. Date of s1 rows must be earlier on 1 day than s table rows.
SELECT s.time , SUM( s.love ) AS total_love, SUM( s.sad ) AS total_sad, SUM( s.angry ) AS total_angry, SUM( s.happy ) AS total_happy
FROM (employee_workshift AS e)
JOIN workshift AS w ON e.workshift_uuid = w.uuid
JOIN shift_summary AS s ON w.uuid = s.workshift_uuid
JOIN shift_summary AS s1 ON (w.uuid = s.workshift_uuid AND CAST(s.date as DATE)=CAST(s1.date as DATE)+1)
WHERE s.location_uuid = '81956feb-3fd7-0e84-e9fe-b640434dfad0'
AND e.employee_uuid = '3866a979-bc5e-56cb-cede-863afc47b8b5'
AND s.workshift_uuid = '8c9dbd85-18a3-6ca9-e3f3-06eb602b6f38'
AND s1.time >= CAST( '18:00:00' AS TIME )
AND s.time <= CAST( '00:00:00' AS TIME )
AND s.date LIKE '%2014-03%'