I wrote this code that find the down nodes and calculate the up and down hours. This code works but I want to know any other way or optimize of this code? what is the best way to calculate the duration of down time? and Is there any way(interactive way) that user can input the date and time interval?
SELECT q1.nodeid, q1.VendorIcon, q1.Caption, q1.IP_Address,
q1.OutageDurationInMinutes,
q2.TimeUp
FROM
(SELECT
Nodes.NodeID AS NodeID, ltrim(rtrim(Nodes.Caption)) Caption, Nodes.VendorIcon,Nodes.IP_Address,
sum(DATEDIFF(hh, StartTime.EventTime, EndTime.EventTime)) as OutageDurationInMinutes
FROM Events StartTime
Left join Events EndTime On
EndTime.EventType = '5' and
EndTime.NetObjectType = 'N' and
EndTime.NetworkNode = StartTime.NetworkNode and
EndTime.EventTime =
(
Select
min(EventTime)
from Events
where
EventTime>StartTime.EventTime and
EventType = '5' and
NetObjectType = 'N' and
NetworkNode = StartTime.NetworkNode
)
INNER JOIN Nodes ON
StartTime.NetworkNode = Nodes.NodeID
WHERE
Nodes.Department = '4' AND
StartTime.EventType = 1 AND
StartTime.NetObjectType = 'N' AND
StartTime.eventtime between dateadd(M, -1, getdate()) and getdate()
Group by
Nodes.NodeID,Nodes.Caption, Nodes.VendorIcon,Nodes.IP_Address, Nodes.LastBoot
) q1
INNER JOIN
(SELECT
Nodes.NodeID AS NodeID
,ltrim(rtrim(Caption)) Caption
,VendorIcon
,Ip_Address
,DateDiff(hour,Nodes.LastBoot,GetDate()) AS HoursUp
,CONVERT(VARCHAR(40), DATEDIFF(minute, Nodes.LastBoot, GETDATE())/(24*60))
+ ' days, '
+ CONVERT(VARCHAR(40), DATEDIFF(minute, Nodes.LastBoot, GETDATE())%(24*60)/60)
+ ' hours, and '
+ CONVERT(VARCHAR(40), DATEDIFF(minute, Nodes.LastBoot, GETDATE())%60)
+ ' minutes.' AS TimeUp
FROM [Nodes]
Where
LastBoot between dateadd(day, -30, getdate()) and getdate()) q2 on q1.NodeID=q2.NodeID
Order by Caption
I want to know any other way or
optimize of this code?
I would recommend having a look at the query execution plan for you query.
Is there any way(interactive way) that
user can input the date and time
interval?
You could just determine the values before you run the query and use these parameters in your query (I'm not sure what's calling your query though, is it a stored procedure?)
Related
I have a SQL Server view to show an overview of account statements, first we calculate the latest closing balances of the user accounts to know what the latest balance was from their account. This is the LATEST_CB_DATES part.
Than we calculate the next business days, meaning the 2 next days where we are expecting to receive a balance in the database. This happens in NEXT_B_DAYS
Finally we calculate if the account is expecting a closing balance, received one or received one too late. Note that we use a window reception ending for this.
IF EXISTS (SELECT TABLE_NAME FROM INFORMATION_SCHEMA.VIEWS
WHERE TABLE_NAME = 'VIEW_AS_AS_ACCT_STAT')
DROP VIEW VIEW_AS_AS_ACCT_STAT
GO
CREATE VIEW VIEW_AS_AS_ACCT_STAT AS
WITH LATEST_CB_DATES AS (
SELECT * FROM (
SELECT row_number() over (partition by SD_ACCT.ID order by (AS_ACCT_STAT.CBAL_BAL_DATE) DESC) RN,SD_ACCT.ID, SD_ACCT.ACCT_NBR, AS_ACCT_STAT.CBAL_BAL_DATE AS BAL_DATE, SD_ACCT.CODE, SD_ACCT.CCY, SD_ACCT_GRP.ID AS GRP_ID, SD_ACCT_GRP.CODE AS ACCT_GRP_CODE, SD_ACCT.DATA_OWNER_ID, AS_ACCT_STAT.STATIC_DATA_BNK AS BANK_CODE, AS_ACCT_STAT.STATIC_DATA_HLD AS HOLDER_CODE
FROM SD_ACCT
LEFT JOIN AS_ACCT on SD_ACCT.ID = AS_ACCT.STATIC_DATA_ACCT_ID
LEFT JOIN AS_ACCT_STAT on AS_ACCT.ID = AS_ACCT_STAT.ACCT_ID
JOIN SD_ACCT_GRP_MEMBER ON SD_ACCT.ID = SD_ACCT_GRP_MEMBER.ACCT_ID
JOIN SD_ACCT_GRP on SD_ACCT_GRP_MEMBER.GRP_ID = SD_ACCT_GRP.ID
JOIN SD_ACCT_GRP_ROLE on SD_ACCT_GRP_ROLE.ID = SD_ACCT_GRP.ROLE_ID
WHERE SD_ACCT_GRP_ROLE.CODE = 'AccountStatementsToReceive' AND (AS_ACCT_STAT.VALID = 1 OR AS_ACCT_STAT.VALID IS NULL)
) LST_STMT
WHERE RN = 1
),
NEXT_B_DAYS AS (
SELECT VIEW_BUSINESS_DATES.CAL_ID, VIEW_BUSINESS_DATES.BUSINESS_DATE,
LEAD(VIEW_BUSINESS_DATES.BUSINESS_DATE, 1) OVER (PARTITION BY VIEW_BUSINESS_DATES.CAL_CODE ORDER BY VIEW_BUSINESS_DATES.BUSINESS_DATE) AS NEXT_BUSINESS_DATE,
LEAD(VIEW_BUSINESS_DATES.BUSINESS_DATE, 2) OVER (PARTITION BY VIEW_BUSINESS_DATES.CAL_CODE ORDER BY VIEW_BUSINESS_DATES.BUSINESS_DATE) AS SECOND_BUSINESS_DATE
FROM VIEW_BUSINESS_DATES
)
SELECT LATEST_CB_DATES.ID AS ACCT_ID,
LATEST_CB_DATES.CODE AS ACCT_CODE,
LATEST_CB_DATES.ACCT_NBR,
LATEST_CB_DATES.CCY AS ACCT_CCY,
LATEST_CB_DATES.BAL_DATE AS LATEST_CLOSING_BAL_DATE,
LATEST_CB_DATES.DATA_OWNER_ID,
LATEST_CB_DATES.BANK_CODE,
LATEST_CB_DATES.HOLDER_CODE,
LATEST_CB_DATES.ACCT_GRP_CODE,
CASE
WHEN LATEST_CB_DATES.BAL_DATE IS NULL THEN 'Expecting'
WHEN NEXT_B_DAYS.NEXT_BUSINESS_DATE IS NULL OR NEXT_B_DAYS.SECOND_BUSINESS_DATE IS NULL THEN 'Late'
WHEN AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END IS NOT NULL AND GETDATE() >= TODATETIMEOFFSET(CAST(NEXT_B_DAYS.SECOND_BUSINESS_DATE AS DATETIME) + CAST(CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END AS TIME) AS DATETIME), SEC_TIMEZONE.UTC_TIME_TOTAL_OFFSET) THEN 'Late'
WHEN AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END IS NULL AND GETDATE() >= TODATETIMEOFFSET(CAST(NEXT_B_DAYS.SECOND_BUSINESS_DATE AS DATETIME) + CAST(CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_START AS TIME) AS DATETIME), SEC_TIMEZONE.UTC_TIME_TOTAL_OFFSET) AND CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END AS TIME) >= CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_START AS TIME) THEN 'Expecting'
WHEN AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END IS NULL AND GETDATE() >= TODATETIMEOFFSET(CAST(NEXT_B_DAYS.NEXT_BUSINESS_DATE AS DATETIME) + CAST(CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_START AS TIME) AS DATETIME), SEC_TIMEZONE.UTC_TIME_TOTAL_OFFSET) AND CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END AS TIME) < CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_START AS TIME) THEN 'Expecting' -- overnight
WHEN AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END IS NULL AND CAST (GETDATE() AS DATE) > NEXT_B_DAYS.SECOND_BUSINESS_DATE THEN 'Expecting'
ELSE 'Received'
END AS STAT,
CASE
WHEN LATEST_CB_DATES.BAL_DATE IS NULL THEN NULL
WHEN NEXT_B_DAYS.NEXT_BUSINESS_DATE IS NULL OR NEXT_B_DAYS.SECOND_BUSINESS_DATE IS NULL THEN NULL
WHEN AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END IS NOT NULL THEN CAST(NEXT_B_DAYS.SECOND_BUSINESS_DATE AS DATETIME) + CAST(CAST(AS_AS_RECEPTION_CONF.RECEPTION_WINDOW_END AS TIME) AS DATETIME)
ELSE NULL
END AS DEADLINE,
SEC_TIMEZONE.UTC_TIME_TOTAL_OFFSET AS TIME_ZONE
FROM AS_AS_RECEPTION_CONF
JOIN LATEST_CB_DATES ON AS_AS_RECEPTION_CONF.ACCT_GRP_ID = LATEST_CB_DATES.GRP_ID
JOIN SEC_TIMEZONE ON SEC_TIMEZONE.ID = AS_AS_RECEPTION_CONF.TIME_ZONE_ID
LEFT JOIN NEXT_B_DAYS ON AS_AS_RECEPTION_CONF.CALENDAR_ID = NEXT_B_DAYS.CAL_ID AND LATEST_CB_DATES.BAL_DATE = NEXT_B_DAYS.BUSINESS_DATE
GO
SELECT * FROM VIEW_AS_AS_ACCT_STAT
What is the issue? Nothing, this works fine, but it's slow. We created a graphical report to display the data for our customers, but it takes 1minute, 30 seconds to load this SQL when you have 5000 accounts, which is too slow.
I guess the reason is the last line, but I didn't manage to refactor it well
LEFT JOIN NEXT_B_DAYS ON AS_AS_RECEPTION_CONF.CALENDAR_ID =
NEXT_B_DAYS.CAL_ID AND LATEST_CB_DATES.BAL_DATE =
NEXT_B_DAYS.BUSINESS_DATE
The exeuction plan of my sql can be found here
How can I refactor this to make my view still work but much more performant?
Created a SQL query to summarize some data. It is slow, so I thought I'd ask for some help.
Table is a log table that has :
loc, tag, entrytime, exittime, visits, entrywt, exitwt
My test log has 700,000 records in it. The entrytime and exittime are epoch values.
I know my query is inefficient as it rips through the table 4 times.
select
loc, edate, tag,
(select COUNT(*) from mylog as ml
where mvlog.loc = ml.loc
and mvlog.edate = CONVERT(date, DATEADD(ss, ml.entrytime, '19700101'))
and mvlog.tag = ml.tag) as visits,
(select SUM(entrywt - exitwt) from mylog as ml2
where mvlog.loc = ml2.loc
and mvlog.edate = CONVERT(date, DATEADD(ss, ml2.entrytime, '19700101'))
and mvlog.tag = ml2.tag) as consumed,
(select SUM(exittime - entrytime) from mylog as ml3
where mvlog.loc = ml3.loc
and mvlog.edate = CONVERT(date, DATEADD(ss, ml3.entrytime, '19700101'))
and mvlog.tag = ml3.tag) as occupancy
from
eventlogV as mvlog with (INDEX(pt_index))
Index pt_index is made up of columns tag and loc.
When I run this query, it completes in roughly 30 seconds. Since my query is inefficient, I am sure it can be better.
Any ideas appreciated.
Seems like you can just LEFT JOIN mylog to eventlogV once and get the same results.
SELECT mvlog.loc,
mvlog.edate,
mvlog.tag,
COUNT(ml.loc) AS visits,
SUM(entrywt - exitwt) AS consumed,
SUM(exittime - entrytime) AS occupancy
FROM eventlogV AS mvlog
LEFT OUTER JOIN mylog ml ON mvlog.loc = ml.loc
AND mvlog.edate = CONVERT(DATE,DATEADD(ss,ml.entrytime,'19700101'))
AND mvlog.tag = ml.tag
GROUP BY mvlog.loc,
mvlog.edate,
mvlog.tag
I have this query:
SELECT `s`.`time` , SUM( s.love ) AS total_love, SUM( s.sad ) AS total_sad, SUM( s.angry ) AS total_angry, SUM( s.happy ) AS total_happy
FROM (`employee_workshift` AS e)
JOIN `workshift` AS w ON `e`.`workshift_uuid` = `w`.`uuid`
JOIN `shift_summary` AS s ON `w`.`uuid` = `s`.`workshift_uuid`
WHERE `s`.`location_uuid` = '81956feb-3fd7-0e84-e9fe-b640434dfad0'
AND `e`.`employee_uuid` = '3866a979-bc5e-56cb-cede-863afc47b8b5'
AND `s`.`workshift_uuid` = '8c9dbd85-18a3-6ca9-e3f3-06eb602b6f38'
AND `s`.`time` >= CAST( '18:00:00' AS TIME )
AND `s`.`time` <= CAST( '00:00:00' AS TIME )
AND `s`.`date` LIKE '%2014-03%'
My problem is it returns "NULL" but when I changed my 'end_time' to "23:59:59", it returned the right data. I've got an idea to pull the hour of both 'start_time' and 'end_time' and then insert it in a loop to get everything between them.
$time_start = 15;
$time_end = 03;
So it should produce: 15,16,17,18,19,20,21,22,23,00,01,02,03
Then I'll compare them all. But this would take a lot of line and effort than just simply using "BETWEEN". Or should I just use "in_array"? Have you encountered this? I hope someone could help. Thanks.
19:00 is certainly bigger then 00:00 - so your approach should not work.
Try using full timestamp (including date) to get all data you need.
Try to use this query. I don't know your data structure so check INNER JOIN between s and s1 tables. The join must be one row to one row - the difference only in date. Date of s1 rows must be earlier on 1 day than s table rows.
SELECT s.time , SUM( s.love ) AS total_love, SUM( s.sad ) AS total_sad, SUM( s.angry ) AS total_angry, SUM( s.happy ) AS total_happy
FROM (employee_workshift AS e)
JOIN workshift AS w ON e.workshift_uuid = w.uuid
JOIN shift_summary AS s ON w.uuid = s.workshift_uuid
JOIN shift_summary AS s1 ON (w.uuid = s.workshift_uuid AND CAST(s.date as DATE)=CAST(s1.date as DATE)+1)
WHERE s.location_uuid = '81956feb-3fd7-0e84-e9fe-b640434dfad0'
AND e.employee_uuid = '3866a979-bc5e-56cb-cede-863afc47b8b5'
AND s.workshift_uuid = '8c9dbd85-18a3-6ca9-e3f3-06eb602b6f38'
AND s1.time >= CAST( '18:00:00' AS TIME )
AND s.time <= CAST( '00:00:00' AS TIME )
AND s.date LIKE '%2014-03%'
I have a monster query that I'm running against a SQL SERVER 2005 database that is acting very strange. I have two conditions in the WHERE clause of the outermost select, comparing a field to a constant date. When the constant dates are either identical (down to the second) or their date parts are not equal, the query runs in under 2 seconds. When the date parts are the same but the time parts are different, the query takes around 7 minutes to complete. Specifically, having a WHERE clause of
WHERE
d.date >= '2011-11-07 00:00:00' AND
d.date <= '2011-11-08 11:59:59'
works well and as expected. Changing the WHERE clause to
WHERE
d.date >= '2011-11-07 00:00:00' AND
d.date <= '2011-11-07 11:59:59'
causes the query to take many minutes.
I also noticed that when I turned off the index on the Agent_Hours table that the bad case of having the same dates the same reduces the query time to 25 seconds, still far longer than when they dates are different, but not by as much.
Below is the full query for reference (the WHERE clause in question is at the very end):
SELECT
s.transaction_id AS 'transaction',
s.created_on AS transaction_date,
s.first_name + ' ' + s.Last_Name AS customer_name,
a.name AS agent_name,
a.phantom AS phantom,
a.team AS agent_team,
a.id AS agent_number,
h.hours,
h2.hours_today,
d.*
FROM
(SELECT
agents.first_name + ' ' + agents.last_name AS name,
agents.id AS id,
agents.phantom AS phantom,
transient.value AS team,
transient.start_date AS team_start_date,
transient.end_date AS team_end_date
FROM
Agents.dbo.Agent_Static AS agents
JOIN
Agents.dbo.Agent_Transient AS transient
ON transient.agent = agents.id
WHERE
transient.field = 'team') AS a
LEFT JOIN Agents.dbo.Agent_Daily AS d
ON d.agent = a.id
LEFT JOIN (SELECT
agent_hours.agent AS agent,
dates.date AS date,
CAST(COUNT(*) AS FLOAT) / 4 AS hours
FROM
Agents.dbo.Agent_Hours AS agent_hours
JOIN
(SELECT
DISTINCT CONVERT(
VARCHAR(10),
hour_worked,
101)
AS date
FROM
Agents.dbo.Agent_Hours) AS dates
ON dates.date = CONVERT(
VARCHAR(10),
agent_hours.hour_worked,
101)
WHERE
(status = 'Phone' OR
status = 'Meeting')
GROUP BY
agent_hours.agent,
dates.date) AS h
ON h.agent = a.id AND
h.date = d.date
LEFT JOIN (SELECT
agent_hours.agent AS agent,
dates.date AS date,
CAST(COUNT(*) AS FLOAT) / 4 AS hours_today
FROM
Agents.dbo.Agent_Hours AS agent_hours
JOIN
(SELECT
DISTINCT CONVERT(
VARCHAR(10),
hour_worked,
101)
AS date
FROM
Agents.dbo.Agent_Hours) AS dates
ON dates.date = CONVERT(
VARCHAR(10),
agent_hours.hour_worked,
101)
WHERE
(status = 'Phone' OR
status = 'Meeting') AND
CONVERT(
VARCHAR(10),
CAST('11/09/2011 13:01' AS DATETIME),
101) = CONVERT(
VARCHAR(10),
agent_hours.hour_worked,
101) AND
CONVERT(
VARCHAR(10),
CAST('11/09/2011 13:01' AS DATETIME),
114) > CONVERT(
VARCHAR(10),
agent_hours.hour_worked,
114)
GROUP BY
agent_hours.agent,
dates.date) AS h2
ON h2.agent = a.id AND
h2.date = d.date
LEFT JOIN sale_transactions AS s
ON a.id = s.agent_hermes_id AND
s.created_on >= a.team_start_date AND
s.created_on <= a.team_end_date AND
CONVERT(
VARCHAR(10),
d.date,
101) = CONVERT(
VARCHAR(10),
s.created_on,
101)
LEFT JOIN sold_phrases AS p
ON s.Transaction_ID = p.transaction_id
WHERE
d.date >= '2011-11-07 00:00:00' AND
d.date <= '2011-11-07 11:59:59'
As a general rule, always post your exact table definition, including all indexes, when asking performance problems in SQL.
I cannot see any difference between the two cases, but considering your explanation, this is what likely happens: the cardinality estimates for the date range may trigger the index tipping point and you get wildly different execution plans. Such issues are best addressed by using plan guides, see Optimizing Queries in Deployed Applications by Using Plan Guides. You should be able to confirm if the problem is indeed the plan, see Displaying Graphical Execution Plans (SQL Server Management Studio).
This is maybe a micro optimization but have you consider changing the way you get the date part from datetime to DATEADD(dd, 0, DATEDIFF(dd, 0, datetime_format)). It's usually faster way than convert function.
SELECT
s.transaction_id AS 'transaction',
s.created_on AS transaction_date,
s.first_name + ' ' + s.Last_Name AS customer_name,
a.name AS agent_name,
a.phantom AS phantom,
a.team AS agent_team,
a.id AS agent_number,
h.hours,
h2.hours_today,
d.*
FROM (SELECT
agents.first_name + ' ' + agents.last_name AS name,
agents.id AS id,
agents.phantom AS phantom,
transient.value AS team,
transient.start_date AS team_start_date,
transient.end_date AS team_end_date
FROM
Agents.dbo.Agent_Static AS agents
JOIN
Agents.dbo.Agent_Transient AS transient
ON transient.agent = agents.id
WHERE
transient.field = 'team'
) AS a
LEFT JOIN Agents.dbo.Agent_Daily AS d ON d.agent = a.id
LEFT JOIN (
SELECT
agent_hours.agent AS agent,
dates.date AS date,
COUNT(*) / 4.0 AS hours
FROM Agents.dbo.Agent_Hours AS agent_hours
JOIN (
SELECT DATEADD(dd, 0, DATEDIFF(dd, 0, hour_worked)) as date
FROM Agents.dbo.Agent_Hours GROUP BY DATEADD(dd, 0, DATEDIFF(dd, 0, hour_worked))
) AS dates ON dates.date = DATEADD(dd, 0, DATEDIFF(dd, 0, agent_hours.hour_worked))
WHERE (status = 'Phone' OR status = 'Meeting')
GROUP BY agent_hours.agent, dates.date
) AS h ON h.agent = a.id AND h.date = d.date
LEFT JOIN (
SELECT
agent_hours.agent AS agent,
dates.date AS date,
COUNT(*) / 4.0 AS hours_today
FROM Agents.dbo.Agent_Hours AS agent_hours
JOIN (
SELECT DATEADD(dd, 0, DATEDIFF(dd, 0, hour_worked)) as date
FROM Agents.dbo.Agent_Hours GROUP BY DATEADD(dd, 0, DATEDIFF(dd, 0, hour_worked))
) AS dates ON dates.date = DATEADD(dd, 0, DATEDIFF(dd, 0, agent_hours.hour_worked))
WHERE
(status = 'Phone' OR status = 'Meeting') AND
agent_hours.hour_worked >=
DATEADD(dd, 0, DATEDIFF(dd, 0, CAST('11/09/2011 13:01' AS DATETIME)))
AND
agent_hours.hour_worked <
CAST('11/09/2011 13:01' AS DATETIME)
GROUP BY agent_hours.agent, dates.date
) AS h2 ON h2.agent = a.id AND h2.date = d.date
LEFT JOIN sale_transactions AS s
ON a.id = s.agent_hermes_id AND
s.created_on >= a.team_start_date AND
s.created_on <= a.team_end_date AND
DATEADD(dd, 0, DATEDIFF(dd, 0, d.date))
=
DATEADD(dd, 0, DATEDIFF(dd, 0, s.created_on))
LEFT JOIN sold_phrases AS p
ON s.Transaction_ID = p.transaction_id
WHERE
d.date >= '2011-11-07 00:00:00' AND
d.date <= '2011-11-07 11:59:59'
The more important (as Remus Rusanu already wrote) are indexes. Execute both queries and check which indexes are used in faster query and force SQL Server to use them always. You can do it using with(index(index_name)).
I have inherited a stored procedure and am having problems with it takes a very long time to run (around 3 minutes). I have played around with it, and without the where clause it actually only takes 12 seconds to run. None of the tables it references have a lot of data in them, can anybody see any reason why adding the main where clause below makes it take so much longer?
ALTER Procedure [dbo].[MissingReadingsReport] #SiteID INT,
#FormID INT,
#StartDate Varchar(8),
#EndDate Varchar(8)
As
If #EndDate > GetDate()
Set #EndDate = Convert(Varchar(8), GetDate(), 112)
Select Dt.FormID,
DT.FormDAte,
DT.Frequency,
Dt.DayOfWeek,
DT.NumberOfRecords,
Dt.FormName,
dt.OrgDesc,
Dt.CDesc
FROM (Select MeterForms.FormID,
MeterForms.FormName,
MeterForms.SiteID,
MeterForms.Frequency,
DateTable.FormDate,
tblOrganisation.OrgDesc,
CDesc = ( COMPANY.OrgDesc ),
DayOfWeek = CASE Frequency
WHEN 'Day' THEN DatePart(dw, DateTable.FormDate)
WHEN 'WEEK' THEN
DatePart(dw, MeterForms.FormDate)
END,
NumberOfRecords = CASE Frequency
WHEN 'Day' THEN (Select TOP 1 RecordID
FROM MeterReadings
Where
MeterReadings.FormDate =
DateTable.FormDate
And MeterReadings.FormID =
MeterForms.FormID
Order By RecordID DESC)
WHEN 'WEEK' THEN (Select TOP 1 ( FormDate )
FROM MeterReadings
Where
MeterReadings.FormDate >=
DateAdd(d
, -4,
DateTable.FormDate)
And MeterReadings.FormDate
<=
DateAdd(d, 3,
DateTable.FormDate)
AND MeterReadings.FormID =
MeterForms.FormID)
END
FROM MeterForms
INNER JOIN DateTable
ON MeterForms.FormDate <= DateTable.FormDate
INNER JOIN tblOrganisation
ON MeterForms.SiteID = tblOrganisation.pkOrgId
INNER JOIN tblOrganisation COMPANY
ON tblOrganisation.fkOrgID = COMPANY.pkOrgID
/*this is what makes the query run slowly*/
Where DateTable.FormDAte >= #StartDAte
AND DateTable.FormDate <= #EndDate
AND MeterForms.SiteID = ISNULL(#SiteID, MeterForms.SiteID)
AND MeterForms.FormID = IsNull(#FormID, MeterForms.FormID)
AND MeterForms.FormID > 0)DT
Where ( Frequency = 'Day'
And dt.NumberofRecords IS NULL )
OR ( ( Frequency = 'Week'
AND DayOfWeek = DATEPART (dw, Dt.FormDate) )
AND ( FormDate <> NumberOfRecords
OR dt.NumberofRecords IS NULL ) )
Order By FormID
Based on what you've already mentioned, it looks like the tables are properly indexed for columns in the join conditions but not for the columns in the where clause.
If you're not willing to change the query, it may be worth it to look into indexes defined on the where clause columns, specially that have the NULL check
Try replacing your select with this:
FROM
(select siteid, formid, formdate from meterforms
where siteid = isnull(#siteid, siteid) and
meterforms.formid = isnull(#formid, formid) and formid >0
) MeterForms
INNER JOIN
(select formdate from datetable where formdate >= #startdate and formdate <= #enddate) DateTable
ON MeterForms.FormDate <= DateTable.FormDate
INNER JOIN tblOrganisation
ON MeterForms.SiteID = tblOrganisation.pkOrgId
INNER JOIN tblOrganisation COMPANY
ON tblOrganisation.fkOrgID = COMPANY.pkOrgID
/*this is what makes the query run slowly*/
)DT
I would be willing to bet that if you moved the Meterforms where clauses up to the from statement:
FROM (select [columns] from MeterForms WHERE SiteID= ISNULL [etc] ) MF
INNER JOIN [etc]
It would be faster, as the filtering would occur before the join. Also, having your INNER JOIN on your DateTable doing a <= down in your where clause may be returning more than you'd like ... try moving that between up to a subselect as well.
Have you run an execution plan on this yet to see where the bottleneck is?
Random suggestion, coming from an Oracle background:
What happens if you rewrite the following:
AND MeterForms.SiteID = ISNULL(#SiteID, MeterForms.SiteID)
AND MeterForms.FormID = IsNull(#FormID, MeterForms.FormID)
...to
AND (#SiteID is null or MeterForms.SiteID = #SiteID)
AND (#FormID is null or MeterForms.FormID = #FormID)