SQL Query in CRM Report - sql

A "Case" in CRM has a field called "Status" with four options.
I'm trying to
build a report in CRM that fills a table with every week of the year (each row is a different week), and then counts the number of cases that have each Status option (the columns would be each of the Status options).
The table would look like this
Status 1 Status 2 Status 3
Week 1 3 55 4
Week 2 5 23 5
Week 3 14 11 33
So far I have the following:
SELECT
SUM(case WHEN status = 1 then 1 else 0 end) Status1,
SUM(case WHEN status = 2 then 1 else 0 end) Status2,
SUM(case WHEN status = 3 then 1 else 0 end) Status3,
SUM(case WHEN status = 4 then 1 else 0 end) Status4,
SUM(case WHEN status = 5 then 1 else 0 end) Status5
FROM [DB].[dbo].[Contact]
Which gives me the following:
Status 1 Status 2 Status 3
2 43 53
Now I need to somehow split this into 52 rows for the past year and filter these results by date (columns in the Contact table). I'm a bit new to SQL queries and CRM - any help here would be much appreciated.
Here is a SQLFiddle with my progress and sample data: http://sqlfiddle.com/#!2/85b19/1

Sounds like you want to group by a range. The trick is to create a new field that represents each range (for you one per year) and group by that.
Since it also seems like you want an infinite range of dates, marc_s has a good summary for how to do the group by trick with dates in a generic way: SQL group by frequency within a date range

So, let's break this down:
You want to make a report that shows, for each contact, a breakdown, week by week, of the number of cases registered to that contact, which is divided into three columns, one for each StateCode.
If this is the case, then you would need to have 52 date records (or so) for each contact. For calendar like requests, it's always good to have a separate calendar table that lets you query from it. Dan Guzman has a blog entry that creates a useful calendar table which I'll use in the query.
WITH WeekNumbers AS
(
SELECT
FirstDateOfWeek,
-- order by first date of week, grouping calendar year to produce week numbers
WeekNumber = row_number() OVER (PARTITION BY CalendarYear ORDER BY FirstDateOfWeek)
FROM
master.dbo.Calendar -- created from script
GROUP BY
FirstDateOfWeek,
CalendarYear
), Calendar AS
(
SELECT
WeekNumber =
(
SELECT
WeekNumber
FROM
WeekNumbers WN
WHERE
C.FirstDateOfWeek = WN.FirstDateOfWeek
),
*
FROM
master.dbo.Calendar C
WHERE
CalendarDate BETWEEN '1/1/2012' AND getutcdate()
)
SELECT
C.FullName,
----include the below if the data is necessary
--Cl.WeekNumber,
--Cl.CalendarYear,
--Cl.FirstDateOfWeek,
--Cl.LastDateOfWeek,
'Week: ' + CAST(Cl.WeekNumber AS VARCHAR(20))
+ ', Year: ' + CAST(Cl.CalendarYear AS VARCHAR(20)) WeekNumber
FROM
CRM.dbo.Contact C
-- use a cartesian join to produce a table list
CROSS JOIN
(
SELECT
DISTINCT WeekNumber,
CalendarYear,
FirstDateOfWeek,
LastDateOfWeek
FROM
Calendar
) Cl
ORDER BY
C.FullName,
Cl.WeekNumber
This is different from the solution Ben linked to because Marc's query only returns weeks where there is a matching value, whereas you may or may not want to see even the weeks where there is no activity.
Once you have your core tables of contacts split out week by week as in the above (or altered for your specific time period), you can simply add a subquery for each StateCode to see the breakdown in columns as in the final query below.
WITH WeekNumbers AS
(
SELECT
FirstDateOfWeek,
WeekNumber = row_number() OVER (PARTITION BY CalendarYear ORDER BY FirstDateOfWeek)
FROM
master.dbo.Calendar
GROUP BY
FirstDateOfWeek,
CalendarYear
), Calendar AS
(
SELECT
WeekNumber =
(
SELECT
WeekNumber
FROM
WeekNumbers WN
WHERE
C.FirstDateOfWeek = WN.FirstDateOfWeek
),
*
FROM
master.dbo.Calendar C
WHERE
CalendarDate BETWEEN '1/1/2012' AND getutcdate()
)
SELECT
C.FullName,
--Cl.WeekNumber,
--Cl.CalendarYear,
--Cl.FirstDateOfWeek,
--Cl.LastDateOfWeek,
'Week: ' + CAST(Cl.WeekNumber AS VARCHAR(20)) +', Year: ' + CAST(Cl.CalendarYear AS VARCHAR(20)) WeekNumber,
(
SELECT
count(*)
FROM
CRM.dbo.Incident I
INNER JOIN CRM.dbo.StringMap SM ON
I.StateCode = SM.AttributeValue
INNER JOIN
(
SELECT
DISTINCT ME.Name,
ME.ObjectTypeCode
FROM
CRM.MetadataSchema.Entity ME
) E ON
SM.ObjectTypeCode = E.ObjectTypeCode
WHERE
I.ModifiedOn >= Cl.FirstDateOfWeek
AND I.ModifiedOn < dateadd(day, 1, Cl.LastDateOfWeek)
AND E.Name = 'incident'
AND SM.AttributeName = 'statecode'
AND SM.LangId = 1033
AND I.CustomerId = C.ContactId
AND SM.Value = 'Active'
) ActiveCases,
(
SELECT
count(*)
FROM
CRM.dbo.Incident I
INNER JOIN CRM.dbo.StringMap SM ON
I.StateCode = SM.AttributeValue
INNER JOIN
(
SELECT
DISTINCT ME.Name,
ME.ObjectTypeCode
FROM
CRM.MetadataSchema.Entity ME
) E ON
SM.ObjectTypeCode = E.ObjectTypeCode
WHERE
I.ModifiedOn >= Cl.FirstDateOfWeek
AND I.ModifiedOn < dateadd(day, 1, Cl.LastDateOfWeek)
AND E.Name = 'incident'
AND SM.AttributeName = 'statecode'
AND SM.LangId = 1033
AND I.CustomerId = C.ContactId
AND SM.Value = 'Resolved'
) ResolvedCases,
(
SELECT
count(*)
FROM
CRM.dbo.Incident I
INNER JOIN CRM.dbo.StringMap SM ON
I.StateCode = SM.AttributeValue
INNER JOIN
(
SELECT
DISTINCT ME.Name,
ME.ObjectTypeCode
FROM
CRM.MetadataSchema.Entity ME
) E ON
SM.ObjectTypeCode = E.ObjectTypeCode
WHERE
I.ModifiedOn >= Cl.FirstDateOfWeek
AND I.ModifiedOn < dateadd(day, 1, Cl.LastDateOfWeek)
AND E.Name = 'incident'
AND SM.AttributeName = 'statecode'
AND SM.LangId = 1033
AND I.CustomerId = C.ContactId
AND SM.Value = 'Canceled'
) CancelledCases
FROM
CRM.dbo.Contact C
CROSS JOIN
(
SELECT
DISTINCT WeekNumber,
CalendarYear,
FirstDateOfWeek,
LastDateOfWeek
FROM
Calendar
) Cl
ORDER BY
C.FullName,
Cl.WeekNumber

Related

SQL Case When Slowing Down Query

What I'm looking to do is quantify the total value of purchases and the number of months in which a purchase was made within three different timeframes by account. I only want to look at accounts who made a purchase between 1-1-2020 and 4-1-2021.
I'm wondering if there is a more streamlined way to pull in the fields I'm creating using CASE WHEN below (maybe through a series of queries to create the calculations and the left joining?). This query is taking extremely long to pull back, so I'd like to enhance this code where I can. All of my code and desired output is listed below. Thank you!
Creating a temporary table to pull account numbers:
DROP TABLE IF EXISTS #accounts
SELECT DISTINCT s.account_no, c.code, c.code_desc
INTO #accounts
FROM sales AS s
LEFT JOIN customer AS c ON s.account_no = c.account_no
WHERE s.tran_date BETWEEN '2020-01-01' AND '2021-04-01'
GROUP BY s.account_no, c.code, c.code_desc;
Confirming row counts:
SELECT COUNT (*)
FROM #accounts
ORDER BY account_no;
Creating Sales and Sales period count columns for three timeframes:
SELECT
s.account_no, c.code, c.code_desc
SUM(CASE
WHEN s.tran_date BETWEEN '2020-01-01' AND '2021-04-01'
THEN VALUE_USD
END) AS Total_Spend_Pre,
SUM(CASE
WHEN s.tran_date BETWEEN '2021-04-01' AND '2022-03-31'
THEN VALUE_USD
END) Total_Spend_During,
SUM(CASE
WHEN s.tran_date > '2022-04-01'
THEN VALUE_USD
END) Total_Spend_Post,
COUNT(DISTINCT CASE WHEN s.tran_date BETWEEN '2020-01-01' AND '2021-04-01' THEN CONCAT(s.bk_month, s.bk_year) END) Pre_Periods,
COUNT(DISTINCT CASE WHEN s.tran_date BETWEEN '2021-04-01' AND '2022-03-31' THEN CONCAT(s.bk_month, s.bk_year) END) During_Periods,
COUNT(DISTINCT CASE WHEN s.tran_date > '2022-04-01' THEN CONCAT(s.bk_month, s.bk_year) END) Post_Periods
FROM
sales AS s
LEFT JOIN
customer AS c ON s.account_no = c.account_no
WHERE
c.account_no IN (SELECT DISTINCT account_no
FROM #accounts)
GROUP BY
s.account_no, c.code, c.code_desc;
Desired output:
account_no
code
code_desc
Total_Spend_Pre
Total_Spend_During
Total_Spend_Post
Pre_Periods
During_Periods
Post_Periods
25
1234
OTHER
1000
2005
500
2
14
5
11
5678
PC
500
100
2220
5
11
2
You may use your date ranges to join with dataset, and 'Tag' your result like below, this will result in 3 rows, for each group. If you need them in a single row, have PIVOTE over it
;With DateRanges AS (
SELECT CAST('2020-01-01' AS DATE) StartDate, CAST('2021-04-01' AS DATE) EndDate, 'Pre' Tag UNION
SELECT '2021-04-01', '2022-03-31', 'During' UNION
SELECT '2022-04-01', Null, 'Post'
)
SELECT s.account_no, c.code, c.code_desc, d.Tag,
SUM(VALUE_USD) AS Total_Spend,
COUNT(DISTINCT CONCAT(s.bk_month, s.bk_year)) RecordCount
FROM sales as s
LEFT JOIN customer as c
INNER JOIN DateRanges D ON s.tran_date BETWEEN D.StartDate AND ISNULL(D.EndDate,s.tran_date)
ON s.account_no = c.account_no
WHERE c.account_no IN (SELECT DISTINCT account_no FROM #accounts)
GROUP BY s.account_no, c.code, c.code_desc;
with [cte_accountActivityPeriods] as (
select [PeriodOrdinal] = 1, [PeriodName] = 'Total Spend Pre', [PeriodStart] = convert(date,'2020-01-01',23) , [PeriodFinish] = convert(date,'2021-03-31',23) union
select [PeriodOrdinal] = 2, [PeriodName] = 'Total Spend During', [PeriodStart] = convert(date,'2021-04-01',23) , [PeriodFinish] = convert(date,'2022-03-31',23) union
select [PeriodOrdinal] = 3, [PeriodName] = 'Total Spend Post', [PeriodStart] = convert(date,'2022-04-01',23) , [PeriodFinish] = convert(date,'9999-12-31',23)
)
, [cte_allsalesForActivityPeriod]
SELECT s.account_no, bk_month, bk_year, [PeriodOrdinal], s.tran_date, s.value_usd
FROM sales as s
cross join [cte_accountActivityPeriods]
on s.[tran_date] between [cte_ActivityPeriods].[PeriodStart] and [cte_ActivityPeriods].[PeriodFinish]
)
, [cte_uniqueAccounts] as ( /*Unique and qualifying Accounts*/
select distinct account_no from [cte_allsalesForActivityPeriod]
inner join #accounts accs on accs.[account_no] = [cte_allsalesForActivityPeriod].[account_no]
)
, [cte_AllSalesAggregatedByPeriod] as (
select account_no, [PeriodOrdinal], bk_month, bk_year, [PeriodTotalSpend] = sum([value_usd])
from [cte_allsalesForActivityPeriod]
group by s.account_no, [PeriodOrdinal], bk_month, bk_year
)
, [cte_PeriodAnalysis] as (
select account_no, [PeriodOrdinal], [ActivePeriods] = count(distinct concat(bk_month, bk_year))
from [cte_AllSalesAggregatedByPeriod]
group by s.account_no, [PeriodOrdinal]
)
, [cte_pivot_clumsily] as (
/* Aggregations already done - so simple pivot */
select [cte_uniqueAccounts].[account_no]
, [Total_Spend_Pre] = case when [SaleVal].[PeriodOrdinal] in (1) then [SaleVal].[PeriodTotalSpend] else 0 end
, [Total_Spend_During] = case when [SaleVal].[PeriodOrdinal] in (2) then [SaleVal].[PeriodTotalSpend] else 0 end
, [Total_Spend_Post] = case when [SaleVal].[PeriodOrdinal] in (3) then [SaleVal].[PeriodTotalSpend] else 0 end
, [Pre_Periods] = case when [SalePrd].[PeriodOrdinal] in (1) then [SalePrd].[ActivePeriods] else 0 end
, [During_Periods] = case when [SalePrd].[PeriodOrdinal] in (2) then [SalePrd].[ActivePeriods] else 0 end
, [Post_Periods] = case when [SalePrd].[PeriodOrdinal] in (3) then [SalePrd].[ActivePeriods] else 0 end
from [cte_uniqueAccounts]
left join [cte_AllSalesAggregatedByPeriod] [SaleVal] on [SaleVal].[account_no] = [cte_uniqueAccounts].[account_no]
left join [cte_PeriodAnalysis] [SalePrd] on [SalePrd].[account_no] = [cte_uniqueAccounts].[account_no]
)
select c.code, c.code_desc, [cte_pivot_clumsily].*
from [cte_pivot_clumsily]
LEFT JOIN customer as c
ON [cte_pivot_clumsily].account_no = c.account_no

SELECT list expression references column integration_start_date which is neither grouped nor aggregated at

I'm facing an issue with the following query. It gave me this error [SELECT list expression references column integration_start_date which is neither grouped nor aggregated at [34:63]]. In particular, it points to the first 'when' in the result table, which I don't know how to fix. This is on BigQuery if that helps. I see everything is written correctly or I could be wrong. Seeking for help.
with plan_data as (
select format_date("%Y-%m-%d",last_day(date(a.basis_date))) as invoice_date,
a.sponsor_id as sponsor_id,
b.company_name as sponsor_name,
REPLACE(SUBSTR(d.meta,STRPOS(d.meta,'merchant_id')+12,13),'"','') as merchant_id,
a.state as plan_state,
date(c.start_date) as plan_start_date,
a.employee_id as square_employee_id,
date(
(select min(date)
from glproductionview.stats_sponsors
where sponsor_id = a.sponsor_id and sponsor_payroll_provider_identifier = 'square' and date >= c.start_date) )
as integration_start_date,
count(distinct a.employee_id) as eligible_pts_count, --pts that are in active plan and have payroll activities (payroll deductions) in the reporting month
from glproductionview.payroll_activities as a
left join glproductionview.sponsors as b
on a.sponsor_id = b.id
left join glproductionview.dc_plans as c
on a.plan_id = c.id
left join glproductionview.payroll_connections as d
on a.sponsor_id = d.sponsor_id and d.provider_identifier = 'rocket' and a.company_id = d.payroll_id
where a.payroll_provider_identifier = 'rocket'
and format_date("%Y-%m",date(a.basis_date)) = '2021-07'
and a.amount_cents > 0
group by 1,2,3,4,5,6,7,8
order by 2 asc
)
select invoice_date,
sponsor_id,
sponsor_name,
eligible_pts_count,
case
when eligible_pts_count <= 5 and date_diff(current_date(),integration_start_date, month) <= 12 then 20
when eligible_pts_count <= 5 and date_diff(current_date(),integration_start_date, month) > 12 then 15
when eligible_pts_count > 5 and date_diff(current_date(),integration_start_date, month) <= 12 then count(distinct square_employee_id)*4
when eligible_pts_count > 5 and date_diff(current_date(),integration_start_date, month) > 12 then count(distinct square_employee_id)*3
else 0
end as fees
from plan_data
group by 1,2,3,4;

Fill in blank dates for rolling average - CTE in Snowflake

I have two tables – activity and purchase
Activity table:
user_id date videos_watched
1 2020-01-02 3
1 2020-01-04 5
1 2020-01-07 5
Purchase table:
user_id purchase_date
1 2020-01-01
2 2020-02-02
What I would like to do is to get a 30 day rolling average since purchase on how many videos has been watched.
The base query is like this:
SELECT
DATEDIFF(DAY, p.purchase_date, a.date) AS day_since_purchase,
AVG(A.VIDEOS_VIEWED)
FROM PURCHASE P
LEFT OUTER JOIN ACTIVITY A ON P.USER_ID = A.USER_ID AND
A.DATE >= P.PURCHASE_DATE AND A.DATE <= DATEADD(DAY, 30, P.PURCHASE_DATE)
GROUP BY 1;
However, the Activity table only has records for each day a video has been logged. I would like to fill in the blanks for days a video has not been viewed.
I have started to look into using a CTE like this:
WITH cte AS (
SELECT date('2020-01-01') as fdate
UNION ALL
SELECT CAST(DATEADD(day,1,fdate) as date)
FROM cte
WHERE fdate < date('2020-04-01')
) select * from cte
cross join purchases p
left outer join activity a
on p.user id = a.user_id
and a.fdate = p.purchase_date
and a.date >= p.purchase_date and a.date <= dateadd(day, 30, p.purchase_date)
The end goal is to have something like this:
days_since_purchase videos_watched
1 3
2 0 --CTE coalesce inserted value
3 0
4 5
Been trying for the last couple of hours to get it right, but still can't really get the hang of it.
If you want to fill in the gaps in the result set, then I think you should be generating integers rather than dates:
WITH cte AS (
SELECT 1 as day_since_purchase
UNION ALL
SELECT 1 + day_since_purchase
FROM cte
WHERE day_since_purchase < 4
)
SELECT cte.day_since_purchase, COALESCE(avg_videos_viewed, 0)
FROM cte LEFT JOIN
(SELECT DATEDIFF(DAY, p.purchase_date, a.date) AS day_since_purchase,
AVG(A.VIDEOS_VIEWED) as avg_videos_viewed
FROM purchases p JOIN
activity a
ON p.user id = a.user_id AND
a.fdate = p.purchase_date AND
a.date >= p.purchase_date AND
a.date <= dateadd(day, 30, p.purchase_date)
GROUP BY 1
) pa
ON pa.day_since_purchase = cte.day_since_purchase;
You can use a recursive query to generate the 30 days following each purchase, then bring the activity table:
with cte as (
select
purchase_date,
client_id,
0 days_since_purchase,
purchase_date dt
from purchases
union all
select
purchase_date,
client_id,
days_since_purchase + 1
dateadd(day, days_since_purchase + 1, purchase_date)
from cte
where days_since_purchase < 30
)
select
c.days_since_purchase,
avg(colaesce(a. videos_watch, 0)) avg_ videos_watch
from cte c
left join activity a
on a.client_id = c.client_id
and a.fdate = c.purchase_date
and a.date = c.dt
group by c.days_since_purchase
Your question is unclear on whether you have a column in the activity table that stores the purchase date each row relates to. Your query has column fdate but not your sample data. I used that column in the query (without such column, you might end up counting the same activity in different purchases).

SQL Server - Aggregate the number of errors per user per day

The application that I maintain stores user errors in a SQL Server table. When an error occurs it jots down the username of the person that caused it, the time, the error message, and some other housekeeping stuff.
I'm trying to build out a report where we could see the "top 3" errors each day for the past year - the three users with the most errors on each day, the three most common types of errors on each day, etc.
My goal is something like this:
DATE USER1 ERR_COUNT1 USER2 ERR_COUNT2 USER3 ERR_COUNT3
1/1/18 BOB 70 BILL 50 JOE 30
1/2/18 JILL 55 JOY 30 BOB 20
...
I've got a rough loop set up to pull this data from our error logs but when I run it I get the error There is already an object named '#TempErrorLog'in the database. Loop code below:
DECLARE #StartDate AS DATE,
#EndDate AS DATE
SET #StartDate = '2018.1.1'
WHILE #StartDate <= CONVERT(DATE, GETDATE())
BEGIN
SET #EndDate = DATEADD(DAY, 1, #StartDate)
SELECT #StartDate AS date_err,
( u.name_frst+' '+u.name_lst ) AS user_err,
COUNT(e.id_err) AS count_err
INTO dbo.#TempErrLog
FROM err_log AS e
LEFT JOIN users AS u ON e.id_user = u.id_user
WHERE e.dttm_err >= #StartDate AND
e.dttm_err < #EndDate AND
e.id_user <> 'system'
GROUP BY ( u.name_frst+' '+u.name_lst )
ORDER BY count_err DESC;
SET #StartDate = DATEADD(DAY, 1, #StartDate)
CONTINUE
END
SELECT * FROM #TempErrLog
My guess is that it is trying to create a new temporary table each time the loop iterates. Is there a better approach I should be using here?
You can pivot this using conditional aggregation and row_number(). For the results in your question:
with ue as (
select e.*, (u.name_frst + ' ' + u.name_lst ) as user_name,
cast(e.dttm_err as date) as err_date
from err_log e join
users u
on e.id_user = u.id_user
)
select u.err_date,
max(case when u.seqnum = 1 then u.user_name end) as user_1,
max(case when u.seqnum = 1 then u.cnt end) as cnt_1,
max(case when u.seqnum = 2 then u.user_name end) as user_2,
max(case when u.seqnum = 2 then u.cnt end) as cnt_2,
max(case when u.seqnum = 3 then u.user_name end) as user_3,
max(case when u.seqnum = 3 then u.cnt end) as cnt_3
from (select err_date, user_name, count(*) as cnt,
row_number() over (partition by err_date order by count(*) desc) as seqnum
from ul
group by err_date, user_name
) u
group by u.err_date

Aggregating a sub query within query

I am currently working on aggregating the sum qty of "OUT" and "OUT+IN".
Current query is the following:
Select
a.Date
,a.DepartmentID
from
(Select
dris.Date
,dris.RentalItemKey
,dris.WarehouseKey
,ISNULL((Select TOP 1 dris.Date where OutQty=1 order by Date DESC),(Select ri.ReceiveDate from RentalItem ri where ri.RentalItemKey=dris.RentalItemKey)) as LastOutDate
,(Select d.DepartmentKey from Department d where d.Department=i.Department)as DepartmentID
, (CASE WHEN OutQty=1 OR (RepairQty=1 AND RentedQty=1) THEN 'IN' ELSE 'OUT' END) as Status
from DailyRentalItemStatus dris
inner join Inventory i on i.InventoryKey=dris.InventoryKey
where dris.Date='2014-08-02'
and i.ICode='3223700'
and i.Classification IN ('ITEM', 'ACCESSORY')
and i.AvailFor='RENT'
and i.AvailFrom='WAREHOUSE'
and dris.Warehouse='TORONTO')a
and I would like the result to be the following:
Date WarehouseID DepartmentID ICode Owned NotRedundant Out
2014-08-02 001T A00G 3223700 30 30 19
Where Owned is is The items with status as "OUT+IN", out is "OUT" and Not Redundant as where the lastout date is within the last 2 years from the date.
Help would be greatly appreciated.
I think this is close to what you're looking for. Your Not Redundant description, is hard to understand. Which dates are you comparing. The same trick for OUT may be used for that though.
My query also assumes that you always have a department connecting to the inventory table and that there's always a rentalitem.receivedate.
;WITH LastOut as
(Select Max(Date) as LastOutDate, rentalItemKey
from DailyRentalItemStatus
WHERE OutQty=1
)
Select
dris.Date
,dris.WarehouseKey as WarehouseID
,d.DepartmentKey as DepartmentID
, i.Icode
--,ISNULL((Select TOP 1 dris.Date where OutQty=1 order by Date DESC),(Select ri.ReceiveDate from RentalItem ri where ri.RentalItemKey=dris.RentalItemKey)) as LastOutDate
, Count(1) as Owned
, Sum(CASE WHEN NOT (OutQty=1 OR (RepairQty=1 AND RentedQty=1)) THEN 1 ELSE 0 END) as OUT
, Sum(CASE WHEN DateAdd(yy, 2,dris.[date]) >= ISNULL(lastout.lastoutdate, ri.ReceiveDate) then 1 else 0 end) as NonRedundent
from DailyRentalItemStatus dris
inner join Inventory i on i.InventoryKey=dris.InventoryKey
INNER JOIN Department d ON d.Department=i.Department
INNER JOIN RentalItem ri ON ri.RentalItemKey=dris.RentalItemKey
LEFT OUTER JOIN LastOUT ON LastOut.rentalItemKey=dris.RentalItemKey
where dris.Date='2014-08-02'
and i.ICode='3223700'
and i.Classification IN ('ITEM', 'ACCESSORY')
and i.AvailFor='RENT'
and i.AvailFrom='WAREHOUSE'
and dris.Warehouse='TORONTO'
Group BY dris.Date, d.DepartmentKey, Dris.WarehouseKey , i.icode