SQL Server OVER() clause not behaving as expected - sql

I'm using the OVER() clause to get the running total of payments based on account number and the date the payment was made. I'm then subtracting that running total from the current balance to determine the balance after each transaction was made
SELECT
no_,
amount,
SUM(amount) OVER(PARTITION BY no_ ORDER BY effectiveDate DESC) AS RunningTotal,
balance - (SUM(amount) OVER(PARTITION BY no_ ORDER BY effectiveDate DESC)) + amount AS CalculatedBalance,
balance
FROM
c
WHERE
status != 'closed'
ORDER BY
no_
It works fine for positive numbers, but when the amount field is a negative number, I get weird outputs like below:
This happens for all negative numbers, I've checked all my positive numbers and they are correct. I looked online and I can't find a reason for OVER() not accepting negative numbers

Why are you doing the calculation in reverse? I would expect logic more like this:
SELECT no_, amount,
SUM(amount) OVER (PARTITION BY no_ ORDER BY effectiveDate ASC) AS RunningTotal,
(balance + amount +
SUM(amount) OVER (PARTITION BY no_ ORDER BY effectiveDate ASC)
) AS CalculatedBalance,
balance
FROM c
WHERE status <> 'closed'
ORDER BY no_;
In addition, because balance is negative, you want to add the amounts rather than subtract them.

I think you should be made aware of the ROWS BETWEEN PRECEDING clause.
In the window frame clause, you indicate the window frame units (ROWS
or RANGE) and the window frame extent (the delimiters). With the ROWS
window frame unit, you can indicate the delimiters as one of three
options:
■■ UNBOUNDED PRECEDING or FOLLOWING, meaning the beginning or end of
the partition, respectively
■■ CURRENT ROW, obviously representing the current row
■■ ROWS PRECEDING or FOLLOWING, meaning n rows before or after the
current, respectively
Example with syntax:
TEMP QUERY TO GROUP THE DAYS FOUND IN THE DATASET (1..7..infinity)
, DENSE_RANK()
OVER(PARTITION BY tbl.Account ORDER BY tbl.Date DESC)
AS [calcDayRankdenseGroup]
FINAL RESULTS:
select
cte.*
, SUM(cte.Amount)
OVER(PARTITION BY cte.Account ORDER BY cte.calcDayRankdenseGroup ASC
ROWS BETWEEN 7 PRECEDING and CURRENT ROW)
AS [Amount_RunningTotalDaysLast7]
from cteDateRankAdded cte
order by Account, DATE DESC
Data returned from my testing data (seen below):
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| Account | Amount | DATE | calcDayRankGroup_DoNotUse | calcDayRankdenseGroup | calcDayCountInGroup | Amount_RunningTotalDaysLast7 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 30.00 | 2018-09-16 | 1 | 1 | 1 | 30.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 69.00 | 2018-09-16 | 1 | 1 | 2 | 99.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 6.00 | 2018-09-13 | 3 | 2 | 1 | 105.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 57.00 | 2018-09-12 | 4 | 3 | 1 | 162.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 13.00 | 2018-09-12 | 4 | 3 | 2 | 175.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 98.00 | 2018-09-12 | 4 | 3 | 3 | 273.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 47.00 | 2018-09-03 | 7 | 4 | 1 | 320.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 90.00 | 2018-09-02 | 8 | 5 | 1 | 410.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 90.00 | 2018-09-02 | 8 | 5 | 2 | 470.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 32.00 | 2018-08-29 | 10 | 6 | 1 | 433.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 50.00 | 2018-08-24 | 11 | 7 | 1 | 477.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 48.00 | 2018-08-24 | 11 | 7 | 2 | 468.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 100.00 | 2018-08-23 | 13 | 8 | 1 | 555.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 63.00 | 2018-08-20 | 14 | 9 | 1 | 520.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 49.00 | 2018-08-19 | 15 | 10 | 1 | 522.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 38.00 | 2018-09-16 | 1 | 1 | 1 | 38.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 3.00 | 2018-09-16 | 1 | 1 | 2 | 41.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 83.00 | 2018-09-13 | 3 | 2 | 1 | 124.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 28.00 | 2018-09-12 | 4 | 3 | 1 | 152.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 78.00 | 2018-09-12 | 4 | 3 | 2 | 230.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 32.00 | 2018-09-12 | 4 | 3 | 3 | 262.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 29.00 | 2018-09-03 | 7 | 4 | 1 | 291.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 64.00 | 2018-09-02 | 8 | 5 | 1 | 355.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 81.00 | 2018-09-02 | 8 | 5 | 2 | 398.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 60.00 | 2018-08-29 | 10 | 6 | 1 | 455.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 82.00 | 2018-08-24 | 11 | 7 | 1 | 454.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 66.00 | 2018-08-24 | 11 | 7 | 2 | 492.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 9.00 | 2018-08-23 | 13 | 8 | 1 | 423.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 45.00 | 2018-08-20 | 14 | 9 | 1 | 436.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 43.00 | 2018-08-19 | 15 | 10 | 1 | 450.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
Here is the sample data I created to represent your scenario, with today being 9/18/2018.
+-----+---------+------------+--------+------------------+------------------+
| Seq | Account | DATE | Amount | dataDaysAgoDelta | dataDaysAgoGroup |
+-----+---------+------------+--------+------------------+------------------+
| 1 | 1 | 2018-08-19 | 49.00 | 30 | 10 |
+-----+---------+------------+--------+------------------+------------------+
| 2 | 1 | 2018-08-20 | 63.00 | 29 | 9 |
+-----+---------+------------+--------+------------------+------------------+
| 3 | 1 | 2018-08-23 | 100.00 | 26 | 8 |
+-----+---------+------------+--------+------------------+------------------+
| 4 | 1 | 2018-08-24 | 50.00 | 25 | 7 |
+-----+---------+------------+--------+------------------+------------------+
| 5 | 1 | 2018-08-24 | 48.00 | 25 | 7 |
+-----+---------+------------+--------+------------------+------------------+
| 6 | 1 | 2018-08-29 | 32.00 | 20 | 6 |
+-----+---------+------------+--------+------------------+------------------+
| 7 | 1 | 2018-09-02 | 90.00 | 16 | 5 |
+-----+---------+------------+--------+------------------+------------------+
| 8 | 1 | 2018-09-02 | 90.00 | 16 | 5 |
+-----+---------+------------+--------+------------------+------------------+
| 9 | 1 | 2018-09-03 | 47.00 | 15 | 4 |
+-----+---------+------------+--------+------------------+------------------+
| 10 | 1 | 2018-09-12 | 57.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 11 | 1 | 2018-09-12 | 13.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 12 | 1 | 2018-09-12 | 98.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 13 | 1 | 2018-09-13 | 6.00 | 5 | 2 |
+-----+---------+------------+--------+------------------+------------------+
| 14 | 1 | 2018-09-16 | 30.00 | 2 | 1 |
+-----+---------+------------+--------+------------------+------------------+
| 15 | 1 | 2018-09-16 | 69.00 | 2 | 1 |
+-----+---------+------------+--------+------------------+------------------+
| 16 | 2 | 2018-08-19 | 43.00 | 30 | 10 |
+-----+---------+------------+--------+------------------+------------------+
| 17 | 2 | 2018-08-20 | 45.00 | 29 | 9 |
+-----+---------+------------+--------+------------------+------------------+
| 18 | 2 | 2018-08-23 | 9.00 | 26 | 8 |
+-----+---------+------------+--------+------------------+------------------+
| 19 | 2 | 2018-08-24 | 82.00 | 25 | 7 |
+-----+---------+------------+--------+------------------+------------------+
| 20 | 2 | 2018-08-24 | 66.00 | 25 | 7 |
+-----+---------+------------+--------+------------------+------------------+
| 21 | 2 | 2018-08-29 | 60.00 | 20 | 6 |
+-----+---------+------------+--------+------------------+------------------+
| 22 | 2 | 2018-09-02 | 64.00 | 16 | 5 |
+-----+---------+------------+--------+------------------+------------------+
| 23 | 2 | 2018-09-02 | 81.00 | 16 | 5 |
+-----+---------+------------+--------+------------------+------------------+
| 24 | 2 | 2018-09-03 | 29.00 | 15 | 4 |
+-----+---------+------------+--------+------------------+------------------+
| 25 | 2 | 2018-09-12 | 28.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 26 | 2 | 2018-09-12 | 78.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 27 | 2 | 2018-09-12 | 32.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 28 | 2 | 2018-09-13 | 83.00 | 5 | 2 |
+-----+---------+------------+--------+------------------+------------------+
| 29 | 2 | 2018-09-16 | 38.00 | 2 | 1 |
+-----+---------+------------+--------+------------------+------------------+
| 30 | 2 | 2018-09-16 | 3.00 | 2 | 1 |
+-----+---------+------------+--------+------------------+------------------+
COMPLETE QUERY:
DECLARE #tblDaysLast7Found as table
(Seq int, Account int, DATE date, Amount numeric(10,2), dataDaysAgoDelta int, dataDaysAgoGroup int)
INSERT INTO #tblDaysLast7Found
(Seq, Account, DATE, Amount, dataDaysAgoDelta, dataDaysAgoGroup)
VALUES
(1, 1, '8/19/2018', 49, 30, 10)
,(2, 1, '8/20/2018', 63, 29, 9)
,(3, 1, '8/23/2018', 100, 26, 8)
,(4, 1, '8/24/2018', 50, 25, 7)
,(5, 1, '8/24/2018', 48, 25, 7)
,(6, 1, '8/29/2018', 32, 20, 6)
,(7, 1, '9/2/2018', 90, 16, 5)
,(8, 1, '9/2/2018', 90, 16, 5)
,(9, 1, '9/3/2018', 47, 15, 4)
,(10, 1, '9/12/2018', 57, 6, 3)
,(11, 1, '9/12/2018', 13, 6, 3)
,(12, 1, '9/12/2018', 98, 6, 3)
,(13, 1, '9/13/2018', 6, 5, 2)
,(14, 1, '9/16/2018', 30, 2, 1)
,(15, 1, '9/16/2018', 69, 2, 1)
,(16, 2, '8/19/2018', 43, 30, 10)
,(17, 2, '8/20/2018', 45, 29, 9)
,(18, 2, '8/23/2018', 9, 26, 8)
,(19, 2, '8/24/2018', 82, 25, 7)
,(20, 2, '8/24/2018', 66, 25, 7)
,(21, 2, '8/29/2018', 60, 20, 6)
,(22, 2, '9/2/2018', 64, 16, 5)
,(23, 2, '9/2/2018', 81, 16, 5)
,(24, 2, '9/3/2018', 29, 15, 4)
,(25, 2, '9/12/2018', 28, 6, 3)
,(26, 2, '9/12/2018', 78, 6, 3)
,(27, 2, '9/12/2018', 32, 6, 3)
,(28, 2, '9/13/2018', 83, 5, 2)
,(29, 2, '9/16/2018', 38, 2, 1)
,(30, 2, '9/16/2018', 3, 2, 1)
--select * from #tblDaysLast7Found
;WITH cteDateRankAdded AS
(
select -- *
tbl.Account
, tbl.Amount
, tbl.DATE
, RANK()
OVER(PARTITION BY tbl.Account ORDER BY tbl.Date DESC)
AS [calcDayRankGroup_DoNotUse]
, DENSE_RANK()
OVER(PARTITION BY tbl.Account ORDER BY tbl.Date DESC)
AS [calcDayRankdenseGroup]
, ROW_NUMBER()
OVER(PARTITION BY tbl.Account, tbl.Date ORDER BY tbl.Date DESC)
AS [calcDayCountInGroup]
from
#tblDaysLast7Found tbl
)
select
cte.*
, SUM(cte.Amount)
OVER(PARTITION BY cte.Account ORDER BY cte.calcDayRankdenseGroup ASC
ROWS BETWEEN 7 PRECEDING and CURRENT ROW)
AS [Amount_RunningTotalDaysLast7]
from cteDateRankAdded cte
order by Account, DATE DESC

Related

SQL value of one column based on max values in other selected rows

I am using the Northwind sample database table and I would like to find the top categoryId for each supplierId...
+-----------+----------------------------------+------------+------------+
| ProductID | ProductName | SupplierID | CategoryID |
+-----------+----------------------------------+------------+------------+
| 1 | Chai | 1 | 1 |
+-----------+----------------------------------+------------+------------+
| 2 | Chang | 1 | 1 |
+-----------+----------------------------------+------------+------------+
| 3 | Aniseed Syrup | 1 | 2 |
+-----------+----------------------------------+------------+------------+
| 4 | Chef Anton's Cajun Seasoning | 2 | 2 |
+-----------+----------------------------------+------------+------------+
| 5 | Chef Anton's Gumbo Mix | 2 | 2 |
+-----------+----------------------------------+------------+------------+
| 6 | Grandma's Boysenberry Spread | 3 | 2 |
+-----------+----------------------------------+------------+------------+
| 7 | Uncle Bob's Organic Dried Pears | 3 | 7 |
+-----------+----------------------------------+------------+------------+
| 8 | Northwoods Cranberry Sauce | 3 | 2 |
+-----------+----------------------------------+------------+------------+
| 9 | Mishi Kobe Niku | 4 | 6 |
+-----------+----------------------------------+------------+------------+
| 10 | Ikura | 4 | 8 |
+-----------+----------------------------------+------------+------------+
| 11 | Queso Cabrales | 5 | 4 |
+-----------+----------------------------------+------------+------------+
| 12 | Queso Manchego La Pastora | 5 | 4 |
+-----------+----------------------------------+------------+------------+
| 13 | Konbu | 6 | 8 |
+-----------+----------------------------------+------------+------------+
| 14 | Tofu | 6 | 7 |
+-----------+----------------------------------+------------+------------+
| 15 | Genen Shouyu | 6 | 2 |
+-----------+----------------------------------+------------+------------+
| 16 | Pavlova | 7 | 3 |
+-----------+----------------------------------+------------+------------+
| 17 | Alice Mutton | 7 | 6 |
+-----------+----------------------------------+------------+------------+
| 18 | Carnarvon Tigers | 7 | 8 |
+-----------+----------------------------------+------------+------------+
| 19 | Teatime Chocolate Biscuits | 8 | 3 |
+-----------+----------------------------------+------------+------------+
| 20 | Sir Rodney's Marmalade | 8 | 3 |
+-----------+----------------------------------+------------+------------+
| 21 | Sir Rodney's Scones | 8 | 3 |
+-----------+----------------------------------+------------+------------+
| 22 | Gustaf's Knäckebröd | 9 | 5 |
+-----------+----------------------------------+------------+------------+
| 23 | Tunnbröd | 9 | 5 |
+-----------+----------------------------------+------------+------------+
| 24 | Guaraná Fantástica | 10 | 1 |
+-----------+----------------------------------+------------+------------+
| 25 | NuNuCa Nuß-Nougat-Creme | 11 | 3 |
+-----------+----------------------------------+------------+------------+
| 26 | Gumbär Gummibärchen | 11 | 3 |
+-----------+----------------------------------+------------+------------+
| 27 | Schoggi Schokolade | 11 | 3 |
+-----------+----------------------------------+------------+------------+
| 28 | Rössle Sauerkraut | 12 | 7 |
+-----------+----------------------------------+------------+------------+
| 29 | Thüringer Rostbratwurst | 12 | 6 |
+-----------+----------------------------------+------------+------------+
| 30 | Nord-Ost Matjeshering | 13 | 8 |
+-----------+----------------------------------+------------+------------+
| 31 | Gorgonzola Telino | 14 | 4 |
+-----------+----------------------------------+------------+------------+
| 32 | Mascarpone Fabioli | 14 | 4 |
+-----------+----------------------------------+------------+------------+
| 33 | Geitost | 15 | 4 |
+-----------+----------------------------------+------------+------------+
| 34 | Sasquatch Ale | 16 | 1 |
+-----------+----------------------------------+------------+------------+
| 35 | Steeleye Stout | 16 | 1 |
+-----------+----------------------------------+------------+------------+
| 36 | Inlagd Sill | 17 | 8 |
+-----------+----------------------------------+------------+------------+
| 37 | Gravad lax | 17 | 8 |
+-----------+----------------------------------+------------+------------+
| 38 | Côte de Blaye | 18 | 1 |
+-----------+----------------------------------+------------+------------+
| 39 | Chartreuse verte | 18 | 1 |
+-----------+----------------------------------+------------+------------+
| 40 | Boston Crab Meat | 19 | 8 |
+-----------+----------------------------------+------------+------------+
| 41 | Jack's New England Clam Chowder | 19 | 8 |
+-----------+----------------------------------+------------+------------+
| 42 | Singaporean Hokkien Fried Mee | 20 | 5 |
+-----------+----------------------------------+------------+------------+
| 43 | Ipoh Coffee | 20 | 1 |
+-----------+----------------------------------+------------+------------+
| 44 | Gula Malacca | 20 | 2 |
+-----------+----------------------------------+------------+------------+
| 45 | Rogede sild | 21 | 8 |
+-----------+----------------------------------+------------+------------+
| 46 | Spegesild | 21 | 8 |
+-----------+----------------------------------+------------+------------+
| 47 | Zaanse koeken | 22 | 3 |
+-----------+----------------------------------+------------+------------+
| 48 | Chocolade | 22 | 3 |
+-----------+----------------------------------+------------+------------+
| 49 | Maxilaku | 23 | 3 |
+-----------+----------------------------------+------------+------------+
| 50 | Valkoinen suklaa | 23 | 3 |
+-----------+----------------------------------+------------+------------+
| 51 | Manjimup Dried Apples | 24 | 7 |
+-----------+----------------------------------+------------+------------+
| 52 | Filo Mix | 24 | 5 |
+-----------+----------------------------------+------------+------------+
| 53 | Perth Pasties | 24 | 6 |
+-----------+----------------------------------+------------+------------+
| 54 | Tourtière | 25 | 6 |
+-----------+----------------------------------+------------+------------+
| 55 | Pâté chinois | 25 | 6 |
+-----------+----------------------------------+------------+------------+
| 56 | Gnocchi di nonna Alice | 26 | 5 |
+-----------+----------------------------------+------------+------------+
| 57 | Ravioli Angelo | 26 | 5 |
+-----------+----------------------------------+------------+------------+
| 58 | Escargots de Bourgogne | 27 | 8 |
+-----------+----------------------------------+------------+------------+
| 59 | Raclette Courdavault | 28 | 4 |
+-----------+----------------------------------+------------+------------+
| 60 | Camembert Pierrot | 28 | 4 |
+-----------+----------------------------------+------------+------------+
| 61 | Sirop d'érable | 29 | 2 |
+-----------+----------------------------------+------------+------------+
| 62 | Tarte au sucre | 29 | 3 |
+-----------+----------------------------------+------------+------------+
| 63 | Vegie-spread | 7 | 2 |
+-----------+----------------------------------+------------+------------+
| 64 | Wimmers gute Semmelknödel | 12 | 5 |
+-----------+----------------------------------+------------+------------+
| 65 | Louisiana Fiery Hot Pepper Sauce | 2 | 2 |
+-----------+----------------------------------+------------+------------+
| 66 | Louisiana Hot Spiced Okra | 2 | 2 |
+-----------+----------------------------------+------------+------------+
| 67 | Laughing Lumberjack Lager | 16 | 1 |
+-----------+----------------------------------+------------+------------+
| 68 | Scottish Longbreads | 8 | 3 |
+-----------+----------------------------------+------------+------------+
| 69 | Gudbrandsdalsost | 15 | 4 |
+-----------+----------------------------------+------------+------------+
| 70 | Outback Lager | 7 | 1 |
+-----------+----------------------------------+------------+------------+
| 71 | Flotemysost | 15 | 4 |
+-----------+----------------------------------+------------+------------+
| 72 | Mozzarella di Giovanni | 14 | 4 |
+-----------+----------------------------------+------------+------------+
| 73 | Röd Kaviar | 17 | 8 |
+-----------+----------------------------------+------------+------------+
| 74 | Longlife Tofu | 4 | 7 |
+-----------+----------------------------------+------------+------------+
| 75 | Rhönbräu Klosterbier | 12 | 1 |
+-----------+----------------------------------+------------+------------+
| 76 | Lakkalikööri | 23 | 1 |
+-----------+----------------------------------+------------+------------+
| 77 | Original Frankfurter grüne Soße | 12 | 2 |
+-----------+----------------------------------+------------+------------+
Using the query
SELECT SupplierID, CategoryID, COUNT(CategoryID) AS Total FROM [dbo].[Products] GROUP BY CategoryID, SupplierID
I get the table
+------------+------------+-------+
| SupplierID | CategoryID | Total |
+------------+------------+-------+
| 1 | 1 | 2 |
+------------+------------+-------+
| 1 | 2 | 1 |
+------------+------------+-------+
| 2 | 2 | 4 |
+------------+------------+-------+
| 3 | 2 | 2 |
+------------+------------+-------+
| 3 | 7 | 1 |
+------------+------------+-------+
| 4 | 6 | 1 |
+------------+------------+-------+
| 4 | 7 | 1 |
+------------+------------+-------+
| 4 | 8 | 1 |
+------------+------------+-------+
| 5 | 4 | 2 |
+------------+------------+-------+
| 6 | 2 | 1 |
+------------+------------+-------+
| 6 | 7 | 1 |
+------------+------------+-------+
| 6 | 8 | 1 |
+------------+------------+-------+
| 7 | 1 | 1 |
+------------+------------+-------+
| 7 | 2 | 1 |
+------------+------------+-------+
| 7 | 3 | 1 |
+------------+------------+-------+
| 7 | 6 | 1 |
+------------+------------+-------+
| 7 | 8 | 1 |
+------------+------------+-------+
| 8 | 3 | 4 |
+------------+------------+-------+
| 9 | 5 | 2 |
+------------+------------+-------+
| 10 | 1 | 1 |
+------------+------------+-------+
| 11 | 3 | 3 |
+------------+------------+-------+
| 12 | 1 | 1 |
+------------+------------+-------+
| 12 | 2 | 1 |
+------------+------------+-------+
| 12 | 5 | 1 |
+------------+------------+-------+
| 12 | 6 | 1 |
+------------+------------+-------+
| 12 | 7 | 1 |
+------------+------------+-------+
| 13 | 8 | 1 |
+------------+------------+-------+
| 14 | 4 | 3 |
+------------+------------+-------+
| 15 | 4 | 3 |
+------------+------------+-------+
| 16 | 1 | 3 |
+------------+------------+-------+
| 17 | 8 | 3 |
+------------+------------+-------+
| 18 | 1 | 2 |
+------------+------------+-------+
| 19 | 8 | 2 |
+------------+------------+-------+
| 20 | 1 | 1 |
+------------+------------+-------+
| 20 | 2 | 1 |
+------------+------------+-------+
| 20 | 5 | 1 |
+------------+------------+-------+
| 21 | 8 | 2 |
+------------+------------+-------+
| 22 | 3 | 2 |
+------------+------------+-------+
| 23 | 1 | 1 |
+------------+------------+-------+
| 23 | 3 | 2 |
+------------+------------+-------+
| 24 | 5 | 1 |
+------------+------------+-------+
| 24 | 6 | 1 |
+------------+------------+-------+
| 24 | 7 | 1 |
+------------+------------+-------+
| 25 | 6 | 2 |
+------------+------------+-------+
| 26 | 5 | 2 |
+------------+------------+-------+
| 27 | 8 | 1 |
+------------+------------+-------+
| 28 | 4 | 2 |
+------------+------------+-------+
| 29 | 2 | 1 |
+------------+------------+-------+
| 29 | 3 | 1 |
+------------+------------+-------+
As you can see supplier 1 makes 2 category 1 products and 1 catergory 2 product. Therefore the first line in the query should read
+------------+------------+-------+
| SupplierID | CategoryID | Total |
+------------+------------+-------+
| 1 | 1 | 2 |
+------------+------------+-------+
Next should be supplierId #2 which makes a total of 4 category 2 products. The final table should look like this...
+------------+------------+-------+
| SupplierID | CategoryID | Total |
+------------+------------+-------+
| 1 | 1 | 2 |
+------------+------------+-------+
| 2 | 2 | 4 |
+------------+------------+-------+
| 3 | 2 | 2 |
+------------+------------+-------+
| 4 | 6 | 1 |
+------------+------------+-------+
| 5 | 4 | 2 |
+------------+------------+-------+
| 6 | 2 | 1 |
+------------+------------+-------+
| 7 | 1 | 1 |
+------------+------------+-------+
| 8 | 3 | 4 |
+------------+------------+-------+
| 9 | 5 | 2 |
+------------+------------+-------+
| 11 | 3 | 3 |
+------------+------------+-------+
| 12 | 1 | 1 |
+------------+------------+-------+
| 13 | 8 | 1 |
+------------+------------+-------+
| 14 | 4 | 3 |
+------------+------------+-------+
| 15 | 4 | 3 |
+------------+------------+-------+
| 16 | 1 | 3 |
+------------+------------+-------+
| 17 | 8 | 3 |
+------------+------------+-------+
| 18 | 1 | 2 |
+------------+------------+-------+
| 19 | 8 | 2 |
+------------+------------+-------+
| 20 | 1 | 1 |
+------------+------------+-------+
| 21 | 8 | 2 |
+------------+------------+-------+
| 22 | 3 | 2 |
+------------+------------+-------+
| 23 | 3 | 2 |
+------------+------------+-------+
| 24 | 5 | 1 |
+------------+------------+-------+
| 25 | 6 | 2 |
+------------+------------+-------+
| 26 | 5 | 2 |
+------------+------------+-------+
| 27 | 8 | 1 |
+------------+------------+-------+
| 28 | 4 | 2 |
+------------+------------+-------+
| 29 | 2 | 1 |
+------------+------------+-------+
| 29 | 3 | 1 |
+------------+------------+-------+
I know a lot of suppliers only make one item for a given category and this isn't a great example but just trying to learn here.
Thanks
I think you can make use of row number based on partition by supplier and then use aggregate function along with row number for ranking. Then only select the one where you have more rows for a given supplier. I just took some part of your sample data and did it in this way.
with cte as (
select 1 as ProductID, 'Chai' as ProductNmae, 1 as SupplierID, 1 as CategoryID union all
select 2 as ProductID, 'Chang' as ProductNmae, 1 as SupplierID, 1 as CategoryID union all
select 3 as ProductID, 'Aniseed Syrup' as ProductNmae, 1 as SupplierID, 2 as CategoryID union all
select 4 as ProductID, 'Chef Anton''s Cajun Seasoning' as ProductNmae, 2 as SupplierID, 2 as CategoryID union all
select 5 as ProductID, 'Chef Anton''s Gumbo Mix' as ProductNmae, 2 as SupplierID, 2 as CategoryID union all
select 6 as ProductID, 'Grandma''s Boysenberry Spread' as Product_name , 3 as SupplierID, 2 as CategoryID union all
select 7 as ProductID, 'Uncle Bob''s Organic Dried Pears' as Product_name , 3 as SupplierID, 7 as CategoryID union all
select 8 as ProductID, 'Northwoods Cranberry Sauce' as Product_name , 3 as SupplierID, 2 as CategoryID )
select t.SupplierID, t.CategoryID, t.total from (
select supplierID, CategoryID , ROW_NUMBER() over (partition by supplierID order by count(1) desc) rownum, count(1) total from cte
group by supplierID, CategoryID ) t
where t.rownum = 1
Output:
SupplierID CategoryID total
1 1 2
2 2 2
3 2 2
In Sql server you can write a query as:
select SupplierID ,
CategoryID ,
Total
from (
select
SupplierID ,
CategoryID ,
Total ,
ROW_NUMBER() over (partition by SupplierID order by Total desc) as rownum
from (
SELECT SupplierID
, CategoryID
, COUNT(CategoryID) AS Total
FROM [dbo].[Products]
GROUP BY CategoryID, SupplierID
) as Innertable
) as Outertable
where rownum = 1
order by SupplierID
First you have to generate the category counts by supplier, then you have to rank them from highest to lowest, and finally select only the highest. In the following query, I've done that by using nested queries:
-- Select only the top category counts by supplier
SELECT
[SupplierID],
[CategoryID],
[Total]
FROM (
-- Rank category counts by supplier
SELECT
*,
RANK() OVER (PARTITION BY [SupplierID] ORDER BY [Total] DESC) AS [Rank]
FROM (
-- Generate category counts by supplier
SELECT
[SupplierID],
[CategoryID],
COUNT(*) AS [Total]
FROM [Products]
GROUP BY
[SupplierID],
[CategoryID]
) AS SupplierCategoryCounts
) AS RankedSupplierCategoryCounts
WHERE [Rank] = 1
ORDER BY [SupplierID]

Generating data averages for 15min slots using SQL Server

I have an SQL Server table as below.
CREATE TABLE [dbo].[ChannelData](
[Id] [int] IDENTITY(1,1) NOT NULL,
[ChannelId] [int] NOT NULL,
[ChannelValue] [decimal](10, 2) NULL,
[ChannelDataLogTime] [datetime] NOT NULL,
[Active] [bit] NULL,CONSTRAINT [PK_ChannelData] PRIMARY KEY CLUSTERED ( [Id] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]) ON [PRIMARY]
Sample Data is as follows:::
+----+-----------+--------------+-------------------------+--------+
| Id | ChannelId | ChannelValue | ChannelDataLogTime | Active |
+----+-----------+--------------+-------------------------+--------+
| 1 | 9 | 5.46 | 2015-06-09 14:00:11.463 | 1 |
| 2 | 9 | 8.46 | 2015-06-09 14:01:11.503 | 1 |
| 3 | 9 | 3.46 | 2015-06-09 14:02:27.747 | 1 |
| 4 | 9 | 6.46 | 2015-06-09 14:03:11.503 | 1 |
| 5 | 9 | 1.46 | 2015-06-09 14:04:11.530 | 1 |
| 6 | 9 | 4.46 | 2015-06-09 14:05:11.537 | 1 |
| 7 | 9 | 7.46 | 2015-06-09 14:06:11.547 | 1 |
| 8 | 9 | 2.46 | 2015-06-09 14:07:33.983 | 1 |
| 9 | 9 | 5.46 | 2015-06-09 14:08:11.570 | 1 |
| 10 | 9 | 8.46 | 2015-06-09 14:09:11.603 | 1 |
| 11 | 9 | 3.46 | 2015-06-09 14:10:11.613 | 1 |
| 12 | 9 | 6.47 | 2015-06-09 14:11:11.623 | 1 |
| 13 | 9 | 1.47 | 2015-06-09 14:12:24.497 | 1 |
| 14 | 9 | 4.47 | 2015-06-09 14:13:11.623 | 1 |
| 15 | 9 | 7.47 | 2015-06-09 14:14:11.650 | 1 |
| 16 | 9 | 2.47 | 2015-06-09 14:15:11.707 | 1 |
| 17 | 9 | 5.47 | 2015-06-09 14:16:11.707 | 1 |
| 18 | 9 | 8.47 | 2015-06-09 14:17:25.647 | 1 |
| 19 | 9 | 3.47 | 2015-06-09 14:18:11.707 | 1 |
| 20 | 9 | 6.47 | 2015-06-09 14:19:11.753 | 1 |
| 21 | 9 | 1.47 | 2015-06-09 14:20:11.760 | 1 |
| 22 | 9 | 4.47 | 2015-06-09 14:21:11.790 | 1 |
| 23 | 9 | 7.47 | 2015-06-09 14:22:29.500 | 1 |
| 24 | 9 | 2.47 | 2015-06-09 14:23:11.907 | 1 |
| 25 | 9 | 5.47 | 2015-06-09 14:24:12.057 | 1 |
| 26 | 9 | 8.47 | 2015-06-09 14:25:11.817 | 1 |
| 27 | 9 | 3.47 | 2015-06-09 14:26:11.837 | 1 |
| 28 | 9 | 6.47 | 2015-06-09 14:27:32.253 | 1 |
| 29 | 9 | 1.47 | 2015-06-09 14:28:11.870 | 1 |
| 30 | 9 | 4.47 | 2015-06-09 14:29:11.870 | 1 |
| 31 | 9 | 7.50 | 2015-06-09 16:00:13.313 | 1 |
| 32 | 9 | 2.50 | 2015-06-09 16:01:13.260 | 1 |
| 33 | 9 | 5.50 | 2015-06-09 16:02:13.290 | 1 |
| 34 | 9 | 8.50 | 2015-06-09 16:03:13.270 | 1 |
| 35 | 9 | 3.50 | 2015-06-09 16:04:32.827 | 1 |
| 36 | 9 | 6.50 | 2015-06-09 16:05:13.323 | 1 |
| 37 | 9 | 1.50 | 2015-06-09 16:06:13.330 | 1 |
| 38 | 9 | 4.50 | 2015-06-09 16:07:13.337 | 1 |
| 39 | 9 | 7.50 | 2015-06-09 16:08:13.313 | 1 |
| 40 | 9 | 2.50 | 2015-06-09 16:09:28.497 | 1 |
| 41 | 9 | 5.50 | 2015-06-09 16:10:13.370 | 1 |
| 42 | 9 | 8.50 | 2015-06-09 16:11:13.417 | 1 |
| 43 | 9 | 3.50 | 2015-06-09 16:12:13.540 | 1 |
| 44 | 9 | 6.50 | 2015-06-09 16:13:13.577 | 1 |
| 45 | 9 | 1.50 | 2015-06-09 16:14:33.880 | 1 |
| 46 | 9 | 4.50 | 2015-06-09 16:15:13.453 | 1 |
| 47 | 9 | 7.50 | 2015-06-09 16:16:13.500 | 1 |
| 48 | 9 | 2.50 | 2015-06-09 16:17:13.497 | 1 |
| 49 | 9 | 5.50 | 2015-06-09 16:18:13.503 | 1 |
| 50 | 9 | 8.50 | 2015-06-09 16:19:38.717 | 1 |
| 51 | 9 | 3.50 | 2015-06-09 16:21:13.567 | 1 |
| 52 | 9 | 6.50 | 2015-06-09 16:22:13.557 | 1 |
| 53 | 9 | 1.50 | 2015-06-09 16:23:14.163 | 1 |
| 54 | 9 | 4.50 | 2015-06-09 16:24:13.607 | 1 |
| 55 | 9 | 7.50 | 2015-06-09 16:25:38.783 | 1 |
| 56 | 9 | 2.50 | 2015-06-09 16:27:13.660 | 1 |
| 57 | 9 | 5.51 | 2015-06-09 16:28:13.710 | 1 |
| 58 | 9 | 8.51 | 2015-06-09 16:29:13.703 | 1 |
| 59 | 9 | 3.51 | 2015-06-09 16:30:13.713 | 1 |
+----+-----------+--------------+-------------------------+--------+
Now I am generating 15 minute averaged data for a period of time, with start date and end date. Which is working fine with out any issues.
I have scenario where the data will be missing for some time. Which inturn missing the 15 minute slots as there is no data for that 15min slot. What I need is to list the 15 minute slots even if the data is not available during that time slot using SQL Query.
SELECT
Avg(chnldata.ChannelValue) AS ChannelValue,
DATEADD(minute,FLOOR(DATEDIFF(minute,0,ChannelDataLogTime)/15)*15,0) as HourlyDateTime,
chnldata.ChannelId as Id
FROM ChannelData as chnldata
WHERE chnldata.ChannelId in (9) AND chnldata.ChannelDataLogTime >= '06/09/2015' AND chnldata.ChannelDataLogTime < '06/11/2015 23:59:50'
GROUP BY chnldata.ChannelId, DATEADD(minute,FLOOR(DATEDIFF(minute,0,ChannelDataLogTime)/15)*15,0)
This is the existing 15 min average query. But it doesn't display missing 15min slots.
The current output is:::
+--------------+-------------------------+----+
| ChannelValue | HourlyDateTime | Id |
+--------------+-------------------------+----+
| 5.129333 | 2015-06-09 14:00:00.000 | 9 |
| 4.803333 | 2015-06-09 14:15:00.000 | 9 |
| 5.033333 | 2015-06-09 16:00:00.000 | 9 |
| 5.270769 | 2015-06-09 16:15:00.000 | 9 |
| 3.510000 | 2015-06-09 16:30:00.000 | 9 |
+--------------+-------------------------+----+
Required Output is:::
+--------------+-------------------------+----+
| ChannelValue | HourlyDateTime | Id |
+--------------+-------------------------+----+
| 5.129333 | 2015-06-09 14:00:00.000 | 9 |
| 4.803333 | 2015-06-09 14:15:00.000 | 9 |
| NULL | 2015-06-09 14:30:00.000 | 9 |
| NULL | 2015-06-09 14:45:00.000 | 9 |
| NULL | 2015-06-09 15:00:00.000 | 9 |
| NULL | 2015-06-09 15:15:00.000 | 9 |
| NULL | 2015-06-09 15:30:00.000 | 9 |
| NULL | 2015-06-09 15:45:00.000 | 9 |
| 5.033333 | 2015-06-09 16:00:00.000 | 9 |
| 5.270769 | 2015-06-09 16:15:00.000 | 9 |
| 3.510000 | 2015-06-09 16:30:00.000 | 9 |
+--------------+-------------------------+----+
RIGHT OUTER JOIN to a CTE that has all the possible 15-minute intervals in your time-range.
build a time range CTE, which can be done in various ways, but the cartesian product method is probably faster than many methods
if you want speed it might be best to build a static dates table and maybe a dates and a time table
;WITH mins as (SELECT 0 as q union select 15 union select 30 union select 45),
dats as (SELECT MIN(ChannelDataLogTime) as t1, max(ChannelDataLogTime) as t2 from channeldata),
ranges as (SELECT CAST(t1 as date) s1 FROM dats
union all
SELECT dateadd(day,1,r.s1) from ranges r where r.s1< (select t2 from dats)
),
hrs as (select 0 h union all select h + 1 from hrs where h < 23), --hours 0 to 23
slots as (select dateadd(MINUTE,mins.q,dateadd(hour,hrs.h,cast(ranges.s1 as datetime2))) as strt from mins,ranges,hrs ),
ids as (SELECT distinct ChannelId from ChannelData),
allslot as (select channelid, strt from slots,ids)
SELECT count(0) as x,
coalesce(Avg(chnldata.ChannelValue) , 0) AS ChannelValue,
s.strt HourlyDateTime,
s.ChannelId as Id
FROM ChannelData as chnldata
RIGHT JOIN allslot s on s.strt <= ChannelDataLogTime and ChannelDataLogTime < dateadd(minute,15,s.strt) and s.ChannelId = chnldata.ChannelId
WHERE chnldata.ChannelId is null or chnldata.ChannelId in (9) AND chnldata.ChannelDataLogTime >= '20150906' AND chnldata.ChannelDataLogTime < '20151123'
GROUP BY s.ChannelId, s.strt
Take in consideration the limitations of maxrecursion option.
DECLARE #StartDT DATETIME = '2015-06-09';
DECLARE #EndDT DATETIME = '2015-06-12'; -- moved to the next day to use >= and < operators correctly
;WITH
[Interval]
AS
(
SELECT
[Start] = #StartDT
,[End] = DATEADD(MINUTE, 15, #StartDT)
UNION ALL
SELECT
[Start] = [End]
,[End] = DATEADD(MINUTE, 15, [End])
FROM [Interval]
WHERE (1 = 1)
AND ([End] < #EndDT)
),
[Available]
AS
(
SELECT
[Start] = CONVERT(SMALLDATETIME, MIN([CD].[ChannelDataLogTime]))
,[End] = CONVERT(SMALLDATETIME, MAX([CD].[ChannelDataLogTime]))
FROM [dbo].[ChannelData] AS [CD]
WHERE (1 = 1)
AND (#StartDT <= [CD].[ChannelDataLogTime] AND [CD].[ChannelDataLogTime] < #EndDT)
)
SELECT
[ChannelValue] = AVG([CD].[ChannelValue])
,[HourlyDateTime] = [I].[Start]
,[Id] = [CD].[ChannelId]
FROM [Available] AS [A]
INNER JOIN [Interval] AS [I]
ON ([A].[Start] <= [I].[Start] AND [I].[Start] <= [A].[End])
LEFT OUTER JOIN [dbo].[ChannelData] AS [CD]
ON
(
([CD].[ChannelId] IN (9))
AND ([I].[Start] <= [CD].[ChannelDataLogTime] AND [CD].[ChannelDataLogTime] < [I].[End])
)
GROUP BY
[I].[Start]
,[CD].[ChannelId]
ORDER BY
[I].[Start]
OPTION (MAXRECURSION 32767);

SQL window excluding current group?

I'm trying to provide rolled up summaries of the following data including only the group in question as well as excluding the group. I think this can be done with a window function, but I'm having problems with getting the syntax down (in my case Hive SQL).
I want the following data to be aggregated
+------------+---------+--------+
| date | product | rating |
+------------+---------+--------+
| 2018-01-01 | A | 1 |
| 2018-01-02 | A | 3 |
| 2018-01-20 | A | 4 |
| 2018-01-27 | A | 5 |
| 2018-01-29 | A | 4 |
| 2018-02-01 | A | 5 |
| 2017-01-09 | B | NULL |
| 2017-01-12 | B | 3 |
| 2017-01-15 | B | 4 |
| 2017-01-28 | B | 4 |
| 2017-07-21 | B | 2 |
| 2017-09-21 | B | 5 |
| 2017-09-13 | C | 3 |
| 2017-09-14 | C | 4 |
| 2017-09-15 | C | 5 |
| 2017-09-16 | C | 5 |
| 2018-04-01 | C | 2 |
| 2018-01-13 | D | 1 |
| 2018-01-14 | D | 2 |
| 2018-01-24 | D | 3 |
| 2018-01-31 | D | 4 |
+------------+---------+--------+
Aggregated results:
+------+-------+---------+----+------------+------------------+----------+
| year | month | product | ct | avg_rating | avg_rating_other | other_ct |
+------+-------+---------+----+------------+------------------+----------+
| 2018 | 1 | A | 5 | 3.4 | 2.5 | 4 |
| 2018 | 2 | A | 1 | 5 | NULL | 0 |
| 2017 | 1 | B | 4 | 3.6666667 | NULL | 0 |
| 2017 | 7 | B | 1 | 2 | NULL | 0 |
| 2017 | 9 | B | 1 | 5 | 4.25 | 4 |
| 2017 | 9 | C | 4 | 4.25 | 5 | 1 |
| 2018 | 4 | C | 1 | 2 | NULL | 0 |
| 2018 | 1 | D | 4 | 2.5 | 3.4 | 5 |
+------+-------+---------+----+------------+------------------+----------+
I've also considered producing two aggregates, one with the product in question and one without, but having trouble with creating the appropriate joining key.
You can do:
select year(date), month(date), product,
count(*) as ct, avg(rating) as avg_rating,
sum(count(*)) over (partition by year(date), month(date)) - count(*) as ct_other,
((sum(sum(rating)) over (partition by year(date), month(date)) - sum(rating)) /
(sum(count(*)) over (partition by year(date), month(date)) - count(*))
) as avg_other
from t
group by year(date), month(date), product;
The rating for the "other" is a bit tricky. You need to add everything up and subtract out the current row -- and calculate the average by doing the sum divided by the count.

SQL Getting Running Count with SUM and OVER

In sql I have a history table for each item we have and they can have a record of in or out with a quantity for each action. I'm trying to get a running count of how many of an item we have based on whether it's an activity of out or in. Here is my final sql:
SELECT itemid,
activitydate,
activitycode,
SUM(quantity) AS quantity,
SUM(CASE WHEN activitycode = 'IN'
THEN quantity
WHEN activitycode = 'OUT'
THEN -quantity
ELSE 0 END) OVER (PARTITION BY itemid ORDER BY activitydate rows unbounded preceding) AS runningcount
FROM itemhistory
GROUP BY itemid,
activitydate,
activitycode
This results in:
+--------+-------------------------+--------------+----------+--------------+
| itemid | activitydate | activitycode | quantity | runningcount |
+--------+-------------------------+--------------+----------+--------------+
| 1 | 2017-06-08 13:58:00.000 | IN | 1 | 1 |
| 1 | 2017-06-08 16:02:00.000 | IN | 6 | 2 |
| 1 | 2017-06-15 11:43:00.000 | OUT | 3 | 1 |
| 1 | 2017-06-19 12:36:00.000 | IN | 1 | 2 |
| 2 | 2017-06-08 13:50:00.000 | IN | 5 | 1 |
| 2 | 2017-06-12 12:41:00.000 | IN | 4 | 2 |
| 2 | 2017-06-15 11:38:00.000 | OUT | 2 | 1 |
| 2 | 2017-06-20 12:54:00.000 | IN | 15 | 2 |
| 2 | 2017-06-08 13:52:00.000 | IN | 5 | 3 |
| 2 | 2017-06-12 13:09:00.000 | IN | 1 | 4 |
| 2 | 2017-06-15 11:47:00.000 | OUT | 1 | 3 |
| 2 | 2017-06-20 13:14:00.000 | IN | 1 | 4 |
+--------+-------------------------+--------------+----------+--------------+
I want the end result to look like this:
+--------+-------------------------+--------------+----------+--------------+
| itemid | activitydate | activitycode | quantity | runningcount |
+--------+-------------------------+--------------+----------+--------------+
| 1 | 2017-06-08 13:58:00.000 | IN | 1 | 1 |
| 1 | 2017-06-08 16:02:00.000 | IN | 6 | 7 |
| 1 | 2017-06-15 11:43:00.000 | OUT | 3 | 4 |
| 1 | 2017-06-19 12:36:00.000 | IN | 1 | 5 |
| 2 | 2017-06-08 13:50:00.000 | IN | 5 | 5 |
| 2 | 2017-06-12 12:41:00.000 | IN | 4 | 9 |
| 2 | 2017-06-15 11:38:00.000 | OUT | 2 | 7 |
| 2 | 2017-06-20 12:54:00.000 | IN | 15 | 22 |
| 2 | 2017-06-08 13:52:00.000 | IN | 5 | 27 |
| 2 | 2017-06-12 13:09:00.000 | IN | 1 | 28 |
| 2 | 2017-06-15 11:47:00.000 | OUT | 1 | 27 |
| 2 | 2017-06-20 13:14:00.000 | IN | 1 | 28 |
+--------+-------------------------+--------------+----------+--------------+
You want sum(sum()), because this is an aggregation query:
SELECT itemid, activitydate, activitycode,
SUM(quantity) AS quantity,
SUM(SUM(CASE WHEN activitycode = 'IN' THEN quantity
WHEN activitycode = 'OUT' THEN -quantity
ELSE 0
END)
) OVER (PARTITION BY itemid ORDER BY activitydate ) AS runningcount
FROM itemhistory
GROUP BY itemid, activitydate, activitycode

Count rows each month of a year - SQL Server

I have a table "Product" as :
| ProductId | ProductCatId | Price | Date | Deadline |
--------------------------------------------------------------------
| 1 | 1 | 10.00 | 2016-01-01 | 2016-01-27 |
| 2 | 2 | 10.00 | 2016-02-01 | 2016-02-27 |
| 3 | 3 | 10.00 | 2016-03-01 | 2016-03-27 |
| 4 | 1 | 10.00 | 2016-04-01 | 2016-04-27 |
| 5 | 3 | 10.00 | 2016-05-01 | 2016-05-27 |
| 6 | 3 | 10.00 | 2016-06-01 | 2016-06-27 |
| 7 | 1 | 20.00 | 2016-01-01 | 2016-01-27 |
| 8 | 2 | 30.00 | 2016-02-01 | 2016-02-27 |
| 9 | 1 | 40.00 | 2016-03-01 | 2016-03-27 |
| 10 | 4 | 15.00 | 2016-04-01 | 2016-04-27 |
| 11 | 1 | 25.00 | 2016-05-01 | 2016-05-27 |
| 12 | 5 | 55.00 | 2016-06-01 | 2016-06-27 |
| 13 | 5 | 55.00 | 2016-06-01 | 2016-01-27 |
| 14 | 5 | 55.00 | 2016-06-01 | 2016-02-27 |
| 15 | 5 | 55.00 | 2016-06-01 | 2016-03-27 |
I want to create SP count rows of Product each month with condition Year = CurrentYear , like :
| Month| SumProducts | SumExpiredProducts |
-------------------------------------------
| 1 | 3 | 3 |
| 2 | 3 | 3 |
| 3 | 3 | 3 |
| 4 | 2 | 2 |
| 5 | 2 | 2 |
| 6 | 2 | 2 |
What should i do ?
You can use a query like the following:
SELECT MONTH([Date]),
COUNT(*) AS SumProducts ,
COUNT(CASE WHEN [Date] > Deadline THEN 1 END) AS SumExpiredProducts
FROM mytable
WHERE YEAR([Date]) = YEAR(GETDATE())
GROUP BY MONTH([Date])