I'm using the OVER() clause to get the running total of payments based on account number and the date the payment was made. I'm then subtracting that running total from the current balance to determine the balance after each transaction was made
SELECT
no_,
amount,
SUM(amount) OVER(PARTITION BY no_ ORDER BY effectiveDate DESC) AS RunningTotal,
balance - (SUM(amount) OVER(PARTITION BY no_ ORDER BY effectiveDate DESC)) + amount AS CalculatedBalance,
balance
FROM
c
WHERE
status != 'closed'
ORDER BY
no_
It works fine for positive numbers, but when the amount field is a negative number, I get weird outputs like below:
This happens for all negative numbers, I've checked all my positive numbers and they are correct. I looked online and I can't find a reason for OVER() not accepting negative numbers
Why are you doing the calculation in reverse? I would expect logic more like this:
SELECT no_, amount,
SUM(amount) OVER (PARTITION BY no_ ORDER BY effectiveDate ASC) AS RunningTotal,
(balance + amount +
SUM(amount) OVER (PARTITION BY no_ ORDER BY effectiveDate ASC)
) AS CalculatedBalance,
balance
FROM c
WHERE status <> 'closed'
ORDER BY no_;
In addition, because balance is negative, you want to add the amounts rather than subtract them.
I think you should be made aware of the ROWS BETWEEN PRECEDING clause.
In the window frame clause, you indicate the window frame units (ROWS
or RANGE) and the window frame extent (the delimiters). With the ROWS
window frame unit, you can indicate the delimiters as one of three
options:
■■ UNBOUNDED PRECEDING or FOLLOWING, meaning the beginning or end of
the partition, respectively
■■ CURRENT ROW, obviously representing the current row
■■ ROWS PRECEDING or FOLLOWING, meaning n rows before or after the
current, respectively
Example with syntax:
TEMP QUERY TO GROUP THE DAYS FOUND IN THE DATASET (1..7..infinity)
, DENSE_RANK()
OVER(PARTITION BY tbl.Account ORDER BY tbl.Date DESC)
AS [calcDayRankdenseGroup]
FINAL RESULTS:
select
cte.*
, SUM(cte.Amount)
OVER(PARTITION BY cte.Account ORDER BY cte.calcDayRankdenseGroup ASC
ROWS BETWEEN 7 PRECEDING and CURRENT ROW)
AS [Amount_RunningTotalDaysLast7]
from cteDateRankAdded cte
order by Account, DATE DESC
Data returned from my testing data (seen below):
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| Account | Amount | DATE | calcDayRankGroup_DoNotUse | calcDayRankdenseGroup | calcDayCountInGroup | Amount_RunningTotalDaysLast7 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 30.00 | 2018-09-16 | 1 | 1 | 1 | 30.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 69.00 | 2018-09-16 | 1 | 1 | 2 | 99.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 6.00 | 2018-09-13 | 3 | 2 | 1 | 105.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 57.00 | 2018-09-12 | 4 | 3 | 1 | 162.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 13.00 | 2018-09-12 | 4 | 3 | 2 | 175.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 98.00 | 2018-09-12 | 4 | 3 | 3 | 273.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 47.00 | 2018-09-03 | 7 | 4 | 1 | 320.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 90.00 | 2018-09-02 | 8 | 5 | 1 | 410.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 90.00 | 2018-09-02 | 8 | 5 | 2 | 470.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 32.00 | 2018-08-29 | 10 | 6 | 1 | 433.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 50.00 | 2018-08-24 | 11 | 7 | 1 | 477.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 48.00 | 2018-08-24 | 11 | 7 | 2 | 468.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 100.00 | 2018-08-23 | 13 | 8 | 1 | 555.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 63.00 | 2018-08-20 | 14 | 9 | 1 | 520.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 1 | 49.00 | 2018-08-19 | 15 | 10 | 1 | 522.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 38.00 | 2018-09-16 | 1 | 1 | 1 | 38.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 3.00 | 2018-09-16 | 1 | 1 | 2 | 41.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 83.00 | 2018-09-13 | 3 | 2 | 1 | 124.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 28.00 | 2018-09-12 | 4 | 3 | 1 | 152.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 78.00 | 2018-09-12 | 4 | 3 | 2 | 230.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 32.00 | 2018-09-12 | 4 | 3 | 3 | 262.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 29.00 | 2018-09-03 | 7 | 4 | 1 | 291.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 64.00 | 2018-09-02 | 8 | 5 | 1 | 355.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 81.00 | 2018-09-02 | 8 | 5 | 2 | 398.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 60.00 | 2018-08-29 | 10 | 6 | 1 | 455.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 82.00 | 2018-08-24 | 11 | 7 | 1 | 454.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 66.00 | 2018-08-24 | 11 | 7 | 2 | 492.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 9.00 | 2018-08-23 | 13 | 8 | 1 | 423.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 45.00 | 2018-08-20 | 14 | 9 | 1 | 436.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
| 2 | 43.00 | 2018-08-19 | 15 | 10 | 1 | 450.00 |
+---------+--------+------------+---------------------------+-----------------------+---------------------+------------------------------+
Here is the sample data I created to represent your scenario, with today being 9/18/2018.
+-----+---------+------------+--------+------------------+------------------+
| Seq | Account | DATE | Amount | dataDaysAgoDelta | dataDaysAgoGroup |
+-----+---------+------------+--------+------------------+------------------+
| 1 | 1 | 2018-08-19 | 49.00 | 30 | 10 |
+-----+---------+------------+--------+------------------+------------------+
| 2 | 1 | 2018-08-20 | 63.00 | 29 | 9 |
+-----+---------+------------+--------+------------------+------------------+
| 3 | 1 | 2018-08-23 | 100.00 | 26 | 8 |
+-----+---------+------------+--------+------------------+------------------+
| 4 | 1 | 2018-08-24 | 50.00 | 25 | 7 |
+-----+---------+------------+--------+------------------+------------------+
| 5 | 1 | 2018-08-24 | 48.00 | 25 | 7 |
+-----+---------+------------+--------+------------------+------------------+
| 6 | 1 | 2018-08-29 | 32.00 | 20 | 6 |
+-----+---------+------------+--------+------------------+------------------+
| 7 | 1 | 2018-09-02 | 90.00 | 16 | 5 |
+-----+---------+------------+--------+------------------+------------------+
| 8 | 1 | 2018-09-02 | 90.00 | 16 | 5 |
+-----+---------+------------+--------+------------------+------------------+
| 9 | 1 | 2018-09-03 | 47.00 | 15 | 4 |
+-----+---------+------------+--------+------------------+------------------+
| 10 | 1 | 2018-09-12 | 57.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 11 | 1 | 2018-09-12 | 13.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 12 | 1 | 2018-09-12 | 98.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 13 | 1 | 2018-09-13 | 6.00 | 5 | 2 |
+-----+---------+------------+--------+------------------+------------------+
| 14 | 1 | 2018-09-16 | 30.00 | 2 | 1 |
+-----+---------+------------+--------+------------------+------------------+
| 15 | 1 | 2018-09-16 | 69.00 | 2 | 1 |
+-----+---------+------------+--------+------------------+------------------+
| 16 | 2 | 2018-08-19 | 43.00 | 30 | 10 |
+-----+---------+------------+--------+------------------+------------------+
| 17 | 2 | 2018-08-20 | 45.00 | 29 | 9 |
+-----+---------+------------+--------+------------------+------------------+
| 18 | 2 | 2018-08-23 | 9.00 | 26 | 8 |
+-----+---------+------------+--------+------------------+------------------+
| 19 | 2 | 2018-08-24 | 82.00 | 25 | 7 |
+-----+---------+------------+--------+------------------+------------------+
| 20 | 2 | 2018-08-24 | 66.00 | 25 | 7 |
+-----+---------+------------+--------+------------------+------------------+
| 21 | 2 | 2018-08-29 | 60.00 | 20 | 6 |
+-----+---------+------------+--------+------------------+------------------+
| 22 | 2 | 2018-09-02 | 64.00 | 16 | 5 |
+-----+---------+------------+--------+------------------+------------------+
| 23 | 2 | 2018-09-02 | 81.00 | 16 | 5 |
+-----+---------+------------+--------+------------------+------------------+
| 24 | 2 | 2018-09-03 | 29.00 | 15 | 4 |
+-----+---------+------------+--------+------------------+------------------+
| 25 | 2 | 2018-09-12 | 28.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 26 | 2 | 2018-09-12 | 78.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 27 | 2 | 2018-09-12 | 32.00 | 6 | 3 |
+-----+---------+------------+--------+------------------+------------------+
| 28 | 2 | 2018-09-13 | 83.00 | 5 | 2 |
+-----+---------+------------+--------+------------------+------------------+
| 29 | 2 | 2018-09-16 | 38.00 | 2 | 1 |
+-----+---------+------------+--------+------------------+------------------+
| 30 | 2 | 2018-09-16 | 3.00 | 2 | 1 |
+-----+---------+------------+--------+------------------+------------------+
COMPLETE QUERY:
DECLARE #tblDaysLast7Found as table
(Seq int, Account int, DATE date, Amount numeric(10,2), dataDaysAgoDelta int, dataDaysAgoGroup int)
INSERT INTO #tblDaysLast7Found
(Seq, Account, DATE, Amount, dataDaysAgoDelta, dataDaysAgoGroup)
VALUES
(1, 1, '8/19/2018', 49, 30, 10)
,(2, 1, '8/20/2018', 63, 29, 9)
,(3, 1, '8/23/2018', 100, 26, 8)
,(4, 1, '8/24/2018', 50, 25, 7)
,(5, 1, '8/24/2018', 48, 25, 7)
,(6, 1, '8/29/2018', 32, 20, 6)
,(7, 1, '9/2/2018', 90, 16, 5)
,(8, 1, '9/2/2018', 90, 16, 5)
,(9, 1, '9/3/2018', 47, 15, 4)
,(10, 1, '9/12/2018', 57, 6, 3)
,(11, 1, '9/12/2018', 13, 6, 3)
,(12, 1, '9/12/2018', 98, 6, 3)
,(13, 1, '9/13/2018', 6, 5, 2)
,(14, 1, '9/16/2018', 30, 2, 1)
,(15, 1, '9/16/2018', 69, 2, 1)
,(16, 2, '8/19/2018', 43, 30, 10)
,(17, 2, '8/20/2018', 45, 29, 9)
,(18, 2, '8/23/2018', 9, 26, 8)
,(19, 2, '8/24/2018', 82, 25, 7)
,(20, 2, '8/24/2018', 66, 25, 7)
,(21, 2, '8/29/2018', 60, 20, 6)
,(22, 2, '9/2/2018', 64, 16, 5)
,(23, 2, '9/2/2018', 81, 16, 5)
,(24, 2, '9/3/2018', 29, 15, 4)
,(25, 2, '9/12/2018', 28, 6, 3)
,(26, 2, '9/12/2018', 78, 6, 3)
,(27, 2, '9/12/2018', 32, 6, 3)
,(28, 2, '9/13/2018', 83, 5, 2)
,(29, 2, '9/16/2018', 38, 2, 1)
,(30, 2, '9/16/2018', 3, 2, 1)
--select * from #tblDaysLast7Found
;WITH cteDateRankAdded AS
(
select -- *
tbl.Account
, tbl.Amount
, tbl.DATE
, RANK()
OVER(PARTITION BY tbl.Account ORDER BY tbl.Date DESC)
AS [calcDayRankGroup_DoNotUse]
, DENSE_RANK()
OVER(PARTITION BY tbl.Account ORDER BY tbl.Date DESC)
AS [calcDayRankdenseGroup]
, ROW_NUMBER()
OVER(PARTITION BY tbl.Account, tbl.Date ORDER BY tbl.Date DESC)
AS [calcDayCountInGroup]
from
#tblDaysLast7Found tbl
)
select
cte.*
, SUM(cte.Amount)
OVER(PARTITION BY cte.Account ORDER BY cte.calcDayRankdenseGroup ASC
ROWS BETWEEN 7 PRECEDING and CURRENT ROW)
AS [Amount_RunningTotalDaysLast7]
from cteDateRankAdded cte
order by Account, DATE DESC
I have two tables. The first inv containing records of invoices, the second containing payments. I want to match the payments in the inv table by inv_amount and inv_date. There might be more than one invoice with the same amount on the same day and also more than one payment of the same amount on the same day.
The payment should be matched with the first matching invoice and every payment must only be matched once.
This is my data:
Table inv
inv_id | inv_amount | inv_date | inv_number
--------+------------+------------+------------
1 | 10 | 2018-01-01 | 1
2 | 16 | 2018-01-01 | 1
3 | 12 | 2018-02-02 | 2
4 | 14 | 2018-02-03 | 3
5 | 19 | 2018-02-04 | 3
6 | 19 | 2018-02-04 | 5
7 | 5 | 2018-02-04 | 6
8 | 40 | 2018-02-04 | 7
9 | 19 | 2018-02-04 | 8
10 | 19 | 2018-02-05 | 9
11 | 20 | 2018-02-05 | 10
12 | 20 | 2018-02-07 | 11
Table pay
pay_id | pay_amount | pay_date
--------+------------+------------
1 | 10 | 2018-01-01
2 | 12 | 2018-02-02
4 | 19 | 2018-02-04
3 | 14 | 2018-02-03
5 | 5 | 2018-02-04
6 | 19 | 2018-02-04
7 | 19 | 2018-02-05
8 | 20 | 2018-02-07
My Query:
SELECT DISTINCT ON (inv.inv_id) inv.inv_id,
inv.inv_amount,
inv.inv_date,
inv.inv_number,
pay.pay_id
FROM ("2016".pay
RIGHT JOIN "2016".inv ON (((pay.pay_amount = inv.inv_amount) AND (pay.pay_date = inv.inv_date))))
ORDER BY inv.inv_id
resulting in:
inv_id | inv_amount | inv_date | inv_number | pay_id
--------+------------+------------+------------+--------
1 | 10 | 2018-01-01 | 1 | 1
2 | 16 | 2018-01-01 | 1 |
3 | 12 | 2018-02-02 | 2 | 2
4 | 14 | 2018-02-03 | 3 | 3
5 | 19 | 2018-02-04 | 3 | 4
6 | 19 | 2018-02-04 | 5 | 4
7 | 5 | 2018-02-04 | 6 | 5
8 | 40 | 2018-02-04 | 7 |
9 | 19 | 2018-02-04 | 8 | 6
10 | 19 | 2018-02-05 | 9 | 7
11 | 20 | 2018-02-05 | 10 |
12 | 20 | 2018-02-07 | 11 | 8
The record inv_id = 6 should not match with pay_id = 4 for it would mean that payment 4 was inserted twice
Desired result:
inv_id | inv_amount | inv_date | inv_number | pay_id
--------+------------+------------+------------+--------
1 | 10 | 2018-01-01 | 1 | 1
2 | 16 | 2018-01-01 | 1 |
3 | 12 | 2018-02-02 | 2 | 2
4 | 14 | 2018-02-03 | 3 | 3
5 | 19 | 2018-02-04 | 3 | 4
6 | 19 | 2018-02-04 | 5 | <- should be empty**
7 | 5 | 2018-02-04 | 6 | 5
8 | 40 | 2018-02-04 | 7 |
9 | 19 | 2018-02-04 | 8 | 6
10 | 19 | 2018-02-05 | 9 | 7
11 | 20 | 2018-02-05 | 10 |
12 | 20 | 2018-02-07 | 11 | 8
Disclaimer: Yes I asked that question yesterday with the original data but someone pointed out that my sql was very hard to read. I, therefore, tried to create a cleaner representation of my problem.
For convenience, here's an SQL Fiddle to test: http://sqlfiddle.com/#!17/018d7/1
After seeing the example I think I've got the query for you:
WITH payments_cte AS (
SELECT
payment_id,
payment_amount,
payment_date,
ROW_NUMBER() OVER (PARTITION BY payment_amount, payment_date ORDER BY payment_id) AS payment_row
FROM payments
), invoices_cte AS (
SELECT
invoice_id,
invoice_amount,
invoice_date,
invoice_number,
ROW_NUMBER() OVER (PARTITION BY invoice_amount, invoice_date ORDER BY invoice_id) AS invoice_row
FROM invoices
)
SELECT invoice_id, invoice_amount, invoice_date, invoice_number, payment_id
FROM invoices_cte
LEFT JOIN payments_cte
ON payment_amount = invoice_amount
AND payment_date = invoice_date
AND payment_row = invoice_row
ORDER BY invoice_id, payment_id
Hopefully what I have here is a simple question and explained to you in the correct manner.
I have the following Query:
--DECLARE DATES
DECLARE #Date datetime
DECLARE #DaysInMonth INT
DECLARE #i INT
--GIVE VALUES
SET #Date = Getdate()
SELECT #DaysInMonth = datepart(dd,dateadd(dd,-1,dateadd(mm,1,cast(cast(year(#Date) as varchar)+'-'+cast(month(#Date) as varchar)+'-01' as datetime))))
SET #i = 1
--MAKE TEMP TABLE
CREATE TABLE #TempDays
(
[days] VARCHAR(50)
)
WHILE #i <= #DaysInMonth
BEGIN
INSERT INTO #TempDays
VALUES(#i)
SET #i = #i + 1
END
SELECT #TempDays.days, DATEPART(dd, a.ActualDate) ActualDate, a.ActualAmount, (SELECT SUM(b.ActualAmount)
FROM UnpaidManagement..Actual b
WHERE b.ID <= a.ID) RunningTotal
FROM UnpaidManagement..Actual a
RIGHT JOIN #TempDays on a.ID = #TempDays.days
DROP TABLE #TempDays
Which produces the following output:
+------+------------+--------------+--------------+
| days | ActualDate | ActualAmount | RunningTotal |
+------+------------+--------------+--------------+
| 1 | 1 | 438706 | R 438 706 |
| 2 | 2 | 16239 | R 454 945 |
| 3 | 3 | 1611264 | R 2 066 209 |
| 4 | 4 | 1157777 | R 3 223 986 |
| 5 | 5 | 470662 | R 3 694 648 |
| 6 | 6 | 288628 | 3983276 |
| 7 | 7 | 245897 | 4229173 |
| 8 | 8 | 5235 | 4234408 |
| 9 | 10 | 375630 | 4610038 |
| 10 | 11 | 95610 | 4705648 |
| 11 | 12 | 87285 | 4792933 |
| 12 | 13 | 73399 | 4866332 |
| 13 | 14 | 59516 | 4925848 |
| 14 | 15 | 918915 | 5844763 |
| 15 | 17 | 1957285 | 7802048 |
| 16 | 18 | 489964 | 8292012 |
| 17 | 19 | 272304 | 8564316 |
| 18 | 20 | 378601 | 8942917 |
| 19 | 22 | 92374 | 9035291 |
| 20 | 23 | 198 | 9035489 |
| 21 | 24 | 1500820 | 10536309 |
| 22 | 25 | 2631057 | 13167366 |
| 23 | 26 | 6466505 | 19633871 |
| 24 | 27 | 3757350 | 23391221 |
| 25 | 28 | 3487466 | 26878687 |
| 26 | 29 | 160197 | 27038884 |
| 27 | 30 | 14000 | 27052884 |
| 28 | NULL | NULL | NULL |
| 29 | NULL | NULL | NULL |
| 30 | NULL | NULL | NULL |
| 31 | NULL | NULL | NULL |
+------+------------+--------------+--------------+
If you look closely at the table above, the "ActualDate" column is missing a few values, EG: 9, 16, etc.
And because of this, the rows are being pushed up instead of being grouped with their correct number? How would I accomplish a group by / anything to keep them in their correct row?
DESIRED OUTPUT:
+------+------------+--------------+--------------+
| days | ActualDate | ActualAmount | RunningTotal |
+------+------------+--------------+--------------+
| 1 | 1 | 438706 | R 438 706 |
| 2 | 2 | 16239 | R 454 945 |
| 3 | 3 | 1611264 | R 2 066 209 |
| 4 | 4 | 1157777 | R 3 223 986 |
| 5 | 5 | 470662 | R 3 694 648 |
| 6 | 6 | 288628 | 3983276 |
| 7 | 7 | 245897 | 4229173 |
| 8 | 8 | 5235 | 4234408 |
| 9 | NULL | NULL | NULL |
| 10 | 10 | 375630 | 4610038 |
| 11 | 11 | 95610 | 4705648 |
| 12 | 12 | 87285 | 4792933 |
| 13 | 13 | 73399 | 4866332 |
| 14 | 14 | 59516 | 4925848 |
| 15 | 15 | 918915 | 5844763 |
| 16 | NULL | NULL | NULL |
| 17 | 17 | 1957285 | 7802048 |
| 18 | 18 | 489964 | 8292012 |
| 19 | 19 | 272304 | 8564316 |
| 20 | 20 | 378601 | 8942917 |
| 21 | NULL | NULL | NULL |
| 22 | 22 | 92374 | 9035291 |
| 23 | 23 | 198 | 9035489 |
| 24 | 24 | 1500820 | 10536309 |
| 25 | 25 | 2631057 | 13167366 |
| 26 | 26 | 6466505 | 19633871 |
| 27 | 27 | 3757350 | 23391221 |
| 28 | 28 | 3487466 | 26878687 |
| 29 | 29 | 160197 | 27038884 |
| 30 | 30 | 14000 | 27052884 |
| 31 | NULL | NULL | NULL |
+------+------------+--------------+--------------+
I know this is a long one to read, but please let me know if I have explained this clearly enough. I have been trying to group by this whole morning, but I keep getting errors.
SELECT #TempDays.days, DATEPART(dd, a.ActualDate) ActualDate, a.ActualAmount, (SELECT SUM(b.ActualAmount)
FROM UnpaidManagement..Actual b
WHERE b.ID <= a.ID) RunningTotal
FROM UnpaidManagement..Actual a
RIGHT JOIN #TempDays on DATEPART(dd, a.ActualDate) = #TempDays.days
If you select the temp table as first table in the select and join to UnpaidManagement..Actual you have the days in correct row and order:
SELECT t.days
,DATEPART(dd, a.ActualDate) ActualDate
,a.ActualAmount
,(
SELECT SUM(b.ActualAmount)
FROM UnpaidManagement..Actual b
WHERE b.ID <= a.ID
) RunningTotal
FROM #TempDays AS t
INNER JOIN UnpaidManagement..Actual AS a ON a.IDENTITYCOL = t.days
ORDER BY t.days
After doing that, cou can add CASE WHEN to generate content for the NULL cells.
I have a table where I have a list of people, lets say i have 100 people listed in that table
I need to filter out the people using different criteria's and put them in groups, problem is when i start excluding on the 4th-5th level, performance issues come up and it becomes slow
with lst_tous_movements as (
select
t1.refid_eClinibase
t1.[dthrfinmouvement]
t1.[unite_service_id]
t1.[unite_service_suiv_id]
from sometable t1
)
,lst_patients_hospitalisés as (
select distinct
t1.refid_eClinibase
from lst_tous_movements t1
where
t1.[dthrfinmouvement] = '4000-01-01'
)
,lst_patients_admisUIB_transferes as (
select distinct
t1.refid_eClinibase
from lst_tous_movements t1
left join lst_patients_hospitalisés t2 on t1.refid_eClinibase = t2.refid_eClinibase
where
t1.[unite_service_id] = 4
and t1.[unite_service_suiv_id] <> 0
and t2.refid_eClinibase is null
)
,lst_patients_admisUIB_nonTransferes as (
select distinct
t1.refid_eClinibase
from lst_tous_movements t1
left join lst_patients_admisUIB_transferes t2 on t1.refid_eClinibase = t2.refid_eClinibase
left join lst_patients_hospitalisés t3 on t1.refid_eClinibase = t3.refid_eClinibase
where
t1.[unite_service_id] = 4
and t1.[unite_service_suiv_id] = 0
and t2.refid_eClinibase is null
and t3.refid_eClinibase is null
)
,lst_patients_autres as (
select distinct
t1.refid_eClinibase
from lst_patients t1
left join lst_patients_admisUIB_transferes t2 on t1.refid_eClinibase = t2.refid_eClinibase
left join lst_patients_hospitalisés t3 on t1.refid_eClinibase = t3.refid_eClinibase
left join lst_patients_admisUIB_nonTransferes t4 on t1.refid_eClinibase = t4.refid_eClinibase
where
t2.refid_eClinibase is null
and t3.refid_eClinibase is null
and t4.refid_eClinibase is null
)
as you can see i have a multi level filtering out going on here...
1st i get the people where t1.[dthrfinmouvement] = '4000-01-01'
2nd i get the people with another criteria EXCLUDING the 1st group
3rd i get the people with yet another criteria EXCLUDING the 1st and
the 2nd group
etc..
when i get to the 4th level, my query takes 6 - 10 seconds to complete
is there any way to speed this up ?
this is my dataset i'm working with:
+------------------+-------------------------------+------------------+------------------+-----------------------+
| refid_eClinibase | nodossierpermanent_eClinibase | dthrfinmouvement | unite_service_id | unite_service_suiv_id |
+------------------+-------------------------------+------------------+------------------+-----------------------+
| 25611 | P0017379 | 2013-04-27 | 58 | 0 |
| 25611 | P0017379 | 2013-05-02 | 4 | 2 |
| 25611 | P0017379 | 2013-05-18 | 2 | 0 |
| 85886 | P0077918 | 2013-04-10 | 58 | 0 |
| 85886 | P0077918 | 2013-05-06 | 6 | 12 |
| 85886 | P0077918 | 4000-01-01 | 12 | 0 |
| 91312 | P0083352 | 2013-07-24 | 3 | 14 |
| 91312 | P0083352 | 2013-07-24 | 14 | 3 |
| 91312 | P0083352 | 2013-07-30 | 3 | 8 |
| 91312 | P0083352 | 4000-01-01 | 8 | 0 |
| 93835 | P0085879 | 2013-04-30 | 58 | 0 |
| 93835 | P0085879 | 2013-05-07 | 4 | 2 |
| 93835 | P0085879 | 2013-05-16 | 2 | 0 |
| 93835 | P0085879 | 2013-05-22 | 58 | 0 |
| 93835 | P0085879 | 2013-05-24 | 4 | 0 |
| 93835 | P0085879 | 2013-05-31 | 58 | 0 |
| 93836 | P0085880 | 2013-05-20 | 58 | 0 |
| 93836 | P0085880 | 2013-05-22 | 4 | 2 |
| 93836 | P0085880 | 2013-05-31 | 2 | 0 |
| 97509 | P0089576 | 2013-04-09 | 58 | 0 |
| 97509 | P0089576 | 2013-04-11 | 4 | 0 |
| 102787 | P0094886 | 2013-04-08 | 58 | 0 |
| 102787 | P0094886 | 2013-04-11 | 4 | 2 |
| 102787 | P0094886 | 2013-05-21 | 2 | 0 |
| 103029 | P0095128 | 2013-04-04 | 58 | 0 |
| 103029 | P0095128 | 2013-04-10 | 4 | 1 |
| 103029 | P0095128 | 2013-05-03 | 1 | 0 |
| 103813 | P0095922 | 2013-07-02 | 58 | 0 |
| 103813 | P0095922 | 2013-07-03 | 4 | 6 |
| 103813 | P0095922 | 2013-08-14 | 6 | 0 |
| 105106 | P0097215 | 2013-08-09 | 58 | 0 |
| 105106 | P0097215 | 2013-08-13 | 4 | 0 |
| 105106 | P0097215 | 2013-08-14 | 58 | 0 |
| 105106 | P0097215 | 4000-01-01 | 4 | 0 |
| 106223 | P0098332 | 2013-06-11 | 1 | 0 |
| 106223 | P0098332 | 2013-08-01 | 58 | 0 |
| 106223 | P0098332 | 4000-01-01 | 1 | 0 |
| 106245 | P0098354 | 2013-04-02 | 58 | 0 |
| 106245 | P0098354 | 2013-05-24 | 58 | 0 |
| 106245 | P0098354 | 2013-05-29 | 4 | 1 |
| 106245 | P0098354 | 2013-07-12 | 1 | 0 |
| 106280 | P0098389 | 2013-04-07 | 58 | 0 |
| 106280 | P0098389 | 2013-04-09 | 4 | 0 |
| 106416 | P0098525 | 2013-04-19 | 58 | 0 |
| 106416 | P0098525 | 2013-04-23 | 4 | 0 |
| 106444 | P0098553 | 2013-04-22 | 58 | 0 |
| 106444 | P0098553 | 2013-04-25 | 4 | 0 |
| 106609 | P0098718 | 2013-05-08 | 58 | 0 |
| 106609 | P0098718 | 2013-05-10 | 4 | 11 |
| 106609 | P0098718 | 2013-07-24 | 11 | 12 |
| 106609 | P0098718 | 4000-01-01 | 12 | 0 |
| 106616 | P0098725 | 2013-05-09 | 58 | 0 |
| 106616 | P0098725 | 2013-05-09 | 4 | 1 |
| 106616 | P0098725 | 2013-07-27 | 1 | 0 |
| 106698 | P0098807 | 2013-05-16 | 58 | 0 |
| 106698 | P0098807 | 2013-05-22 | 4 | 6 |
| 106698 | P0098807 | 2013-06-14 | 6 | 1 |
| 106698 | P0098807 | 2013-06-28 | 1 | 0 |
| 106714 | P0098823 | 2013-05-20 | 58 | 0 |
| 106714 | P0098823 | 2013-05-21 | 58 | 0 |
| 106714 | P0098823 | 2013-05-24 | 58 | 0 |
| 106729 | P0098838 | 2013-05-21 | 58 | 0 |
| 106729 | P0098838 | 2013-05-23 | 4 | 1 |
| 106729 | P0098838 | 2013-06-03 | 1 | 0 |
| 107038 | P0099147 | 2013-06-25 | 58 | 0 |
| 107038 | P0099147 | 2013-06-28 | 4 | 1 |
| 107038 | P0099147 | 2013-07-04 | 1 | 0 |
| 107038 | P0099147 | 2013-08-13 | 58 | 0 |
| 107038 | P0099147 | 2013-08-15 | 4 | 6 |
| 107038 | P0099147 | 4000-01-01 | 6 | 0 |
| 107082 | P0099191 | 2013-06-29 | 58 | 0 |
| 107082 | P0099191 | 2013-07-04 | 4 | 6 |
| 107082 | P0099191 | 2013-07-19 | 6 | 0 |
| 107157 | P0099267 | 4000-01-01 | 13 | 0 |
| 107336 | P0099446 | 4000-01-01 | 6 | 0 |
+------------------+-------------------------------+------------------+------------------+-----------------------+
thanks.
It is hard to understand exactly what all your rules are from the question, but the general approach should be to add a "Grouping" column to a singl query that uses a CASE statement to categorize the people.
The conditions in a CASE are evaluated in order, so that if the first criteria is met, then the subsequent criteria are not even evaluated for that row.
Here is some code to get you started....
select t1.refid_eClinibase
,t1.[dthrfinmouvement]
,t1.[unite_service_id]
,t1.[unite_service_suiv_id]
CASE WHEN [dthrfinmouvement] = '4000-01-01' THEN 'Group1 Label'
WHEN condition2 = something THEN 'Group2 Label'
....
WHEN conditionN = something THEN 'GroupN Label'
ELSE 'Catch All Label'
END as person_category
from sometable t1