I am having difficulty figuring out this dang problem. From the data and queries I have given below I am trying to see the email address that has rented the most movies during the month of September.
There are only 4 relevant tables in my database and they have been anonymized and shortened:
Table "cust":
cust_id
f_name
l_name
email
1
Jack
Daniels
jack.daniels#google.com
2
Jose
Quervo
jose.quervo#yahoo.com
5
Jim
Beam
jim.beam#protonmail.com
Table "rent"
inv_id
cust_id
rent_date
10
1
9/1/2022 10:29
11
1
9/2/2022 18:16
12
1
9/2/2022 18:17
13
1
9/17/2022 17:34
14
1
9/19/2022 6:32
15
1
9/19/2022 6:33
16
3
9/1/2022 18:45
17
3
9/1/2022 18:46
18
3
9/2/2022 18:45
19
3
9/2/2022 18:46
20
3
9/17/2022 18:32
21
3
9/19/2022 22:12
10
2
9/19/2022 11:43
11
2
9/19/2022 11:42
Table "inv"
mov_id
inv_id
22
10
23
11
24
12
25
13
26
14
27
15
28
16
29
17
30
18
31
19
31
20
32
21
Table "mov":
mov_id
titl
rate
22
Anaconda
3.99
23
Exorcist
1.99
24
Philadelphia
3.99
25
Quest
1.99
26
Sweden
1.99
27
Speed
1.99
28
Nemo
1.99
29
Zoolander
5.99
30
Truman
5.99
31
Patient
1.99
32
Racer
3.99
and here is my current query progress:
SELECT cust.email,
COUNT(DISTINCT inv.mov_id) AS "Rented_Count"
FROM cust
JOIN rent ON rent.cust_id = cust.cust_id
JOIN inv ON inv.inv_id = rent.inv_id
JOIN mov ON mov.mov_id = inv.mov_id
WHERE rent.rent_date BETWEEN '2022-09-01' AND '2022-09-31'
GROUP BY cust.email
ORDER BY "Rented_Count" DESC;
and here is what it outputs:
email
Rented_Count
jack.daniels#google.com
6
jim.beam#protonmail.com
6
jose.quervo#yahoo.com
2
and what I want it to be outputting:
email
jack.daniels#google.com
jim.beam#protonmail.com
From the results I am actually getting I have a tie for first place (Jim and Jack) and that is fine but I would like it to list both tieing email addresses not just Jack's so you cant do anything with rows or max I don't think.
I think it must have something to do with dense_rank but I don't know how to use that specifically in this scenario with the count and Group By?
Your creativity and help would be appreciated.
You're missing the FETCH FIRST ROWS WITH TIES clause. It will work together with the ORDER BY clause to get you the highest values (FIRST ROWS), including ties (WITH TIES).
SELECT cust.email
FROM cust
INNER JOIN rent
ON rent.cust_id = cust.cust_id
INNER JOIN inv
ON inv.inv_id = rent.inv_id
INNER JOIN mov
ON mov.mov_id = inv.mov_id
WHERE rent.rent_date BETWEEN '2022-09-01' AND '2022-09-31'
GROUP BY cust.email
ORDER BY COUNT(DISTINCT inv.mov_id) DESC
FETCH FIRST 1 ROWS WITH TIES
I'm having a difficult time writing the SQL I'm able to accomplish with Excel SUMIFS. The ultimate goal is create share of requirement (loyalty) metric for each Customer. In the below, I'm trying to create the Category_Sales column. All other columns I already have in my SQL.
Here is what my Excel SUMIFS looks like.
=(Brand_Sales (range) , Cust_ID (range), Cust_ID (row), First_Buy (range), >=Brand_Str (row), Last Buy (range) <=, Brand_End (row))
The 268 value for Innocent is attained from the sum of Innocent, Cresco, Supply, and PTS since their First_Buy & Last_Buys are all inside the range of the Innocent Brand Start & End.
State
Brand
Cust_ID
First_Buy
Last_Buy
Brand_Str
Brand_End
Brand_Sales
Category_Sales
IL
Innocent
xyz
4/9/2022
4/9/2022
4/7/2022
5/29/2022
64
268
IL
Cresco
xyz
4/15/2022
4/15/2022
1/1/2022
5/30/2022
57
446
IL
Supply
xyz
4/15/2022
4/15/2022
1/1/2022
5/30/2022
45
446
IL
Rythm
xyz
1/3/2022
1/13/2022
1/1/2022
5/30/2022
121
446
IL
Natures
xyz
1/22/2022
1/22/2022
1/1/2022
5/30/2022
57
446
IL
PTS
xyz
4/26/2022
4/26/2022
1/1/2022
5/30/2022
102
446
I want to add logic that calculates price per claim. Below, there are two claims, one for patient 5, and another for patient 6. Original idea is to create a unique list of patient numbers in a separate table, then sort the original table by these unique patient numbers and run conditional statements to output a single value (reimbursement value).Then iterate through the unique table until completed. Does this sound like a feasible workflow? Not necessarily looking for specific code but more of a workflow/process
For example/context:
PatNo
RevCode
CPT
BilledCharges
DRG
5
141
null
100
880
5
636
J1234
50
null
6
111
null
8000
783
6
636
J1234
300
null
PSYCH look up table: if claim has DRG on table then calculate 75% of BilledCharges for claim.
DRG
Service Category
876
PSYCH
880
PSYCH
881
PSYCH
882
PSYCH
883
PSYCH
884
PSYCH
885
PSYCH
886
PSYCH
887
PSYCH
C- Section look up table: if claim has DRG on table pay $5000 for claim.
DRG
Service
765
C-SECTION
766
C-SECTION
783
C-SECTION
784
C-SECTION
786
C-SECTION
787
C-SECTION
785
C-SECTION
788
C-SECTION
If claim has RevCode 636, then add 50% of charges to claim reimbusment.
OUTPUT:
PatNo
Reimburs.
5
100
6
5150
So...
Patient 5's reimbursement is...(75% of 100) + (50% of 50) = 100
Patient 6's reimbursement is...(5000) + (50% of 300)
Assuming you've told us all the rules...
You can left join the tables, to check if values are present there or not, then use case expressions to apply the logic, and finally aggregate it to sum it all up...
SELECT
YourTable.patno,
SUM(
CASE WHEN section.drg IS NOT NULL THEN 5000
WHEN psych.drg IS NOT NULL THEN 0.75 * yourTable.billedcharges
WHEN yourTable.revcode = 636 THEN 0.5 * yourTable.billedcharges
ELSE 0
END
)
FROM
yourTable
LEFT JOIN
section
ON section.drg = yourTable.drg
LEFT JOIN
psych
ON psych.drg = yourTable.drg
GROUP BY
yourTable.patno
Please forgive typos, I'm on my phone.
This is a sample Data Frame
Date Items_Sold
12/29/2019 10
12/30/2019 20
12/31/2019 30
1/1/2020 40
1/2/2020 50
1/3/2020 60
1/4/2020 35
1/5/2020 56
1/6/2020 34
1/7/2020 564
1/8/2020 6
1/9/2020 45
1/10/2020 56
1/11/2020 45
1/12/2020 37
1/13/2020 36
1/14/2020 479
1/15/2020 47
1/16/2020 47
1/17/2020 578
1/18/2020 478
1/19/2020 3578
1/20/2020 67
1/21/2020 578
1/22/2020 478
1/23/2020 4567
1/24/2020 7889
1/25/2020 8999
1/26/2020 99
1/27/2020 66
1/28/2020 678
1/29/2020 889
1/30/2020 990
1/31/2020 58585
2/1/2020 585
2/2/2020 555
2/3/2020 56
2/4/2020 66
2/5/2020 66
2/6/2020 6634
2/7/2020 588
2/8/2020 2588
2/9/2020 255
I am running this query
%sql
use my_items_table;
select weekofyear(Date), count(items_sold) as Sum
from my_items_table
where year(Date)=2020
group by weekofyear(Date)
order by weekofyear(Date)
I am getting this output. (IMP: I have added random values in Sum)
Week Sum
1 | 300091
2 | 312756
3 | 309363
4 | 307312
5 | 310985
6 | 296889
7 | 315611
But I want in which with week number one column should hold a start date of each week. Like this
Start_Date Week Sum
12/29/2019 1 300091
1/5/2020 2 312756
1/12/2020 3 309363
1/19/2020 4 307312
1/26/2020 5 310985
2/2/2020 6 296889
2/9/2020 7 315611
I am running the query on Azure Data Bricks.
If you have data for all days, then just use min():
select min(date), weekofyear(Date), count(items_sold) as Sum
from my_items_table
where year(Date) = 2020
group by weekofyear(Date)
order by weekofyear(Date);
Note: The year() is the calendar year starting on Jan 1. You are not going to get dates from other years using this query. If that is an issue, I would suggest that you ask a new question asking how to get the first day for the first week of the year.
I have a list of open invoices listed in Excel per office. How can I use a macro to split these open invoices per office into different Excel sheets, and then save each sheet as the office number? Below is a sample list where the first column is the office number, the entire line associated would have to be copied over to the new sheet.
1 180 JOHN 30073 COMPANY X 15,101 173,758 255,713 8/1/2011 8,101 8/24/11 Adair has approved the rates!
1 278 ADAM 159334 COMPANY A 28,606 116,174 158,925 5/9/2011 167,631 7/18/11 Julie Levinsohn still needs to look at reduced entries to see if we can resubmit.
2 600 ROSE 113724 COMPANY 123 0 5,918 20,446 8/22/2011 6,713 8/26/11 Em M Belcher Jul invs per her request.
2 289 SUE 149232 COMPANY BC 389 5,575 12,098 4/22/2011 328 8/23/11 Em w/jun inv to R. Kos
2 169 MIKE 120126 COMPANY 98 41,907 5,218 202,756 8/18/2011 33,635 8/24/11 Jun invs to be pd mid sept per Patrice.
2 849 BOB 63068 COMPANY CB 2,862 4,889 9,271 4/25/2011 4,279 8/23/11 Called Choi re when and how much she will send.
2 849 LANEY 170318 COMPANY 34 0 4,123 6,283 6/30/2011 270 8/17/11 Robert em me re working w/cj re problem w/retainer hrs and bills.
2 707 BOB 153213 COMPANY CI 0 3,127 3,127 5/6/2011 257 4/27/11 Appeals for some of the shortpays by insurance co are pending per
2 141 SUE 65267 COMPANY Z 9,652 2,313 12,546 7/20/2011 8,380 8/16/11 Stmt em to Pat for pmt of os invs.
2 705 MIKE 173993 COMPANY X 5,020 2,240 7,294 7/19/2011 1,120 8/24/11 Pmt is processing.
2 763 JOHN 85919 COMPANY LK 3,500 0 4,500 8/22/2011 1,014 8/19/11 Inv 5637061 in a/p for pmt per cl.
2 400 MIKE 41218 COMPANY 90 3,433 0 3,433 8/24/2011 2,270 5/27/09 Per Hall, ck has been signed and mailed for Feb inv.
3 164 MIKE 133625 COMPANY LO 500 19,351 21,795 7/25/2011 636 8/16/11 Em to B. Kampas re my calling for pmt
3 178 BOB 168512 COMPANY GH 889 15,749 17,030 6/9/2011 2,322 8/24/11 L/m for M Cornejo to call re osbal due.
3 1005 SUE 164680 COMPANY TH 0 13,862 14,459 8/24/2010 5,000 07/06/11 snt ar statement.dtelles
3 164 LANEY 61383 COMPANY RT 0 11,077 65,316 7/29/2011 31,750 8/30/11 Inv 5567542 being revised per
3 171 SUE 78029 COMPANY 345 0 10,507 20,385 8/15/2011 10,165 8/26/11 May invs em to susan
3 164 JOHN 62161 COMPANY 383 14,000 10,500 73,376 8/22/2011 3,500 8/26/11 Invs 5655722 and 5629996 are flat fee bills
3 1139 MIKE 169932 COMPANY 282 145 10,401 10,546 8/24/2011 800 8/26/11 $800 recd.
3 171 CHRIS 134278 COMPANY 202 0 9,603 9,603 8/15/2011 38,300 8/11/11 JP em from Myriam that ck cut today
3 1363 CHRIS 166031 COMPANY CW 0 8,987 8,987 9/17/2010 4,104 8/3/11 em to Brad K about speaking to Charles
3 171 JOHN 139383 COMPANY WE 3,872 8,712 23,575 8/19/2011 5,608 07/06/11 snt ar stmnt.
3 198 MIKE 118294 COMPANY LC 0 3,262 3,262 3/9/2011 1,000 8/15/11 Em Cl for a $500 pmt.
3 1139 BOB 176647 COMPANY XC 0 2,673 11,648 7/26/2011 12,152 8/24/11 Em w/may inv sent to for pmt.
3 1223 BOB 163879 COMPANY NC 4,550 185 4,735 8/15/2011 32,815 8/4/11 Mar, Apr May invs revised and sent to Cl
3 1139 BOB 169094 COMPANY 321 5,000 173 12,728 7/20/2011 4,730 8/30/11 CP em Gayle W9 and inv 5625894
3 178 SUE 5416 COMPANY DW 2,670 0 9,596 8/8/2011 2,496 8/24/11 Sent stmt to V. Wu.
3 762 CHRIS 112507 COMPANY IC 6,000 0 12,293 8/8/2011 4,013 07/06/11 snt ar stmnt.
3 631 JOHH 108718 COMPANY IF 15,842 0 43,215 8/26/2011 5,515 07/06/11 snt ar stmnt
4 133 LANEY 157042 COMPANY IR 0 92,879 114,157 5/6/2011 116,483 08/18/2011,jw, emailed Carolyn
4 502 LANEY 66422 COMPANY IG 30,291 58,792 160,301 8/23/2011 24,512 9/17/10 JP sent f/u em to Robin re past due inv
4 155 CHRIS 72564 COMPANY 853 3,283 55,918 62,367 8/4/2011 500 10/19/05 lg recd call from karen,invs will be pd as of next week
4 500 BOB 128230 COMPANY URE 850 49,217 51,006 12/3/2010 2,353 06/29/2011,jw, asked Dk for collection ypdate
4 751 MIKE 174393 COMPANY KRIG 0 42,753 60,057 7/5/2011 3,658
4 1384 JOHN 143392 COMPANY IR 0 42,468 42,468 10/27/2010 -2,500 7/20/11 Account turned over to collections.
4 1135 MIKE 169399 COMPANY IGD 1,517 38,857 44,108 7/3/2011 1,539 07/07/2011,jw, emailed Jake for collecti on update
4 1135 CHRIS 151511 COMPANY IGDS 608 37,458 42,010 7/29/2011 5,691 07/07/2011,jw, emailed Jake for collection update
5 101 BOB 140464 COMPANY IDGS 0 9,226 16,185 7/20/2011 1,120 10/13/10 JP call from Sun req copy of os inv be em
5 281 JOHN 155780 COMPANY IERE 0 9,214 14,557 8/22/2011 13,097 5/3/11Inv 5440713 to clnt mi
5 288 JOHN 86325 COMPANY 832 1,140 9,178 11,458 8/22/2011 2,824 12/10/09 am received em advising they r moving
Try this; it should work, assuming the office number is column A and there are row headings:
Dim Source As Range
Dim OfficeNumber As String
Dim PrevOfficeNumber As String
Dim CurrentSheet As Worksheet
Dim NewSheet As Worksheet
Set Source = Cells(1, 1).CurrentRegion
Set CurrentSheet = ActiveSheet
For Row = 1 To Source.Rows.Count
OfficeNumber = CurrentSheet.Cells(Row, 1)
If OfficeNumber <> PrevOfficeNumber Then
' create a new sheet
newrow = 1
Set NewSheet = Application.Sheets.Add
NewSheet.Name = OfficeNumber
PrevOfficeNumber = OfficeNumber
End If
' copy row to new sheet
CurrentSheet.Cells(Row, 1).EntireRow.Copy (NewSheet.Cells(newrow, 1))
newrow = newrow + 1
Next