How to get weekly data but starting from the first date of the month and do SUM calculation accordingly in BQ? - sql

I have an issue to pull this kind of data. So I need to pull weekly data with these specifications:
The data pull will be scheduled, hence it will involve multiple months
The very first week will start from the first date (1 in every month) -- Green in the pic
The last week doesn't involve dates from the next month -- Red in the pic
The raw data and the desirable output(s) will more or less look like this:
Is there any workaround to do this in BigQuery? Thanks (attached below the data)
+-------------+-------+
| date | sales |
+-------------+-------+
| 1 Oct 2021 | 5 |
+-------------+-------+
| 2 Oct 2021 | 13 |
+-------------+-------+
| 3 Oct 2021 | 75 |
+-------------+-------+
| 4 Oct 2021 | 3 |
+-------------+-------+
| 5 Oct 2021 | 70 |
+-------------+-------+
| 6 Oct 2021 | 85 |
+-------------+-------+
| 7 Oct 2021 | 99 |
+-------------+-------+
| 8 Oct 2021 | 90 |
+-------------+-------+
| 9 Oct 2021 | 68 |
+-------------+-------+
| 10 Oct 2021 | 97 |
+-------------+-------+
| 11 Oct 2021 | 87 |
+-------------+-------+
| 12 Oct 2021 | 56 |
+-------------+-------+
| 13 Oct 2021 | 99 |
+-------------+-------+
| 14 Oct 2021 | 38 |
+-------------+-------+
| 15 Oct 2021 | 6 |
+-------------+-------+
| 16 Oct 2021 | 43 |
+-------------+-------+
| 17 Oct 2021 | 45 |
+-------------+-------+
| 18 Oct 2021 | 90 |
+-------------+-------+
| 19 Oct 2021 | 64 |
+-------------+-------+
| 20 Oct 2021 | 26 |
+-------------+-------+
| 21 Oct 2021 | 24 |
+-------------+-------+
| 22 Oct 2021 | 4 |
+-------------+-------+
| 23 Oct 2021 | 36 |
+-------------+-------+
| 24 Oct 2021 | 68 |
+-------------+-------+
| 25 Oct 2021 | 4 |
+-------------+-------+
| 26 Oct 2021 | 16 |
+-------------+-------+
| 27 Oct 2021 | 30 |
+-------------+-------+
| 28 Oct 2021 | 89 |
+-------------+-------+
| 29 Oct 2021 | 46 |
+-------------+-------+
| 30 Oct 2021 | 28 |
+-------------+-------+
| 31 Oct 2021 | 28 |
+-------------+-------+
| 1 Nov 2021 | 47 |
+-------------+-------+
| 2 Nov 2021 | 75 |
+-------------+-------+
| 3 Nov 2021 | 1 |
+-------------+-------+
| 4 Nov 2021 | 26 |
+-------------+-------+
| 5 Nov 2021 | 26 |
+-------------+-------+
| 6 Nov 2021 | 38 |
+-------------+-------+
| 7 Nov 2021 | 79 |
+-------------+-------+
| 8 Nov 2021 | 37 |
+-------------+-------+
| 9 Nov 2021 | 83 |
+-------------+-------+
| 10 Nov 2021 | 97 |
+-------------+-------+
| 11 Nov 2021 | 56 |
+-------------+-------+
| 12 Nov 2021 | 83 |
+-------------+-------+
| 13 Nov 2021 | 14 |
+-------------+-------+
| 14 Nov 2021 | 25 |
+-------------+-------+
| 15 Nov 2021 | 55 |
+-------------+-------+
| 16 Nov 2021 | 16 |
+-------------+-------+
| 17 Nov 2021 | 80 |
+-------------+-------+
| 18 Nov 2021 | 66 |
+-------------+-------+
| 19 Nov 2021 | 25 |
+-------------+-------+
| 20 Nov 2021 | 62 |
+-------------+-------+
| 21 Nov 2021 | 36 |
+-------------+-------+
| 22 Nov 2021 | 33 |
+-------------+-------+
| 23 Nov 2021 | 19 |
+-------------+-------+
| 24 Nov 2021 | 47 |
+-------------+-------+
| 25 Nov 2021 | 14 |
+-------------+-------+
| 26 Nov 2021 | 22 |
+-------------+-------+
| 27 Nov 2021 | 66 |
+-------------+-------+
| 28 Nov 2021 | 15 |
+-------------+-------+
| 29 Nov 2021 | 96 |
+-------------+-------+
| 30 Nov 2021 | 4 |
+-------------+-------+

Consider below approach
with temp as (
select parse_date('%d %B %Y', date) date, sales
from your_table
)
select format_date('%d %B %Y', weeks[ordinal(num)]) start_week, sum(sales) total_sales
from (
select sales, weeks, range_bucket(date, weeks) num
from temp, unnest([struct(generate_date_array(date_trunc(date, month), last_day(date, month), interval 7 day ) as weeks)])
)
group by start_week
if to apply to sample data (as is) in your question - output is

Related

create a calculated column base on two column in SQL

I have a below table and I need to create a calculated column (RA) based on the category and month column.
Oa Sa Ai month MDY
5 10 2 Jan J302022
16 32 38 Jan J302022
15 14 4 Feb J302022
46 32 81 Jan J302022
3 90 0 Mar J302022
51 10 21 Jan J302021
19 32 3 Jan J302021
45 16 41 Feb J302021
46 7 81 Jan J302022
30 67 14 Mar J302021
45 16 41 Apr J302021
46 7 81 Apr J302021
30 67 0 Jan J302021
56 17 0 Mar J302022
first, it should need to consider a category, for example, J302022, then it needs to calculate the "RA" column based on the month for that category. for example, J302022, Jan, ((5+16+46+46)+(10+32+32+7)) / (2+38+81+81) = 0.96.
So below is the expected output looks like.
Oa Sa Ai month category RA
5 10 2 Jan J302022 0.96
16 32 38 Jan J302022 0.96
15 14 4 Feb J302022 7.25
46 32 81 Jan J302022 0.96
3 90 0 Mar J302022 0
51 10 21 Jan J302021 8.70
19 32 3 Jan J302021 8.70
45 16 41 Feb J302021 1.48
46 7 81 Jan J302022 0.96
30 67 14 Mar J302021 6.92
45 16 41 Apr J302021 1.48
46 7 81 Apr J302022 0.65
30 67 0 Jan J302021 8.70
56 17 0 Mar J302022 0
Is it possible to do it in SQL?
Thanks in advance!
select Oa, Sa, Ai, month, category,
coalesce((ra1+ra2)/ra3, 0) as RA
from (
select Oa, Sa, Ai, month, mdy as category,
sum(oa) over (partition by month, mdy) as ra1,
sum(sa) over (partition by month, mdy) as ra2,
sum(ai) over (partition by month, mdy) as ra3
from WhateverYourTableNameIs
) as t;
Output on MySQL 8.0.29:
+------+------+------+-------+----------+--------+
| Oa | Sa | Ai | month | category | RA |
+------+------+------+-------+----------+--------+
| 45 | 16 | 41 | Apr | J302021 | 0.9344 |
| 46 | 7 | 81 | Apr | J302021 | 0.9344 |
| 45 | 16 | 41 | Feb | J302021 | 1.4878 |
| 15 | 14 | 4 | Feb | J302022 | 7.2500 |
| 51 | 10 | 21 | Jan | J302021 | 8.7083 |
| 19 | 32 | 3 | Jan | J302021 | 8.7083 |
| 30 | 67 | 0 | Jan | J302021 | 8.7083 |
| 5 | 10 | 2 | Jan | J302022 | 0.9604 |
| 16 | 32 | 38 | Jan | J302022 | 0.9604 |
| 46 | 32 | 81 | Jan | J302022 | 0.9604 |
| 46 | 7 | 81 | Jan | J302022 | 0.9604 |
| 30 | 67 | 14 | Mar | J302021 | 6.9286 |
| 3 | 90 | 0 | Mar | J302022 | 0.0000 |
| 56 | 17 | 0 | Mar | J302022 | 0.0000 |
+------+------+------+-------+----------+--------+
for SQL Server
select Oa, Sa, Ai, [Month],Category,
case when (sum(Ai) over(partition by [Month], Category) *1.0) = 0 then 0 else
(sum(Oa) over(partition by [Month], Category) +
sum(Sa) over(partition by [Month], Category))/
(sum(Ai) over(partition by [Month], Category) *1.0) end Ra
from #temp
order by Category desc
1.0 is multiplied in denominator to convert the output to float.

How do I edit the code that calculates the value for the four weeks of the month for all months with pl/sql?

I divided the month into four weeks and printed the amount for each week. How do I set this up with a loop for 12 months?
declare
cursor c is
select varis_tar, tutar
from muhasebe.doviz_takip
where trunc(varis_tar) BETWEEN TO_DATE('01/10/2021', 'DD/MM/YYYY') AND
TO_DATE('31/10/2021', 'DD/MM/YYYY')
group by varis_tar,tutar;
tutar1 number(13,2):=0;
tutar2 number(13,2):=0;
tutar3 number(13,2):=0;
tutar4 number(13,2):=0;
begin
for r in c loop
if r.varis_tar between TO_DATE('01/10/2021', 'DD/MM/YYYY') AND
TO_DATE('07/10/2021', 'DD/MM/YYYY') then
tutar1:=(r.tutar)+tutar1;
--message(r.tutar);
elsif r.varis_tar between TO_DATE('07/10/2021', 'DD/MM/YYYY') AND
TO_DATE('14/10/2021', 'DD/MM/YYYY') then
tutar2:=(r.tutar)+tutar2;
--message(r.tutar);
elsif r.varis_tar between TO_DATE('14/10/2021', 'DD/MM/YYYY') AND
TO_DATE('21/10/2021', 'DD/MM/YYYY') then
tutar3:=(r.tutar)+tutar3;
--message(r.tutar);
elsif r.varis_tar between TO_DATE('21/10/2021', 'DD/MM/YYYY') AND
TO_DATE('31/10/2021', 'DD/MM/YYYY') then
tutar4:=(r.tutar)+tutar4;
--message(r.tutar);
end if;
end loop;
I tried to get the dates the same way for all the months. I tried that, but it worked wrong.
where trunc(varis_tar) BETWEEN TO_DATE('1', 'DD') AND
TO_DATE('31', 'DD')
if r.varis_tar between TO_DATE('1', 'DD') AND
TO_DATE('07', 'DD') then
elsif r.varis_tar between TO_DATE('7', 'DD') AND
TO_DATE('14', 'DD') then
elsif r.varis_tar between TO_DATE('14', 'DD') AND
TO_DATE('21', 'DD') then
elsif r.varis_tar between TO_DATE('21', 'DD') AND
TO_DATE('31', 'DD') then
I don't know if I'am understanding it correctly but:
try if extract(day from varis_tar) between 1 and 7
or more complex
l_week := to_char(varis_tar,'W'); --week number
if l_week = 1 then --first week
elsif l_week = 2 etc...
Your code has several issues:
date in Oracle is actually a datetime, so between will not count any time after the midnight of the upper boundary.
you count the midnight of the week's end twice: in current week and in the next week (between includes both boundaries).
you do not need any PL/SQL and especially a cursor loop, because it occupy resources during calculation outside of SQL context.
Use datetime format to calculate weeks, because it is easy to read and understand. Then group by corresponding components.
with a as (
select
date '2021-01-01' - 1 + level as dt
, level as val
from dual
connect by level < 400
)
, b as (
select
dt
, val
/*Map 29, 30 and 31 to 28*/
, to_char(
least(dt, trunc(dt, 'mm') + 27)
, 'yyyymmw'
) as w
from a
)
select
substr(w, 1, 4) as y
, substr(w, 5, 2) as m
, substr(w, -1) as w
, sum(val) as val
, min(dt) as dt_from
, max(dt) as dt_to
from b
group by
w
Y | M | W | VAL | DT_FROM | DT_TO
:--- | :- | :- | ---: | :--------- | :---------
2021 | 01 | 1 | 28 | 2021-01-01 | 2021-01-07
2021 | 01 | 2 | 77 | 2021-01-08 | 2021-01-14
2021 | 01 | 3 | 126 | 2021-01-15 | 2021-01-21
2021 | 01 | 4 | 265 | 2021-01-22 | 2021-01-31
2021 | 02 | 1 | 245 | 2021-02-01 | 2021-02-07
2021 | 02 | 2 | 294 | 2021-02-08 | 2021-02-14
2021 | 02 | 3 | 343 | 2021-02-15 | 2021-02-21
2021 | 02 | 4 | 392 | 2021-02-22 | 2021-02-28
2021 | 03 | 1 | 441 | 2021-03-01 | 2021-03-07
2021 | 03 | 2 | 490 | 2021-03-08 | 2021-03-14
2021 | 03 | 3 | 539 | 2021-03-15 | 2021-03-21
2021 | 03 | 4 | 855 | 2021-03-22 | 2021-03-31
2021 | 04 | 1 | 658 | 2021-04-01 | 2021-04-07
2021 | 04 | 2 | 707 | 2021-04-08 | 2021-04-14
2021 | 04 | 3 | 756 | 2021-04-15 | 2021-04-21
2021 | 04 | 4 | 1044 | 2021-04-22 | 2021-04-30
2021 | 05 | 1 | 868 | 2021-05-01 | 2021-05-07
2021 | 05 | 2 | 917 | 2021-05-08 | 2021-05-14
2021 | 05 | 3 | 966 | 2021-05-15 | 2021-05-21
2021 | 05 | 4 | 1465 | 2021-05-22 | 2021-05-31
2021 | 06 | 1 | 1085 | 2021-06-01 | 2021-06-07
2021 | 06 | 2 | 1134 | 2021-06-08 | 2021-06-14
2021 | 06 | 3 | 1183 | 2021-06-15 | 2021-06-21
2021 | 06 | 4 | 1593 | 2021-06-22 | 2021-06-30
2021 | 07 | 1 | 1295 | 2021-07-01 | 2021-07-07
2021 | 07 | 2 | 1344 | 2021-07-08 | 2021-07-14
2021 | 07 | 3 | 1393 | 2021-07-15 | 2021-07-21
2021 | 07 | 4 | 2075 | 2021-07-22 | 2021-07-31
2021 | 08 | 1 | 1512 | 2021-08-01 | 2021-08-07
2021 | 08 | 2 | 1561 | 2021-08-08 | 2021-08-14
2021 | 08 | 3 | 1610 | 2021-08-15 | 2021-08-21
2021 | 08 | 4 | 2385 | 2021-08-22 | 2021-08-31
2021 | 09 | 1 | 1729 | 2021-09-01 | 2021-09-07
2021 | 09 | 2 | 1778 | 2021-09-08 | 2021-09-14
2021 | 09 | 3 | 1827 | 2021-09-15 | 2021-09-21
2021 | 09 | 4 | 2421 | 2021-09-22 | 2021-09-30
2021 | 10 | 1 | 1939 | 2021-10-01 | 2021-10-07
2021 | 10 | 2 | 1988 | 2021-10-08 | 2021-10-14
2021 | 10 | 3 | 2037 | 2021-10-15 | 2021-10-21
2021 | 10 | 4 | 2995 | 2021-10-22 | 2021-10-31
2021 | 11 | 1 | 2156 | 2021-11-01 | 2021-11-07
2021 | 11 | 2 | 2205 | 2021-11-08 | 2021-11-14
2021 | 11 | 3 | 2254 | 2021-11-15 | 2021-11-21
2021 | 11 | 4 | 2970 | 2021-11-22 | 2021-11-30
2021 | 12 | 1 | 2366 | 2021-12-01 | 2021-12-07
2021 | 12 | 2 | 2415 | 2021-12-08 | 2021-12-14
2021 | 12 | 3 | 2464 | 2021-12-15 | 2021-12-21
2021 | 12 | 4 | 3605 | 2021-12-22 | 2021-12-31
2022 | 01 | 1 | 2583 | 2022-01-01 | 2022-01-07
2022 | 01 | 2 | 2632 | 2022-01-08 | 2022-01-14
2022 | 01 | 3 | 2681 | 2022-01-15 | 2022-01-21
2022 | 01 | 4 | 3915 | 2022-01-22 | 2022-01-31
2022 | 02 | 1 | 1194 | 2022-02-01 | 2022-02-03
db<>fiddle here
Or the same in columns:
with a as (
select
date '2021-01-01' - 1 + level as dt
, level as val
from dual
connect by level < 400
)
, b as (
select
val
/*Map 29, 30 and 31 to 28*/
, to_char(dt, 'yyyymm') as m
, to_char(
least(dt, trunc(dt, 'mm') + 27)
, 'w'
) as w
from a
)
select
substr(m, 1, 4) as y
, substr(m, 5, 2) as m
, tutar1
, tutar2
, tutar3
, tutar4
from b
pivot(
sum(val)
for w in (
1 as tutar1, 2 as tutar2
, 3 as tutar3, 4 as tutar4
)
)
Y | M | TUTAR1 | TUTAR2 | TUTAR3 | TUTAR4
:--- | :- | -----: | -----: | -----: | -----:
2021 | 01 | 28 | 77 | 126 | 265
2021 | 02 | 245 | 294 | 343 | 392
2021 | 03 | 441 | 490 | 539 | 855
2021 | 04 | 658 | 707 | 756 | 1044
2021 | 05 | 868 | 917 | 966 | 1465
2021 | 06 | 1085 | 1134 | 1183 | 1593
2021 | 07 | 1295 | 1344 | 1393 | 2075
2021 | 08 | 1512 | 1561 | 1610 | 2385
2021 | 09 | 1729 | 1778 | 1827 | 2421
2021 | 10 | 1939 | 1988 | 2037 | 2995
2021 | 11 | 2156 | 2205 | 2254 | 2970
2021 | 12 | 2366 | 2415 | 2464 | 3605
2022 | 01 | 2583 | 2632 | 2681 | 3915
2022 | 02 | 1194 | null | null | null
db<>fiddle here

How to create churn table from transactional data?

Currently my Transaction Table has customer's transaction data for each month. Account_ID identifies the customer's ID. Order_ID is the number of orders that the customer had made. Reporting_week_start_date is the week which begins on Monday where each transaction is made (Date_Purchased).
How do i create a new table to identify the customer_status after each transaction has been made? Note that the new table has the Reporting_week_start_date until current date despite no transactions has been made .
Customer_Status
- New : customers who made their first paid subscription
- Recurring : customers with continuous payment
- Churned : when customers' subscriptions had expired and there's no renewal within the next month/same month
- Reactivated : customers who had churned and then returned to re-subscribe
Transaction Table
Account_ID | Order_ID | Reporting_week_start_date| Date_Purchased | Data_Expired
001 | 1001 | 31 Dec 2018 | 01 Jan 2019 | 08 Jan 2019
001 | 1001 | 07 Jan 2019 | 08 Jan 2019 | 15 Jan 2019
001 | 1001 | 14 Jan 2019 | 15 Jan 2019 | 22 Jan 2019 #Transaction 1
001 | 1001 | 21 Jan 2019 | 22 Jan 2019 | 29 Jan 2019
001 | 1001 | 28 Jan 2019 | 29 Jan 2019 | 31 Jan 2019
001 | 1002 | 28 Jan 2019 | 01 Feb 2019 | 08 Feb 2019
001 | 1002 | 04 Feb 2019 | 08 Feb 2019 | 15 Feb 2019 #Transaction 2
001 | 1002 | 11 Feb 2019 | 15 Feb 2019 | 22 Feb 2019
001 | 1002 | 18 Feb 2019 | 22 Feb 2019 | 28 Feb 2019
001 | 1003 | 25 Feb 2019 | 01 Mar 2019 | 08 Mar 2019
001 | 1003 | 04 Mar 2019 | 08 Mar 2019 | 15 Mar 2019
001 | 1003 | 11 Mar 2019 | 15 Mar 2019 | 22 Mar 2019 #Transaction 3
001 | 1003 | 18 Mar 2019 | 22 Mar 2019 | 29 Mar 2019
001 | 1003 | 25 Mar 2019 | 29 Mar 2019 | 31 Mar 2019
001 | 1004 | 27 May 2019 | 01 Jun 2019 | 08 Jun 2019
001 | 1004 | 03 Jun 2019 | 08 Jun 2019 | 15 Jun 2019 #Transaction 4
001 | 1004 | 10 Jun 2019 | 15 Jun 2019 | 22 Jun 2019
001 | 1004 | 17 Jun 2019 | 22 Jun 2019 | 29 Jun 2019
001 | 1004 | 24 Jun 2019 | 29 Jun 2019 | 30 Jun 2019
Expected Output
Account_ID | Order_ID | Reporting_week_start_date| Customer_status
001 | 1001 | 31 Dec 2018 | New
001 | 1001 | 07 Jan 2019 | New #Transaction 1
001 | 1001 | 14 Jan 2019 | New
001 | 1001 | 21 Jan 2019 | New
001 | 1001 | 28 Jan 2019 | New
001 | 1002 | 28 Jan 2019 | Recurring
001 | 1002 | 04 Feb 2019 | Recurring #Transaction 2
001 | 1002 | 11 Feb 2019 | Recurring
001 | 1002 | 18 Feb 2019 | Recurring
001 | 1003 | 25 Feb 2019 | Churned
001 | 1003 | 04 Mar 2019 | Churned #Transaction 3
001 | 1003 | 11 Mar 2019 | Churned
001 | 1003 | 18 Mar 2019 | Churned
001 | 1003 | 25 Mar 2019 | Churned
001 | - | 1 Apr 2019 | Churned
001 | - | 08 Apr 2019 | Churned
001 | - | 15 Apr 2019 | Churned
001 | - | 22 Apr 2019 | Churned
001 | - | 29 Apr 2019 | Churned
001 | - | 29 Apr 2019 | Churned
001 | - | 06 May 2019 | Churned
001 | - | 13 May 2019 | Churned
001 | - | 20 May 2019 | Churned
001 | - | 27 May 2019 | Churned
001 | 1004 | 27 May 2019 | Reactivated
001 | 1004 | 03 Jun 2019 | Reactivated #Transaction 4
001 | 1004 | 10 Jun 2019 | Reactivated
001 | 1004 | 17 Jun 2019 | Reactivated
001 | 1004 | 24 Jun 2019 | Reactivated'
...
...
...
current date
I think you just want window functions and case logic. Assuming the date you are referring to is Reporting_week_start_date, then the logic looks something like this:
select t.*,
(case when Reporting_week_start_date = min(Reporting_week_start_date) over (partition by account_id)
then 'New'
when Reporting_week_start_date < dateadd(lag(Reporting_week_start_date) over (partition by account_id order by Reporting_week_start_date), interval 1 month)
then 'Recurring'
when Reporting_week_start_date < dateadd(lead(Reporting_week_start_date) over (partition by account_id order by Reporting_week_start_date), interval -1 month)
then 'Churned'
else 'Reactivated'
end) as status
from transactions t;
These are not exactly the rules you have specified. But they seem very reasonable interpretations of what you want to do.

Average daily peak hours in a week with SQL select

I'm trying to list the weekly average of customers in different restaurants in their daily peak hours, for example:
Week | Day | Hour | Rest | Custom
20 | Mon | 08-12 | KFC | 15
20 | Mon | 12-16 | KFC | 10
20 | Mon | 16-20 | KFC | 8
20 | Tue | 08-12 | KFC | 20
20 | Tue | 12-16 | KFC | 11
20 | Tue | 16-20 | KFC | 9
20 | Mon | 08-12 | MCD | 13
20 | Mon | 12-16 | MCD | 14
20 | Mon | 16-20 | MCD | 19
20 | Tue | 08-12 | MCD | 31
20 | Tue | 12-16 | MCD | 20
20 | Tue | 16-20 | MCD | 22
20 | Mon | 08-12 | PHT | 15
20 | Mon | 12-16 | PHT | 12
20 | Mon | 16-20 | PHT | 11
20 | Tue | 08-12 | PHT | 08
20 | Tue | 12-16 | PHT | 07
20 | Tue | 16-20 | PHT | 14
The desired result should be:
WeeK | Rest | Custom
20 | KFC | 17.5
20 | MCD | 25
20 | PHT | 14.5
Is it possible to do it in one line of SQL?
This is really two steps. Get the maximum people per day per restaurant and then average that per week:
select week, rest, sum(maxc)
from (select Week, Day, Rest, max(Custom) as maxc
from t
group by Week, Day, Rest
) wdr
group by week, rest;

How can I obtain sum of each row in Postgresql?

I have got some rows of results like below, if each sum (row(i)) same, I can suppose the results are correct. How can I write a SQL clause to calculate sum of each row? thanks.
27 | 29 | 27 | 36 | 33 | 29 | 16 | 17 | 35 | 28 | 34 | 15
27 | 29 | 27 | 29 | 33 | 29 | 16 | 17 | 35 | 28 | 34 | 15
27 | 29 | 27 | 14 | 33 | 29 | 16 | 17 | 35 | 28 | 34 | 15
27 | 29 | 16 | 37 | 33 | 29 | 16 | 17 | 35 | 28 | 34 | 15
27 | 29 | 16 | 36 | 33 | 29 | 16 | 17 | 35 | 28 | 34 | 15