Related
There are four tables
First - Customer
Second - Invoice
Third - Supplier
Fourth - Supplier_Remit
Tables details are mentioned below
Customer_id
Customer_Account_number
Customer_Status
Supplier_id
Supplier_Remit_id
1
1501
Active
11
111
2
1502
Inactive
12
112
3
1503
Active
13
113
4
1504
Active
14
114
5
1505
Inactive
15
115
Invoice_Date
Invoice_Amount
Invoice_Number
Payment Method
Customer_id
01/01/2023
100
1000001
Cash
1
12/01/2022
150
1000002
Credit Card
1
11/09/2022
200
1000003
Credit Card
1
12/09/2022
300
1000004
Cash
2
04/15/2022
1000
1000005
Cash
2
04/15/2022
1000
1000006
Credit Card
3
10/31/2022
250
1000007
Cash
4
10/25/2022
250
1000008
Cash
4
09/20/2022
130
1000009
Credit Card
5
05/20/2022
120
10000010
Credit Card
5
Supplier_Name
Supplier_id
ABC
11
ACCC
12
ADEF
13
AJKL
14
AFLR
15
City
Country
Supplier_Remit_id
Supplier_id
Boston
US
111
11
Oak
US
112
12
Albany
US
113
13
Madison
US
114
14
Los Ang
US
115
15
I need help in finding the most recent payment method, most recent invoice amount, no of count of invoices missing for current year (2023) and no of count of invoices missing for previous year(2022)
I have written query to find first few columns but unable to write further to get the above mentioned details
select c.customer_id,c.customer_account_number,c.customer_status,sr.country,max(i.invoice_date) as Latest receieved_Invoice_date
from
customer c,
invoice i,
supplier s,
supplier_Remit sr
where
c.customer_status='Active' and
sr.supplier_id=s.supplier_id and
c.supplier_remit_id=sr.supplier_remit_id and
c.customer_id=i.customer_id
group by
c.customer_id,c.customer_account_number,c.customer_status,sr.country;
My expected output would be as below
Customer_id
Cust_Acct_Num
Cust_Status
Country
Last_Inv_Rec_Date
1
1501
Active
US
01/01/2023
3
1503
Active
US
04/15/2022
4
1504
Active
US
10/31/2022
Latest_Paym_Method
Lastest_Inv_Amt
Count of Missing Inv for Curr Yr
Cash
100
0
Credit card
1000
1
Cash
250
1
Count of Missing Invoices for Prev Year
10
11
11
You can use MAX(...) KEEP (DENSE_RANK LAST ORDER BY invoice_date) to get values for the latest invoice and conditional aggregation to count the number of months where there are invoices and then subtract from the total number of months to find the missing invoices:
SELECT c.Customer_id,
c.Customer_Account_number,
c.Customer_Status,
r.country,
i.last_invoice_date,
i.latest_payment_method,
i.latest_invoice_amount,
EXTRACT(MONTH FROM SYSDATE) - COALESCE(i.missing_invoices_this_year, 0)
AS missing_invoices_this_year,
12 - COALESCE(i.missing_invoices_last_year, 0)
AS missing_invoices_last_year
FROM customer c
INNER JOIN supplier_remit r
ON (c.supplier_id = r.supplier_id)
LEFT OUTER JOIN (
SELECT customer_id,
MAX(invoice_date) AS last_invoice_date,
MAX(payment_method) KEEP (DENSE_RANK LAST ORDER BY invoice_date)
AS latest_payment_method,
MAX(invoice_amount) KEEP (DENSE_RANK LAST ORDER BY invoice_date)
AS latest_invoice_amount,
COUNT(
DISTINCT
CASE
WHEN invoice_date < SYSDATE
AND invoice_date >= TRUNC(SYSDATE, 'YY')
THEN TRUNC(invoice_date, 'MM')
END
) AS missing_invoices_this_year,
COUNT(
DISTINCT
CASE
WHEN invoice_date < TRUNC(SYSDATE, 'YY')
AND invoice_date >= ADD_MONTHS(TRUNC(SYSDATE, 'YY'), -12)
THEN TRUNC(invoice_date, 'MM')
END
) AS missing_invoices_last_year
FROM invoice
GROUP BY customer_id
) i
ON (c.customer_id = i.customer_id)
WHERE c.customer_status = 'Active';
Which, for the sample data:
CREATE TABLE customer (Customer_id, Customer_Account_number, Customer_Status, Supplier_id, Supplier_Remit_id) AS
SELECT 1, 1501, 'Active', 11, 111 FROM DUAL UNION ALL
SELECT 2, 1502, 'Inactive', 12, 112 FROM DUAL UNION ALL
SELECT 3, 1503, 'Active', 13, 113 FROM DUAL UNION ALL
SELECT 4, 1504, 'Active', 14, 114 FROM DUAL UNION ALL
SELECT 5, 1505, 'Inactive', 15, 115 FROM DUAL;
CREATE TABLE invoice (Invoice_Date, Invoice_Amount, Invoice_Number, Payment_Method, Customer_id) AS
SELECT DATE '2023-01-01', 100, 1000001, 'Cash', 1 FROM DUAL UNION ALL
SELECT DATE '2022-12-01', 150, 1000002, 'Credit Card', 1 FROM DUAL UNION ALL
SELECT DATE '2022-11-09', 200, 1000003, 'Credit Card', 1 FROM DUAL UNION ALL
SELECT DATE '2022-12-09', 300, 1000004, 'Cash', 2 FROM DUAL UNION ALL
SELECT DATE '2022-04-15', 1000, 1000005, 'Cash', 2 FROM DUAL UNION ALL
SELECT DATE '2022-04-15', 1000, 1000006, 'Credit Card', 3 FROM DUAL UNION ALL
SELECT DATE '2022-10-31', 250, 1000007, 'Cash', 4 FROM DUAL UNION ALL
SELECT DATE '2022-10-25', 250, 1000008, 'Cash', 4 FROM DUAL UNION ALL
SELECT DATE '2022-09-20', 130, 1000009, 'Credit Card', 5 FROM DUAL UNION ALL
SELECT DATE '2022-05-20', 120, 10000010, 'Credit Card', 5 FROM DUAL;
CREATE TABLE supplier (Supplier_Name, Supplier_id) AS
SELECT 'ABC', 11 FROM DUAL UNION ALL
SELECT 'ACCC', 12 FROM DUAL UNION ALL
SELECT 'ADEF', 13 FROM DUAL UNION ALL
SELECT 'AJKL', 14 FROM DUAL UNION ALL
SELECT 'AFLR', 15 FROM DUAL;
CREATE TABLE supplier_remit (City, Country, Supplier_Remit_id, Supplier_id) AS
SELECT 'Boston', 'US', 111, 11 FROM DUAL UNION ALL
SELECT 'Oak', 'US', 112, 12 FROM DUAL UNION ALL
SELECT 'Albany', 'US', 113, 13 FROM DUAL UNION ALL
SELECT 'Madison', 'US', 114, 14 FROM DUAL UNION ALL
SELECT 'Los Ang', 'US', 115, 15 FROM DUAL;
Outputs:
CUSTOMER_ID
CUSTOMER_ACCOUNT_NUMBER
CUSTOMER_STATUS
COUNTRY
LAST_INVOICE_DATE
LATEST_PAYMENT_METHOD
LATEST_INVOICE_AMOUNT
MISSING_INVOICES_THIS_YEAR
MISSING_INVOICES_LAST_YEAR
1
1501
Active
US
2023-01-01 00:00:00
Cash
100
0
10
3
1503
Active
US
2022-04-15 00:00:00
Credit Card
1000
1
11
4
1504
Active
US
2022-10-31 00:00:00
Cash
250
1
11
fiddle
In order to find what's missing, you have to first define what should be there, so you need to create a calendar of every month. Then you can use outer joins to the invoice table to find where there aren't any records for that month for that customer. There are lots of ways to write SQL to do this. Here's one:
WITH months AS(SELECT /*+ MATERIALIZE */ *
FROM (SELECT 'Current' year,
ADD_MONTHS(TRUNC(SYSDATE,'YYYY'),ROWNUM-1) month_start
FROM [any table with at least 12 rows]
WHERE ROWNUM <= 12)
WHERE month_start < SYSDATE
UNION ALL
SELECT 'Previous' year,
ADD_MONTHS(TRUNC(ADD_MONTHS(SYSDATE,-12),'YYYY'),ROWNUM-1)
FROM [any table with at least 12 rows]
WHERE ROWNUM <= 12)
SELECT customer.*,
inv.invoice_amount most_recent_invoice_amount,
inv.payment_method most_recent_payment_method,
(SELECT COUNT(*)
FROM months,
invoice
WHERE months.year = 'Current'
AND months.month_start = TRUNC(invoice_date(+),'MM')
AND invoice.customer_id(+) = customer.customer_id
AND invoice.customer_id IS NULL) missed_current_year_months,
(SELECT COUNT(*)
FROM months,
invoice
WHERE months.year = 'Previous'
AND months.month_start = TRUNC(invoice_date(+),'MM')
AND invoice.customer_id(+) = customer.customer_id
AND invoice.customer_id IS NULL) missed_previous_year_months
FROM customer
OUTER APPLY (SELECT invoice_amount,
payment_method
FROM (SELECT invoice_amount,
payment_method,
ROW_NUMBER() OVER (ORDER BY invoice_date DESC) seq
FROM invoice
WHERE invoice.customer_id = customer.customer_id)
WHERE seq = 1) inv
I have two tables, both has millions of rows.
Table A:-
Store_id, Purchase_dt, Amount
-------- ----------- ------
1001 02JAN19 12.20
1001 05MAY20 13.30
1002 07JUL21 10.97
Table B:-
Store_id, Valid_from, Valid_to, Profile_ID
-------- ---------- -------- ----------
1001 01JAN17 08JUL19 56
1001 09JUL19 12DEC99 60
1002 01JAN20 12DEC99 70
I need to find only transaction from stores that has a profile id of 60 and 70 and Purchase_dt should be between Valid_from and valid_to and for this joining column is Store_id.
Target table expectation is:
Store_id, Purchase_dt, Amount, Profile_ID
-------- ----------- ------
1001 05MAY20 13.30 60
1002 07JUL21 10.97 70
I tried with
Select
a.Store_id,
a.Purchase_dt,
a.Amount,
b.Profile_ID
from
table_a a,
table_b b
where
a.Store_id = b.Store_id
and
a.Purchase_dt between b.Valid_from and b.Valid_to
and
b.Profile_ID in (60,70)
but not getting the desired result, all dates are date data type any help is appreciated!
If dates are really stored as strings (that's what sample data you posted looks like), then - if you want between to work properly - you first have to convert these strings into valid DATE datatype values (using to_date function with appropriate format model).
Moreover, you're looking for trouble keeping 2-digits years; didn't Y2K bug teach you anything?
I'd suggest you to keep dates in DATE datatype columns and avoid many kinds of problems.
As of your current problem, here you are:
Sample data:
SQL> with
2 table_a (store_id, purchase_dt, amount) as
3 (select 1001, '02JAN19', 12.20 from dual union all
4 select 1001, '05MAY20', 13.30 from dual union all
5 select 1002, '07JUL21', 10.97 from dual
6 ),
7 table_b (store_id, valid_from, valid_to, profile_id) as
8 (select 1001, '01JAN17', '08JUL19', 56 from dual union all
9 select 1001, '09JUL19', '12DEC99', 60 from dual union all
10 select 1002, '01JAN20', '12DEC99', 70 from dual
11 )
Query begins here:
12 select a.store_id, a.purchase_dt, a.amount, b.profile_id
13 from table_a a join table_b b
14 on a.store_id = b.store_id
15 and to_date(a.purchase_dt, 'ddMONyy') between
16 to_date(b.valid_from, 'ddMONyy') and to_date(b.valid_to, 'ddMONyy')
17 where b.profile_id in (60, 70);
STORE_ID PURCHAS AMOUNT PROFILE_ID
---------- ------- ---------- ----------
1001 05MAY20 13,3 60
1002 07JUL21 10,97 70
SQL>
If - as you commented - date values really are DATEs - then it gets simpler.
Compare:
Strings:
15 and to_date(a.purchase_dt, 'ddMONyy') between
16 to_date(b.valid_from, 'ddMONyy') and
to_date(b.valid_to, 'ddMONyy')
Dates:
15 and a.purchase_dt between b.valid_from and b.valid_to
The whole query that deals with DATE datatype:
SQL> with
2 table_a (store_id, purchase_dt, amount) as
3 (select 1001, date '2019-01-02', 12.20 from dual union all
4 select 1001, date '2020-05-05', 13.30 from dual union all
5 select 1002, date '2021-07-07', 10.97 from dual
6 ),
7 table_b (store_id, valid_from, valid_to, profile_id) as
8 (select 1001, date '2017-01-01', date '2019-07-08', 56 from dual union all
9 select 1001, date '2019-07-09', date '2099-12-12', 60 from dual union all
10 select 1002, date '2020-01-01', date '2099-12-12', 70 from dual
11 )
12 select a.store_id, a.purchase_dt, a.amount, b.profile_id
13 from table_a a join table_b b
14 on a.store_id = b.store_id
15 and a.purchase_dt between b.valid_from and b.valid_to
16 where b.profile_id in (60, 70) ;
STORE_ID PURCHASE AMOUNT PROFILE_ID
---------- -------- ---------- ----------
1001 05.05.20 13,3 60
1002 07.07.21 10,97 70
SQL>
Your query applied to same sample data also works:
SQL> with
2 table_a (store_id, purchase_dt, amount) as
3 (select 1001, date '2019-01-02', 12.20 from dual union all
4 select 1001, date '2020-05-05', 13.30 from dual union all
5 select 1002, date '2021-07-07', 10.97 from dual
6 ),
7 table_b (store_id, valid_from, valid_to, profile_id) as
8 (select 1001, date '2017-01-01', date '2019-07-08', 56 from dual union all
9 select 1001, date '2019-07-09', date '2099-12-12', 60 from dual union all
10 select 1002, date '2020-01-01', date '2099-12-12', 70 from dual
11 )
This is your query:
12 Select
13 a.Store_id,
14 a.Purchase_dt,
15 a.Amount,
16 b.Profile_ID
17 from
18 table_a a,
19 table_b b
20 where
21 a.Store_id = b.Store_id
22 and
23 a.Purchase_dt between b.Valid_from and b.Valid_to
24 and
25 b.Profile_ID in (60,70);
STORE_ID PURCHASE AMOUNT PROFILE_ID
---------- -------- ---------- ----------
1001 05.05.20 13,3 60
1002 07.07.21 10,97 70
SQL>
Basically I have Product table like this:
date price
--------- -----
02-SEP-14 50
03-SEP-14 60
04-SEP-14 60
05-SEP-14 60
07-SEP-14 71
08-SEP-14 45
09-SEP-14 45
10-SEP-14 24
11-SEP-14 60
I need to update the table in this form
date price id
--------- ----- --
02-SEP-14 50 1
03-SEP-14 60 2
04-SEP-14 60 2
05-SEP-14 60 2
07-SEP-14 71 3
08-SEP-14 45 4
09-SEP-14 45 4
10-SEP-14 24 5
11-SEP-14 60 6
What I have tried:
CREATE SEQUENCE user_id_seq
START WITH 1
INCREMENT BY 1
CACHE 20;
ALTER TABLE Product
ADD (ID number);
UPDATE Product SET ID = user_id_seq.nextval;
This is updating the ID in the usual way like 1,2,3,4,5..
I have no idea how to do it using basic SQL commands. Please suggest how can I make it. Thank you in advance.
Here is one way to create a view from your base data. I assume you have more than one product (identified by product id), and that the price dates aren't necessarily consecutive. The sequence is separate for each product id. (Also, product should be the name of a different table - where the product id is primary key, and you have other information such as product name, category, etc. The table in your post would be more properly called something like price_history.)
alter session set nls_date_format='dd-MON-rr';
create table product ( prod_id number, dt date, price number );
insert into product ( prod_id, dt, price )
select 101, '02-SEP-14', 50 from dual union all
select 101, '03-SEP-14', 60 from dual union all
select 101, '04-SEP-14', 60 from dual union all
select 101, '05-SEP-14', 60 from dual union all
select 101, '07-SEP-14', 71 from dual union all
select 101, '08-SEP-14', 45 from dual union all
select 101, '09-SEP-14', 45 from dual union all
select 101, '10-SEP-14', 24 from dual union all
select 101, '11-SEP-14', 60 from dual union all
select 102, '02-SEP-14', 45 from dual union all
select 102, '04-SEP-14', 45 from dual union all
select 102, '05-SEP-14', 60 from dual union all
select 102, '06-SEP-14', 50 from dual union all
select 102, '09-SEP-14', 60 from dual
;
commit;
create view product_vw ( prod_id, dt, price, seq ) as
select prod_id, dt, price,
count(flag) over (partition by prod_id order by dt)
from ( select prod_id, dt, price,
case when price = lag(price) over (partition by prod_id order by dt)
then null else 1 end as flag
from product
)
;
Now check what the view looks like:
select * from product_vw;
PROD_ID DT PRICE SEQ
------- ------------------- ---------- ----------
101 02/09/0014 00:00:00 50 1
101 03/09/0014 00:00:00 60 2
101 04/09/0014 00:00:00 60 2
101 05/09/0014 00:00:00 60 2
101 07/09/0014 00:00:00 71 3
101 08/09/0014 00:00:00 45 4
101 09/09/0014 00:00:00 45 4
101 10/09/0014 00:00:00 24 5
101 11/09/0014 00:00:00 60 6
102 02/09/0014 00:00:00 45 1
102 04/09/0014 00:00:00 45 1
102 05/09/0014 00:00:00 60 2
102 06/09/0014 00:00:00 50 3
102 09/09/0014 00:00:00 60 4
NOTE: This answers the question that was originally asked. The OP changed the data.
If your data is not too large, you can use a correlated subquery:
update product p
set id = (select count(distinct p2.price)
from product p2
where p2.date <= p.date
);
If your data is larger, then merge is more appropriate.
WITH cts AS
(
SELECT row_number() over (partition by price order by price ) as id
,date
,price
FROM Product
)
UPDATE p
set p.id = cts.id
from product p join cts on cts.id = p.id
This is the best way by which you try to do.
There is no another simple way to do this using simple statements
The goal is to select the count of distinct customer_id's who have not made a purchase in the rolling 30 day period prior to every day in the calendar year 2016. I have created a calendar table in my database to join to.
Here is an example table for reference, let's say you have customers orders normalized as follows:
+-------------+------------+----------+
| customer_id | date | order_id |
+-------------+------------+----------+
| 123 | 01/25/2016 | 1000 |
+-------------+------------+----------+
| 123 | 04/27/2016 | 1025 |
+-------------+------------+----------+
| 444 | 02/02/2016 | 1010 |
+-------------+------------+----------+
| 521 | 01/23/2016 | 998 |
+-------------+------------+----------+
| 521 | 01/24/2016 | 999 |
+-------------+------------+----------+
The goal output is effectively a calendar with 1 row for every single day of 2016 with a count on each day of how many customers "lapsed" on that day, meaning their last purchase was 30 days or more prior from that day of the year. The final output will look like this:
+------------+--------------+
| date | lapsed_count |
+------------+--------------+
| 01/01/2016 | 0 |
+------------+--------------+
| 01/02/2016 | 0 |
+------------+--------------+
| ... | ... |
+------------+--------------+
| 03/01/2016 | 12 |
+------------+--------------+
| 03/02/2016 | 9 |
+------------+--------------+
| 03/03/2016 | 7 |
+------------+--------------+
This data does not exist in 2015, therefore it's not possible for Jan-01-2016 to have a count of lapsed customers because that is the first possible day to ever make a purchase.
So for customer_id #123, they purchased on 01/25/2016 and 04/27/2016. They should have 2 lapse counts because their purchases are more than 30 days apart. One lapse occurring on 2/24/2016 and another lapse on 05/27/2016.
Customer_id#444 only purchased once, so they should have one lapse count for 30 days after 02/02/2016 on 03/02/2016.
Customer_id#521 is tricky, since they purchased with a frequency of 1 day we will not count the first purchase on 03/02/2016, so there is only one lapse starting from their last purchase of 03/03/2016. The count for the lapse will occur on 04/02/2016 (+30 days).
If you have a table of dates, here is one expensive method:
select date,
sum(case when prev_date < date - 30 then 1 else 0 end) as lapsed
from (select c.date, o.customer_id, max(o.date) as prev_date
from calendar c cross join
(select distinct customer_id from orders) c left join
orders o
on o.date <= c.date and o.customer_id = c.customer_id
group by c.date, o.customer_id
) oc
group by date;
For each date/customer pair, it determines the latest purchase the customer made before the date. It then uses this information to count the lapsed.
To be honest, this will probably work well on a handful of dates, but not for a full year's worth.
Apologies, I didn't read your question properly the first time around. This query will give you all the lapses you have. It takes each order and uses an analytic function to work out the next order date - if the gap is greater than 30 days then a lapse is recorded
WITH
cust_orders (customer_id , order_date , order_id )
AS
(SELECT 1, TO_DATE('01/01/2016','DD/MM/YYYY'), 1001 FROM dual UNION ALL
SELECT 1, TO_DATE('29/01/2016','DD/MM/YYYY'), 1002 FROM dual UNION ALL
SELECT 1, TO_DATE('01/03/2016','DD/MM/YYYY'), 1003 FROM dual UNION ALL
SELECT 2, TO_DATE('01/01/2016','DD/MM/YYYY'), 1004 FROM dual UNION ALL
SELECT 2, TO_DATE('29/01/2016','DD/MM/YYYY'), 1005 FROM dual UNION ALL
SELECT 2, TO_DATE('01/04/2016','DD/MM/YYYY'), 1006 FROM dual UNION ALL
SELECT 2, TO_DATE('01/06/2016','DD/MM/YYYY'), 1007 FROM dual UNION ALL
SELECT 2, TO_DATE('01/08/2016','DD/MM/YYYY'), 1008 FROM dual UNION ALL
SELECT 3, TO_DATE('01/09/2016','DD/MM/YYYY'), 1009 FROM dual UNION ALL
SELECT 3, TO_DATE('01/12/2016','DD/MM/YYYY'), 1010 FROM dual UNION ALL
SELECT 3, TO_DATE('02/12/2016','DD/MM/YYYY'), 1011 FROM dual UNION ALL
SELECT 3, TO_DATE('03/12/2016','DD/MM/YYYY'), 1012 FROM dual UNION ALL
SELECT 3, TO_DATE('04/12/2016','DD/MM/YYYY'), 1013 FROM dual UNION ALL
SELECT 3, TO_DATE('05/12/2016','DD/MM/YYYY'), 1014 FROM dual UNION ALL
SELECT 3, TO_DATE('06/12/2016','DD/MM/YYYY'), 1015 FROM dual UNION ALL
SELECT 3, TO_DATE('07/12/2016','DD/MM/YYYY'), 1016 FROM dual
)
SELECT
customer_id
,order_date
,order_id
,next_order_date
,order_date + 30 lapse_date
FROM
(SELECT
customer_id
,order_date
,order_id
,LEAD(order_date) OVER (PARTITION BY customer_id ORDER BY order_date) next_order_date
FROM
cust_orders
)
WHERE NVL(next_order_date,sysdate) - order_date > 30
;
Now join that to a set of dates and run a COUNT function (enter the year parameter as YYYY) :
WITH
cust_orders (customer_id , order_date , order_id )
AS
(SELECT 1, TO_DATE('01/01/2016','DD/MM/YYYY'), 1001 FROM dual UNION ALL
SELECT 1, TO_DATE('29/01/2016','DD/MM/YYYY'), 1002 FROM dual UNION ALL
SELECT 1, TO_DATE('01/03/2016','DD/MM/YYYY'), 1003 FROM dual UNION ALL
SELECT 2, TO_DATE('01/01/2016','DD/MM/YYYY'), 1004 FROM dual UNION ALL
SELECT 2, TO_DATE('29/01/2016','DD/MM/YYYY'), 1005 FROM dual UNION ALL
SELECT 2, TO_DATE('01/04/2016','DD/MM/YYYY'), 1006 FROM dual UNION ALL
SELECT 2, TO_DATE('01/06/2016','DD/MM/YYYY'), 1007 FROM dual UNION ALL
SELECT 2, TO_DATE('01/08/2016','DD/MM/YYYY'), 1008 FROM dual UNION ALL
SELECT 3, TO_DATE('01/09/2016','DD/MM/YYYY'), 1009 FROM dual UNION ALL
SELECT 3, TO_DATE('01/12/2016','DD/MM/YYYY'), 1010 FROM dual UNION ALL
SELECT 3, TO_DATE('02/12/2016','DD/MM/YYYY'), 1011 FROM dual UNION ALL
SELECT 3, TO_DATE('03/12/2016','DD/MM/YYYY'), 1012 FROM dual UNION ALL
SELECT 3, TO_DATE('04/12/2016','DD/MM/YYYY'), 1013 FROM dual UNION ALL
SELECT 3, TO_DATE('05/12/2016','DD/MM/YYYY'), 1014 FROM dual UNION ALL
SELECT 3, TO_DATE('06/12/2016','DD/MM/YYYY'), 1015 FROM dual UNION ALL
SELECT 3, TO_DATE('07/12/2016','DD/MM/YYYY'), 1016 FROM dual
)
,calendar (date_value)
AS
(SELECT TO_DATE('01/01/'||:P_year,'DD/MM/YYYY') + (rownum -1)
FROM all_tables
WHERE rownum < (TO_DATE('31/12/'||:P_year,'DD/MM/YYYY') - TO_DATE('01/01/'||:P_year,'DD/MM/YYYY')) + 2
)
SELECT
calendar.date_value
,COUNT(*)
FROM
(
SELECT
customer_id
,order_date
,order_id
,next_order_date
,order_date + 30 lapse_date
FROM
(SELECT
customer_id
,order_date
,order_id
,LEAD(order_date) OVER (PARTITION BY customer_id ORDER BY order_date) next_order_date
FROM
cust_orders
)
WHERE NVL(next_order_date,sysdate) - order_date > 30
) lapses
,calendar
WHERE 1=1
AND calendar.date_value = TRUNC(lapses.lapse_date)
GROUP BY
calendar.date_value
;
Or if you really want every date printed out then use this :
WITH
cust_orders (customer_id , order_date , order_id )
AS
(SELECT 1, TO_DATE('01/01/2016','DD/MM/YYYY'), 1001 FROM dual UNION ALL
SELECT 1, TO_DATE('29/01/2016','DD/MM/YYYY'), 1002 FROM dual UNION ALL
SELECT 1, TO_DATE('01/03/2016','DD/MM/YYYY'), 1003 FROM dual UNION ALL
SELECT 2, TO_DATE('01/01/2016','DD/MM/YYYY'), 1004 FROM dual UNION ALL
SELECT 2, TO_DATE('29/01/2016','DD/MM/YYYY'), 1005 FROM dual UNION ALL
SELECT 2, TO_DATE('01/04/2016','DD/MM/YYYY'), 1006 FROM dual UNION ALL
SELECT 2, TO_DATE('01/06/2016','DD/MM/YYYY'), 1007 FROM dual UNION ALL
SELECT 2, TO_DATE('01/08/2016','DD/MM/YYYY'), 1008 FROM dual UNION ALL
SELECT 3, TO_DATE('01/09/2016','DD/MM/YYYY'), 1009 FROM dual UNION ALL
SELECT 3, TO_DATE('01/12/2016','DD/MM/YYYY'), 1010 FROM dual UNION ALL
SELECT 3, TO_DATE('02/12/2016','DD/MM/YYYY'), 1011 FROM dual UNION ALL
SELECT 3, TO_DATE('03/12/2016','DD/MM/YYYY'), 1012 FROM dual UNION ALL
SELECT 3, TO_DATE('04/12/2016','DD/MM/YYYY'), 1013 FROM dual UNION ALL
SELECT 3, TO_DATE('05/12/2016','DD/MM/YYYY'), 1014 FROM dual UNION ALL
SELECT 3, TO_DATE('06/12/2016','DD/MM/YYYY'), 1015 FROM dual UNION ALL
SELECT 3, TO_DATE('07/12/2016','DD/MM/YYYY'), 1016 FROM dual
)
,lapses
AS
(SELECT
customer_id
,order_date
,order_id
,next_order_date
,order_date + 30 lapse_date
FROM
(SELECT
customer_id
,order_date
,order_id
,LEAD(order_date) OVER (PARTITION BY customer_id ORDER BY order_date) next_order_date
FROM
cust_orders
)
WHERE NVL(next_order_date,sysdate) - order_date > 30
)
,calendar (date_value)
AS
(SELECT TO_DATE('01/01/'||:P_year,'DD/MM/YYYY') + (rownum -1)
FROM all_tables
WHERE rownum < (TO_DATE('31/12/'||:P_year,'DD/MM/YYYY') - TO_DATE('01/01/'||:P_year,'DD/MM/YYYY')) + 2
)
SELECT
calendar.date_value
,(SELECT COUNT(*)
FROM lapses
WHERE calendar.date_value = lapses.lapse_date
)
FROM
calendar
WHERE 1=1
ORDER BY
calendar.date_value
;
Here's how I'd do it:
WITH your_table AS (SELECT 123 customer_id, to_date('24/01/2016', 'dd/mm/yyyy') order_date, 12345 order_id FROM dual UNION ALL
SELECT 123 customer_id, to_date('24/01/2016', 'dd/mm/yyyy') order_date, 12346 order_id FROM dual UNION ALL
SELECT 123 customer_id, to_date('25/01/2016', 'dd/mm/yyyy') order_date, 12347 order_id FROM dual UNION ALL
SELECT 123 customer_id, to_date('24/02/2016', 'dd/mm/yyyy') order_date, 12347 order_id FROM dual UNION ALL
SELECT 123 customer_id, to_date('16/03/2016', 'dd/mm/yyyy') order_date, 12348 order_id FROM dual UNION ALL
SELECT 123 customer_id, to_date('18/04/2016', 'dd/mm/yyyy') order_date, 12349 order_id FROM dual UNION ALL
SELECT 456 customer_id, to_date('20/02/2016', 'dd/mm/yyyy') order_date, 12350 order_id FROM dual UNION ALL
SELECT 456 customer_id, to_date('01/03/2016', 'dd/mm/yyyy') order_date, 12351 order_id FROM dual UNION ALL
SELECT 456 customer_id, to_date('03/03/2016', 'dd/mm/yyyy') order_date, 12352 order_id FROM dual UNION ALL
SELECT 456 customer_id, to_date('18/04/2016', 'dd/mm/yyyy') order_date, 12353 order_id FROM dual UNION ALL
SELECT 456 customer_id, to_date('20/05/2016', 'dd/mm/yyyy') order_date, 12354 order_id FROM dual UNION ALL
SELECT 456 customer_id, to_date('23/06/2016', 'dd/mm/yyyy') order_date, 12355 order_id FROM dual UNION ALL
SELECT 456 customer_id, to_date('19/01/2017', 'dd/mm/yyyy') order_date, 12356 order_id FROM dual),
-- end of mimicking your_table with data in it
lapsed_info AS (SELECT customer_id,
order_date,
CASE WHEN TRUNC(SYSDATE) - order_date <= 30 THEN NULL
WHEN COUNT(*) OVER (PARTITION BY customer_id ORDER BY order_date RANGE BETWEEN 1 FOLLOWING AND 30 FOLLOWING) = 0 THEN order_date+30
ELSE NULL
END lapsed_date
FROM your_table),
dates AS (SELECT to_date('01/01/2016', 'dd/mm/yyyy') + LEVEL -1 dt
FROM dual
CONNECT BY to_date('01/01/2016', 'dd/mm/yyyy') + LEVEL -1 <= TRUNC(SYSDATE))
SELECT dates.dt,
COUNT(li.lapsed_date) lapsed_count
FROM dates
LEFT OUTER JOIN lapsed_info li ON dates.dt = li.lapsed_date
GROUP BY dates.dt
ORDER BY dates.dt;
Results:
DT LAPSED_COUNT
---------- ------------
01/01/2016 0
<snip>
23/01/2016 0
24/01/2016 0
25/01/2016 0
26/01/2016 0
<snip>
19/02/2016 0
20/02/2016 0
21/02/2016 0
22/02/2016 0
23/02/2016 0
24/02/2016 1
25/02/2016 0
<snip>
29/02/2016 0
01/03/2016 0
02/03/2016 0
03/03/2016 0
04/03/2016 0
<snip>
15/03/2016 0
16/03/2016 0
17/03/2016 0
<snip>
20/03/2016 0
21/03/2016 0
22/03/2016 0
<snip>
30/03/2016 0
31/03/2016 0
01/04/2016 0
02/04/2016 1
03/04/2016 0
<snip>
14/04/2016 0
15/04/2016 1
16/04/2016 0
17/04/2016 0
18/04/2016 0
19/04/2016 0
<snip>
17/05/2016 0
18/05/2016 2
19/05/2016 0
20/05/2016 0
21/05/2016 0
<snip>
18/06/2016 0
19/06/2016 1
20/06/2016 0
21/06/2016 0
22/06/2016 0
23/06/2016 0
24/06/2016 0
<snip>
22/07/2016 0
23/07/2016 1
24/07/2016 0
<snip>
18/01/2017 0
19/01/2017 0
20/01/2017 0
<snip>
08/02/2017 0
This takes your data, and uses an the analytic count function to work out the number of rows that have a value within 30 days of (but excluding) the current row's date.
Then we apply a case expression to determine that if the row has a date within 30 days of today's date, we'll count those as not lapsed. If a count of 0 was returned, then the row is considered lapsed and we'll output the lapsed date as the order_date plus 30 days. Any other count result means the row has not lapsed.
The above is all worked out in the lapsed_info subquery.
Then all we need to do is list the dates (see the dates subquery) and outer join the lapsed_info subquery to it based on the lapsed_date and then do a count of the lapsed dates for each day.
I have a table with the below structure:
I would like to retrieve the results using sql in the below format
I am new to SQL and can't figure out how to go about it. Is this possible without using procedures? How do I go achieve this? (the actual data size is huge and I have given only a snapshot here)
Part of it is pivoting. Totals by row and column (and really, even the pivoting) should be done in your reporting application, not in SQL. If you insist on doing it in SQL, there are fancier ways, but something like the silly query below will suffice.
with test_data (city, yr, ct) as (
select 'Tokyo' , 2016, 2 from dual union all
select 'Mumbai', 2013, 3 from dual union all
select 'Mumbai', 2014, 5 from dual union all
select 'Dubai' , 2011, 5 from dual union all
select 'Dubai' , 2015, 15 from dual union all
select 'Dubai' , 2016, 8 from dual union all
select 'London', 2011, 16 from dual union all
select 'London', 2012, 22 from dual union all
select 'London', 2013, 4 from dual union all
select 'London', 2014, 24 from dual union all
select 'London', 2015, 13 from dual union all
select 'London', 2016, 5 from dual
),
test_with_totals as (
select city, yr, ct from test_data union all
select city, 9999, sum(ct) from test_data group by city union all
select 'Grand Total', yr , sum(ct) from test_data group by yr union all
select 'Grand Total', 9999, sum(ct) from test_data
)
select * from test_with_totals
pivot ( sum (ct) for yr in (2011, 2012, 2013, 2014, 2015, 2016, 9999 as "Total"))
order by "Total";
Result:
CITY 2011 2012 2013 2014 2015 2016 Total
----------- ---------- ---------- ---------- ---------- ---------- ---------- ----------
Tokyo 2 2
Mumbai 3 5 8
Dubai 5 15 8 28
London 16 22 4 24 13 5 84
Grand Total 21 22 7 29 28 15 122