Related
I have a data like this from the db output.
yearmonth orig_region_name performance_percentage
2020-04 AMERICAS 95.45
2020-04 ASIA PACIFIC 100.00
2020-04 EUROPE 97.78
2020-04 GLOBAL 97.76
2020-05 AMERICAS 100.00
2020-05 EUROPE 97.20
2020-05 GLOBAL 97.21
But i need to add Asia pacific region which is missing for May month. The origin values are dynamic and it will change based on user selection parameters from UI.
yearmonth orig_region_name performance_percentage
2020-04 AMERICAS 95.45
2020-04 ASIA PACIFIC 100.00
2020-04 EUROPE 97.78
2020-04 GLOBAL 97.76
2020-05 AMERICAS 100.00
2020-05 ASIA PACIFIC 0
2020-05 EUROPE 97.20
2020-05 GLOBAL 97.21
How do i enforce this logic without impact much on my existing code? can someone help me on this.
Query -
SELECT yearmonth, orig_region_name,
(Cast(sum_net_volume AS DECIMAL(15,4))
/ NullIf(sum_monitored_volume ,0))*100 AS performance_percentage
FROM
(
SELECT
Trim(Year(start_date)) || '-' || To_Char(start_date, 'MM') AS yearmonth,
orig_region_name,
Sum(net_volume) AS sum_net_volume,
Sum(monitored_vol) AS sum_monitored_volume
FROM TABLE
WHERE start_date BETWEEN '2020-07-01' AND '2021-09-30'
AND group_code IN ('230')
AND (filter_BILL_RGN IN ('AM','EU') OR (1=2) )
GROUP BY 1,2
) A
You can use cte for this purpose. I have just used your query as inner query in cte.
with cte as
(
SELECT yearmonth, orig_region_name,
(Cast(sum_net_volume AS DECIMAL(15,4))
/ NullIf(sum_monitored_volume ,0))*100 AS performance_percentage
FROM
(
SELECT
Trim(Year(start_date)) || '-' || To_Char(start_date, 'MM') AS yearmonth,
orig_region_name,
Sum(net_volume) AS sum_net_volume,
Sum(monitored_vol) AS sum_monitored_volume
FROM TABLE
WHERE start_date BETWEEN '2020-07-01' AND '2021-09-30'
AND group_code IN ('230')
AND (filter_BILL_RGN IN ('AM','EU') OR (1=2) )
GROUP BY 1,2
) A
)
,ym as (select distinct yearmonth from cte)
,regionname as (select distinct orig_region_name from cte)
,finalcte as (select * from ym cross join regionname)
select f.yearmonth,f.orig_region_name,coalesce(cte.performance_percentage,0) from finalcte f left join cte
on f.yearmonth=cte.yearmonth and f.orig_region_name=cte.orig_region_name
Example with your provided dummy data:
create table yourtable (yearmonth varchar(10), orig_region_name varchar(50), performance_percentage float);
insert into yourtable values('2020-04' ,'AMERICAS', 95.45);
insert into yourtable values('2020-04' ,'ASIA PACIFIC', 100.00);
insert into yourtable values('2020-04' ,'EUROPE', 97.78);
insert into yourtable values('2020-04' ,'GLOBAL', 97.76);
insert into yourtable values('2020-05' ,'AMERICAS', 100.00);
insert into yourtable values('2020-05' ,'EUROPE', 97.20);
insert into yourtable values('2020-05' ,'GLOBAL', 97.21);
Query:
with cte as(
SELECT yearmonth, orig_region_name,
performance_percentage
from yourtable
)
,ym as (select distinct yearmonth from cte)
,regionname as (select distinct orig_region_name from cte)
,finalcte as (select * from ym cross join regionname)
select f.yearmonth,f.orig_region_name,coalesce(cte.performance_percentage,0) from finalcte f left join cte
on f.yearmonth=cte.yearmonth and f.orig_region_name=cte.orig_region_name
Output:
yearmonth
orig_region_name
(No column name)
2020-04
AMERICAS
95.45
2020-04
ASIA PACIFIC
100
2020-04
EUROPE
97.78
2020-04
GLOBAL
97.76
2020-05
AMERICAS
100
2020-05
ASIA PACIFIC
0
2020-05
EUROPE
97.2
2020-05
GLOBAL
97.21
db<fiddle here
Teradata 16.20 supports Time Series Table & Aggregations:
SELECT TO_CHAR(BEGIN(pd), 'YYYY-MM') AS YEARMONTH, orig_region_name,
(Cast(sum_net_volume AS DECIMAL(15,4))
/ NullIf(sum_monitored_volume ,0))*100 AS performance_percentage
FROM
(
SELECT
-- returns a date period
CAST($TD_TIMECODE_RANGE AS PERIOD(DATE)) AS pd,
orig_region_name,
Sum(net_volume) AS sum_net_volume,
Sum(monitored_vol) AS sum_monitored_volume
FROM TABLE
WHERE start_date BETWEEN DATE '2020-07-01' AND DATE '2021-09-30'
AND group_code IN ('230')
AND (filter_BILL_RGN IN ('AM','EU') OR (1=2) )
-- times series aggregation, one row per month/region
GROUP BY TIME (CAL_MONTHS(1) AND orig_region_name)
-- if the base table has no Primary Time Index TIMECODE must be specified
USING TIMECODE (start_date)
-- this creates the missing rows based on the date range in WHERE
FILL (0)
) A
Your percentage calculation can probably be simplified, too. Assuming the volumes are decimal/integer columns:
100.00 * sum_net_volume
/ NullIf(sum_monitored_volume ,0) AS performance_percentage
Rule of thumb: Multiply first, then divide.
You want all regions with all months. This is a cross join to start with.
SELECT
yearmonths.yearmonth,
regions.orig_region_name,
CASE WHEN data.yearmonth IS NULL THEN 0 ELSE
(Cast(data.sum_net_volume AS DECIMAL(15,4)) /
NullIf(data.sum_monitored_volume, 0)) * 100
END AS performance_percentage
FROM
(
SELECT DISTINCT
Trim(Year(start_date)) || '-' || To_Char(start_date, 'MM') AS yearmonth
FROM TABLE
WHERE start_date BETWEEN '2020-07-01' AND '2021-09-30'
AND group_code IN ('230')
AND (filter_BILL_RGN IN ('AM','EU') OR (1=2))
) yearmonths
CROSS JOIN
(
(
SELECT DISTINCT
orig_region_name
FROM TABLE
WHERE start_date BETWEEN '2020-07-01' AND '2021-09-30'
AND group_code IN ('230')
AND (filter_BILL_RGN IN ('AM','EU') OR (1=2))
) regions
LEFT JOIN
(
SELECT
Trim(Year(start_date)) || '-' || To_Char(start_date, 'MM') AS yearmonth,
orig_region_name,
Sum(net_volume) AS sum_net_volume,
Sum(monitored_vol) AS sum_monitored_volume
FROM TABLE
WHERE start_date BETWEEN '2020-07-01' AND '2021-09-30'
AND group_code IN ('230')
AND (filter_BILL_RGN IN ('AM','EU') OR (1=2))
GROUP BY 1,2
) data ON data.yearmonth = yearmonths.yearmonth
AND data.orig_region_name = regions.orig_region_name
ORDER BY yearmonths.yearmonth, regions.orig_region_name;
Remove conditions from yearmonths and regions, if you don't want to restrict them to those that have data.
I'm working with a table in SAP Advantage with separate date and time fields. I want to find records with a date and time within the last 5 minutes.
This works 99% of the time:
SELECT
*
FROM
table_name
WHERE
TIMESTAMPDIFF(SQL_TSI_DAY, date_field, CURRENT_TIMESTAMP()) < 1
AND
TIMESTAMPDIFF(SQL_TSI_MINUTE, time_field, CURRENT_TIMESTAMP()) < 5
However, this won't work around midnight. For instance, at 12:00AM, any records created at 11:57PM the previous day won't match the filter.
Any idea to do this? Thanks!
Sample image of data. Based on this data, at 7/12/19 at 12:01AM, I'd like to return the last 2 rows.
Created: 7/11/2019 22:54:43
Item EmpNo LastName FirstName date_field time_field
--------------------------------------------------------------------------
1 2 Nelson Roberto 7/11/2019 21:00:00
2 4 Young Bruce 7/11/2019 22:00:00
3 5 Lambert Kim 7/11/2019 23:00:00
4 8 Johnson Leslie 7/11/2019 23:56:00
5 9 Forest Phil 7/12/2019 00:00:00
The easiest way is to recombine the fields and then use TIMESTAMPDIFF():
TRY DROP TABLE #test; CATCH ALL END TRY;
CREATE TABLE #test
(
date_field DATE
, time_field TIME
);
INSERT INTO #test
SELECT '2019-07-11', '21:00:00' FROM system.iota
UNION SELECT '2019-07-11', '22:00:00' FROM system.iota
UNION SELECT '2019-07-11', '23:00:00' FROM system.iota
UNION SELECT '2019-07-11', '23:56:00' FROM system.iota
UNION SELECT '2019-07-12', '00:00:00' FROM system.iota
;
SELECT
TIMESTAMPDIFF(SQL_TSI_MINUTE,
CREATETIMESTAMP(
YEAR(date_field)
, MONTH(date_field)
, DAY(date_field)
, HOUR(time_field)
, MINUTE(time_field)
, SECOND(time_field)
, 0
)
, DATETIME'2019-07-12T00:00:00' -- CURRENT_TIMESTAMP()
)
FROM #test;
Which gives the expected result of:
180
120
4
0
It would be even more trivial if ADS supported an operator or a function to directly combine a date and a time, but I can't find one in the documentation.
So if you integrate that into your original SQL code, it would be:
SELECT
*
FROM
table_name
WHERE
TIMESTAMPDIFF(SQL_TSI_MINUTE,
CREATETIMESTAMP(
YEAR(date_field)
, MONTH(date_field)
, DAY(date_field)
, HOUR(time_field)
, MINUTE(time_field)
, SECOND(time_field)
, 0
)
, CURRENT_TIMESTAMP()
) < 5
I have monthly targets defined for the different category of items for the complete year.
Example:
January Target for A Category - 15,000
January Target for R Category - 10,000
January Target for O Category - 5,000
Actual Sales for A Category January - 18,400
Actual Sales for R Category January - 8,500
Actual Sales for O Category January - 3,821
The SQL query to compare actual sales with target will be simple as follows:
SELECT TO_CHAR (Sales_Date, 'MM') Sales_Month,
Sales_Category,
SUM (Sales_Value) Sales_Val_Monthly,
Target_Month,
Target_Category,
Target_Value
FROM Sales_Data, Target_Data
WHERE TO_CHAR (Sales_Date, 'MM') = Target_Month
AND Sales_Category = Target_Category
GROUP BY TO_CHAR (Sales_Date, 'MM'),
Target_Month,
Target_Category,
Sales_Category,
Target_Value;
Now I have a requirement that user will input FROM_DATE and TILL_DATE in the report parameter and the starting/ending date can be random, it will not represent a complete month or week, the start date can be 12/01/2018 and end date can be 15/01/2018, i.e., data for 4 days. The result should calculate the actual data for 4 days, calculate the target for 4 days considering the fact that there will be 6 working days (Sunday is a holiday) and if the date range includes Sunday, it should not be considered.
Also, the number of days in a month should be considered and the date parameters may contain some days from one month and some days from another month or maybe more than one month.
Target_Table (Target_Data)
Target_Year Target_Month Target_Category Target_Value
2018 01 A 15000
2018 02 A 8500
2018 03 A 9500
2018 01 R 15000
2018 02 R 8500
2018 03 R 9500
2018 01 O 15000
2018 02 O 8500
2018 03 O 9500
Sales Table (Sales_Data)
Inv_Txn Inv_No Sales_Date Item_Code Sales_Category Qty Rate Sales_Value Inv_Locn Inv_SM_ID
A21 2018000001 02/01/2018 XXXX A 2 5.5 11 O001 XXXX
R32 2018000001 27/02/2018 XXXX R 3 9.5 28.5 O305 XXXX
O98 2018000001 12/03/2018 XXXX O 12 12.5 150 O901 XXXX
U76 2018000001 18/01/2018 XXXX A 98 5.5 539 O801 XXXX
B87 2018000001 19/02/2018 XXXX R 2 9.5 19 O005 XXXX
A21 2018000002 13/03/2018 XXXX R 45 9.5 427.5 O001 XXXX
B87 2018000002 14/03/2018 XXXX O 12 12.5 150 O005 XXXX
Desired Output (From Date: 27/02/2018 Till Date: 06/03/2018)
Target_Category Target_Value Sales_Value
A 87.52 21.88
A 96.25 24.06
A 74.25 18.56
R 100.25 25.06
R 800.2 200.05
R 25.1 6.28
O 75.5 18.88
O 98.1 24.53
O 25.5 6.38
The first step might be to see whether we can get the number of Sundays in a given month. As it turns out, we can - and we don't have to use any SQL tricks or PL/SQL:
SELECT EXTRACT( DAY FROM LAST_DAY(SYSDATE) ) AS month_day_cnt
, CEIL( ( LAST_DAY(TRUNC(SYSDATE, 'MONTH')) - NEXT_DAY(TRUNC(SYSDATE, 'MONTH')-1, 'SUN') + 1 ) / 7 ) AS sunday_cnt
FROM dual;
This will give us the number of days in a given month as well as the number of Sundays. All we need to do is subtract the latter number from the former to get the number of working days. We can work that into your initial query (by the way, I suggest using TRUNC() instead of TO_CHAR() since your users might want a date range that spans more than one calendar year):
SELECT TRUNC(s.Sales_Date, 'MONTH') AS Sales_Month
, EXTRACT( DAY FROM LAST_DAY( TRUNC(s.Sales_Date, 'MONTH') ) ) - CEIL( ( LAST_DAY(TRUNC(s.Sales_Date, 'MONTH')) - NEXT_DAY(TRUNC(s.Sales_Date, 'MONTH')-1, 'SUN') + 1 ) / 7 ) AS working_day_cnt
, s.Sales_Category, SUM(s.Sales_Value) AS Sales_Val_Monthly
, t.Target_Value -- Target_Month and Target_Category are superfluous
FROM Sales_Data s INNER JOIN Target_Data t
ON TO_CHAR(s.Sales_Date, 'MM') = t.Target_Month
AND TO_CHAR(s.Sales_Date, 'YYYY') = t.Target_Year
AND s.Sales_Category = t.Target_Category
GROUP BY TRUNC(s.Sales_Date, 'MONTH'), Sales_Category, Target_Value;
Now given a start date and an end date, we can generate the number of working days for all the months in between those dates as follows:
SELECT TRUNC(range_dt, 'MONTH'), COUNT(*) FROM (
SELECT start_dt + LEVEL - 1 AS range_dt
FROM dual
CONNECT BY start_dt + LEVEL - 1 < end_dt
) WHERE TO_CHAR(range_dt, 'DY') != 'SUN'
GROUP BY TRUNC(range_dt, 'MONTH');
where start_dt and end_dt are parameters supplied by the user. Putting this all together, we'll have something like the following:
WITH rd ( range_month, range_day_cnt ) AS (
SELECT TRUNC(range_dt, 'MONTH'), COUNT(*) FROM (
SELECT start_dt + LEVEL - 1 AS range_dt
FROM dual
CONNECT BY start_dt + LEVEL - 1 < end_dt
) WHERE TO_CHAR(range_dt, 'DY') != 'SUN'
GROUP BY TRUNC(range_dt, 'MONTH')
)
SELECT range_month, Sales_Category, Sales_Val_Monthly
, range_day_cnt, working_day_cnt, Target_Value
, Target_Value*range_day_cnt/working_day_cnt AS prorated_target_value
FROM (
SELECT r.range_month, r.range_day_cnt
, EXTRACT( DAY FROM LAST_DAY( TRUNC(s.Sales_Date, 'MONTH') ) ) - CEIL( ( LAST_DAY(TRUNC(s.Sales_Date, 'MONTH')) - NEXT_DAY(TRUNC(s.Sales_Date, 'MONTH')-1, 'SUN') + 1 ) / 7 ) AS working_day_cnt
, s.Sales_Category, SUM(s.Sales_Value) AS Sales_Val_Monthly
, t.Target_Value -- Target_Month and Target_Category are superfluous
FROM rd INNER JOIN Sales_Data s
ON rd.range_month = TRUNC(s.Sales_Date, 'MONTH')
INNER JOIN Target_Data t
ON TO_CHAR(s.Sales_Date, 'MM') = t.Target_Month
AND TO_CHAR(s.Sales_Date, 'YYYY') = t.Target_Year
AND s.Sales_Category = t.Target_Category
WHERE s.Sales_Date >= TRUNC(start_dt)
AND s.Sales_Date < TRUNC(end_dt+1)
GROUP BY r.range_month, r.range_day_cnt, s.Sales_Category, t.Target_Value
) ORDER BY range_month;
If you have a table of public holidays, then those will have to be factored in somewhere as well - both in the rd common table expression and from the calculation of working days. If the above doesn't give you a start on that then I can take a look again in a bit and see how the other holidays might be worked in.
You can calculate the number of working days between two dates using below query. I added a nonworking date via a table named: holiday_dates and created a series of dates from 12/01/2018 to 15/01. I remove those dates that are either Sunday or holiday. Please let me know if it works for you. Thanks.
create table holiday_dates(holiday_dte date, holiday_desc varchar(100));
insert into holiday_dates values(TO_DATE('13/01/2018','DD-MM-YYYY'), 'Not a Working Day');
With tmp as (
select count(*) as num_of_working_days
from ( select rownum as rn
from all_objects
where rownum <= to_date('15/01/2018','DD-MM-YYYY') - to_date('12/01/2018','DD-MM-YYYY')+1 )
where to_char( to_date('12/01/2018','DD-MM-YYYY')+rn-1, 'DY' ) not in ( 'SUN' )
and not exists ( select null from holiday_dates where holiday_dte = trunc(to_date('12/01/2018','DD-MM-YYYY') + rn - 1)))
SELECT TO_CHAR (Sales_Date, 'MM') Sales_Month,
Sales_Category,
SUM (Sales_Value) Sales_Val_Monthly,
Target_Month,
Target_Category,
Target_Value,
tmp.num_of_working_days
FROM Sales_Data, Target_Data, tmp
WHERE Sales_Date between to_date('12/01/2018','DD-MM-YYYY') and to_date('15/01/2018','DD-MM-YYYY')
AND Sales_Category = Target_Category
GROUP BY TO_CHAR (Sales_Date, 'MM'),
Target_Month,
Target_Category,
Sales_Category,
Target_Value;
I have the following table Sales:
Date Store Sales
1/1/2015 St01 12123
1/1/2015 St02 3123
1/1/2016 St01 4213
1/1/2016 St03 2134
When I try to self join to get this year and last year sales the closed store is not showing up.
The result should be like this:
Date Store This year Sales Last Year Sales
1/1/2016 St01 4213 1212
1/1/2016 St02 0 3123
1/1/2016 St03 2134 0
My query as follows:
SELECT CY.DATE,
CY.store cy.Sales,
LY.sales
FROM sales CY,
sales LY
WHERE CY.store(+) = LY.store(+)
AND LY.DATE = CY.DATE - 365
Oracle Setup:
CREATE TABLE sales ( "DATE", Store, Sales ) AS
SELECT DATE '2015-01-01', 'St01', 12123 FROM DUAL UNION ALL
SELECT DATE '2015-01-01', 'St02', 3123 FROM DUAL UNION ALL
SELECT DATE '2016-01-01', 'St01', 4213 FROM DUAL UNION ALL
SELECT DATE '2016-01-01', 'St03', 2134 FROM DUAL;
Query:
SELECT TRUNC( SYSDATE, 'YY' ) AS "DATE",
Store,
SUM( CASE WHEN "DATE" = TRUNC( SYSDATE, 'YY' )
THEN sales END )
AS "This year sales",
SUM( CASE WHEN "DATE" = ADD_MONTHS( TRUNC( SYSDATE, 'YY' ), -12 )
THEN sales END )
AS "Last year sales"
FROM sales
GROUP BY store
ORDER BY store;
Output:
DATE STORE This year sales Last year sales
------------------- ----- --------------- ---------------
2016-01-01 00:00:00 St01 4213 12123
2016-01-01 00:00:00 St02 3123
2016-01-01 00:00:00 St03 2134
What you need is called Pivoting Table. Although Oracle has specific clauses to do it, you can use just plain and pure SQL to do so, like this:
SELECT store,
SUM(CASE WHEN Extract(year FROM DATE) = Extract(year FROM SYSDATE) THEN
sales
ELSE 0
END) AS "This year Sales",
SUM(CASE WHEN Extract(year FROM DATE) = Extract(year FROM SYSDATE) - 1 THEN
sales
ELSE 0
END) AS "Last year Sales"
FROM sales
WHERE Extract(year FROM DATE) >= Extract(year FROM SYSDATE) - 1
GROUP BY store
ORDER BY store
It would show:
Store This year Sales Last year Sales
St01 4213 12123
St02 0 3123
St03 2134 0
Note that makes no sense to have to column date as the first column. You couldn't group by it to show the output you want.
See the equivalent of this query here on fiddle: http://sqlfiddle.com/#!15/7662d8/6
Since I want the query to return day by day sales I used MT0 answer and added the dates, this way I can get the data for all year days.
WITH AllYear AS
(select to_date('2016-01-01', 'yyyy-mm-dd') + level - 1 AS dobs
from dual
connect by level <= 366)
SELECT dobs AS "DATE",
Store,
nvl(SUM(CASE
WHEN t.Date = dobs THEN
t.sales
END),
0) AS "This Year Sales",
nvl(SUM(CASE
WHEN t.Date = dobs-365 THEN
t.sales
END),
0) AS "Last Year Sales"
FROM Sales t,AllYear
where dobs='01-Jan-2016'
GROUP BY Store
ORDER BY Store;
The general solution is a full outer join, which includes all records from both joined tables. I don't know the Oracle syntax, but in MS SQL Server it would be something like this:
SELECT ISNULL(CY.DATE, LY.DATE) as DATE,
ISNULL(CY.store, LY.store) as STORE,
isnull(cy.Sales, 0),
isnull(LY.sales, 0)
FROM sales CY FULL OUTER JOIN sales LY
ON CY.store = LY.store
AND (CY.DATE IS NULL OR
DATEPART(year, LY.DATE) = DATEPART(year, CY.DATE) - 1
ISNULL(a, b) gives a if A IS NOT NULL, else b. DATEPART extracts specified part of a date; I'm comparing a difference of exactly one year rather than 365 days, in case "last year" is a leap year/
I only work with SQL Server. If anything is different, try to apply the same logic.
Declaring a temporary table to test the query:
DECLARE #Sales TABLE (
[Date] DATE,
Store NVARCHAR(10),
Sales INT
)
INSERT INTO #Sales VALUES
('1/1/2015','St01',12123),
('1/1/2015','St02',3123),
('1/1/2016','St01',4213),
('1/1/2016','St03',2134);
SELECT * FROM #Sales;
The actual query:
SELECT
CY_Date = CASE
WHEN CY.Date IS NULL THEN DATEADD(YEAR, 1, LY.Date)
ELSE CY.Date
END,
LY_Date = CASE
WHEN LY.Date IS NULL THEN DATEADD(YEAR, -1, CY.Date)
ELSE LY.Date
END,
Store = CASE
WHEN CY.Store IS NULL THEN LY.Store
ELSE CY.Store
END,
ISNULL(CY.Sales, 0) AS CY_Sales,
ISNULL(LY.Sales, 0) AS LY_Sales
FROM #Sales CY
FULL JOIN #Sales LY ON (CY.Store = LY.Store AND LY.Date = DATEADD(YEAR, -1, CY.Date))
WHERE (CY.Date = '1/1/2016' OR CY.Date IS NULL)
AND (LY.Date = DATEADD(YEAR, -1, '1/1/2016') OR LY.Date IS NULL);
Result:
CY_Date LY_Date Store CY_Sales LY_Sales
2016-01-01 2015-01-01 St01 4213 12123
2016-01-01 2015-01-01 St03 2134 0
2016-01-01 2015-01-01 St02 0 3123
How it works:
The FULL JOIN will will combine by the Store and the lines from the current and the year before.
The WHERE clause will filter by the current date '1/1/2016'. The NULLs are allowed because sometimes you don't have lines for the current or for the last year.
On the columns, CASES are used to create the dates if they are null (If the current date is null, get the last year + 1 year, and vice versa), to create the store if they are null and to place a zero instead of a null on the sales columns.
I have seen a few questions on Stackoverflow that pertain to comparing rows but nothing quite like this question. I have a table with columns similar to :
- Month (Date (01-Jan-2013)
- Country (Varchar2)
- SubCustomer (Varchar2)
- FTE (Number) Represents a value that is manually entered by employees each month
We may have some data such as this..
- 01-Jan-2013 USA Customer1 10
- 01-Feb-2013 USA Customer1 15
- 01-Mar-2013 USA Customer1 30
- 01-Jan-2013 BRA Customer2 100
- 01-Feb-2013 BRA Customer2 300
- 01-Mar-2013 BRA Customer2 50
My goal is to compare the FTE that is entered and provide an Alert in a separate column like 'High Alert' or 'Low Alert' or 'Ok' if the FTE that is entered is +/- 2x the previous month for each Month + Country + SubCustomer. I've been playing with different Case statements but I cannot seem to get the comparison to work from month to month.
Using the data above, Customer 1 would produce an alert of "High Alert" from Feb to Mar and Customer 2 would produce a hgh alert of Jan to Feb and a Low Alert from Feb to Mar.
Sounds like you should use LAG / LEAD.
SELECT Month, Country, SubCustomer, FTE
, LAG(FTE, 1, 0) OVER (PARTITION BY Country, SubCustomer ORDER BY Month) as PrevFTE
FROM MyTable
You may find some good examples here:
http://www.oracle-base.com/articles/misc/lag-lead-analytic-functions.php
Try this: INCLUDED POSSIBILITY OF NULL or ZERO
WITH TABLEDATA
AS (SELECT
TO_DATE ( '01-Jan-2013',
'DD-MON-YYYY' )
AS MONT,
'USA' AS COUNTRY,
'Customer1' AS SUBCUST,
0 AS FTE
FROM
DUAL
UNION ALL
SELECT
TO_DATE ( '01-Feb-2013',
'DD-MON-YYYY' )
AS MONT,
'USA' AS COUNTRY,
'Customer1' AS SUBCUST,
15 AS FTE
FROM
DUAL
UNION ALL
SELECT
TO_DATE ( '01-Mar-2013',
'DD-MON-YYYY' )
AS MONT,
'USA' AS COUNTRY,
'Customer1' AS SUBCUST,
30 AS FTE
FROM
DUAL
UNION ALL
SELECT
TO_DATE ( '01-Jan-2013',
'DD-MON-YYYY' )
AS MONT,
'BRA' AS COUNTRY,
'Customer2' AS SUBCUST,
100 AS FTE
FROM
DUAL
UNION ALL
SELECT
TO_DATE ( '01-Feb-2013',
'DD-MON-YYYY' )
AS MONT,
'BRA' AS COUNTRY,
'Customer2' AS SUBCUST,
300 AS FTE
FROM
DUAL
UNION ALL
SELECT
TO_DATE ( '01-Mar-2013',
'DD-MON-YYYY' )
AS MONT,
'BRA' AS COUNTRY,
'Customer2' AS SUBCUST,
50 AS FTE
FROM
DUAL)
SELECT
MONT,
COUNTRY,
SUBCUST,
FTE,
ROUND ( ( FTE
- PREVIOUS_FTE )
/ NULLIF ( PREVIOUS_FTE,
0 ),
2 )
* 100
CHANGE_IN_PERCENT,
CASE
WHEN ROUND ( ( FTE
- PREVIOUS_FTE )
/ NULLIF ( PREVIOUS_FTE,
0 ),
2 )
* 100 >= 200
THEN
'High Alert'
WHEN ROUND ( ( FTE
- PREVIOUS_FTE )
/ NULLIF ( PREVIOUS_FTE,
0 ),
2 )
* 100 <= 200
THEN
'Low Alert'
ELSE
'OK'
END
AS ALERTS
FROM
(SELECT
MONT,
COUNTRY,
SUBCUST,
FTE,
LAG ( FTE,
1 )
OVER (PARTITION BY COUNTRY, SUBCUST ORDER BY MONT)
PREVIOUS_FTE
FROM
TABLEDATA);
Results:
MONT COUNTRY SUBCUST FTE CHANGE_IN_PERCENT ALERTS
1/1/2013 BRA Customer2 100 OK
2/1/2013 BRA Customer2 300 200 High Alert
3/1/2013 BRA Customer2 50 -83 Low Alert
1/1/2013 USA Customer1 0 OK
2/1/2013 USA Customer1 15 OK
3/1/2013 USA Customer1 30 100 Low Alert