SQL SELECT for Cursor using an NVL - sql

I am trying to modify the SELECT statement used in an employee interface file because it is returning an error when it passes an employee with their END_DATE is...well 'dated'.
I am pretty new to this, so in order to skip Oracle users that are end dated, I feel that using the NVL expression would be best here? However, after researching it, I am just not grasping the concept too well, and unsure if it is what I should be using.
Here is the statement:
FROM oaa.xxap_employee_interim ei,
hr.per_all_people_f ppf,
hr.per_all_assignments_f paf,
hr.per_addresses pa,
apps.fnd_user u
WHERE new_update = 'U'
AND ppf.person_type_id = 6
AND ei.employee_number = ppf.employee_number
AND ppf.effective_end_date >= SYSDATE
AND ppf.person_id = paf.person_id
AND paf.effective_end_date >= SYSDATE
AND ppf.person_id = pa.person_id
AND u.employee_id = ppf.person_id
--AND NVL(u.end_date, SYSDATE + 5);
I was also suggested by a co-worker that using the 'IS NULL' would not be correct in this circumstance. Is that true? If so, why?
Thanks in advance.

I'm going to guess that you want to check if u.end_date is a date in the past or is not null. In that case I would use the following:
AND COALESCE(u.end_date, SYSDATE) < SYSDATE
The Oracle specific way to do that is
AND NVL(u.end_date, SYSDATE) < SYSDATE

Related

Is there a way to define a named constant/parameter in a single-statement SQL query?

Tried to do something like:
WITH
dates as (SELECT '2015-01-01' as start, '2016-01-01' as end)
SELECT * FROM my_table WHERE start_date >= dates.start AND end_date <= dates.end
But got the error message "Relation 'dates' does not exist" (in Vertica). Is there any proper way to define a constant/parameter. In real example the query contains multiple selects over a defined time range, hence I'd like to maintain the values constants/parameters in a single place to allow them to be reused in the nested subqueries.
IF possible, I'd like to refrain from DECLARE/SET-like statements, where a separate line is required.
I would still do what #GordonLinoff mentioned (I doubt it has much impact on the query plan), but in case you really don't want to or like the feature I will show at the end...
You mentioned you were going to use vsql. You can do variables there.
\set start '2015-01-01'
\set end '2016-01-01'
SELECT *
FROM my_table
WHERE start_date >= :start
AND end_date <= :end;
Which is I guess kind of nice because you can also do things like:
\set start ''`date "+%Y-%m-%d %H:%M:%S"`''
Or echo the result of any command into a variable that you can use as a variable in your sql statement. Note that this variable is quite literal and you need to include any punctuation, quoting, etc. It isn't just a value, it's more like a template.
You need to have dates in the FROM clause if you want it in the query. You can do this as:
WITH dates as (SELECT '2015-01-01' as start, '2016-01-01' as end)
SELECT t.*
FROM my_table t JOIN
dates d
ON t.start_date >= d.start AND t.end_date <= d.end;
Note: You can also do this with a CROSS JOIN. I often write queries as:
WITH params as (
SELECT '2015-01-01' as start, '2016-01-01' as end
)
SELECT t.*
FROM params CROSS JOIN
my_table t
WHERE t.start_date >= params.start AND t.end_date <= params.end;

Start a Case Statement without an outside 'Select' - CTE's

Running this query (below) returns a 'Too Many Values' error:
select
case
when to_char(sysdate, 'yyyymmdd') = to_char(sysdate, 'yyyymm') || '01'
then (select FirstReportGroups.*, FirstReportDetails.*
from FirstReportGroups, FirstReportDetails)
when to_char(sysdate, 'yyyymmdd') = to_char(sysdate, 'yyyymm') || '16'
then (select SecondReportGroups.*, SecondReportDetails.*
from SecondReportGroups, SecondReportDetails)
end as LetsSee
from cmtmpentered t1 join cmtmpconf t2
on t1.casenumber = t2.casenumber
and t1.enc = t2.enc
;
I'm using CTE's (they are not included because it would make this very long) and it makes logical sense to me, but googling this 'Too Many Values' error gives me no substantial help. Running the CTE's individually works, so that is not the problem.
I think all would be solved if I could only get rid of the outside 'Select' statement and just keep the selects inside the Case. If I'm explaining this poorly, an example of what I'm looking for is this:
case
when to_char(sysdate, 'yyyymmdd') = to_char(sysdate, 'yyyymm') || '01'
then (select FirstReportGroups.*, FirstReportDetails.*
from FirstReportGroups, FirstReportDetails)
when to_char(sysdate, 'yyyymmdd') = to_char(sysdate, 'yyyymm') || '16'
then (select SecondReportGroups.*, SecondReportDetails.*
from SecondReportGroups, SecondReportDetails)
end as LetsSee
;
Is this doable in any capacity? This syntax obviously doesn't work, otherwise I wouldn't be asking the question - but is there a related way I can do this?
select FirstReportGroups.*, FirstReportDetails.*
from (select 1 a from dual) FirstReportGroups
cross join (select 2 b from dual) FirstReportDetails
where extract(day from sysdate) = 1
---------
union all
---------
select SecondReportGroups.*, SecondReportDetails.*
from (select 3 a from dual) SecondReportGroups
cross join (select 4 b from dual) SecondReportDetails
where extract(day from sysdate) = 16;
Replaced common table expressions with inline views. CTEs should only be used if they are referenced more than once. They may look a little nicer with small examples and for programmers only used to procedural code. Serious SQL requires multiple nested levels of inline views. And debugging is much easier without CTEs - the CTEs make it difficult to highlight and run sub-blocks of code.
Replaced case expression with predicate to filter by date. A CASE expression can only return a single value. You could probably do something fancy with types, but that would be horribly complicated. My code still makes the assumption that the two sets return the same types of values. If that's not true you'll need to do something different at the application level.
Replaced to_char with extract. Date handling can be confusing in Oracle. The trick is to keep everything in their native type. You should almost never need to use to_char or to_date for anything other than formatting for display. For any other operation there's almost certainly a function to handle it without converting types.
Replaced , with ANSI-syntax cross join. The ANSI-style syntax has several advantages. One major advantage is it makes it clear when cross joins are intentional.
You aren't specifying a join condition for LetsSee. You are effectively cross joining LetsSee with the result of your from clause:
from cmtmpentered t1
join cmtmpconf t2 on t1.casenumber = t2.casenumber and t1.enc = t2.enc
I suggest you join LetsSee with t1 or t2.
Also, you are not specifying a join condition for the pairs:
FirstReportGroups and FirstReportDetails
SecondReportGroups and SecondReportDetails

SQL: conditional SELECT

I use the following SQL query to receive all entries that are older than payment_target days.
SELECT * from orders WHERE(EXTRACT(EPOCH FROM age(CURRENT_DATE,
date_trunc('day', orders.created_at))) - orders.payment_target*86400 >= 0)
However, I want to modify this query in a way that orders.invcoice_sent_at is used as the calculation basis if it is not null. Otherwise, `orders.created_at' should be used.
I tried it with the following query but I guess it is more pseudo code than valid SQL. I don't know how I can set the attribute to be used in the statement block.
SELECT * from orders where (EXTRACT(EPOCH FROM age(CURRENT_DATE,
date_trunc('day', IF orders.invoice_sent_at IS NOT NULL
BEGIN orders.invoice_sent_at END
ELSE
BEGIN orders.created_at END ))) - orders.payment_target*86400 >= 0)
Instead of if you can use coalesce which will work on every database supporting ANSI SQL:
coalesce( orders.invoice_sent_at, orders.created_at) - orders.payment_target*86400 >= 0)
If I understand your question correctly, you can use a declared variable.
this will also make your code more readable. something along the lines of:
DECLARE #DUEDATE DATETIME
SET #DUEDATE = ISNULL(("GET THE VALUE FROM orders.invoice_sent_at"),("GET THE VALUE FROM orders.created_at"))
use #DUEDATE in your select.
SQL Fiddle
SELECT *
from orders
where CURRENT_DATE - PaymentTarget > coalesce( InvoiceSent, Created );

How to reduce query execution time for table with huge data

I am running this query in production(Oracle) and it is taking more than 3 minutes . Is there any way out to reduce the execution time ? Both svc_order and event table contains almost 1million records .
select 0 test_section, count(1) count, 'DD' test_section_value
from svc_order so, event e
where so.svc_order_id = e.svc_order_id
and so.entered_date >= to_date('01/01/2012', 'MM/DD/YYYY')
and e.event_type = 230 and e.event_level = 'O'
and e.current_sched_date between
to_date( '09/01/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS')
and to_date('09/29/2013 23:59:59', 'MM/DD/YYYY HH24:MI:SS')
and (((so.sots_ta = 'N') and (so.action_type = 0))
or ((so.sots_ta is null) and (so.action_type = 0))
or ((so.sots_ta = 'N') and (so.action_type is null)))
and so.company_code = 'LL'
Looking at the what you said that you cannot create indexes. I hope that the query is making a full table scan on the table. Please try a parallel hint.
select /*+ full(so) parallel(so, 4) */ 0 test_section, count(1) count, 'DD' test_section_value
from svc_order so, event e
where so.svc_order_id = e.svc_order_id
and so.entered_date >= to_date('01/01/2012', 'MM/DD/YYYY')
and e.event_type = 230 and e.event_level = 'O'
and e.current_sched_date between
to_date( '09/01/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS')
and to_date('09/29/2013 23:59:59', 'MM/DD/YYYY HH24:MI:SS')
and (((so.sots_ta = 'N') and (so.action_type = 0))
or ((so.sots_ta is null) and (so.action_type = 0))
or ((so.sots_ta = 'N') and (so.action_type is null)))
and so.company_code = 'LL'
You could at least avoid the triple AND/OR list by using COALESCE() (or its oracle equivalent IFNULL() ) Note: this does not catch the case where both sots_ta and action_type are NULL.
SELECT 0 test_section, count(1) count, 'DD' test_section_value
FROM svc_order so
JOIN event e ON so.svc_order_id = e.svc_order_id
WHERE e.event_type = 230 and e.event_level = 'O'
AND so.entered_date >= to_date('01/01/2012', 'MM/DD/YYYY')
AND e.current_sched_date >= to_date('09/01/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS')
AND e.current_sched_date < to_date('10/01/2013 00:00:00', 'MM/DD/YYYY HH24:MI:SS')
AND COALESCE(so.sots_ta, 'N') = 'N'
AND COALESCE(so.action_type, 0) = 0
AND so.company_code = 'LL'
I replaced the between by a plain t >= low AND t. < high) test because I don't like betweens semantics. I replaced the FROM kommalist by a JOIN because I like joins better.
We cannot have additional indexes but tables must have at least meaning full primary key, right so is there one? That should result in at least index, non/clustered, anything.
Look at it lets and try to make use of it.
In case table is a heap, and we want to deal with it as it is, then we should reduce the number rows in each table individually by applying respective where filters and then combine that result set.
In your query only meaning full result column depends on base tables is count(1). Other two columns are constants.
Because also JOIN/Cartesian Product etc….. will lead DB engine to look for Indexes so instead use INTERSECT which I feel should better in your case.
Some other changes you can do:
Avoid using TO_DATE or any kind of function in Right Side of the WHERE condition column. Prepare data in local Variable and use Local Variable in query.
Also you need to check is there any good performance gain using >= than BETWEEN ?
I have modified the query and also combined one redundant where condition.
Remember that if this changes works for you right now that doesn’t mean it will work always. As son your table start hitting more data that qualifies those WHERE conditions this swill again come back as slow query. so for short term this might work but longer term you have to think about alternate options
1) for example Indexed Views on top of this tables
2) Create same tables with different name and sync data
between new and original table using “Insert/Update/Delete Trigger”.
SELECT COUNT(1) AS [COUNT], 'DD' test_section_value ,0 test_section
FROM
(
SELECT so.svc_order_id
FROM svc_order so
WHERE so.entered_date >= to_date('01/01/2012', 'MM/DD/YYYY')
AND so.company_code = 'LL'
INTERSECT
SELECT e.svc_order_id
FROM event e
WHERE e.event_type = 230
AND e.event_level = 'O'
AND e.current_sched_date BETWEEN
to_date('09/01/2010 00:00:00','MM/DD/YYYY HH24:MI:SS')
AND to_date('09/29/2013 23:59:59','MM/DD/YYYY HH24:MI:SS')
AND (
(( so.sots_ta = 'N' ) AND ( so.action_type IS NULL OR so.action_type = 0))
OR
(( so.sots_ta IS NULL ) AND ( so.action_type = 0 ))
--or ((so.sots_ta = 'N') and (so.action_type is null))
)
)qry1
First, ensure statistics are up-to-date.
begin
dbms_stats.gather_table_stats('[schema]', 'svc_order');
dbms_stats.gather_table_stats('[schema]', 'event');
end;
/
This query is a very simple join between two small tables, but with complex predicates.
You almost certainly do not want to significantly re-write all of your queries in search of some magic syntax that will make everything run fast. Yes, there are some rare cases where BETWEEN does not work well, or moving the predicates into an inline view helps, or replacing the join with an INTERSECT might help. But that sounds like cargo-cult programming to me. Ask yourself, why would those changes make any difference? If those types of changes always improved performance, why wouldn't Oracle just translate the queries internally?
Normally, you should try to provide better information to the optimizer so it can make better decisions. Usually this is as simple as gathering statistics with the default settings. Some predicates are just too complex, and for that you should try to use
dynamic sampling, such as /*+ dynamic_sampling(6) */. Or maybe
add some histograms. Or perhaps add an expression statistic like this:
SELECT
DBMS_STATS.CREATE_EXTENDED_STATS(null,'SVC_ORDER',
'(((so.sots_ta = 'N') and (so.action_type = 0))
or ((so.sots_ta is null) and (so.action_type = 0))
or ((so.sots_ta = 'N') and (so.action_type is null)))'
)
FROM DUAL;
--Don't forget to re-gather statistics after this.
The optimizer is probably under-estimating the number of rows, and using a nested loop instead of a hash join. After providing it with more information, ideally it will start using a hash join. But at some point, after you've tried the above methods and possibly many other features, you can just tell it what kind of join to use. Which would be #Florin Ghita's suggestion, /*+use_hash(so e)*/.

SQL date Statement

I need some help figuring out and SQL Statement.
I know what I want I just cant express it.
Im using php, so it doesnt need to be exclusivly SQL, its to act as a filter.
Pseudo code
$query="SELECT * FROM MyTable WHERE 'TIS' is not older than 2 days or empty = ''$ORDER"; }
TIS in the name of the column in my table were I store dates in this format 03-12-09 (d,m,y).
The $ORDER is for ordering the values based on values from other fields not dates.
Im looking at
SELECT *
FROM orders
WHERE day_of_order >
(SELECT DATEADD(day,-30, (SELECT MAX(day_of_order) FROM orders)) AS "-30 Days");
But i dont quite think im on the rigth track with this.
Thanks
Try the following:
SELECT *
FROM MyTable
WHERE COALESCE(TIS, SYSDATE) > SYSDATE - INTERVAL '2' DAY
$ORDER
I don't know what database you're using - the above uses Oracle's method of dealing with time intervals. If you're using SQL Server the following should be close:
SELECT *
FROM MyTable
WHERE COALESCE(TIS, GETDATE()) > DATEADD(Day, -2, GETDATE())
$ORDER
In MySQL try this:
SELECT *
FROM MyTable
WHERE COALESCE(TIS, NOW()) > DATE_SUB(NOW(), INTERVAL 2 DAYS)
$ORDER
I hope this helps.
So, I was pretty lost in all this.
How did it got solved:
First I understood that the Statement I was using was not supported by MySql thanks to eligthment from Bob Jarvis.
_ Second In a comment by vincebowdren wich "strongly" adviced me to change the data type on that field to Date wich indeed I had not, it was a string.
It was pretty Dumb for me to try using SQL operations for Dates on a field that had String values.
So I just RTFM: http://dev.mysql.com/doc/refman/5.1/en/date-and-time-functions.html
and:
mysql> SELECT something FROM tbl_name
-> WHERE DATE_SUB(CURDATE(),INTERVAL 30 DAY) <= date_col;
Then proceeded to change the field value to date.
and this is my perfectly working query:
$query="SELECT * FROM MyTable WHERE DATE_SUB(CURDATE(),INTERVAL 2 DAY) <= TIS OR TIS = 0000-00-00 $ORDER "; }
I would like to thank the posters for their aid.