Take the following query...
select
trunc(dateX)-trunc(sysdate) daysTilX,
trunc(dateY)-trunc(sysdate) daysTilY,
least(trunc(dateX)-trunc(sysdate),trunc(dateY)-trunc(sysdate)) leastOfTheTwo
from myTable
If dateX or dateY is null then least() returns null. I need to figure out how to have the leastOfTheTwo column return null only if both dateX and dateY are null, otherwise, I want the number. Any ideas?
UPDATE To be clear, I cannot use nvl on the dates because they represent due dates. Meaning -1 (one day late), 0 (today), 1 (tomorrow), null (neither due dates ever set).
select
trunc(dateX)-trunc(sysdate) daysTilX,
trunc(dateY)-trunc(sysdate) daysTilY,
least(trunc(nvl(dateX, dateY))-trunc(sysdate),trunc(nvl(dateY, dateX))-trunc(sysdate)) leastOfTheTwo
from myTable
select
trunc(dateX)-trunc(sysdate) daysTilX,
trunc(dateY)-trunc(sysdate) daysTilY,
least(nvl(trunc(dateX)-trunc(sysdate),0),nvl(trunc(dateY)-trunc(sysdate),0)) leastOfTheTwo
from myTable
Related
I use a CTE to calculate spans of time in a log as shown in this fiddle:
http://www.sqlfiddle.com/#!3/b99448/6
Note that one of the rows has a NULL value because that is the most recent log entry and no calculation could be made.
However, if I SUM these results the NULL is being treated as a zero:
http://www.sqlfiddle.com/#!3/b99448/4
How can I get this to stop ignoring NULL values?
I would expect the sum to be NULL since it is adding a NULL value.
The aggregation functions ignore NULL values. They are not treated as 0 -- the distinction is more important for AVG(), MIN(), and MAX(). So, SUM() only returns NULL when all values are NULL.
If you want to get NULL back, here is a simple expression:
select (case when count(*) = count(a.DateTimeChangedUtc) and
count(*) = count(b.DateTimeChangedUTC)
then SUM(DATEDIFF(SECOND, a.DateTimeChangedUtc, b.DateTimeChangedUTC))
end) AS TimeSpentSeconds
This returns NULL if either argument is ever NULL.
It would be appreciated explaining the internal functionality of SUM function in Oracle, when encountering null values:
The result of
select sum(null) from dual;
is null
But when a null value is in a sequence of values (like sum of a null-able column), the calculated value of null value will be 0
select sum(value) from
(
select case when mod(level , 2) = 0 then null else level end as value from dual
connect by level <= 10
)
is 25
This will be more interesting when seeing the result of
select (1 + null) from dual
is null
As any operation with null will result null (except is null operator).
==========================
Some update due to comments:
create table odd_table as select sum(null) as some_name from dual;
Will result:
create table ODD_TABLE
(
some_name NUMBER
)
Why some_name column is of type number?
If you are looking for a rationale for this behaviour, then it is to be found in the ANSI SQL standards which dictate that aggregate operators ignore NULL values.
If you wanted to override that behaviour then you're free to:
Sum(Coalesce(<expression>,0))
... although it would make more sense with Sum() to ...
Coalesce(Sum(<expression>),0)
You might more meaningfully:
Avg(Coalesce(<expression>,0))
... or ...
Min(Coalesce(<expression,0))
Other ANSI aggregation quirks:
Count() never returns null (or negative, of course)
Selecting only aggregation functions without a Group By will always return a single row, even if there is no data from which to select.
So ...
Coalesce(Count(<expression>),0)
... is a waste of a good coalesce.
SQL does not treat NULL values as zeros when calculating SUM, it ignores them:
Returns the sum of all the values, or only the DISTINCT values, in the expression. Null values are ignored.
This makes a difference only in one case - when the sequence being totalled up does not contain numeric items, only NULLs: if at least one number is present, the result is going to be numeric.
You're looking at this the wrong way around. SUM() operates on a column, and ignores nulls.
To quote from the documentation:
This function takes as an argument any numeric data type or any nonnumeric data type that can be implicitly converted to a numeric data type. The function returns the same data type as the numeric data type of the argument.
A NULL has no data-type, and so your first example must return null; as a NULL is not numeric.
Your second example sums the numeric values in the column. The sum of 0 + null + 1 + 2 is 3; the NULL simply means that a number does not exist here.
Your third example is not an operation on a column; remove the SUM() and the answer will be the same as nothingness + 1 is still nothingness. You can't cast a NULL to an empty number as you can with a string as there's no such thing as an empty number. It either exists or it doesn't.
Arithmetic aggregate functions ignore nulls.
SUM() ignores them
AVG() calculates the average as if the null rows didn't exist (nulls don't count in the total or the divisor)
As Bohemian has pointed out, both SUM and AVG exclude entries with NULL in them. Those entries do not go into the aggregate. If AVG treated NULL entries as zero, it would bias the result towards zero.
It may appear to the casual observer as though SUM is treating NULL entries as zero. It's really excluding them. If all the entries are excluded, the result is no value at all, which is NULL. Your example illustrates this.
This is incorrect: The sum of 0 + null + 1 + 2 is 3;
select 0 + null + 1 + 2 total from dual;
Result is null!
Similar statements give result null if any operand is null.
Here's a solution if you want to sum and NOT ignore nulls.
This solution splits the records into two groups: nulls and non-nulls. NVL2(a, 1, NULL) does this by changing all the non-nulls to 1 so they sort together identically. It then sorts those two groups to put the null group first (if there is one), then sums just the first of the two groups. If there are no nulls, there will be no null group, so that first group will contain all the rows. If, instead, there is at least one null, then that first group will only contain those nulls, and the sum of those nulls will be null.
SELECT SUM(a) AS standards_compliant_sum,
SUM(a) KEEP(DENSE_RANK FIRST ORDER BY NVL2(a, 1, NULL) DESC) AS sum_with_nulls
FROM (SELECT 41 AS a FROM DUAL UNION ALL
SELECT NULL AS a FROM DUAL UNION ALL
SELECT 42 AS a FROM DUAL UNION ALL
SELECT 43 AS a FROM DUAL);
You can optionally include NULLS FIRST to make it a little more clear about what's going on. If you're intentionally ordering for the sake of moving nulls around, I always recommend this for code clarity.
SELECT SUM(a) AS standards_compliant_sum,
SUM(a) KEEP(DENSE_RANK FIRST ORDER BY NVL2(a, 1, NULL) DESC NULLS FIRST) AS sum_with_nulls
FROM (SELECT 41 AS a FROM DUAL UNION ALL
SELECT NULL AS a FROM DUAL UNION ALL
SELECT 42 AS a FROM DUAL UNION ALL
SELECT 43 AS a FROM DUAL);
I have below information in table and want to retrive the count if difference between two dates is >= 1.
Id testdate exdate
1 20120502 20120501 --> This should included, because diff is 1
2 20120601 20120601 --> This should not included, because diff is 0
3 20120704 20120703 --> This should included, because diff is 1
4 20120803 20120802 --> This should included, because diff is 1
Based on the above data, my select count should return 3.
I am trying the following, but it's not giving any results:
select count(to_char(testdate,'YYYYMMDD')-to_char(exdate,'YYYYMMDD')) from test ;
select count(*)
from my_table
where testdate <> exdate
You really should convert those to a date data-type though... it saves a lot of problems in the long run.
Your query will give you results. It will return 4. It gives you results because as long as the result of testdate - exdate is not null it will return a value for that row.
However, as you're not using dates Oracle will most probably convert those to numbers, which won't help for date comparisons should you do that in the future.
20120901 - 20120831 = 70 -- not 1
Okay, from your comment:
Working with ,if i use down voteaccept select count(*) from test where
to_char((testdate,'YYYYMMDD') - to_char(exdate,'YYYYMMDD')) >= 1; .But
count is one of the column.how to retrive above select statement as
one of the column
you're trying something completely different.
Your dates are actually dates; it's helpful to post this. You're looking for an analytic function, specifically count().
select a.*, count(*) over ( partition by 1 ) as ct
from my_table a
where trunc(exdate) <> trunc(testdate)
Note the trunc function, which, without additional parameters will remove the time portion of the date this enabling a direct comparison without resorting to converting the date to a character.
select count(*)
from test
where to_date(testdate,'YYYYMMDD') - to_date(exdate,'YYYYMMDD') >= 1;
or
select count(*)
from test
where to_date(testdate,'YYYYMMDD') <> to_date(exdate,'YYYYMMDD');
Looking at testdate and exdate it looks more like the columns are VARCHAR type so you would require apropriate date conversion.
In Oracle if the type is date you can calculate with them. 1 equal 1 day. 1/24 equals 1 hour.
Your case is rather easy because you could even compare the strings.
SELECT count(*)
FROM test
WHERE testdate <> exdate
But it sounds like you want to be able to be variable, so you rather convert them to a date and then you can do
SELECT count(*)
FROM test
WHERE to_date(testdate,'YYYYMMDD')-to_date(exdate,'YYYYMMDD') >= 1
I am not sure what you want if testdate minus exdate is -1 or more because the exdate is after testdate. Then you can work with ABS
SELECT count(*)
FROM test
WHERE ABS(to_date(testdate,'YYYYMMDD')-to_date(exdate,'YYYYMMDD')) >= 1
Here is my select query:
SELECT SUM(rating) AS this_week
FROM table_name
WHERE UNIX_TIMESTAMP(created_at) >= UNIX_TIMESTAMP() - 604800)
Which basically counts the rating of an item for the last week (604800 is a number of seconds in 1 week).
The problem is that when there are no rows in the table, the this_week will be returned as NULL. I would like the query to return 0 in case there are no rows in the table. How to do it?
This should do the trick:
SELECT COALESCE(SUM(rating),0) AS this_week FROM table_name
WHERE UNIX_TIMESTAMP(created_at) >= UNIX_TIMESTAMP() - 604800)
COALESCE is a function that will return the first non NULL value from the list.
Can't you use IFNULL(SUM(rating), 0)?
Try this one :
SUM(IFNULL(rating, 0))
I have a bunch of tasks in a MySQL database, and one of the fields is "deadline date". Not every task has to have to a deadline date.
I'd like to use SQL to sort the tasks by deadline date, but put the ones without a deadline date in the back of the result set. As it is now, the null dates show up first, then the rest are sorted by deadline date earliest to latest.
Any ideas on how to do this with SQL alone? (I can do it with PHP if needed, but an SQL-only solution would be great.)
Thanks!
Here's a solution using only standard SQL, not ISNULL(). That function is not standard SQL, and may not work on other brands of RDBMS.
SELECT * FROM myTable
WHERE ...
ORDER BY CASE WHEN myDate IS NULL THEN 1 ELSE 0 END, myDate;
SELECT * FROM myTable
WHERE ...
ORDER BY ISNULL(myDate), myDate
SELECT foo, bar, due_date FROM tablename
ORDER BY CASE ISNULL(due_date, 0)
WHEN 0 THEN 1 ELSE 0 END, due_date
So you have 2 order by clauses. The first puts all non-nulls in front, then sorts by due date after that
The easiest way is using the minus operator with DESC.
SELECT * FROM request ORDER BY -date DESC
In MySQL, NULL values are considered lower in order than any non-NULL value, so sorting in ascending (ASC) order NULLs are listed first, and if descending (DESC) they are listed last.
When a - (minus) sign is added before the column name, NULL become -NULL.
Since -NULL == NULL, adding DESC make all the rows sort by date in ascending order followed by NULLs at last.