ANSI-92 SQL mandates that comparisons with NULL evaluate to "falsy," eg:
SELECT * FROM table WHERE field = NULL
SELECT * FROM table WHERE field != NULL
Will both return no rows because NULL can't be compared like that. Instead, the predicates IS NULL and IS NOT NULL have to be used instead:
SELECT * FROM table WHERE field IS NULL
SELECT * FROM table WHERE field IS NOT NULL
Research has shown me that Oracle 1, PostgreSQL, MySQL and SQLite all support the ANSI syntax. Add to that list DB2 and Firebird.
Aside from SQL Server with ANSI_NULLS turned off, what other RDBMS support the non-ANSI syntax?
1 The whole empty string = NULL mess notwithstanding.
For what it's worth, comparing something to NULL is not strictly false, it's unknown. Furthermore, NOT unknown is still unknown.
ANSI SQL-99 defines a predicate IS [NOT] DISTINCT FROM. This allows you to mix nulls and non-null values in comarisons, and always get a true or false. Null compared to null in this way is true, otherwise any non-null compared to null is false. So negation works as you probably expect.
PostgreSQL, IBM DB2, and Firebird do support IS [NOT] DISTINCT FROM.
MySQL has a similar null-safe comparison operator <=> that returns true if the operands are the same and false if they're different.
Oracle has the hardest path. You have to get creative with use of NVL() or boolean expressions:
WHERE a = b OR (a IS NULL AND b IS NULL)
Yuck.
Here is a nice comparison of null handling in SQLite, PostgreSQL, Oracle, Informix, DB2, MS-SQL, OCELOT, MySQL 3.23.41, MySQL 4.0.16, Firebird, SQL Anywhere, and Borland Interbase
Related
I'm using Python 3.6, PostgreSQL 9.6, and Psycopg2 2.6.2 on a Fedora 27 64-bit system.
I have a SELECT query called my_query that pulls data from about 15 columns in one table. Several of the columns may be NULL. It's written as
SELECT col_01, col_02, ... col_15
FROM my_table
WHERE (col_01, col_02, ... col_15) = (%s, %s, ... %s)
The data, called my_data, is written in a tuple as
(value_01, value_02, ... value_15)
When I run
cursor.execute(my_query, my_data)
cursor_result = cursor.fetchone()
cursor_result becomes None.
Assuming the problem was with the nullable columns, I removed them from my query and my data and cursor_result was no longer None.
I also tried a query with only nullable columns; it produced the same result as the original query: cursor_result became None.
My question: is there a way around my problem? I want to be able to SELECT columns that are never NULL, always NULL, or sometimes NULL.
Thank you in advance for your help.
The ANSI 99 standard for SQL introduced the IS [NOT] DISTINCT FROM operator, which behaves identically to the equality comparisons involving non-null values but treats all NULLs as being the same (NOT DISTINCT, given the curious negative phrasing).
Most database products do not yet support this operator, so far as I'm aware, but luckily PostgreSQL does
We used Stored Procedure on our Queries. Some of our field got NULL value, so for us to get this value we Put the conversion of ISNULL inside the WHERE condition but per checking it affects the Process of our strodproc based on the SQL performance tool.
Ex.
SELECT * FROM tblInfo
WHERE ISNULL(fldInfo,'') <> ''
tblInfo
fldinfo
NULL
30
NULL
20
Query
SELECT * FROM tblinfo WHERE fldinfo NOT IN (30,20) - different result
SELECT * FROM tblinfo WHERE ISNULL(fldinfo,'') NOT IN (30,20) - Correct
Result
Any other Substitute process of script we can use so that we can get the value but not affecting the performance of the query.
Your approach will be non sargable. Even though you have an index it will not be used.
Right way to do this would be using IS NOT NULL condition
SELECT * FROM tblInfo
WHERE fldInfo <> ''
AND fldInfo IS NOT NULL
If you don't have a index, then create a index on fldinfo to improve the performance
Update :
Not In fails to compare the NULL values. Comparison with NULL values are unknown so it is fails to return the NULL values. Here is the correct way to do this
SELECT *
FROM tblInfo
WHERE (fldinfo NOT IN ( 30, 20 ) OR fldinfo IS NULL)
COALESCE is one option that you can try. It behaves in the same way. However the differences between performance has to be evaluated by you with some test
Some differences between ISNULL and COALESCE are outlined here:
SQL - Difference between COALESCE and ISNULL?
EDIT: Based on the tests done by multiple people and by theory, ISNULL seems to be a better option over COALESCE
Which is quicker COALESCE OR ISNULL?
No need for IIF. Simply check for NULL
WHERE fldinfo IS NOT NULL
Or, of course, use IS NULL if you want rows where this condition is met
I was suppose to get all data from the table where the column "Address" is not null
so I made a statement that look like this...
Select * from Table where Address is not null
Unfortunately, there are rows in "Address" column that has spaces so SQL cannot consider it as Null
How can I display rows where Address is not null?
Thanks :)
Most database systems have a NULLIF() function. It was defined together with COALESCE() in the ANSI SQL-99 standard if not earlier. It is implemented in at least SQL Server, Oracle, PostgreSQL, MySQL, SQLite, DB2, Firebird.
Select * from Table where NULLIF(Address,'') is not null
But for me, I like this more
Select * from Table where Address > ''
It kills nulls and empty strings in one go. It will even exclude strings that are made up entirely of spaces ('', ' ', etc). It also retains SARGability.
In SQLite, IS is a binary operator that behaves exactly like = except when one or both of the operands are NULL. In the case where both operands are NULL, the IS operator evaluates to TRUE. In the case where one operand is NULL, but not the other, the IS operand evaluates to FALSE.
I was looking for a similar operator in PostgreSQL, but I could not find one. Is there an equivalent of SQLite's IS operator in PostgreSQL? If not, what is the best/least-complicated work-around?
To clarify, SELECT column1 IS column2 ... is allowed in SQLite, but PostgreSQL raises a syntax error.
Try the IS (NOT) DISTINCT FROM operator.
Apparently it's possible in postgresql (see dan04's). The query below would work in SQL Server and other DBMSs where that syntax isn't available.
You can simulate by doing:
WHERE (column1 is NULL and column2 is NULL)
OR column1 = column2
To add to #Derek's answer:
In MySQL you can use the <=> operator to the same effect:
SELECT * FROM table1 WHERE column1 <=> column2
Previously we used DB2 as database, but now we are migrating to Oracle. Now, in our project we have extensively used sql's that were Db2 specific.
Is there any way to convert those DB2 specific queries to oracle supported queries.
Thanks
You have a lot of work ahead!
Between DB2 and Oracle, some important differences are (just an arbitrary enumeration of what I can think of):
Data types
Number data types: DB2 has many more standard types, such as SMALLINT, INTEGER, DOUBLE, etc. Those don't exist in Oracle SQL (although some exist in PL/SQL). This is important for DDL and for casting and some other use cases, such as the correctness of predicates
Date data types: Oracle's only difference between DATE and TIMESTAMP is the fact that TIMESTAMP has microseconds. But DATE may also contain time information. In DB2, DATE has no time information, I think.
Character data types: Read about the difference between VARCHAR and VARCHAR2 in Oracle
NULL. In Oracle, NULL is much more general than in DB2. Before DB2 v9.7, you had to cast NULL to any explicit type, e.g. cast(null as integer). That's not necessary in Oracle.
System objects
SYSIBM.DUAL simply becomes DUAL
Functions: They're all a bit different. You'll have to check case by case. For example, LOCATE becomes INSTR
Syntax
TRUNCATE IMMEDIATE becomes TRUNCATE
EXCEPT becomes MINUS
DB2's FETCH FIRST n ROWS ONLY: There is no such clause in Oracle. You'll have to use ROWNUM or ROW_NUMBER() OVER() filtering (see this example)
DB2's MERGE statement is more powerful than that of Oracle, in case you use this.
DB2 supports INSERT INTO .. (..) VALUES (..), (..), (..). With Oracle, you'd have to write INSERT INTO .. SELECT .. UNION ALL SELECT .. UNION ALL SELECT ..
Advanced
If you use stored procedures, they work a bit differently, especially with advanced data types involved, but that's out of scope here.
Your most efficient shot at this might be to use SQL abstraction of some sort. If you're using Java, I would recommend you wrap your SQL statements with jOOQ (Disclaimer: I work for the company behind jOOQ). jOOQ provides API-level abstraction for all of the above facts. A great deal of SQL can be executed both on DB2 and Oracle, without adaptation. We're also working on a more independent translator product: https://www.jooq.org/translate
On a higher level of abstraction, Hibernate (or other JPA implementations) can do the same for you
I found out that there are also some differences in the management of character strings.
DB2 doesn't care about the trailing whitespaces when comparing:
/* DB2 */
SELECT CASE WHEN ('A ' = 'A') THEN 'true' ELSE 'false' END FROM SYSIBM.SYSDUMMY1
--> true
/* Oracle */
SELECT CASE WHEN ('A ' = 'A') THEN 'true' ELSE 'false' END FROM DUAL
--> false
Oracle considers that '' equals NULL:
/* DB2 */
SELECT CASE WHEN ('' IS NULL) THEN 'true' ELSE 'false' END FROM SYSIBM.SYSDUMMY1
--> false
/* Oracle */
SELECT CASE WHEN ('' IS NULL) THEN 'true' ELSE 'false' END FROM DUAL
--> true