How can you use NULLIF in the where clause? - sql

I am trying to do something like this:
select
col_1, col_2, etc
from
table
where
col_1 = nullif('', '')
Am I doing this incorrectly? I am not getting any results back.
Edit:
My expected results are to get every record back where col_1 is NULL.
I know I can use where col_1 is null, but I am using SSIS and a variable. Sometimes the col_1 is actually NULL and sometimes it is not.
Sample data:
collaboration first_name last_name city
NULL Bob Smith Chicago
Data Migration John Smith Austin
NULL Pika Chu Houston
Production ash ketchum tokyo
Sometimes I may want to return the records where collaboration is NULL, sometimes I want to return the records where it says Production.
I'd like to use the same query, if possible, with little modification.
Edit Part 2:
I tried to experiment with this.
select
col_1, col_2, etc
from
table
where
case
when col_1 = '' then NULL
else col_1
end
But I am getting the error message:
An expression of non-boolean type specified in a context where a condition is expected, near ORDER.
Query speed it not something I am concerned with.

This is the query you need
select
col_1, col_2, etc
from
table
where
col_1 is null
is null checks if a column is null, nullif(#expr1,#expr2) could be rewritten as:
case when #expr1 = #expr2 return null else return #expr1 end
EDIT:
you can relax filters adding OR condition into the 'where' clause (TIP: remember AND is evaluated before OR)
select
col_1, col_2, etc
from
table
where
(col_1 is null OR col1 like 'production')
if you want to decide runtime wich one you neeed you could write a procedure:
create proc my_proc #var AS varchar(100) = 'NULL§159§' -- this defaults to null, if you put a parameter it queries with parameter passed
as
select
col_1, col_2, etc
from
table
where
WHERE coalesce(col_1,'NULL§159§') = #var
-- added §159§ symbol to the null to make sure the queried string is impossible in the database,
-- obviously into the database the value 'NULL159' hase become a sort of 'reserved word', but hopefully is odd enough not to appear in data
GO
and call it by exec my_proc('production')

Try this, it can handle the column with null values or empty space
SELECT
col_1, col_2, etc
FROM
Table
WHERE
ISNULL(NULLIF(col_1 ,''),'1') = '1'

You can do something like
select
col_1, col_2, etc
from
table
where
col_1 IS NULL OR col_1 = ''

select
col_1, col_2, etc
from
table
where
collaboration IS NULL OR collaboration ='Production'

Crystal ball time from me. This is my guess on what the OP wants:
DECLARE #Prod varchar(15);
--SET #Prod = 'Production';
SELECT {Columns}
FROM YourTable
WHERE Col1 = #Prod
OR (Col1 IS NULL AND #Prod IS NULL);

Try this.
DECLARE #SearchValue VARCHAR(50)
SELECT col_1, col_2, etc
FROM YourTable
WHERE ISNULL(col_1,'') = ISNULL(#SearchValue,'')

Related

Force COALESE(NULL,NULL) to return NULL

In SQL SERVER
Considering the fact that:
Col1 and Col2 contain numeric and NULL values
SELECT
COALESCE(Col1,Col2)
Return an error: "At least one of the arguments to COALESCE must be an expression that is not the NULL constant."
Considering that Col1 and Col2 are NULL,I want to force it to return NULL value in this case.
The workaround seems to be unelegant/inefficient for me:
SELECT
NULLIF(COALESCE(Col1 ,Col2 ,''),'')
Note that Col1 and Col2 are numeric fields and cannot take '' as a value.
Any other suggestion ?
Thank you for your help
This code works:
SELECT COALESCE(Col1, Col2)
FROM . . . -- references here that define COL1 and COL2
If both columns are NULL, it will return NULL (typed as an integer in your case).
The only time you get the error you mention is when the columns are explicitly NULL. Even calculations seem to get around this:
select coalesce(null + 3, null)
--> NULL rather than an error
The following even returns NULL rather than an error:
declare #col1 int;
declare #col2 int;
select coalesce(#col1, #col2);
Here is a db<>fiddle.

SQL - conditionally set column values to NULL

I have a table - some_table which has a number of columns and some of them have some invalid value in some rows which need to transformed into NULL.
I cannot use the below due as mutating the original table is not allowed by permissions for one and also it needs to be repeated for all column names.
UPDATE some_table TABLE## SET column_name = NULL WHERE column_name = 'invalid value';
So it needs to be a 'SELECT' operation to create a new table with invalid values converted to NULL - is there a quick way to do this ?
Updating with an answer from #Jonny below
NULLIF is a good option. However is there a way to apply it to all columns rather having to do it for each column separately - sometimes the number of columns is pretty huge.
You could use a NULLIF
Have a look at 9.16.3. NULLIF
https://www.postgresql.org/docs/current/static/functions-conditional.html
SELECT NULLIF('invalid value', column_name)
FROM some_table
How about something like:
INSERT INTO some_table2 (column_name, ...) SELECT * FROM some_table WHERE column_name <> 'invalid value';
INSERT INTO some_table2 (column_name, ...) SELECT null, ... FROM some_table WHERE column_name = 'invalid_value';

Oracle/PL SQL/SQL null comparison on where clause

Just a question about dealing will null values in a query.
For example I have the following table with the following fields and values
TABLEX
Column1
1
2
3
4
5
---------
Column2
null
A
B
C
null
I'm passing a variableY on a specific procedure. Inside the procedure is a cursor like this
CURSOR c_results IS
SELECT * FROM TABLEX where column2 = variableY
now the problem is variableY can be either null, A, B or C
if the variableY is null i want to select all record where column2 is null, else where column2 is either A, B or C.
I cannot do the above cursor/query because if variableY is null it won't work because the comparison should be
CURSOR c_results IS
SELECT * FROM TABLEX where column2 IS NULL
What cursor/query should I use that will accomodate either null or string variable.
Sorry if my question is a bit confusing. I'm not that good in explaining things. Thanks in advance.
Either produce different SQL depending on the contents of that parameter, or alter your SQL like this:
WHERE (column2 = variableY) OR (variableY IS NULL AND column2 IS NULL)
Oracle's Ask Tom says:
where decode( col1, col2, 1, 0 ) = 0 -- finds differences
or
where decode( col1, col2, 1, 0 ) = 1 -- finds sameness - even if both NULL
Safely Comparing NULL Columns as Equal
You could use something like:
SELECT * FROM TABLEX WHERE COALESCE(column2, '') = COALESCE(variableY, '')
(COALESCE takes the first non NULL value)
Note this will only work when you the column content cannot be '' (empty string). Else this statement will fail because NULL will match '' (empty string).
(edit)
You could also consider:
SELECT * FROM TABLEX WHERE COALESCE(column2, 'a string that never occurs') = COALESCE(variableY, 'a string that never occurs')
This will fix the '' fail hypothesis.
Below is similar to "top" answer but more concise:
WHERE ((column2 = variableY ) or COALESCE( column2, variableY) IS NULL)
May not be appropriate depending on the data you're looking at, but one trick I've seen (and used) is to compare NVL(fieldname,somenonexistentvalue).
For example, if AGE is an optional column, you could use:
if nvl(table1.AGE,-1) = nvl(table2.AGE,-1)
This relies on there being a value that you know will never be allowed. Age is a good example, salary, sequence numbers, and other numerics that can't be negative. Strings may be trickier of course - you may say that you'll never have anyone named 'xyzzymaryhadalittlelamb" or something like that, but the day you run with that assumption you KNOW they'll hire someone with that name!!
All that said: "where a = b or (a is null and b is null)" is the traditional way to solve it. Which is unfortunate, as even experienced programmers forget that part of it sometimes.
Try using the ISNULL() function. you can check if the variable is null and if so, set a default return value. camparing null to null is not really possible. remember: null <> null
WHERE variableY is null or column2 = variableY
for example:
create table t_abc (
id number(19) not null,
name varchar(20)
);
insert into t_abc(id, name) values (1, 'name');
insert into t_abc(id, name) values (2, null);
commit;
select * from t_abc where null is null or name = null;
--get all records
select * from t_abc where 'name' is null or name = 'name';
--get one record with name = 'name'
You could use DUMP:
SELECT *
FROM TABLEX
WHERE DUMP(column2) = DUMP(variableY);
DBFiddle Demo
Warning: This is not SARG-able expression so there will be no index usage.
With this approach you don't need to search for value that won't exists in your data (like NVL/COALESCE).

How can I update a record using a correlated subquery?

I have a function that accepts one parameter and returns a table/resultset. I want to set a field in a table to the first result of that recordset, passing in one of the table's other fields as the parameter. If that's too complicated in words, the query looks something like this:
UPDATE myTable
SET myField = (SELECT TOP 1 myFunctionField
FROM fn_doSomething(myOtherField)
WHERE someCondition = 'something')
WHERE someOtherCondition = 'somethingElse'
In this example, myField and myOtherField are fields in myTable, and myFunctionField is a field return by fn_doSomething. This seems logical to me, but I'm getting the following strange error:
'myOtherField' is not a recognized OPTIMIZER LOCK HINTS option.
Any idea what I'm doing wrong, and how I can accomplish this?
UPDATE:
Based on Anil Soman's answer, I realized that the function is expecting a string parameter and the field being passed is an integer. I'm not sure if this should be a problem as an explicit call to the function using an integer value works - e.g. fn_doSomething(12345) seems to automatically cast the number to an string. However, I tried to do an explicit cast:
UPDATE myTable
SET myField = (SELECT TOP 1 myFunctionField
FROM fn_doSomething(CAST(myOtherField AS varchar(1000)))
WHERE someCondition = 'something')
WHERE someOtherCondition = 'somethingElse'
Now I'm getting the following error:
Line 5: Incorrect syntax near '('.
I have never done anything like this so .... all the code I have seen uses a schema on the function name - so something like:
FROM dbo.fn_doSomething(myOtherField)
seems like a compiler bug in SQL 2000
try in Server 2005 and join the table-valued function using CROSS APPLY or OUTER APPLY
also try this, guru huys
CREATE FUNCTION FCN_pruebaChicaBorrame(#numerito int)
RETURNS #returnTable TABLE (numerito int)
AS
BEGIN
insert into #returnTable values(#numerito)
return
END
Select * from FCN_pruebaChicaBorrame(20)
Select col_1
from ( select 1 as col_1
union select 2
union select 3) as tablita
Select col_1, (select * from dbo.FCN_pruebaChicaBorrame(20) as fcnTable)
from ( select 1 as col_1
union select 2
union select 3) as tablita
Select col_1, (select * from dbo.FCN_pruebaChicaBorrame(col_1) as fcnTable)
from ( select 1 as col_1
union select 2
union select 3) as tablita
Select col_1, (select * from dbo.FCN_pruebaChicaBorrame(case when 1=1 then 20 else 21) as fcnTable)
from ( select 1 as col_1
union select 2
union select 3) as tablita
I searched on google for this error and one person talks about missing single quotes in search condition. Is that the case with your function code? link to related blog
It seems that (at least in SQL Server 2000) you can't pass a column value to a table valued function. I had to set up a scalar function to get around this.

Oracle: Is there a simple way to say "if null keep the current value" in merge/update statements?

I have a rather weak understanding of any of oracle's more advanced functionality but this should I think be possible.
Say I have a table with the following schema:
MyTable
Id INTEGER,
Col1 VARCHAR2(100),
Col2 VARCHAR2(100)
I would like to write an sproc with the following
PROCEDURE InsertOrUpdateMyTable(p_id in integer, p_col1 in varcahr2, p_col2 in varchar2)
Which, in the case of an update will, if the value in p_col1, p_col2 is null will not overwrite Col1, Col2 respectively
So If I have a record:
id=123, Col1='ABC', Col2='DEF'
exec InsertOrUpdateMyTable(123, 'XYZ', '098'); --results in id=123, Col1='XYZ', Col2='098'
exec InsertOrUpdateMyTable(123, NULL, '098'); --results in id=123, Col1='ABC', Col2='098'
exec InsertOrUpdateMyTable(123, NULL, NULL); --results in id=123, Col1='ABC', Col2='DEF'
Is there any simple way of doing this without having multiple SQL statements?
I am thinking there might be a way to do this with the Merge statement though I am only mildly familiar with it.
EDIT:
Cade Roux bellow suggests using COALESCE which works great! Here are some examples of using the coalesce kewyord.
And here is the solution for my problem:
MERGE INTO MyTable mt
USING (SELECT 1 FROM DUAL) a
ON (mt.ID = p_id)
WHEN MATCHED THEN
UPDATE
SET mt.Col1 = coalesce(p_col1, mt.Col1), mt.Col2 = coalesce(p_col2, mt.Col2)
WHEN NOT MATCHED THEN
INSERT (ID, Col1, Col2)
VALUES (p_id, p_col1, p_col2);
Change the call or the update statement to use
nvl(newValue, oldValue)
for the new field value.
Using MERGE and COALESCE? Try this link for an example
with
SET a.Col1 = COALESCE(incoming.Col1, a.Col1)
,a.Col2 = COALESCE(incoming.Col2, a.Col2)