Checking to see if array field is null or empty? - sql

In snowflake, how can I filter for null or empty array fields in a column?
Column has an empty [] or string of values in that bracket, tried using array_size(column_name, 1) > 0 but array_size does not function,
Thanks

Are you trying to filter them in or out?
Either way the array_size should work. although there's no second argument
where column_name is not null and array_size(column_name) != 0 worked for me
If you're specifically looking to filter to the records that have an empty array, this approach works too, although it's a little odd.
where column_name = array_construct()
Edit: It seems like your issue is that your column is a string. There's a few ways to work around this
Change your column's datatype to a variant or array
Parse your column before using array functions array_size(TRY_PARSE_JSON(column_name)) != 0
Compare to a string instead column_name is not null and column_name != '[]'

If the column is string, it has to be parsed first:
SELECT *
FROM tab
WHERE ARRAY_SIZE(TRY_PARSE_JSON(column_name)) > 0;
-- excluding NULLs/empty arrays

Related

array_length() and cardinality() of an empty array return one of array size

I have a table like :
CREATE TABLE psdc_psr
(
id bigint not null,
available_region_code character varying(255)[],
is_valid bigint,
)
I want usearray_length(), cardinality() or available_region_code ='{}' to select empty available_region_code columns but it failed, the length return one.
Why does this happen and how to solve this problem.
I don't know what reason causes this problem. Maybe like jjanes say in the comments the array has a non-printing character.
In the last, I use (CAST(ARRAY_TO_JSON(available_region_code) AS VARCHAR) IN ('[null]', '[""]')) to check empty array which learn from answer of Zone in question(How to check if an array is empty in Postgres).
The result on blow:
for a similar issue I had in presto based SQL, I created a case statement like:
case when available_region_code[1] = '' then 0 else cardinality(available_region_code) end as cardinality
Not sure if this helps in your use case, but it resulted in 0's for my empty arrays, and the correct counts for all others.

TSQL - dynamic WHERE clause to check for NULL for certain columns

Using Sql Server 2012 I want to query a table to only fetch rows where certain columns are not null or don't contain an empty string.
The columns I need to check for null and ' ' all start with either col_as or col_m followed by two digits.
At the moment I write where col_as01 is not null or ....
which becomes difficult to maintain due to the quantity of columns I have to check.
Is there a more elegant way to do this? Some kind of looping?
I also use ISNULL(NULLIF([col_as01], ''), Null) AS [col_as01] in the select stmt to get rid of the empty string values.
thank you for your help.
You should fill in the blanks.
select
#myWhereString =stuff((select 'or isnull('+COLUMN_NAME+','''') = '''' ' as [text()]
from Primebet.INFORMATION_SCHEMA.COLUMNS
where TABLE_NAME = 'YourTable'
and (column_name like 'col_as%'
or
column_name like 'col_m%')
for xml path('')),1,3,'')
set #myWhereString ='rest of your query'+ #myWhereString
exec executesql with your query
You can use something like this
WHERE DATALENGTH(col_as01) > 0
That will implicitly exclude null values, and the length greater 0 will guarantee you to retrieve non empty strings.
PS: You could also use LEN instead of DATALENGTH but that will trim spaces in your string at the beginning and end so you would not get values that only contain spaces then.
Simple as this:
WHERE col_as01 > ''

Check if value exists in Postgres array

Using Postgres 9.0, I need a way to test if a value exists in a given array. So far I came up with something like this:
select '{1,2,3}'::int[] #> (ARRAY[]::int[] || value_variable::int)
But I keep thinking there should be a simpler way to this, I just can't see it. This seems better:
select '{1,2,3}'::int[] #> ARRAY[value_variable::int]
I believe it will suffice. But if you have other ways to do it, please share!
Simpler with the ANY construct:
SELECT value_variable = ANY ('{1,2,3}'::int[])
The right operand of ANY (between parentheses) can either be a set (result of a subquery, for instance) or an array. There are several ways to use it:
SQLAlchemy: how to filter on PgArray column types?
IN vs ANY operator in PostgreSQL
Important difference: Array operators (<#, #>, && et al.) expect array types as operands and support GIN or GiST indices in the standard distribution of PostgreSQL, while the ANY construct expects an element type as left operand and can be supported with a plain B-tree index (with the indexed expression to the left of the operator, not the other way round like it seems to be in your example). Example:
Index for finding an element in a JSON array
None of this works for NULL elements. To test for NULL:
Check if NULL exists in Postgres array
Watch out for the trap I got into: When checking if certain value is not present in an array, you shouldn't do:
SELECT value_variable != ANY('{1,2,3}'::int[])
but use
SELECT value_variable != ALL('{1,2,3}'::int[])
instead.
but if you have other ways to do it please share.
You can compare two arrays. If any of the values in the left array overlap the values in the right array, then it returns true. It's kind of hackish, but it works.
SELECT '{1}' && '{1,2,3}'::int[]; -- true
SELECT '{1,4}' && '{1,2,3}'::int[]; -- true
SELECT '{4}' && '{1,2,3}'::int[]; -- false
In the first and second query, value 1 is in the right array
Notice that the second query is true, even though the value 4 is not contained in the right array
For the third query, no values in the left array (i.e., 4) are in the right array, so it returns false
unnest can be used as well.
It expands array to a set of rows and then simply checking a value exists or not is as simple as using IN or NOT IN.
e.g.
id => uuid
exception_list_ids => uuid[]
select * from table where id NOT IN (select unnest(exception_list_ids) from table2)
Hi that one works fine for me, maybe useful for someone
select * from your_table where array_column ::text ilike ANY (ARRAY['%text_to_search%'::text]);
"Any" works well. Just make sure that the any keyword is on the right side of the equal to sign i.e. is present after the equal to sign.
Below statement will throw error: ERROR: syntax error at or near "any"
select 1 where any('{hello}'::text[]) = 'hello';
Whereas below example works fine
select 1 where 'hello' = any('{hello}'::text[]);
When looking for the existence of a element in an array, proper casting is required to pass the SQL parser of postgres. Here is one example query using array contains operator in the join clause:
For simplicity I only list the relevant part:
table1 other_name text[]; -- is an array of text
The join part of SQL shown
from table1 t1 join table2 t2 on t1.other_name::text[] #> ARRAY[t2.panel::text]
The following also works
on t2.panel = ANY(t1.other_name)
I am just guessing that the extra casting is required because the parse does not have to fetch the table definition to figure the exact type of the column. Others please comment on this.

Updating values in PostgreSQL array

I have some columns in PostgreSQL database that are array. I want to add a new value (in UPDATE) in it if the value don't exists, otherwise, don't add anytihing. I don't want to overwrite the current value of the array, but only add the element to it.
Is possible do this in a query or I need to do this inside a function? I'm using PostgreSQL.
This should be as simple as this example for an integer array (integer[]):
UPDATE tbl SET col = col || 5
WHERE (5 = ANY(col)) IS NOT TRUE;
A WHERE clause like:
WHERE 5 <> ALL(col)
would also catch the case of an empty array '{}'::int[], but fail if a NULL value appears as element of the array.
If your arrays never contain NULL as element, consider actual array operators, possibly supported by a GIN index.
UPDATE tbl SET col = col || 5
WHERE NOT col #> '{5}';
See:
Check if value exists in Postgres array
Can PostgreSQL index array columns?

SQL List Function Removing Precision

I am using the LIST function to create a ';' delimited list of values. The type is numeric (19,2). For some reason the precision appears to be ignored when using the list function. When performing a simple select on this column the values look good, ie "12.00". However, if I use a LIST() my results are of format "12.000000"
This is my LIST usage:
LIST(case when tblWOService.PricePerVehicle is null then ' ' else CONVERT(decimal(19,2),tblWOService.PricePerVehicle end,';')
The CONVERT does not change the result. Any ideas?
Thanks!
Have you tried explicitly converting your empty string?
LIST(
case when tblWOService.PricePerVehicle is null then CONVERT(decimal(19,2),' ')
else CONVERT(decimal(19,2),tblWOService.PricePerVehicle) end,';'
)
I've run into a similar datatype issue with CASE statements in T-SQL.