How to apply NVL function to every column in redshift query - sql

I want to apply a NVL function to every column in my query:
I.E I want to something like:
select nvl(student.*,'')
from student ;
Put another way, I would like this question answered but instead of Oracle SQL, use Redshift SQL

You have to go through one by one. If data type is int then replace it with coalesce(col1, 0) as col1
Example:
select
coalesce(col1, '') as col1,
coalesce(col2, '') as col2,
.
.
coalesce(colN, '') as colN
from table

Related

Postgresql subtract comma separated string in one column from another column

The format is like:
col1
col2
V1,V2,V3,V4,V5,V6
V4,V1,V6
V1,V2,V3
V2,V3
I want to create another column called col3 which contains the subtraction of two columns.
What I have tried:
UPDATE myTable
SET col3=(replace(col1,col2,''))
It works well for rows like row2 since the order of replacing patterns matters.
I was wondering if there's a perfect way to achieve the same goal for rows like row1.
So the desired output would be:
col1
col2
col3
V1,V2,V3,V4,V5,V6
V4,V1,V6
V2,V3,V5
V1,V2,V3
V2,V3
V1
Any suggestions would be appreciated!
Split values into tables, subtract sets and then assemble it back. Everything is possible as an expression defining new query column.
with t (col1,col2) as (values
('V1,V2,V3,V4,V5,V6','V4,V1,V6'),
('V1,V2,V3','V2,V3')
)
select col1,col2
, (
select string_agg(v,',')
from (
select v from unnest(string_to_array(t.col1,',')) as a1(v)
except
select v from unnest(string_to_array(t.col2,',')) as a2(v)
) x
)
from t
DB fiddle
You will have to unnest the elements then apply an EXCEPT clause on the "unnested" rows and aggregate back:
select col1,
col2,
(select string_agg(item,',' order by item)
from (
select *
from string_to_table(col1, ',') as c1(item)
except
select *
from string_to_table(col2, ',') as c2(item)
) t)
from the_table;
I wouldn't store that result in a separate column, but if you really need to introduce even more problems by storing another comma separated list.
update the_table
set col3 = (select string_agg(item,',' order by item)
from (
select *
from string_to_table(col1, ',') as c1(item)
except
select *
from string_to_table(col2, ',') as c2(item)
) t)
;
string_to_table() requires Postgres 14 or newer. If you are using an older version, you need to use unnest(string_to_array(col1, ',')) instead
If you need that a lot, consider creating a function:
create function remove_items(p_one text, p_other text)
returns text
as
$$
select string_agg(item,',' order by item)
from (
select *
from string_to_table(col1, ',') as c1(item)
except
select *
from string_to_table(col2, ',') as c2(item)
) t;
$$
language sql
immutable;
Then the above can be simplified to:
select col1, col2, remove_items(col1, col2)
from the_table;
Note, POSTGRESQL is not my forte, but thought I'd have a go at it. Try:
SELECT col1, col2, RTRIM(REGEXP_REPLACE(Col1,CONCAT('\m(?:', REPLACE(Col2,',','|'),')\M,?'),'','g'), ',') as col3 FROM myTable
See an online fidle.
The idea is to use a regular expession to replace all values, based on following pattern:
\m - Word-boundary at start of word;
(?:V4|V1|V6) - A non-capture group that holds the alternatives from col2;
\M - Word-boundary at end of word;
,? - Optional comma.
When replaced with nothing we need to clean up a possible trailing comma with RTRIM(). See an online demo where I had to replace the word-boundaries with the \b word-boundary to showcase the outcome.

concatenate all columns from with names of columns also in it, one string for every row

CREATE TABLE myTable
(
COL1 int,
COL2 varchar(10),
COL3 float
)
INSERT INTO myTable
VALUES (1, 'c2r1', NULL), (2, 'c2r2', 2.335)
I want an output with for every row of a table one string with all columns and the names in it.
Something like:
COL1=1|COL2=c2r1|COL3=NULL
COL1=2|COL2=c2r2|COL3=2.3335
I have a table with lot of columns so it has to be dynamic (it would use it on different tables also), is there an easy solution where I can do it and choose separator and things like that... (It has to deal with NULL-values & numeric values also.)
I am using SQL Server 2019.
Since you are on 2019, string_agg() with a bit if JSON
Example
Select NewVal
From MyTable A
Cross Apply ( Select NewVal = string_agg([key]+'='+isnull(value,'null'),'|')
From OpenJson((Select A.* For JSON Path,Without_Array_Wrapper,INCLUDE_NULL_VALUES ))
) B
Results
NewVal
COL1=1|COL2=c2r1|COL3=null
COL1=2|COL2=c2r2|COL3=2.335000000000000e+000 -- Don't like the float
EDIT to Trap FLOATs
Select NewVal
From MyTable A
Cross Apply ( Select NewVal = string_agg([key]+'='+isnull(case when value like '%0e+0%' then concat('',convert(decimal(15,3),convert(float,value))) else value end,'null'),'|')
From OpenJson((Select A.* For JSON Path,Without_Array_Wrapper,INCLUDE_NULL_VALUES ))
) B
Results
NewVal
COL1=1|COL2=c2r1|COL3=null
COL1=2|COL2=c2r2|COL3=2.335
Would one dare to abuse json for this?
SELECT REPLACE (REPLACE (REPLACE (REPLACE (REPLACE (ca.js,'":','='), ',"','|'), '"',''), '[{','') ,'}]','') AS data
FROM (SELECT col1 as id FROM myTable) AS list
CROSS APPLY
(
SELECT t.col1
, t.col2
, cast(t.col3 as decimal(16,3)) as col3
FROM myTable t
WHERE t.col1 = list.id
FOR JSON AUTO, INCLUDE_NULL_VALUES
) ca(js)
It'll work with a simple SELECT t.* in the cross apply.
But the floats tend to be bit too long then.

What is best alternate for cursor in T-SQL?

I have a design concern which I need to implement without using cursor.
There is source table 'A' which will have all column in varchar data type. I want to iterate over them and convert each column to destination table data type and if conversion/parsing fails, I need to log that row in extra error table.
Any suggestions to go ahead will be helpful.
In SQL Server, you would use try_convert():
insert into t2 ( . . . )
select . . .
from (select try_convert(?, col1) as col1,
try_convert(?, col1) as col2,
from staging_t
) t
where col1 is not null and col2 is not null and . . .;
Then run a second query to get the rows where the value is NULL.
If NULL is a permitted value in the staging column, then this is a bit more complex:
insert into t2 ( . . . )
select new_col1, new_col2, . . .
from (select try_convert(?, col1) as new_col1, col1,
try_convert(?, col1) as new_col2, col2,
from staging_t
) t
where (new_col1 is not null or col1 is null) and
(new_col2 is not null or col2 is null) and
. . .;
In Sql Server 2012 and up: each of these will return null when the conversion fails instead of an error.
try_convert(datatype,val)
try_cast(val as datatype)
try_parse(val as datatype [using culture])
Example of all varcharcol that would fail conversion to int:
select id, varcharcol
from a
where try_convert(int,varcharcol) is null
Use Above Query Using User Define Table Type
Create Type Table - Procedure - User Define Table Type
#UserDefineTypeTable UserDefineTypeTable readonly
insert into table1
col1,
col2,
col3
(select
type.col1,
type.col2,
type,col3
from #UserDefineTypeTable as type)

SQL _ wildcard not working as expected. Why?

so i have this query
select id, col1, len(col1)
from tableA
from there I wanted to grab all data in col1 that have exactly 5 characters and start with 15
select id, col1, len(col1)
from tableA
where col1 like '15___' -- underscore 3 times
Now col1 is a nvarchar(192) and there are data that starts with 15 and are of length 5. But the second query always shows me no rows.
Why is that?
The case could be that the field is a large empty string? Such as "15123 "
You could also try another solution?
select id, col1, len(col1)
from tableA
where col1 like '15%' AND Len(col1)=5
EDIT - FOR FUTURE REFERENCE:
For sake of comprehensiveness, char and nchar uses the full field size, so char(10) would be 15________ ("15" + 8 characters) long, because it implicitly forces the size, whereas a varchar resizes based on what it is supplied 15 is simply 15.
To get around this you could
A) Do an LTRIM/RTRIM To cut off all extra spaces
select id, col1, len(col1)
from tableA
where rtrim(ltrim(col1)) like '15___'
B) Do a LEFT() to only grab the left 5 characters
select id, col1, len(col1)
from tableA
where left(col1,5) like '15___'
C) Cast as a varchar, a rather sloppy approach
select id, col1, len(col1)
from tableA
where CAST(col1 AS Varchar(192)) like '15___'
Does this query return anything?
select id, col1, len(col1)
from tableA
where len(col1) = 5 and
left(col1, 2) = '15';
If not, then there are no values that match that pattern. And, my best guess would be spaces, in which case, this might work:
select id, col1, len(col1)
from tableA
where ltrim(rtrim(col1)) like '15___';

SQL Server SELECT shows my own value not from database

I'm looking for way to add my data to some columns,
select co1, col2, col3 from tbl
I want to cod3 exist but only show my data
select co1, col2, col3=3 from tbl
output should be
1, 0, 3
I have problem with CR9 and this is only way I guess .
if you want to call the 3rd column col3 just do
select co1,col2, '3' as col3 from tbl
by the way
select co1,col2,col=3 from tbl
was valid acceptable but not recommenened by microsoft until SQL2008R2 in 2012 is not accepted anymore
just use "select co1,col2,'3' from tbl".
try this answer
select co1,col2, '3' as col3 from tbl