Get values of 2 columns as single value separated with '|' - sql

DECLARE #mockup TABLE(Column1 VARCHAR(1),Column2 VARCHAR(1));
INSERT INTO #mockup VALUES('1','2'),('-','2'),('1','2'),('-','-'),('2','-'),('1','2');
SELECT ISNULL(NULLIF(Column1 + '|','-|'),'')
+ISNULL(NULLIF(Column2,'-'),'')
FROM #mockup
Above query result is as below,
1|2
2
1|2
2|
1|2
I want the result as above only except row4, where 2| should be only as 2 .
'|' should not be there at before or end of the values.

Simply join both fields and use REPLACE to remove |- and -|. Condition in WHERE clause avoids records where both fields are -:
select replace(replace(Column1+'|'+Column2,'-|',''),'|-','') as Result
from mockup
where coalesce(nullif(Column1,'-'),nullif(Column2,'-')) is not null
Output:
Result
1|2
2
1|2
2
1|2
See result here.

Use Replace function
SELECT replace(replace(replace(Column1 + '|' + Column2,'-|',''),'|-',''),'-','')
FROM #mockup
or try using CASE statement
SELECT CASE
WHEN column1 LIKE '[0-9]' AND column2 LIKE '[0-9]' THEN column1 + '|' + column1
WHEN column1 LIKE '[0-9]' AND column2 NOT LIKE '[0-9]' THEN column1
ELSE column2
END
FROM #mockup
if you want to check the - instead of numbers then
SELECT CASE
WHEN column1 NOT LIKE '-' AND column2 NOT LIKE '-' THEN column1 + '|' + column1
WHEN column1 NOT LIKE '-' AND column2 LIKE '-' THEN column1
ELSE column2
END
FROM #mockup

I would do this as:
select stuff( ((case when column1 not like '-' then '|' + column1 else '' end) +
(case when column2 not like '-' then '|' + column2 else '' end)
), 1, 1, '');
This is the simplest way that I've found to implement concat_ws() in SQL Server. concat_ws() is a function available in other databases, where you would just do:
select concat_ws('|',
(case when column1 not like '-' then column1 end),
(case when column2 not like '-' then column2 end)
)

Related

How to delimit string(s) and convert to column headers?

How can you take a string(s) and convert to a column (dynamically as the source string(s) will change):
Example:
(select *
from table
inner join
(select column1, [dynamic field] from table) as dynamic_data
on table.column1 = dynamic_data.column1)
Column1
------
a,b,c
c,d,e
a,f,e
to this
column1 a b c d e f
-------------------
a,b,c |x|x|x| | | |
c,d,e | | |x|x|x| |
a,f,e |x| | | |x|x|
Use like and case:
select column1,
(case when ',' + column1 + ',' like '%,a,%' then 'x' end) as a,
(case when ',' + column1 + ',' like '%,b,%' then 'x' end) as b,
(case when ',' + column1 + ',' like '%,c,%' then 'x' end) as c,
(case when ',' + column1 + ',' like '%,d,%' then 'x' end) as d,
(case when ',' + column1 + ',' like '%,e,%' then 'x' end) as e,
(case when ',' + column1 + ',' like '%,f,%' then 'x' end) as f
from t;
I am not sure why "dynamic" is necessary. The issue isn't the source strings, but the destination columns. If you need those to match the source strings, then you do need to use dynamic SQL . . . and that seems rather complicated.
EDIT:
The heart of the dynamic SQL is putting together the case expressions. You can do this using string_split() and string_agg() (or equivalent functions in older versions of SQL Server):
select string_agg(replace(' (case when '','' + column1 + '','' like ''%,[value],%'' then ''x'' end) as [value]', '[value]', value), '
'
) within group (order by value) as cols
from (select distinct value
from t cross apply
string_split(t.column1, ',')
) t
Here is a db<>fiddle.
I'll let you construct the rest of the query.

Update composite column

I just want to fill column7 of my_table with values from other columns so the result for column7-value looks like that: 86|WWB|2014 or 86|WWB|- in case that column3 has value 0. Here is my SQL:
UPDATE public.my_table
SET column7 =
case when column1 IS NULL then '-' else column1 end
|| '|' ||
case when column2 IS NULL then '-' else column2 end
|| '|' ||
case when column3 = '0' then '-' else column3 end
error: invalid input syntax for integer: "_"
The problem ist the last case-row, because column3 is integer.
Column1 and column2 are bpchar, column3 ist int2.
Is there a way to solve this problem?
You are having type collisions. It is easy to convert in Postgres:
UPDATE public.my_table
SET column7 = (coalesce(column1::text, '-') || '|' ||
coalesce(column2::text, '-') || '|' ||
(case when column3 = 0 then '-' else column3::text end)
);
Using concat will make this a lot easier to read and it automatically converts everything to text. However the case statement needs to yield the same data type for all branches, so a cast to text is still needed for column3
UPDATE public.my_table
SET column7 = concat(
coalesce(column1, '-'), '|'
coalesce(column2, '-'), '|'
case when coalesce(column3,0) = 0 then '-' else column3::text end
);

Change column being used in WHERE condition based on IF condition

I apologize for the title - i was struggling coming up with something better.
Been doing some research on this and did find some close examples, however this is not quite what i need.
Basically i have a table with two columns that i want to evaluate under certain conditions. Column 1 is a identifier that can also be null. Column 2 is a SessionId that can change also.
Primarily i key off of column1, but when column1 is null i would like to key off of column2 instead. The example i linked above doesn't change the column being evaluated in the WHERE clause, only the value being used to evaluate the clause.
Here is some pseudo code to illustrate what i am trying to do:
SELECT * FROM MyTable
WHERE
IF Column1 NOT NULL
Column1 = #myvariable1
ELSE
Column2 LIKE '%' + #myvariable2 + '%'
Is something like this even possible? Can i switch the column to be evaluated in the WHERE clause based on the value of one of the columns?
I hope all that makes sense.
TIA
You could use CASE:
SELECT *
FROM MyTable
WHERE (CASE WHEN Column1 IS NOT NULL AND Column1 = #myvariable1 THEN 1
WHEN Column2 LIKE '%' + #myvariable2 + '%' THEN 1
ELSE 0
END) = 1;
You should check for nullable on Column1 on both validation of the where clause in case you set #myvariable1 to null.
SELECT * FROM MyTable
WHERE (Column1 IS NOT NULL AND Column1 = #myvariable1)
OR (Column1 IS NULL and Column2 LIKE '%' + #myvariable2 + '%')
WHERE Column1 = #myvariable1
OR ( Column1 IS NULL AND Column2 LIKE '%' + #myvariable2 + '%' )

Compare 2 different string columns in an SQL database

I have two different columns of strings and I want to check if one is contained in the other and create a third column which has values of either 1 or 0( 1 if column2 contains column 1 and 0 if otherwise)
For example:
Column1 Column2 Column3
Spar I am Sparta 1
How are you Sparta 0
How do you do this string comparison in SQL ?
You could use the charindex function. Assuming you just want to check if colum1 is contained in column2:
SELECT CASE WHEN CHARINDEX(column1, column2) > 0
THEN 1
ELSE 0
END AS column3
FROM mytable
If you need to check both ways, just add another call:
SELECT CASE WHEN CHARINDEX(column1, column2) > 0 OR
CHARINDEX(column2, column1) > 0
THEN 1
ELSE 0
END AS column3
FROM mytable
Try this:
WITH temp(col1, col2) AS(
SELECT 'Spar', 'I am Sparta' UNION ALL
SELECT 'How are you', 'Sparta'
)
SELECT
*,
col3 =
CASE
WHEN col1 like '%' + col2 + '%' or col2 like '%' + col1 + '%' THEN 1
ELSE 0
END
FROM temp
It should work
select column1,column2, (case when column2 like '%' + column1 + '%' then 1 else 0 end) as column3 FROM yourtable
Try this
DECLARE #table1 TABLE
(
Column1 VARCHAR(20),
Column2 VARCHAR(20)
)
INSERT INTO #table1
VALUES ('Spar',
'I am Sparta'),
('How are you',
'Sparta')
SELECT column1,
column2,
CASE
WHEN CHARINDEX(column1, column2) > 0
OR CHARINDEX(column1, column2) > 0 THEN 1
ELSE 0
END AS column3
FROM #table1

SQL Server Comma Separated value among columns

I want to select columns as comma-separated values by doing something like:
select column1+','+column2+','+column3+','+coulmn4 from someTable
except if any of the columns hold null values i have to skip that column from adding comma
how to do this is SQL Server?
[All columns are of type varchar so no casting needed]
Select
Case When Len(IsNull(Column1),'') > 0 Then Column1 + ',' Else '' End,
Case When Len(IsNull(Column2),'') > 0 Then Column2 + ',' Else '' End,
Case When Len(IsNull(Column3),'') > 0 Then Column3 + ',' Else '' End,
Case When Len(IsNull(Column4),'') > 0 Then Column4 + ',' Else '' End,
Case When Len(IsNull(ColumnN),'') > 0 Then ColumnN + ',' Else '' End
From
SomeTable
try
Test table
create table #testCol (column1 varchar(10), column2 varchar(10),
column3 varchar(10), column4 varchar(10))
insert #testCol values('a', null,null,'b')
insert #testCol values(null,'a',null,'b' )
insert #testCol values(null,'a','Z','b' )
Query
select isnull(column1,'')+ case when column1 is null then '' else ',' end
+ isnull(column2,'')+ case when column2 is null then '' else ',' end
+ isnull(column3,'')+ case when column3 is null then '' else ',' end
+ isnull(column4,'')
from #testCol
Output
a,b
a,b
a,Z,b
Can you export to csv and then strip out all the double commas?
select isnull(column1 + ',', '') + isnull(column2 + ',', '') + isnull(column3 + ',', '') + isnull(coulmn4, '') from someTable