How I can use where Condition in SQL I have comma separated value in A columns in multiple rows - sql

Column1 EventTypes_pKey
Are 5,3
Test 1,4,5
test 1,3,5
If I am using
Select * from Table name where EventTypes_pKey in('5,1,4)
then I want that record where these value belongs the column.
How I can use where condition on the basis of EventTypes_pKey this is my Varchar column.
I want If I am selecting 5,3,4 the there should be all three row data.
Please help me.

If you are using Postgres, you can do this by by converting the value into an array and then using the overlaps operator &&
select *
from badly_designed_table
where string_to_array(eventtypes_pkey, ',')::int[] && array[5,3,4];
Online example

Related

How do I find empty values in multiple columns at once using SQL Big Query?

To find for one column, I can use this.
SELECT column1
FROM `dataset.table`
WHERE column1 IS NULL OR column1 = '';
But what if i have 100 columns? Instead of going through column by column, changing column 1 to 2,3,etc.,I'm looking for one for all solution. I'm kinda new to SQL and Data Cleaning.
Consider below approach
select *
from your_table t
where regexp_contains(to_json_string(t), r':(?:null|"")[,}]')
above will return you all rows where any column either null or empty string

print the values that are common from both the tables of a column

I want to print the values that are common from both the tables of a column.
The issue is one column's substring value matches with the other column's string.
Printing the subquery alone fetches the right values (proving the substring query is correct) but I think the entire query after the where clause needs changing.
Kindly suggest.
Code:
select distinct sd.sourceworkitemid
from u_prodstypetest pst, sdidata sd
where sd.keyid1 = 'S-20210719-00000003'
and sd.sourceworkitemid in (select substr(testmethodid,0,INSTR(testmethodid,'|',1)-1) from u_prodstypetest);
I want to create a substring of a value of a column in a table and compare it with a column in another table. But since it is a substring the where clause of column1=column2 does not suffice and hence I wroe the subquery to fetch the substring which if run throws an error in 'in' because the subquery return >1 values.

PostgreSQL query is only returning one row when there are two rows that meet the query

When I run the query:
select * from table
I get 300 results. 2 of which have col_name with the same value: 10348-0. It's of type text.
When I run
select * from table where col_name = '10348-0'
I expect to be returned two rows return because two rows meet that criteria.
Instead I only get one row who's ID is one less than the other row. I.e. returned row id: 4556 unreturned row id: 4557. The ID column is of type serial integer.
Why am I only getting one row return when I add a "Where" clause to the query?
Presumably there are hidden characters or look-alike characters in the name -- so what looks the same is not necessarily the same.
The most common issue are leading or trailing spaces. I would suggest that you try finding these using:
col_name like '%10348-0%'
Another common problem are em-dashes versus en-dashes. You might also try:
col_name like '10348_0'
The _ is the LIKE wildcard to match exactly one character.

Part replace a record in SQL

I need to mask data in my tables, for example data like:
ABCDEFG
XYZABCD
LMNOPQR
Should appear like:
AB*****
XY*****
LM*****
What update query can I use? Also, can I use a single query for updating multiple columns?
You can just mask it when showing the data
select stuff(stuff(stuff(col,3,3,'*'),7,3,'*'),10,3,'*')) as col from table
Suppose the column you want to mask is called column from table table, than you can use the following query, which is standard in SQL, to update the value in the column:
update table
set column = substring(column from 1 for 2) || '****';
If on the other hand you want only to select the values to show them, you can use the following query:
select substring(column from 1 for 2) || '****'
from table;

SQLite WHERE-Clause for every column?

Does SQLite offer a way to search every column of a table for a searchkey?
SELECT * FROM table WHERE id LIKE ...
Selects all rows where ... was found in the column id. But instead to only search in the column id, I want to search in every column if the searchstring was found. I believe this does not work:
SELECT * FROM table WHERE * LIKE ...
Is that possible? Or what would be the next easy way?
I use Python 3 to query the SQLite database. Should I go the route to search through the dictionary after the query was executed and data returned?
A simple trick you can do is:
SELECT *
FROM table
WHERE ((col1+col2+col3+col4) LIKE '%something%')
This will select the record if any of these 4 columns contain the word "something".
No; you would have to list or concatenate every column in the query, or reorganize your database so that you have fewer columns.
SQLite has full-text search tables where you can search all columns at once, but such tables do not work efficiently with any other queries.
I could not comment on #raging-bull answer. So I had to write a new one. My problem was, that I have columns with null values and got no results because the "search string" was null.
Using coalesce I could solve that problem. Here sqlite chooses the column content, or if it is null an empty string (""). So there is an actual search string available.
SELECT *
FROM table
WHERE (coalesce(col1,"") || coalesce(col2,"") || coalesce(col3,"") || coalesce(col4,"")) LIKE '%something%')
I'm not quite sure, if I understood your question.
If you want the whole row returned, when id=searchkey, then:
select * from table where id=searchkey;
If you want to have specific columns from the row with the correct searchkey:
select col1, col2, col3 from table where id=searchkey;
If you want to search multiple columns for the "id": First narrow down which columns this could be found in - you don't want to search the whole table! Then:
select * from table where col1=searchkey or col2=searchkey or col3=searchkey;