How to query array which is present as string in postgres? - sql

I have a table called table with column called column with datatype text with values like '["1","2"]'.
I need to get all records which has "1" as one of the element.
select *
from table
where column.....?
How should the where clause be?

Simply use LIKE. Keep the double quotes to pass 1, but avoid other numbers containing that digit.
select *
from table
where column like '%"1"%'

I think you can use ? operator on jsonb type:
select *
from (
select '["1","2"]' union all
select '["0"]'
) as a(data)
where
a.data::jsonb ? '1'
In general, I'd consider storing your data as jsonb instead of string.
db<>fiddle example

Related

Converting db2 column names in select to be lowercase in json file

I currently have a very simple select that my code then dumps into JSON
SELECT user, phone
FROM table t;
But the select returns all uppercase column names, resulting in uppercase JSON keys, which I don't want. Is there a way in DB2 to return lowercase column names?
If you want to get lowercase columns names (not data) in DB2, you must use double quotes around the column names.
SELECT user as "user", phone as "phone"
FROM table t;
If you want the result of the query to be small, try this
SELECT LOWER(user), LOWER(phone)
FROM table t;
Use the LOWER() function;
SELECT LOWER(user) AS 'user', LOWER(phone) AS 'phone'
FROM table t;

What the query to display all data on my table in uppercase?

I just try use
SELECT UPPER(*) FROM TABLE
but it didn't work
I suggest use the sentence UPPER for each varchar or char field in your table.
For example:
SELECT UPPER(Name), UPPER(LastName), UPPER(CityName) FROM ClientsTable
On this way you obtain the data in uppercase.

PostgreSQL: return row where any column value like variable

I am trying to have the user search for a value in a SQL table, and the user is returned with any row that contains that value. At the moment, I can make it work such that the code is:
SELECT * FROM table WHERE lower('foo') in (lower('col1'),lower('col2'),etc)
However, I would like it to be able to search every column and return any row LIKE 'foo'. For instance,
SELECT * FROM table WHERE (lower('col1'), lower('col2'), etc) like lower('%foo%')
But that doesn't work.
Any suggestions?
I believe you need to use multiple WHERE clauses instead of grouping them all into one statement. Try this:
SELECT * FROM table
WHERE lower(col1) like lower('%foo%')
OR lower(col2) like lower('%foo%')
OR etc like lower('%foo%')
You can convert the whole row to a string and then use LIKE on the result of that:
select *
from the_table
where lower(the_table::text) like '%foo%';
the_table::text returns all columns of each row as a comma separated list enclosed with parentheses, e.g. (42,Arthur,Dent). So the above is not 100 identical to a LIKE condition applied on each column - but probably does what you want.

Complete list of special characters for hive LIKE operator

select * from table where column like '%a|b%'
The above query matches all rows with the column having either 'a' OR 'b' as a substring.
What if I want to match the substring "a|b"?
Using the query,
select * from table where column like '%a\|b%'
yields the same result.
Can I get the complete reference for the LIKE operator in hive? The UDF manual seems insufficient.
You can use RLIKE (regular expression) : select * from table where column rlike '.*a\|b.*'
You can use select * form table where column like '%a[|]b%'

In SQLite, How do I exclude rows which contain certain strings?

Here's my SQLite table (in comma-delimited format):
ROWID,COLUMN
1,"This here is a string |||"
2,"Here is another string"
3,"And yet another string"
I want to exclude all rows under 'COLUMN' which contain '|||'. Is this possible in SQLite?
select * from table where column not like '%|||%'
this should work
select * from your_table
where your_column not like '%'|||%'
SQLFiddle demo