I have a PostgreSQL table like this one:
Table t
id | keys (jsonb)
---+----------------
1 | ["Key1", "Key2"]
My goal is to query this table to find out if one of the keys of a list is contained in the jsonb array column "keys".
I managed to get a result using:
SELECT *
FROM t
WHERE keys ?| Array ['Key1', 'Key2'];
I can not find a way to make this query broader by applying a lower() on the "keys" values in the table though.
Is there a way to iterate over elements to apply the lower() on each one?
Thanks to the replies above I managed to find a way to do it like this:
SELECT *
FROM t, jsonb_array_elements_text(keys) key
WHERE lower(key) in ('key1', 'key2') ;
You might need to unnest both arrays:
select *
from t
where exists (
select 1
from jsonb_array_elements_text(t.keys) k1(val)
inner join unnest(array['Key1', 'Key2']) k2(val)
on lower(k1.val) = lower(k2.val)
)
Related
These are samples of the two tables I have:
Table 1
material_id (int) codes (jsonb)
--------------------- -------------------------------
1 ['A-12','B-19','A-14','X-22']
2 ['X-106','A-12','X-22','B-19']
.
.
Table 2
user_id material_list (jsonb)
----------- --------------------
1 [2,3]
2 [1,2]
.
.
Table 1 contains material IDs and an array of codes associated with that material.
Table 2 contains user IDs. Each user has a list of materials associated with it and this is saved an an array of material IDs
I want to fetch a list of user IDs for all materials having certain codes. This is the query I tried, but it threw a syntax error:
SELECT user_id from table2
WHERE material_list ?| array(SELECT material_id
FROM table1 where codes ?| ['A-12','B-19]);
I am unable to figure out how to fix it.
Your query fails for multiple reasons.
First, ['A-12','B-19] isn't a valid Postgres text array. Either use an array constant or an array constructor:
'{A-12,B-19}'
ARRAY['A-12','B-19']
See:
How to pass custom type array to Postgres function
Pass array literal to PostgreSQL function
Next, the operator ?| demands text[] to the right, while you provide int[].
Finally, it wouldn't work anyway, as the operator ?| checks for JSON strings, not numbers. The manual:
Do any of the strings in the text array exist as top-level keys or array elements?
Convert the JSON array to a Postgres integer array, then use the array overlap operator &&
SELECT user_id
FROM tbl2
WHERE ARRAY(SELECT jsonb_array_elements_text(material_list)::int)
&& ARRAY(SELECT material_id FROM tbl1 where codes ?| array['A-12','B-19']);
I strongly suggest to alter your table to convert the JSON array in material_list to a Postgres integer array (int[]) for good. See:
Altering JSON column to INTEGER[] ARRAY
How to turn JSON array into Postgres array?
Then the query gets simpler:
SELECT user_id
FROM tbl2
WHERE material_list && ARRAY(SELECT material_id FROM tbl1 where codes ?| '{A-12,B-19}');
db<>fiddle here
Or - dare I say it? - properly normalize your relational design. See:
How to implement a many-to-many relationship in PostgreSQL?
This seems like the process of unnesting json arrays:
select t2.user_id
from table2 t2
where exists (select 1
from table1 t1 join
jsonb_array_elements_text(t2.material_list) j(material_id)
on t1.material_id = j.material_id::int join
jsonb_array_elements_text(t1.codes) j2(code)
on j2.code in ('A-12', 'B-19')
);
Here is a db<>fiddle.
I have a column of type jsonb[] (a Postgres array of jsonb objects) and I'd like to perform a SELECT on rows where a criteria is met on at least one of the objects. Something like:
-- Schema would be something like
mytable (
id UUID PRIMARY KEY,
col2 jsonb[] NOT NULL
);
-- Query I'd like to run
SELECT
id,
x->>'field1' AS field1
FROM
mytable
WHERE
x->>'field2' = 'user' -- for any x in the array stored in col2
I've looked around at ANY and UNNEST but it's not totally clear how to achieve this, since you can't run unnest in a WHERE clause. I also don't know how I'd specify that I want the field1 from the matching object.
Do I need a WITH table with the values expanded to join against? And how would I achieve that and keep the id from the other column?
Thanks!
You need to unnest the array and then you can access each json value
SELECT t.id,
c.x ->> 'field1' AS field1
FROM mytable t
cross join unnest(col2) as c(x)
WHERE c.x ->> 'field2' = 'user'
This will return one row for each json value in the array.
Consider a table temp (jsondata jsonb)
Postgres provides a way to query jsonb array object for contains check using
SELECT jsondata
FROM temp
WHERE (jsondata->'properties'->'home') ? 'football'
But, we can't use LIKE operator for array contains. One way to get LIKE in the array contains is using -
SELECT jsondata
FROM temp,jsonb_array_elements_text(temp.jsondata->'properties'->'home')
WHERE value like '%foot%'
OR operation with LIKE can be achieved by using -
SELECT DISTINCT jsondata
FROM temp,jsonb_array_elements_text(temp.jsondata->'properties'->'home')
WHERE value like '%foot%' OR value like 'stad%'
But, I am unable to perform AND operation with LIKE operator in JSONB array contains.
After unnesting the array with jsonb_array_elements() you can check values meeting one of the conditions and sum them in groups by original rows, example:
drop table if exists temp;
create table temp(id serial primary key, jsondata jsonb);
insert into temp (jsondata) values
('{"properties":{"home":["football","stadium","16"]}}'),
('{"properties":{"home":["football","player","16"]}}'),
('{"properties":{"home":["soccer","stadium","16"]}}');
select jsondata
from temp
cross join jsonb_array_elements_text(temp.jsondata->'properties'->'home')
group by jsondata
-- or better:
-- group by id
having sum((value like '%foot%' or value like 'stad%')::int) = 2
jsondata
---------------------------------------------------------
{"properties": {"home": ["football", "stadium", "16"]}}
(1 row)
Update. The above query may be expensive with a large dataset. There is a simplified but faster solution. You can cast the array to text and apply like to it, e.g.:
select jsondata
from temp
where jsondata->'properties'->>'home' like all('{%foot%, %stad%}');
jsondata
---------------------------------------------------------
{"properties": {"home": ["football", "stadium", "16"]}}
(1 row)
I have the following, but it was a bit fiddly. There's probably a better way but this is working I think.
The idea is to find the matching JSON array entries, then collect the results. In the join condition we check the "matches" array has the expected number of entries.
CREATE TABLE temp (jsondata jsonb);
INSERT INTO temp VALUES ('{"properties":{"home":["football","stadium",16]}}');
SELECT jsondata FROM temp t
INNER JOIN LATERAL (
SELECT array_agg(value) AS matches
FROM jsonb_array_elements_text(t.jsondata->'properties'->'home')
WHERE value LIKE '%foo%' OR value LIKE '%sta%'
LIMIT 1
) l ON array_length(matches, 1) = 2;
jsondata
-------------------------------------------------------
{"properties": {"home": ["football", "stadium", 16]}}
(1 row)
demo: db<>fiddle
I would cast the array into text. Then you are able to search for keywords with every string operator.
Disadvantage: because it was an array the text contains characters like braces and commas. So it's not that simple to search for keyword with a certain beginning (ABC%): You always have to search like %ABC%
SELECT jsondata
FROM (
SELECT
jsondata,
jsondata->'properties'->>'home' as a
FROM
temp
)s
WHERE
a LIKE '%stad%' AND a LIKE '%foot%'
I have a result set which returns 3 columns out of which one is varchar and two are array, now i need to merge the array column to create a new array with not null unique elements. I have tried different options non of them are working, Any suggestions?
You can concatenate the arrays and unnest into rows. Then you can use distinct to get the unique rows, and array_agg to combine them back into an array:
select id
, array_agg(nr)
from (
select distinct id
, unnest(array[col1] || col2 || col3) nr
from t1
) sub
group by
id
Example at SQL Fiddle.
Thanks for the suggestions guys. I found the solution array_cat function worked for me.
I am querying a database in Postgres using psql. I have used the following query to search a field called tags that has an array of text as it's data type:
select count(*) from planet_osm_ways where 'highway' = ANY(tags);
I now need to create a query that searches the tags fields for any word starting with the letter 'A'. I tried the following:
select count(*) from planet_osm_ways where 'A%' LIKE ANY(tags);
This gives me a syntax error. Any suggestions on how to use LIKE with an array of text?
Use the unnest() function to convert array to set of rows:
SELECT count(distinct id)
FROM (
SELECT id, unnest(tags) tag
FROM planet_osm_ways) x
WHERE tag LIKE 'A%'
The count(dictinct id) should count unique entries from planet_osm_ways table, just replace id with your primary key's name.
That being said, you should really think about storing tags in a separate table, with many-to-one relationship with planet_osm_ways, or create a separate table for tags that will have many-to-many relationship with planet_osm_ways. The way you store tags now makes it impossible to use indexes while searching for tags, which means that each search performs a full table scan.
Here is another way to do it within the WHERE clause:
SELECT COUNT(*)
FROM planet_osm_ways
WHERE (
0 < (
SELECT COUNT(*)
FROM unnest(planet_osm_ways) AS planet_osm_way
WHERE planet_osm_way LIKE 'A%'
)
);