I have a jsonb column which have following rows
ROW1:
[
{
"cpe23Uri": "cpe:2.3:a:sgi:irix:3.55:*:*:*:*:*:*:*",
"active": true
},
{
"cpe23Uri": "cpe:2.3:a:university_of_washington:imap:10.234:*:*:*:*:*:*:*",
"active": true
}
]
ROW 2:
[]
ROW 3:
[
{
"cpe23Uri": "cpe:2.3:o:sgi:irix:*:*:*:*:*:*:*:*",
"active": true
}
]
I want to find the rows which contain sgi:irix in the key cpe23Uri
Which query should i use for best performance?
You could use an exists condition with a correlated subquery that uses jsonb_array_element() to unnest and search the array:
select *
from mytable t
where exists (
select 1
from jsonb_array_elements(t.js) x
where x->>'cpe23Uri' like '%sgi:irix%'
);
Demo on DB Fiddle
Related
There is such a data structure:
Column "recipient" type jsonb
{
"phoneNumbers": [
{
"isDefault": true,
"type": "MOBILE",
"number": "3454654645"
},
{
"isDefault": true,
"type": "MOBILE",
"number": "12423543645"
}
]
}
I need to write a search request by number. In the postgres documentation, I did not find a search by value in an array, only it is obtained by an index. It doesn't suit me
I made a query like this, it gets executed, but are there any other ways to search through an array?
SELECT *
FROM my_table
WHERE recipient -> 'phoneNumbers' #> '[{"number":3454654645}]'
That's pretty much the best way, yes.
If you have a (GIST) index on recipient the index would not be used by your condition. But the following could make use of such an index:
SELECT *
FROM my_table
WHERE recipient #> '["phoneNumbers": {"number":3454654645}]}'
If you are using Postgres 12 or later, you can also use a JSON path expression:
SELECT *
FROM my_table
WHERE recipient ## '$.phoneNumbers[*].number == "12423543645"'
If you can't pass a JSON object to your query, you can use an EXISTS sub-select:
SELECT mt.*
FROM my_table mt
WHERE EXISTS (SELECT *
FROM jsonb_array_elements_text(mt.recipient -> 'phoneNumbers') as x(element)
WHERE x.element ->> 'number' = '3454654645')
The '3454654645' can be passed as a parameter to your query. This will never make use of an index though.
I have a table with a column of the data type JSONB. Each row in the column has a JSON that looks something like this:
[
{
"A":{
"AA": "something",
"AB": false
}
},
{
"B": {
"BA":[
{
"BAAA": [1,2,3,4]
},
{
"BABA": {
....
}
}
]
}
}
]
Note: the JSON is a complete mess of lists and objects, and it has a total of 300 lines. Not my data but I am stuck with it. :(
I am using postgresql version 12
How would I write the following queries:
Return all row that has the value of AB set to false.
Return the values of BAAA is each row.
You can find the AB = false rows with a JSON Path query:
select *
from test
where data ## '$[*].A.AB == false'
If you don't know where exactly the key AB is located, you can use:
select *
from test
where data ## '$[*].**.AB == false'
To display all elements from the array as rows, you can use:
select id, e.*
from test
cross join jsonb_array_elements(jsonb_path_query_first(data, '$[*].B.BA.BAAA')) with ordinality as e(item, idx)
I include a column "id" as a placeholder for the primary key column, so that the source of the array element can be determined in the output.
Online example
I'm using following schema for the JSONB column of my table (named fields). There are several of these field entries.
{
"FIELD_NAME": {
"value" : "FIELD_VALUE",
"meta": {
"indexable": true
}
}
}
I need to find all the fields that contain this object
"meta": {
"indexable": true
}
Here is a naive attempt at having json_object_keys in where clause, which doesn't work, but illustrates what I'm trying to do.
with entry(fields) as (values('{
"login": {
"value": "fred",
"meta": {
"indexable": true
}
},
"password_hash": {
"value": "88a3d1c7463d428f0c44fb22e2d9dc06732d1a4517abb57e2b8f734ce4ef2010",
"meta": {
"indexable": false
}
}
}'::jsonb))
select * from entry where fields->jsonb_object_keys(fields) #> '{"meta": {"indexable": "true"}}'::jsonb;
How can I query on the value of nested object? Can I somehow join the result of json_object_keys with the table iself?
demo:db<>fiddle
First way: using jsonb_each()
SELECT
jsonb_build_object(elem.key, elem.value) -- 3
FROM
entry,
jsonb_each(fields) as elem -- 1
WHERE
elem.value #> '{"meta": {"indexable": true}}' -- 2
Expand all subobjects into one row per "field". This creates 2 columns: the key and the value (in your case login and {"meta": {"indexable": true}, "value": "fred"})
Filter the records by checking the value column for containing the meta object using the #> as you already mentioned
Recreate the JSON object (combining the key/value columns)
Second way: Using jsonb_object_keys()
SELECT
jsonb_build_object(keys, fields -> keys) -- 3
FROM
entry,
jsonb_object_keys(fields) as keys -- 1
WHERE
fields -> keys #> '{"meta": {"indexable": true}}' -- 2
Finding all keys as you did
and 3. are very similar to the first way
I have a postgres database with jsonb format column tagsin table. I am trying to query for rows where the confidence >= 50.
I am unsure how to index into the predictions list to check for the confidence. I have tried the query below it executes without error but doesn't return any rows.
select * from mytable where (tags->>'confidence')::int >= 50;
Here is an example row jsonb
{
"predictions": [
{
"label": "Shopping",
"confidence": 91
},
{
"label": "Entertainment",
"confidence": 4
},
{
"label": "Events",
"confidence": 2
}
]
}
You need to normalize the data by un-nest the array, then you can
select p.d
from mytable mt
cross join lateral jsonb_array_elements(mt.tags -> 'predictions') as p(d)
where (p.d ->> 'confidence')::int >= 50;
For the above sample data, that returns:
{"label": "Shopping", "confidence": 91}
Online example: http://rextester.com/CBIAR76462
My JSON data looks like this:
[{
"id": 1,
"payload": {
"location": "NY",
"details": [{
"name": "cafe",
"cuisine": "mexican"
},
{
"name": "foody",
"cuisine": "italian"
}
]
}
}, {
"id": 2,
"payload": {
"location": "NY",
"details": [{
"name": "mbar",
"cuisine": "mexican"
},
{
"name": "fdy",
"cuisine": "italian"
}
]
}
}]
given a text "foo" I want to return all the tuples that have this substring. But I cannot figure out how to write the query for the same.
I followed this related answer but cannot figure out how to do LIKE.
This is what I have working right now:
SELECT r.res->>'name' AS feature_name, d.details::text
FROM restaurants r
, LATERAL (SELECT ARRAY (
SELECT * FROM json_populate_recordset(null::foo, r.res#>'{payload,
details}')
)
) AS d(details)
WHERE d.details #> '{cafe}';
Instead of passing the whole text of cafe I want to pass ca and get the results that match that text.
Your solution can be simplified some more:
SELECT r.res->>'name' AS feature_name, d.name AS detail_name
FROM restaurants r
, jsonb_populate_recordset(null::foo, r.res #> '{payload, details}') d
WHERE d.name LIKE '%oh%';
Or simpler, yet, with jsonb_array_elements() since you don't actually need the row type (foo) at all in this example:
SELECT r.res->>'name' AS feature_name, d->>'name' AS detail_name
FROM restaurants r
, jsonb_array_elements(r.res #> '{payload, details}') d
WHERE d->>'name' LIKE '%oh%';
db<>fiddle here
But that's not what you asked exactly:
I want to return all the tuples that have this substring.
You are returning all JSON array elements (0-n per base table row), where one particular key ('{payload,details,*,name}') matches (case-sensitively).
And your original question had a nested JSON array on top of this. You removed the outer array for this solution - I did the same.
Depending on your actual requirements the new text search capability of Postgres 10 might be useful.
I ended up doing this(inspired by this answer - jsonb query with nested objects in an array)
SELECT r.res->>'name' AS feature_name, d.details::text
FROM restaurants r
, LATERAL (
SELECT * FROM json_populate_recordset(null::foo, r.res#>'{payload, details}')
) AS d(details)
WHERE d.details LIKE '%oh%';
Fiddle here - http://sqlfiddle.com/#!15/f2027/5