Filter Nested JSON in PostgreSQL DB - sql

{
"List1": [
{
"f1": "b6ff",
"f2": "day",
"f3": "HO",
"List2": [{"f1": 1.5,"f2": "RATE"}]
}]
}
This is nested JSON in which there's a list 'List2' inside another list 'List1'.
how to filter f1 = 1.5 in List2? I have tried using #> operator used for contains but it doesn't work with nested JSON.

Assuming you are using an up-to-date Postgres version and you want to get the rows that fulfill the condition, you can use a JSON path expression:
select *
from the_table
where the_column ## '$.List1[*].List2[*].f1 == 1.5'
Alternatively you can use the #> operator, but the parameter must match the array structure in the column:
where the_column #> '{"List1":[{"List2":[{"f1": 1.5}]}]}';

You can try this in Where clause to fetch the whole record and then use Python to get that element.
SELECT .... WHERE
dbfieldname #> {'List1', '0', 'List2', '0', 'f1'} = 1.5

Related

How to remove/update a JSONB array element where key equals a value?

I'd like remove/replace an element from a JSONB array where a property is equal to a set value. I've found a number of functions that will accomplish this but I'd like to know if there's a way to do it without one as I have database restrictions?
Here's an example JSONB value:
[
{ "ID": "valuea" },
{ "ID": "valueb" },
{ "ID": "valuec" }
]
I'd like to remove the second array position where ID is equal to valueb with a single update statement. I'd imagine this could finding the position/order in the array, jsonb_set() to remove it.
It would also be helpful if there was a way to update the row and not just remove it. Likely a similar query, again with jsonb_set().
Unfortunately, there is no function to return the position of a JSON array element (yet) as of Postgres 15.
To remove a single matching element:
UPDATE tbl t
SET js = t.js - (SELECT j.ord::int - 1
FROM jsonb_array_elements(t.js) WITH ORDINALITY j(v,ord)
WHERE j.v = '{"ID": "valueb"}'
LIMIT 1)
WHERE t.js #> '[{"ID": "valueb"}]' -- optional
AND jsonb_typeof(t.js) = 'array'; -- optional
This UPDATE uses a correlated subquery with jsonb_array_elements().
About WITH ORDINALITY:
PostgreSQL unnest() with element number
Both WHERE clauses are optional.
Use the filter t.js #> '[{"ID": "valueb"}]' to suppress (potentially expensive!) empty updates and make good use of an existing GIN index on the jsonb column
Use the filter jsonb_typeof(t.js) = 'array' to only suppress errors from non-arrays.
Note how the outer filter includes enclosing array decorators [], while the inner filter (after unnesting) does not.
To remove all matching elements:
UPDATE tbl t
SET js = (SELECT jsonb_agg(j.v)
FROM jsonb_array_elements(t.js) j(v)
WHERE NOT j.v #> '{"ID": "valueb"}')
WHERE t.js #> '[{"ID": "valueb"}]';
fiddle
The second query aggregates a new array from remaining elements.
This time, the inner filter uses #> instead of = to allow for additional keys. Chose the appropriate filter.
Aside: jsonb_set() might be useful additionally if the array in question is actually nested, unlike your example.

Error during query property from jsonb column with double quotation marks

I have the PostgreSQL table named 'orders' with the following columns:
id | params (jsonb type) | valid
1 | {"value": "120", is_active: true} | true
2 | {"value": "92", is_active: false} | true
I'm trying to perform SELECT query with filter by params.is_active = true.
Method which creates filter receives property like
filter.property = `(params ->>'is_active')::boolean`
The resulting query looks like:
select *
from "orders"
where "valid" = true
and "(params->>'is_active')::boolean" = true
limit 50
It gives me an error
ERROR: column "(params->>'is_active')::boolean" does not exist.
When I removed double quotes around jsonb column as:
select *
from "orders"
where "valid" = true
and (params->>'is_active')::boolean = true
limit 50
it worked fine.
My question: is it possible to provide another (in different format) parameter to filter method instead of
(params ->>'is_active')::boolean
to avoid such error. I think that I can remove double quotes from resulted query with regex or I can change filter method mechanism to bypass adding quotes but maybe there is an option to just provide another value.
Your jsonb seems wrong.
select '{"value": "92", is_active: false}'::jsonb;
will fail.
My question: is it possible to provide another (in different format)
parameter to filter method instead of
You can use #> and jsonb_path_exists. run following command in DBeaver should be fine. I am not sure nodejs.
WITH cte (
params
) AS (
SELECT
'{"value": 120, "is_active": true}'::jsonb
UNION ALL
SELECT
'{"value": 92, "is_active": false}'::jsonb
)
SELECT
params,
params #> '{"is_active": true}'::jsonb,
jsonb_path_exists(params, '$.is_active ? (#== true)'),
jsonb_path_exists(params, '$.value ? (# >= $min)', '{"min":120}')
FROM
cte;

PostgreSQL(function) - Implementation of where clause to get rows by a value(concatenate) in the JSONB column

I am trying to get the rows from a table where I want the condition to check for a value from a jsonb column. The column stores the data as:
[{"UserId": 420, "Permission": "Create"}, {"UserId": 369, "Permission": "View"}]
In the function, I check for the value using:
tab."Books" #> '[{"UserId":420}]'
but I want the 420 to be replaced with "ID" which I pass through the function. The only way I came across was concatenation(
tab."Books" #> '[{"UserId":'||ID||'}]'
which did not help.
Am I doing it wrong? Kindly suggest an alternative if any. Thanks.
You should be able to just cast:
tab."Books" #> ('[{"UserId":' || ID || '}]')::jsonb
Or you can use the json builder functions:
tab."Books" #> jsonb_build_array(jsonb_build_object('UserId', ID))
You can use jsonb_build_object, then convert that to a JSON array:
tab."Books" #> jsonb_build_array(jsonb_build_object('UserId', id))

PostgreSQL 9.6 jsonb query using like on arrays

i need to query a jsonb table field with the normal like functions.
This is my json field
"campi":[
{
"label":"testLabel",
"valore":[
"testValore",
"testValore2"
],
"idCampo":"testID",
"idCampoTP":"testCampoID",
"proprieta":[
{
"label":"testLabel",
"idProprieta":"testProp"
}
],
"idTipoCampo":"idTipoCampoID"
},
{
"label":"testLabel2",
"valore":[
"testValore3",
"testValore4"
],
"idCampo":"testID2",
"idCampoTP":"testCampoID2",
"proprieta":[
{
"label":"testLabel2",
"idProprieta":"testProp2"
}
],
"idTipoCampo":"idTipoCampoID2"
}
]
}
Is even possibile make a query like this?
SELECT customfield from procedura WHERE customfield->'campi' #> '[{"label":"testLabel3"}]'
But with testLabel3 with like wildcards: testLabel%
Another question, is even possibile make a query for get the object(s) "campi" with a "valore" of "testValore"?
My dream query was:
SELECT customfield from procedura WHERE customfield->'campi' #> '[{"label":"testLabel%"}]'
With % as wildcard
EDIT:
I faund a way to make some simple query:
SELECT customfield FROM procedura, jsonb_array_elements(procedura.customfield #> '{campi}') obj
WHERE obj->>'idCampoTP' LIKE 'testCampoID3%' group by procedura.id;
but i cant figure how to search in valore field sub-array
EDIT:
I found this way, but to me seem a crap solution
SELECT customfield FROM procedura, jsonb_array_elements(procedura.customfield #> '{campi}') obj
WHERE obj->>'valore' LIKE '%stValore5%' group by procedura.id;
Yes it works :)
For filtering of type 'testValore' as you have already mentioned in you question
data->'campi' #> '[{"label":"testLabel3"}]';
For extracting id with valore of type 'testValore'
data->'campi' #> '[{"valore": ["testValore"]}]';

Query Postgres JSONB with traversal of nested key when first parent key is unknown

I'm trying to query using JSONB however I have a problem where I don't know what the first key could be.
Ideally I would be able to use a wildcard inside my query.
eg: The following works
WHERE json_data #> '{first_key,second_key}' = '"value-of-second-key"'
but I may not know the name of the first_key or want to match any of the nested sub keys.
Something like.
WHERE json_data #> '{*,second_key}' = '"value-of-second-key"'
Would be ideal using a wildcard like '*'
Any advice or approaches to this would be greatly appreciated.
You can't use wildcard for #> operator, but you can use jsonb_each function to unnest the first level of the JSON:
SELECT *
FROM jsonb_each('{"foo": {"second_key": "xxx"}, "bar": {"other_second_key": "xxx"}, "baz": {"second_key": "yyy"}}') AS e(key, value)
WHERE e.value #> '{"second_key": "xxx"}';
Result:
key | value
-----+-----------------------
foo | {"second_key": "xxx"}
(1 row)
If you just want to search for the row matching it though (and not the exact json element, as above) you can use EXISTS:
SELECT ...
FROM the_table t
WHERE EXISTS(
SELECT 1
FROM jsonb_each(t.the_jsonb_column) AS e(key, value)
WHERE e.value #> '{"second_key": "xxx"}'
)
Logically, this approach works fine, but be warned that it can't get advantage of an index as would e.value #> '{"foo": {"second_key": "xxx"}}', so if performance is really a matter, you may want to rethink your schema.