PostgreSQL 9.6 jsonb query using like on arrays - sql

i need to query a jsonb table field with the normal like functions.
This is my json field
"campi":[
{
"label":"testLabel",
"valore":[
"testValore",
"testValore2"
],
"idCampo":"testID",
"idCampoTP":"testCampoID",
"proprieta":[
{
"label":"testLabel",
"idProprieta":"testProp"
}
],
"idTipoCampo":"idTipoCampoID"
},
{
"label":"testLabel2",
"valore":[
"testValore3",
"testValore4"
],
"idCampo":"testID2",
"idCampoTP":"testCampoID2",
"proprieta":[
{
"label":"testLabel2",
"idProprieta":"testProp2"
}
],
"idTipoCampo":"idTipoCampoID2"
}
]
}
Is even possibile make a query like this?
SELECT customfield from procedura WHERE customfield->'campi' #> '[{"label":"testLabel3"}]'
But with testLabel3 with like wildcards: testLabel%
Another question, is even possibile make a query for get the object(s) "campi" with a "valore" of "testValore"?
My dream query was:
SELECT customfield from procedura WHERE customfield->'campi' #> '[{"label":"testLabel%"}]'
With % as wildcard
EDIT:
I faund a way to make some simple query:
SELECT customfield FROM procedura, jsonb_array_elements(procedura.customfield #> '{campi}') obj
WHERE obj->>'idCampoTP' LIKE 'testCampoID3%' group by procedura.id;
but i cant figure how to search in valore field sub-array
EDIT:
I found this way, but to me seem a crap solution
SELECT customfield FROM procedura, jsonb_array_elements(procedura.customfield #> '{campi}') obj
WHERE obj->>'valore' LIKE '%stValore5%' group by procedura.id;

Yes it works :)
For filtering of type 'testValore' as you have already mentioned in you question
data->'campi' #> '[{"label":"testLabel3"}]';
For extracting id with valore of type 'testValore'
data->'campi' #> '[{"valore": ["testValore"]}]';

Related

Filter Nested JSON in PostgreSQL DB

{
"List1": [
{
"f1": "b6ff",
"f2": "day",
"f3": "HO",
"List2": [{"f1": 1.5,"f2": "RATE"}]
}]
}
This is nested JSON in which there's a list 'List2' inside another list 'List1'.
how to filter f1 = 1.5 in List2? I have tried using #> operator used for contains but it doesn't work with nested JSON.
Assuming you are using an up-to-date Postgres version and you want to get the rows that fulfill the condition, you can use a JSON path expression:
select *
from the_table
where the_column ## '$.List1[*].List2[*].f1 == 1.5'
Alternatively you can use the #> operator, but the parameter must match the array structure in the column:
where the_column #> '{"List1":[{"List2":[{"f1": 1.5}]}]}';
You can try this in Where clause to fetch the whole record and then use Python to get that element.
SELECT .... WHERE
dbfieldname #> {'List1', '0', 'List2', '0', 'f1'} = 1.5

How to extract JSON value through SQL?

"example_code": [
{
"code": "blah",
"type": "value"
}
]
In other cases we would write : (meta->'example_code'->'code')
But, since it is inside [] braces, i am not able to extract.
Welcome to SO. Since example_code is an array use -> 0 to access its first (and only) element. Here is the documentation on it.
CTE the_table is a mimic of real data.
with the_table(meta) as
(
values ('{"example_code":[{"code":"blah", "type":"value"}]}'::json)
)
select meta -> 'example_code' -> 0 ->> 'code' from the_table;

Get a json attribute with variable name in Postgres

not an SQL guru here.
Trying to write a query that gets a few columns of a table, and only the value "icon" of the json column below (named weather). I got to a pointwhere i can list all the attributes listed right after sessions, which are timestamps, but no luck in iterating them and joining to the rest of the table.
I also have the feeling that it wasn't very clever to store that value as an attribute name, especially as it's already stored in the "dt" value.
Can anybody confirm if this is best practice or not?
And could somebody help me get the "icon" value?
{
"lat":43.6423,
"lon":-72.2518,
"timezone":"America/New_York",
"timezone_offset":-14400,
"sessions":{
"1651078174":{
"dt":1651078174,
"sunrise":1651052825,
"sunset":1651103155,
"temp":48.45,
"feels_like":43.63,
"pressure":1009,
"humidity":68,
"dew_point":38.39,
"uvi":5,
"clouds":100,
"visibility":10000,
"wind_speed":11.5,
"wind_deg":310,
"weather":[
{
"id":804,
"main":"Clouds",
"description":"overcast clouds",
"icon":"04d"
}
]
}
}
}
If you have only 1 icon and multiple sessions you can run
if you have multiple icon you need to apply anothe CTE layer to extract them with json_Each
for more json function see https://www.postgresql.org/docs/current/functions-json.html
wITH CTE AS (
select value from json_each('{
"lat":43.6423,
"lon":-72.2518,
"timezone":"America/New_York",
"timezone_offset":-14400,
"sessions":{
"1651078174":{
"dt":1651078174,
"sunrise":1651052825,
"sunset":1651103155,
"temp":48.45,
"feels_like":43.63,
"pressure":1009,
"humidity":68,
"dew_point":38.39,
"uvi":5,
"clouds":100,
"visibility":10000,
"wind_speed":11.5,
"wind_deg":310,
"weather":[
{
"id":804,
"main":"Clouds",
"description":"overcast clouds",
"icon":"04d"
}
]
}
}
}
') WHERE key = 'sessions')
SELECT json_data.key session,
json_data.value-> 'weather' -> 0 ->> 'icon' FROM CTE,json_each(CTE.value) json_data
session | ?column?
:--------- | :-------
1651078174 | 04d
db<>fiddle here

Select column names and querying "LIKE" on Couchbase

I want to get all column names from a bucket.
I found a query:
SELECT ARRAY_DISTINCT(ARRAY_AGG(v)) AS column
FROM mybucket b UNNEST object_names(b) AS v
It's getting column names array but I need LIKE SQL command. It's like this:
SELECT column
FROM mybucket
WHERE column LIKE '%test%'
Is there a way to do this?
OBJECT_NAMES() only gives top level field names (not includes nested fields)
https://docs.couchbase.com/server/current/n1ql/n1ql-language-reference/objectfun.html
SELECT DISTINCT v AS column
FROM mybucket b UNNEST OBJECT_NAMES(b) AS v
WHERE v LIKE "%test%";
This is a tricky one, depending on what you want the resultant structure to be. And disclaimer, there might be a more succinct way to do this (but I haven't found it yet--maybe there's another way that doesn't involve OBJECT_NAMES?).
But anyway, the key to this for me was the ARRAY collection operator.
For instance, this:
SELECT ARRAY a FOR a IN ARRAY_DISTINCT(ARRAY_AGG(allFieldNames))
WHEN a LIKE '%test%' END AS filteredFieldNames
FROM mybucket b UNNEST object_names(b) AS allFieldNames
Will return results like
[
{
"filteredFieldNames": [
"testField1",
"anotherTestField"
]
}
]
If you want a different format, you can work with the ARRAY operator expression. For instance:
SELECT ARRAY { "fieldName" : a } FOR a IN
ARRAY_DISTINCT(ARRAY_AGG(allFieldNames))
WHEN a LIKE '%test%' END AS filteredFieldNames
FROM mybucket b UNNEST object_names(b) AS allFieldNames
Which would return:
[
{
"filteredFieldNames": [
{
"fieldName": "testField1"
},
{
"fieldName": "anotherTestField"
}
]
}
]

Query data inside an attribute array in a json column in Postgres 9.6

I have a table say types, which had a JSON column, say location that looks like this:
{ "attribute":[
{
"type": "state",
"value": "CA"
},
{
"type": "distance",
"value": "200.00"
} ...
]
}
Each row in the table has the data, and all have the "type": "state" in it. I want to just extract the value of "type": "state" from every row in the table, and put it in a new column. I checked out several questions on SO, like:
Query for element of array in JSON column
Index for finding an element in a JSON array
Query for array elements inside JSON type
but could not get it working. I do not need to query on this. I need the value of this column. I apologize in advance if I missed something.
create table t(data json);
insert into t values('{"attribute":[{"type": "state","value": "CA"},{"type": "distance","value": "200.00"}]}'::json);
select elem->>'value' as state
from t, json_array_elements(t.data->'attribute') elem
where elem->>'type' = 'state';
| state |
| :---- |
| CA |
dbfiddle here
I mainly use Redshift where there is a built-in function to do this. So on the off-chance you're there, check it out.
redshift docs
It looks like Postgres has a similar function set:
https://www.postgresql.org/docs/current/static/functions-json.html
I think you'll need to chain three functions together to make this work.
SELECT
your_field::json->'attribute'->0->'value'
FROM
your_table
What I'm trying is a json extract by key name, followed by a json array extract by index (always the 1st, if your example is consistent with the full data), followed finally by another extract by key name.
Edit: got it working for your example
SELECT
'{ "attribute":[
{
"type": "state",
"value": "CA"
},
{
"type": "distance",
"value": "200.00"
}
]
}'::json->'attribute'->0->'value'
Returns "CA"
2nd edit: nested querying
#McNets is the right, better answer. But in this dive, I discovered you can nest queries in Postgres! How frickin' cool!
I stored the json as a text field in a dummy table and successfully ran this:
SELECT
(SELECT value FROM json_to_recordset(
my_column::json->'attribute') as x(type text, value text)
WHERE
type = 'state'
)
FROM dummy_table