Get data from jsonb field array - sql

I have the following schema in PostgreSQL database:
CREATE TABLE survey_results (
id integer NOT NULL,
raw jsonb DEFAULT '{}'::jsonb,
created_at timestamp without time zone,
updated_at timestamp without time zone
);
INSERT INTO survey_results (id, raw, created_at, updated_at)
VALUES (1, '{ "slides": [{"id": "1", "name": "Test", "finished_at": 1517421628092}, {"id": "2", "name": "Test", "finished_at": 1517421894736}]}', now(), now());
I want to get data from raw['slides']. I want to have query which return every raw['slides'] id and raw['slides'] finished_at. So result of a query should look like this:
id finished_at
1 1517421628092
2 1517421894736
Here is sqlfiddle to experiment with:
http://sqlfiddle.com/#!17/ae504
How can I do this in PostgreSQL?

You need to unnest the array, then you can access each element:
select s.slide ->> 'id' as id,
s.slide ->> 'finished_at' as finished_at
from survey_results, jsonb_array_elements(raw -> 'slides') as s (slide)
http://sqlfiddle.com/#!17/ae504/80
For more details see JSON Functions and Operators in the manual.

Related

SQL get the value of a nested key in a jsonb field

Let's suppose I have a table my_table with a field named data, of type jsonb, which thus contains a json data structure.
let's suppose that if I run
select id, data from my_table where id=10;
I get
id | data
------------------------------------------------------------------------------------------
10 | {
|"key_1": "value_1" ,
|"key_2": ["value_list_element_1", "value_list_element_2", "value_list_element_3" ],
|"key_3": {
| "key_3_1": "value_3_1",
| "key_3_2": {"key_3_2_1": "value_3_2_1", "key_3_2_2": "value_3_2_2"},
| "key_3_3": "value_3_3"
| }
| }
so in pretty formatting, the content of column data is
{
"key_1": "value_1",
"key_2": [
"value_list_element_1",
"value_list_element_2",
"value_list_element_3"
],
"key_3": {
"key_3_1": "value_3_1",
"key_3_2": {
"key_3_2_1": "value_3_2_1",
"key_3_2_2": "value_3_2_2"
},
"key_3_3": "value_3_3"
}
}
I know that If I want to get directly in a column the value of a key (of "level 1") of the json, I can do it with the ->> operator.
For example, if I want to get the value of key_2, what I do is
select id, data->>'key_2' alias_for_key_2 from my_table where id=10;
which returns
id | alias_for_key_2
------------------------------------------------------------------------------------------
10 |["value_list_element_1", "value_list_element_2", "value_list_element_3" ]
Now let's suppose I want to get the value of key_3_2_1, that is value_3_2_1.
How can I do it?
I have tryed with
select id, data->>'key_3'->>'key_3_2'->>'key_3_2_1' alias_for_key_3_2_1 from my_table where id=10;
but I get
select id, data->>'key_3'->>'key_3_2'->>'key_3_2_1' alias_for_key_3_2_1 from my_table where id=10;
^
HINT: No operators found with name and argument types provided. Types may need to be converted explicitly.
what am I doing wrong?
The problem in the query
select id, data->>'key_3'->>'key_3_2'->>'key_3_2_1' alias_for_key_3_2_1 --this is wrong!
from my_table
where id=10;
was that by using the ->> operand I was turning a json to a string, so that with the next ->> operand I was trying to get a json key object key_3_2 out of a string object, which makes no sense.
Thus one has to use the -> operand, which does not convert json into string, until one gets to the "final" key.
so the query I was looking for was
select id, data->'key_3'->'key_3_2'->>'key_3_2_1' alias_for_key_3_2_1 --final ->> : this gets the value of 'key_3_2_1' as string
from my_table
where id=10;
or either
select id, data->'key_3'->'key_3_2'->'key_3_2_1' alias_for_key_3_2_1 --final -> : this gets the value of 'key_3_2_1' as json / jsonb
from my_table
where id=10;
More info on JSON Functions and Operators can be find here

How do I Unnest varchar to json in Athena

I am crawling data from Google Big Query and staging them into Athena.
One of the columns crawled as string, contains json :
{
"key": "Category",
"value": {
"string_value": "something"
}
I need to unnest these and flatten them to be able to use them in a query. I require key and string value (so in my query it will be where Category = something
I have tried the following :
WITH dataset AS (
SELECT cast(json_column as json) as json_column
from "thedatabase"
LIMIT 10
)
SELECT
json_extract_scalar(json_column, '$.value.string_value') AS string_value
FROM dataset
which is returning null.
Casting the json_column as json adds \ into them :
"[{\"key\":\"something\",\"value\":{\"string_value\":\"app\"}}
If I use replace on the json, it doesn't allow me as it's not a varchar object.
So how do I extract the values from the some_column field?
Presto's json_extract_scalar actually supports extracting just from the varchar (string) value :
-- sample data
WITH dataset(json_column) AS (
values ('{
"key": "Category",
"value": {
"string_value": "something"
}}')
)
--query
SELECT
json_extract_scalar(json_column, '$.value.string_value') AS string_value
FROM dataset;
Output:
string_value
something
Casting to json will encode data as json (in case of string you will get a double encoded one), not parse it, use json_parse (in this particular case it is not needed, but there are cases when you will want to use it):
-- query
SELECT
json_extract_scalar(json_parse(json_column), '$.value.string_value') AS string_value
FROM dataset;

Search by json key in jsonb column PostgreSQL

I have a table with following structure (simplified):
Sometable:
id
data
1
{"data": [{"type": {"code": "S"}}, {"type": {"code": "aB"}}]}
2
{"data": [{"type": {"code": "B"}}]}
'Data' is jsonb type, json structure is always the same. I need to find all records where 'code equals certain value, for example 'B'.
I've tried:
select * from sometable t
where 'B' in (jsonb_array_elements((t.data->'data'))#>>'{type, code}');
But that gives me an error:
set-returning functions are not allowed in WHERE.
Basically, anything I've tried in 'where' with 'jsonb_array_elements' gives that error. Is there any other way to find records by value of the 'code' key?
You can use the #> operator
select *
from sometable t
where (t.data -> 'data') #> '[{"type": {"code": "B"}}]'
or
select *
from sometable t
where t.data #> '{"data": [{"type": {"code": "B"}}]}'
Online example

How to read JSON key values as a data column in Snowflake?

I have the below sample JSON:
{
"Id1": {
"name": "Item1.jpg",
"Status": "Approved"
},
"Id2": {
"name": "Item2.jpg",
"Status": "Approved"
}
}
and I am trying to get the following output:
_key name Status
Id1 Item1.jpg Approved
Id2 Item2.jpg Approved
Is there any way I can achieve this in Snowflake using SQL?
You should use Snowflake's VARIANT data type in any column holding JSON data. Let's break this down step by step:
create temporary table FOO(v variant); -- Temp table to hold the JSON. Often you'll see a variant column simply called "V"
-- Insert into the variant column. Parse the JSON because variants don't hold string types. They hold semi-structured types.
insert into FOO select parse_json('{"Id1": {"name": "Item1.jpg", "Status": "Approved"}, "Id2": {"name": "Item2.jpg", "Status": "Approved"}}');
-- See how it looks in its raw state
select * from FOO;
-- Flatten the top-level JSON. The flatten function breaks down the JSON into several usable columns
select * from foo, lateral flatten(input => (foo.v)) ;
-- Now traverse the JSON using the column name and : to get to the property you want. Cast to string using ::string.
-- If you must have exact case on your column names, you need to double quote them.
select KEY as "_key",
VALUE:name::string as "name",
VALUE:Status::string as "Status"
from FOO, lateral flatten(input => (FOO.V)) ;

Postgresql update record column based on it's JSONB field values sum

I have a table in Postgresql 9.6.5 with some fields like:
CREATE TABLE test (
id SERIAL,
data JSONB,
amount DOUBLE PRECESION,
PRIMARY KEY(id)
);
In data column there are json objects like this:
{
"Type": 1,
"CheckClose":
{"Payments":
[
{"Type": 4, "Amount": 2068.07},
{"Type": 1, "Amount": 1421.07}
]
}
}
What i need to do is tu put into amount field of each row the SUM of Amount values of Payments field od this data object. For example, for this particular object there should be 2068.07 + 1421.07 = 3489.14.
I've read some stuff about Postgres json and jsonb functions, so here where i am now:
UPDATE test SET amount=sum((jsonb_array_elements(data::jsonb->'CheckClose'->'Payments')->>'Amount')::FLOAT)
That's not working - i get an error about not using agregate functions in UPDATE.
I tried to do this something like this:
UPDATE test SET amount=temp.sum
FROM (
SELECT sum((jsonb_array_elements(data::jsonb->'CheckClose'->'Payments')->>'Amount')::FLOAT) AS "sum"
FROM test WHERE id=test.id
) as "temp"
Now i'm getting an error set-valued function called in context that cannot accept a set
How should i do this? I just need to calculate sum and put it into another row, is that such a hard task?
Please, anyone, help me to figure this out. Thanks.
the set returning fn() aggregation try:
t=# with c(j) as (values('{"Payments":
[
{"Type": 4, "Amount": 2068.07},
{"Type": 1, "Amount": 1421.07}
]
}'::jsonb))
select sum((jsonb_array_elements(j->'Payments')->>'Amount')::float) from c;
error:
ERROR: aggregate function calls cannot contain set-returning function calls
LINE 7: select sum((jsonb_array_elements(j->'Payments')->>'Amount'):...
^
HINT: You might be able to move the set-returning function into a LATERAL FROM item.
can easily be overcame by another cte:
t=# with c(j) as (values('{"Payments":
[
{"Type": 4, "Amount": 2068.07},
{"Type": 1, "Amount": 1421.07}
]
}'::jsonb))
, a as (select (jsonb_array_elements(j->'Payments')->>'Amount')::float am from c)
select sum(am) from a;
sum
---------
3489.14
(1 row)
so now just update from CTE:
with s as (SELECT ((jsonb_array_elements(data::jsonb->'CheckClose'->'Payments')->>'Amount')::FLOAT) AS "sm", id
FROM test
)
, a as (select sum(sm), id from s group by id)
UPDATE test SET amount = sum
FROM a
WHERE id=test.id