I know that this returns the "2":
SELECT JSON_VALUE('{"x": "1", "y": {"a": "2"}}}', '$.y.a') AS value FROM DUAL;
How do I return "a" from this query:
SELECT JSON_VALUE('{"x": "1", "y": {"a": "2"}}', '????') AS value FROM DUAL;
This returns null
SELECT JSON_VALUE('{"x": "1", "y": {"a": "2"}}', '$.y') AS value FROM DUAL;
Assuming you are using Oracle 12 or later (which is when JSON support was introduced) then to get the y attribute as JSON then you can use:
SELECT y
FROM JSON_TABLE(
'{"x": "1", "y": {"a": "2"}}',
'$'
COLUMNS
y VARCHAR2(4000) FORMAT JSON PATH '$.y'
);
Which outputs:
Y
{"a":"2"}
fiddle
Related
Given I have rows in my database, with a JSONB column that holds an array of items as such:
[
{"type": "human", "name": "Alice"},
{"type": "dog", "name": "Fido"},
{"type": "dog", "name": "Pluto"}
]
I need to be able to query rows based on this column. The query I want to write is a check to see if my array argument intersects, at any point, with this column.
Eg:
If I search for [{"type": "human", "name": "Alice"}], I should get a hit.
If I search for [{"type": "human", "name": "Alice"}, {"type": "dog", "name": "Doggy"}] I should also get a hit (Since one of the objects intersects)
I've tried using the ?| operator, but according to the docs, comparison is only made by keys. I need to match the entire jsonb object
You can use exists with cross join:
select t.* from tbl t where exists (select 1 from jsonb_array_elements(t.items) v
cross join jsonb_array_elements('[{"type": "human", "name": "Alice"}, {"type": "dog", "name": "Doggy"}]'::jsonb) v1
where v.value = v1.value)
See fiddle.
As a function:
create or replace function get_results(param jsonb)
returns table(items jsonb)
as $$
select t.* from tbl t where exists (select 1 from jsonb_array_elements(t.items) v
cross join jsonb_array_elements(param) v1
where v.value = v1.value)
$$ language sql;
See fiddle.
I have found this question PostgreSQL: Efficiently split JSON array into rows
I have a similar situation but for inserts instead.
Considering I do not have a table but raw json in a ndjson file...
{"x": 1}
{"x": 2, "y": 3}
{"x": 8, "z": 3}
{"x": 5, "y": 2, "z": 3}
I want to insert the data into a table of the form (where json fields which do not have a column are stored in the json column)
x
y
json
1
NULL
NULL
2
3
NULL
8
NULL
{"z": 3}
5
2
{"z": 3}
How do I define my table such that postgresql does it automatically on insert or \copy
Use the operator -> and cast the value to the proper type for values of existing regular columns. Use the delete operator to get the remaining JSON values.
I have used CTE in the example. Instead, create the table json_data with a single JSONB column and copy the JSON file to it with \copy
with json_data(json) as (
values
('{"x": 1}'::jsonb),
('{"x": 2, "y": 3}'),
('{"x": 8, "z": 3}'),
('{"x": 5, "y": 2, "z": 3}')
)
select
(json->'x')::int as x,
(json->'y')::int as y,
nullif(json- 'x'- 'y', '{}') as json
from json_data
Read about JSON Functions and Operators in the documentation.
Note. In Postgres 10 or earlier use the ->> operator instead of ->.
To automate the conversion when importing json data, define a trigger:
create table json_data(json jsonb);
create or replace function json_data_trigger()
returns trigger language plpgsql as $$
begin
insert into target_table
select
(new.json->>'x')::int,
(new.json->>'y')::int,
nullif(new.json- 'x'- 'y', '{}');
return new;
end $$;
create trigger json_data_trigger
before insert on json_data
for each row execute procedure json_data_trigger();
Test it in Db<>Fiddle.
MongoDB has a way of choosing the fields of a JSON documents that are returned as a result of query. I am looking for the same with PostgreSQL.
Let's assume that I've got a JSON like this:
{
a: valuea,
b: valueb,
c: valuec,
...
z: valuez
}
The particular values may be either simple values or subobjects with further nesting.
I want to have a way of returning JSON Documents containing only the atttributes I choose, something like:
SELECT json_col including_only a,b,c,g,n from table where...
I know that there is the "-" operator, allowing eliminating specific attributes, but is there an operator that does exactly the opposite?
In trivial cases you can use jsonb_to_record(jsonb)
with data(json_col) as (
values
('{"a": 1, "b": 2, "c": 3, "d": 4}'::jsonb)
)
select *, to_jsonb(rec) as result
from data
cross join jsonb_to_record(json_col) as rec(a int, d int)
json_col | a | d | result
----------------------------------+---+---+------------------
{"a": 1, "b": 2, "c": 3, "d": 4} | 1 | 4 | {"a": 1, "d": 4}
(1 row)
See JSON Functions and Operators.
If you need a more generic tool, the function does the job:
create or replace function jsonb_sparse(jsonb, text[])
returns jsonb language sql immutable as $$
select $1 - (
select array_agg(key)
from jsonb_object_keys($1) as key
where key <> all($2)
)
$$;
-- use:
select jsonb_sparse('{"a": 1, "b": 2, "c": 3, "d": 4}', '{a, d}')
Test it in db<>fiddle.
I'm looking to do a query on a column in my database but the column is of type jsonb. This is an example of the structure:
select json_column->>'left' from schema.table;
[{"id": 123, "name": "Joe"},
{"id": 456, "name": "Jane"},
{"id": 789, "name": "John"},
{"id": 159, "name": "Jess"}]
Essentially I'm just trying to return all the name fields from this but I can't figure it out.
I have tried
select json_column->'left'->>'name' from schema.table
But this returns a blank value just.
I have also tried:
select elem->>'name'
from schema.table m,
jsonb_array_elements(json_column->'left') elem;
But that gives me:
ERROR: cannot extract elements from an object
This seems to work when I have a where clause inserted, for example:
select elem->>'name'
from schema.table m,
jsonb_array_elements(json_column->'left') elem
where m.id = 1;
I have a table in Postgresql 9.6.5 with some fields like:
CREATE TABLE test (
id SERIAL,
data JSONB,
amount DOUBLE PRECESION,
PRIMARY KEY(id)
);
In data column there are json objects like this:
{
"Type": 1,
"CheckClose":
{"Payments":
[
{"Type": 4, "Amount": 2068.07},
{"Type": 1, "Amount": 1421.07}
]
}
}
What i need to do is tu put into amount field of each row the SUM of Amount values of Payments field od this data object. For example, for this particular object there should be 2068.07 + 1421.07 = 3489.14.
I've read some stuff about Postgres json and jsonb functions, so here where i am now:
UPDATE test SET amount=sum((jsonb_array_elements(data::jsonb->'CheckClose'->'Payments')->>'Amount')::FLOAT)
That's not working - i get an error about not using agregate functions in UPDATE.
I tried to do this something like this:
UPDATE test SET amount=temp.sum
FROM (
SELECT sum((jsonb_array_elements(data::jsonb->'CheckClose'->'Payments')->>'Amount')::FLOAT) AS "sum"
FROM test WHERE id=test.id
) as "temp"
Now i'm getting an error set-valued function called in context that cannot accept a set
How should i do this? I just need to calculate sum and put it into another row, is that such a hard task?
Please, anyone, help me to figure this out. Thanks.
the set returning fn() aggregation try:
t=# with c(j) as (values('{"Payments":
[
{"Type": 4, "Amount": 2068.07},
{"Type": 1, "Amount": 1421.07}
]
}'::jsonb))
select sum((jsonb_array_elements(j->'Payments')->>'Amount')::float) from c;
error:
ERROR: aggregate function calls cannot contain set-returning function calls
LINE 7: select sum((jsonb_array_elements(j->'Payments')->>'Amount'):...
^
HINT: You might be able to move the set-returning function into a LATERAL FROM item.
can easily be overcame by another cte:
t=# with c(j) as (values('{"Payments":
[
{"Type": 4, "Amount": 2068.07},
{"Type": 1, "Amount": 1421.07}
]
}'::jsonb))
, a as (select (jsonb_array_elements(j->'Payments')->>'Amount')::float am from c)
select sum(am) from a;
sum
---------
3489.14
(1 row)
so now just update from CTE:
with s as (SELECT ((jsonb_array_elements(data::jsonb->'CheckClose'->'Payments')->>'Amount')::FLOAT) AS "sm", id
FROM test
)
, a as (select sum(sm), id from s group by id)
UPDATE test SET amount = sum
FROM a
WHERE id=test.id