SQL to convert JSON object into array of objects in Posgres - sql

I have a column of JSON type in a postgres table. It currently has values like this
{"value": "abc"}
I want to write a SQL query that can change this to
[{"value": "abc", "timestamp": 1465373673}]
The part timestamp: 1465373673 will be hard coded
Any ideas on how this SQL query can be written?

You can use json_build_array and json_build_object:
UPDATE test
set a = json_build_array(
json_build_object('value', a->'value', 'timestamp', 1465373673)
);
Here's a fiddle.

Use the concatenation operator and the function jsonb_build_array():
select jsonb_build_array('{"value": "abc"}'::jsonb || '{"timestamp": 1465373673}');
jsonb_build_array
---------------------------------------------
[{"value": "abc", "timestamp": 1465373673}]
(1 row)
Read JSON Functions and Operators.

Related

How to remove array wrapper from mariadb server

I want to remove the array wrapper surrounding a query result as I'm running a for loop to push the object into an array. This is my query
"SELECT * FROM jobs WHERE id = ? FOR JSON PATH, WITHOUT_ARRAY_WRAPPER"
but I'm getting this result in postman
{
"status": "Failed",
"message": "You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 'JSON PATH, WITHOUT_ARRAY_WRAPPER' at line 1"
}
for json path is a feature of Microsoft SQL Server. There is a standard for JSON in SQL, but don't expect most SQL servers to follow it.
You can get a single JSON object for each row with json_object.
-- {"id": 2, "name": "Bar"}
select
json_object('id', id, 'name', name)
from jobs
where id = 2
Rather than query each job individually and then appending to an array, you can do this in single query using the in operator to query all desired rows at once, and then json_arrayagg to aggregate them into a single array.
-- [{"id": 1, "name": "Foo"},{"id": 3, "name": "Baz"}]
select
json_arrayagg( json_object('id', id, 'name', name) )
from jobs
where id in (1, 3)
This is much more efficient. In general, if you're querying SQL in loops there's a better way.
Demonstration.

PostgreSQL JSON - String_agg in json data with multiple objects

I have create table with Json datatype field in PostgreSQL.
Also i have inserted data with multiple object in json data
like this
[
{
"Type": "1",
"Amount": "1000",
"Occurrence": 2,
"StartDate": "1990-01-19",
"EndDate": "1999-04-03"
},
{
"Type": "2",
"Amount": "2000",
"Occurrence": 2,
"StartDate": "1984-11-19",
"EndDate": "1997-09-29"
}
]
Now i have to retrieve my data as per below formate in single row like string_agg() function output.
Type Amount
1--2 1000-2000
also i have checked inbuilt function for json in PostgreSQL (https://www.postgresqltutorial.com/postgresql-json/) but not find any solutions for the same.
You will have to unnest the array and aggregate the individual keys:
The following assumes you have some kind of primary key column on the table (in addition to your JSON column):
select t.id,
string_agg(x.element ->> 'Type', '-') as types,
string_agg(x.element ->> 'Amount', '-') as amounts
from the_table t
cross join jsonb_array_elements(t.data) as x(element)
group by t.id;
jsonb_array_elements() extracts each array element and the main query then aggregates that back per ID using string_agg().
The above assumes your column is defined with the type jsonb (which it should be). If it is not, you need to use json_array_elements() instead.
Online example

Retrieve data within a SQL JSON Object, and array

I have data formatted in the following way in a column named value:
{
"data" : [
"AVM": "1,000",
"location": "CA"
]
}
I am trying to write a simple SQL query to retrieve the AVM values for the entire dataset stored in a postgresql database, which is a couple thousand records.
Does anyone know an elegant solution to be able to do this?
select p."value" -> 'data'
from table as p;
But not able to dig into the array to retrieve the AVM values.
If you try the value as written:
select '{ data: [ "AVM": "1,000", "location": "CA" ] }'::json;
ERROR: invalid input syntax for type json
LINE 1: select '{ data: [ "AVM": "1,000", "location": "CA" ] }'::jso...
Assuming the data is YAML you could pull it out of the database and use a YAML parser to get the data. An example in Python(https://pyyaml.org/):
import yaml
y_str = '{ "data" : [ "AVM": "1,000", "location": "CA" ] }'
y_dict = yaml.safe_load(y_str)
y_dict
{'data': [{'AVM': '1,000'}, {'location': 'CA'}]}
y_dict["data"][0]["AVM"]
'1,000'
If you have plpythonu or plpython3u installed in database you could write a function that did the same thing.
I was able to complete the query using the following in a Postgre Database:
select (t.value -> 'data' ->> 0)::json -> 'AVM' as "AVM"
from table as t;
The general format for pseudo code.
What you are looking for is a query similar to this
SELECT JSON_VALUE(value, '$.data.AMV')
FROM {tableName}
You can also filter data:
SELECT JSON_VALUE(value, '$.data.AMV')
FROM {tableName}
WHERE JSON_VALUE(value, '$.data.location') = 'CA'
More details here

Get JSON_VALUE with Oracle SQL when multiple nodes share the same name

I have an issue where I have some JSON stored in my oracle database, and I need to extract values from it.
The problem is, there are some fields that are duplicated.
When I try this, it works as there is only one firstname key in the options array:
SELECT
JSON_VALUE('{"increment_id":"2500000043","item_id":"845768","options":[{"firstname":"Kevin"},{"lastname":"Test"}]}', '$.options.firstname') AS value
FROM DUAL;
Which returns 'Kevin'.
However, when there are two values for the firstname field:
SELECT JSON_VALUE('{"increment_id":"2500000043","item_id":"845768","options":[{"firstname":"Kevin"},{"firstname":"Okay"},{"lastname":"Test"}]}', '$.options.firstname') AS value
FROM DUAL;
It only returns NULL.
Is there any way to select the first occurence of 'firstname' in this context?
JSON_VALUE returns one SQL VALUE from the JSON data (or SQL NULL if the key does not exists).
If you have a collection of values (a JSON array) an you want one specific item of the array you use array subscripts (square brackets) like in JavaScript, for example [2] to select the third item. [0] selects the first item.
To get the first array item in your example you have to change the path expression from '$.options.firstname' to '$.options[0].firstname'
You can follow this query:-
SELECT JSON_VALUE('{
"increment_id": "2500000043",
"item_id": "845768",
"options": [
{
"firstname": "Kevin"
},
{
"firstname": "Okay"
},
{
"lastname": "Test"
}
]
}', '$.options[0].firstname') AS value
FROM DUAL;

Postgresql: Find values in JSON array by wildcard and comparison operators with index

I have a table with JSON array data I'd like to search.
CREATE TABLE data (id SERIAL, json JSON);
INSERT INTO data (id, json)
VALUES (1, '[{"name": "Value A", "value": 10}]');
INSERT INTO data (id, json)
VALUES (2, '[{"name": "Value B1", "value": 5}, {"name": "Value B2", "value": 15}]');
As described in this answer, i created a function, which also allows to create an index on the array data (important).
CREATE OR REPLACE FUNCTION json_val_arr(_j json, _key text)
RETURNS text[] AS
$$
SELECT array_agg(elem->>_key)
FROM json_array_elements(_j) AS x(elem)
$$
LANGUAGE sql IMMUTABLE;
This works nicely if I want to find an entire value (eg. "Value B1"):
SELECT *
FROM data
WHERE '{"Value B1"}'::text[] <# (json_val_arr(json, 'name'));
Now my questions:
Is it possible to find values with a wildcard (eg. "Value*")? Something like the following (naive) approach:
...
WHERE '{"Value%"}'::text[] <# (json_val_arr(json, 'name'));
Is it possible to find numeric values with comparison operators (eg. >= 10)? Again, a naive and obviously wrong approach:
...
WHERE '{10}'::int[] >= (json_val_arr(json, 'value'));
I tried to create a new function returning int[] but that did not work.
I created a SQL Fiddle to illustrate my problem.
Or would it be better to use a different approach like the following working queries:
SELECT *
FROM data,
json_array_elements(json) jsondata
WHERE jsondata ->> 'name' LIKE 'Value%';
and
...
WHERE cast(jsondata ->> 'value' as integer) <= 10;
However, for these queries, I was not able to create any index that was actually picked up by the queries.
Also, I'd like to implement all this in Postgresql 9.4 with JSONB eventually, but I think for the above questions this should not be an issue.
Thank you very much!
I know its been a while but I was just chugging on something similar (using wild cards to query json datatypes) and thought I'd share what I found.
Firstly, this was a huge point in the right direction:
http://schinckel.net/2014/05/25/querying-json-in-postgres/
The take away is that your method of exploding the json element into something else (a record-set) is the way to go. It lets you query the json elements with normal postgres stuff.
In my case:
#Table:test
ID | jsonb_column
1 | {"name": "", "value": "reserved", "expires_in": 13732}
2 | {"name": "poop", "value": "{\"ns\":[\"Whaaat.\"]}", "expires_in": 4554}
3 | {"name": "dog", "value": "{\"ns\":[\"woof.\"]}", "expires_in": 4554}
Example Query
select * from test jsonb_to_recordset(x) where jsonb_column->>'name' like '%o%';
# => Returns
# 2 | {"name": "poop", "value": "{\"ns\":[\"Whaaat.\"]}", "expires_in": 4554}
And to answer your question about jsonb: It looks like jsonb is the better route MOST of the time. It has more methods and faster read (but slower write) times.
Sources:
http://www.postgresql.org/docs/9.4/static/functions-json.html
http://www.postgresql.org/docs/9.4/static/datatype-json.html
Happy hunting!