Is there a function in BigQuery to create a typed STRUCT from a JSON object? Here is an example of what I'm trying to do:
with tbl as (
SELECT
STRUCT<name STRING, age INT64>("Bob", 20) AS person_as_struct,
JSON '{"name": "Bob", "age": 20}' AS person_as_json
)
select
-- STRUCT -> JSON
TO_JSON(person_as_struct).age
-- JSON -> STRUCT
-- ???
from tbl
For your requirement, json_extract_scalar can be used which extracts scalar value and returns it as string. You can try the below sample query.
Query:
WITH
jack AS (
SELECT
'''
[
{
"Source": "Internet Movie Database",
"Value": "7.8/10"
},
{
"Source": "Rotten Tomatoes",
"Value": "89%"
},
{
"Source": "Metacritic",
"Value": "75/100"
}
]
''' json )
SELECT
ARRAY(
SELECT
AS STRUCT JSON_EXTRACT_SCALAR(rec, '$.Source') AS SOURCE, JSON_EXTRACT_SCALAR(rec, '$.Value') AS Value
FROM
t.arr AS rec ) AS Reviews
FROM
jack,
UNNEST([STRUCT(JSON_EXTRACT_ARRAY(json) AS arr)]) t
Output:
Related
My Postgres DB Table BookingDetails has columns - id(bigint), info(jsonb)
My Json structure:
"bookingType": “special”,
“travellers”: [
{
"id": 1,
"nationality": "SomeValue",
}
],
“someTag”: “abc”
}
The travellers here is an array (and in this example, there’s only one element)
I want to:
Select records where nationality=‘CertainValue’
Update records set nationality=‘AnotherValue’ where nationality=‘CertainValue’
I could retrieve the array using:
select CAST (info->'travellers' AS TEXT) from BookingDetails
[
{
"id": 1,
"nationality": "SomeValue",
}
]
I could retrieve the array first element using:
select CAST (info->'travellers'->>0 AS TEXT) from BookingDetails
{
"id": 1,
"nationality": "SomeValue"
}
But how to do a select based on nationality=‘CertainValue’?
try this :
SELECT jsonb_set(info, array['travellers', (a.id - 1) :: text, 'nationality'], to_jsonb('AnotherValue' :: text))
FROM BookingDetails
CROSS JOIN LATERAL jsonb_array_elements(info->'travellers') WITH ORDINALITY AS a(content, id)
WHERE a.content->>'nationality' = 'CertainValue'
see the test result in dbfiddle
This is my json data in one of the oracle sql columns "jsoncol" in a table named "jsontable"
{
"Company": [
{
"Info": {
"Address": "123"
},
"Name": "ABC",
"Id": 999
},
{
"Info": {
"Address": "456"
},
"Name": "XYZ",
"Id": 888
}
]
}
I am looking for an UPDATE query to update all the value of "Name" with a new value based on a particular "Id" value.
Thanks in advance
From Oracle 19, you can use JSON_MERGEPATCH:
UPDATE jsontable j
SET jsoncol = JSON_MERGEPATCH(
jsoncol,
(
SELECT JSON_OBJECT(
KEY 'Company'
VALUE JSON_ARRAYAGG(
CASE id
WHEN 999
THEN JSON_MERGEPATCH(
json,
'{"Name":"DEF"}'
)
ELSE json
END
FORMAT JSON
RETURNING CLOB
)
FORMAT JSON
RETURNING CLOB
)
FROM jsontable jt
CROSS APPLY JSON_TABLE(
jt.jsoncol,
'$.Company[*]'
COLUMNS(
json VARCHAR2(4000) FORMAT JSON PATH '$',
id NUMBER PATH '$.Id'
)
)
WHERE jt.ROWID = j.ROWID
)
)
Which, for the sample data:
CREATE TABLE jsontable (
jsoncol CLOB CHECK (jsoncol IS JSON)
);
INSERT INTO jsontable (jsoncol)
VALUES ('{
"Company": [
{
"Info": {
"Address": "123"
},
"Name": "ABC",
"Id": 999
},
{
"Info": {
"Address": "456"
},
"Name": "XYZ",
"Id": 888
}
]
}');
Then after the UPDATE, the table contains:
JSONCOL
{"Company":[{"Info":{"Address":"123"},"Name":"DEF","Id":999},{"Info":{"Address":"456"},"Name":"XYZ","Id":888}]}
db<>fiddle here
You can use REPLACE() within JSON_TABLE() function in order to update the value of the Name(from ABC to DEF) for a specific Id value(999) such as
UPDATE jsontable jt0
SET jsoncol = ( SELECT REPLACE(jsoncol,jt.name,'DEF')
FROM jsontable j,
JSON_TABLE(jsoncol,
'$' COLUMNS(NESTED PATH '$.Company[*]'
COLUMNS(
name VARCHAR2 PATH '$.Name',
id INT PATH '$.Id'
)
)
) jt
WHERE jt.id = 999
AND j.id = jt0.id )
for the DB version prior to 19 provided that the identity values(id) of the table and of the JSON values are unique throughout the table and each individual JSON value, respectively.
Demo
I have a table structure as below in SQL Server database,
I want to populate the data from database something similar to JSON like as below:
id: 1
aname: xyz
categories: bus
{
arnam: res
street: [s1,s2]
},
{
arnam: com
street: [c1,c2]
}
Can someone please guide me as to how I can do this in the database using normal SQL query or procedure.
Your json is not valid, but according to your table I think you want to know how to parse data from nested JSON with array of values with this structure:
WITH cte AS (
SELECT * FROM (VALUES
('{"id": 1, "aname": "xyz",
"categories": {
"bus": [
{"aname": "res",
"street": ["c1", "c2"]
},
{"aname": "res",
"street": ["s1", "s2"]
}]
}
}'),
('{"id": 2, "aname": "abc",
"categories": {
"bus": [
{"aname": "foo",
"street": ["c1", "c2"]
},
{"aname": "zoo",
"street": ["s1", "s2"]
}]
}
}')
) t1 ([json])
)SELECT
ROW_NUMBER() OVER(ORDER BY [id]) AS RN,
*
FROM cte AS e
CROSS APPLY OPENJSON(e.[json]) WITH (
[id] int '$.id',
[aname] VARCHAR(100) '$.aname',
[categories_jsn] NVARCHAR(MAX) '$.categories.bus' AS JSON
) AS jsn
CROSS APPLY OPENJSON([categories_jsn]) WITH (
[street_arr] NVARCHAR(MAX) '$.street' AS JSON,
[aname_lvl2] VARCHAR(20) '$.aname'
) AS jsn2
CROSS APPLY OPENJSON([street_arr]) WITH (
[street] VARCHAR(20) '$'
)
Output:
I have a jsonB field in postgresql DB table, it has data like this
{
"units": [
{
"id": 299872379221376,
"unitNumber": "1",
"unitFloorSpace": 1,
"createdTimeStamp": 1587994498586
},
{
"id": 299872417011074,
"unitNumber": "2",
"unitFloorSpace": 2,
"createdTimeStamp": 1588001330085
}
]
}
I just want to list all unitNumbers like below, what would be the query for that?
1,
2,
I have tried below json query but that doesn’t list
Select form_data -> units -> unitNumbers from table where row_id =1;
here is one way:
select jsonb_array_elements(jsonb_extract_path(jdata,'units')) ->> 'unitNumber' as UnitNumber
from tableName;
db<>fiddle here
I'm using Presto and trying to extract all 'id' from 'source'='dd' from a nested json structure as following.
{
"results": [
{
"docs": [
{
"id": "apple1",
"source": "dd"
},
{
"id": "apple2",
"source": "aa"
},
{
"id": "apple3",
"source": "dd"
}
],
"group": 99806
}
]
}
expected to extract the ids [apple1, apple3] into a column in Presto
I am wondering what is the right way to achieve this in Presto Query?
If your data has a regular structure as in the example you posted, you can use a combination of parsing the value as JSON, casting it to a structured SQL type (array/map/row) and the using array processing functions to filter, transform and extract the elements you want:
WITH data(value) AS (VALUES '{
"results": [
{
"docs": [
{
"id": "apple1",
"source": "dd"
},
{
"id": "apple2",
"source": "aa"
},
{
"id": "apple3",
"source": "dd"
}
],
"group": 99806
}
]
}'),
parsed(value) AS (
SELECT cast(json_parse(value) AS row(results array(row(docs array(row(id varchar, source varchar)), "group" bigint))))
FROM data
)
SELECT
transform( -- extract the id from the resulting docs
filter( -- filter docs with source = 'dd'
flatten( -- flatten all docs arrays into a single doc array
transform(value.results, r -> r.docs) -- extract the docs arrays from the result array
),
doc -> doc.source = 'dd'),
doc -> doc.id)
FROM parsed
The query above produces:
_col0
------------------
[apple1, apple3]
(1 row)