Oracle SQL : JSON_TABLE ; Get sized of Array - sql

I would like to know if there is a way i could get the size of the array of a specific node in a JSON data.
As an example, below is the JSON.
For the node 'Phone' there are two items. I would like to know the size of this node.
My objective is to parse the Java Array to a Java Array.
If i am not aware of the size of the array i would not be able to parse.
Essentially i would like to get the value "2".
{"PONumber" : 1600,
"Reference" : "ABULL-20140421",
"Requestor" : "Alexis Bull",
"User" : "ABULL",
"CostCenter" : "A50",
"ShippingInstructions" : {"name" : "Alexis Bull",
"Address": {"street" : "200 Sporting Green",
"city" : "South San Francisco",
"state" : "CA",
"zipCode" : 99236,
"country" : "United States of America"},
"Phone" : [{"type" : "Office", "number" : "909-555-7307"},
{"type" : "Mobile", "number" : "415-555-1234"}]},
"Special Instructions" : null,
"AllowPartialShipment" : true,
"LineItems" : [{"ItemNumber" : 1,
"Part" : {"Description" : "One Magic Christmas",
"UnitPrice" : 19.95,
"UPCCode" : 13131092899},
"Quantity" : 9.0},
{"ItemNumber" : 2,
"Part" : {"Description" : "Lethal Weapon",
"UnitPrice" : 19.95,
"UPCCode" : 85391628927},
"Quantity" : 5.0}]}
As an alternative, Using the below code i would get that data :
SELECT jt.phones
FROM j_purchaseorder,
JSON_TABLE(po_document, '$.ShippingInstructions'
COLUMNS
(phones VARCHAR2(100) FORMAT JSON PATH '$.Phone')) AS jt;
however if i use
SELECT jt.phones
FROM j_purchaseorder,
JSON_TABLE(po_document, '$'
COLUMNS
(ShippingInstructions VARCHAR2(100) FORMAT JSON PATH '$.ShippingInstructions')) AS jt;
i am getting the value as null.
So how can i get the entire ShippingInstructions in a single value.

For the node 'Phone' there are two items. I would like to know the size of this node.
Essentially i would like to get the value "2".
From this question and this question, you can use:
SELECT jt.phones
FROM j_purchaseorder,
JSON_TABLE(
po_document,
'$.ShippingInstructions'
COLUMNS (
phones VARCHAR2(100) FORMAT JSON WITH WRAPPER PATH '$.Phone.size()'
)
) AS jt;
Which outputs:
PHONES
[2]
If you do not want the array wrapper then you can pass the return value through JSON_VALUE:
SELECT JSON_VALUE(jt.phones, '$[0]') AS phones
FROM j_purchaseorder,
JSON_TABLE(
po_document,
'$.ShippingInstructions'
COLUMNS (
phones VARCHAR2(100) FORMAT JSON WITH WRAPPER PATH '$.Phone.size()'
)
) AS jt;
Which outputs:
PHONES
2
If you are Oracle 19c and your column is defined with an IS JSON check constraint then you can simplify the query to:
SELECT j.po_document."ShippingInstructions"."Phone".size() phones
FROM j_purchaseorder j;
Which outputs:
PHONES
2
If you are using an early Oracle version that does not support the size() function then you can get all rows and use COUNT:
SELECT COUNT(*) AS phones
FROM j_purchaseorder,
JSON_TABLE(
po_document,
'$.ShippingInstructions.Phone[*]'
COLUMNS (
phones VARCHAR2(100) FORMAT JSON PATH '$'
)
) AS jt;
Which outputs:
PHONES
2
db<>fiddle here

Related

How do I Unnest varchar to json in Athena

I am crawling data from Google Big Query and staging them into Athena.
One of the columns crawled as string, contains json :
{
"key": "Category",
"value": {
"string_value": "something"
}
I need to unnest these and flatten them to be able to use them in a query. I require key and string value (so in my query it will be where Category = something
I have tried the following :
WITH dataset AS (
SELECT cast(json_column as json) as json_column
from "thedatabase"
LIMIT 10
)
SELECT
json_extract_scalar(json_column, '$.value.string_value') AS string_value
FROM dataset
which is returning null.
Casting the json_column as json adds \ into them :
"[{\"key\":\"something\",\"value\":{\"string_value\":\"app\"}}
If I use replace on the json, it doesn't allow me as it's not a varchar object.
So how do I extract the values from the some_column field?
Presto's json_extract_scalar actually supports extracting just from the varchar (string) value :
-- sample data
WITH dataset(json_column) AS (
values ('{
"key": "Category",
"value": {
"string_value": "something"
}}')
)
--query
SELECT
json_extract_scalar(json_column, '$.value.string_value') AS string_value
FROM dataset;
Output:
string_value
something
Casting to json will encode data as json (in case of string you will get a double encoded one), not parse it, use json_parse (in this particular case it is not needed, but there are cases when you will want to use it):
-- query
SELECT
json_extract_scalar(json_parse(json_column), '$.value.string_value') AS string_value
FROM dataset;

Unable to save JSON to database with field named as order (NiFi)

I have a JSON file:
[ {
"Order" : "Nestle billboard 100%x250",
"Country" : "Russia",
"Order_ID" : 287259619,
"Country_ID" : 243,
"Order_lifetime_impressions" : "3385377",
"Total_unique_visitors" : "1090850",
"Total_reach_impressions" : "3385525",
"Average_impressions_unique_visitor" : 3.1,
"Date" : "2021-07-01"
}, {
"Order" : "Nestle_june_july 2021_ mob 300x250",
"Country" : "Russia",
"Order_ID" : 28734,
"Country_ID" : 263,
"Order_lifetime_impressions" : "1997022",
"Total_unique_visitors" : "1012116",
"Total_reach_impressions" : "1997036",
"Average_impressions_unique_visitor" : 1.97,
"Date" : "2021-07-01"
}]
And table with the same column names. I'm using PutDatabaseRecord processor with this configuration:
When I'm trying to save this file, I get an error.
ERROR: syntax error (at or near: ",") Position: 110
I renamed column in the table and in the json to order_name and processor was able to save it.
But I still want to save it as order if it possible.
I really dont understand why this happens. Yes, order is a keyword for sql, but it's inside ". Is it a bug? How can I fix it without renaming columns?
If I will keep Order as column in JSON, but change column name in database - works fine as well. But of course, I cannot save Order to this renamed column.
Order is a reserved word and you should absolutely avoid using it as a column name if you can. [1] [3]
If you absolutely can't, you need to set the Quote Column Identifiers property to True in the PutDatabaseRecord processor config. [2]
https://www.postgresql.org/docs/current/sql-keywords-appendix.html
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.13.2/org.apache.nifi.processors.standard.PutDatabaseRecord/
Postgres table column name restrictions?

Summing through list of jsons in PostgreSQL

I have a column with string entries with the following format (for example)
"[ { 'state' : 'CA', 'tax_amount' : 3},{ 'state' : 'AZ', 'tax_amount' : 4}]"
I want to sum through the tax_amounts in each entry to get a total tax amount for each row. How can I do this in PostgreSQL?
I would use a scalar sub-query:
select t.*,
(select sum((item ->> 'tax_amount')::int)
from jsonb_array_elements(t.the_column) as x(item)) as total_tax
from the_table t
Online example
You can use JSONB_POPULATE_RECORDSET() function such as
WITH t(js) AS
(
SELECT '[ { "state" : "CA", "tax_amount" : 3},
{ "state" : "AZ", "tax_amount" : 4}]'::JSONB
)
SELECT SUM(tax_amount) AS total_tax_amount
FROM t
CROSS JOIN JSONB_POPULATE_RECORDSET(NULL::record,js )
AS tab(state VARCHAR(10), tax_amount INT)
total_tax_amount
----------------
7
after fixing the syntax of the JSON object by replacing single-quotes with double-quotes, and double-quotes wrapping up whole object with single-quotes.
Demo

I can not extract data from a JSON type column in PostgreSQL

I want to extract data from a json type column to insert them into a table in order to normalize a database.
The JSON type column is called "info" and an example of a record is the following:
[ { "major" : "International business",
"end" : "2007",
"name" : "Annamalai University",
"degree" : "Master Degree",
"start" : "2005", "desc" : ""
},
{ "major" : "Mechanical Engineering",
"end" : "1990",
"name" : "Bharathidasan University",
"degree" : "Bachelor Degree",
"start" : "1990", "desc" : ""
}
]
This is my code:
SELECT id,
(json_array_elements(info)->>'education')::json ->> 'key' AS key1
FROM perfiles
WHERE id = 1252710;
This is the result I want to obtain:
table result example
How should I do the query?
Thanks in advance
You may use cross join lateral with json_array_elements and list the elements in the select
SELECT p.id,
j->>'major'::text AS major,
(j->>'end')::int AS "end",
j->>'name' AS NAME,
j->>'degree' AS degree,
j->>'start' AS start,
j->>'desc' AS "desc"
FROM perfiles p
CROSS JOIN LATERAL json_array_elements(info) AS j
Or use json_to_recordset by specifying column list in the FROM clause
select p.id,
j.* FROM perfiles p
cross join lateral json_to_recordset(info)
as j(major text, "end" int, name text, degree text, start int, "desc" text);
Demo
use json_to_recordset
SELECT x.*
FROM pjson_table
, json_to_recordset(myjson::json) x
( major text
, "end" text
, name text
, degree text
, start text
,"desc" text
)
demo link
major end name degree start
International business 2007 Annamalai University Master Degree 2005
Mechanical Engineering 1990 Bharathidasan University Bachelor Degree 1990
Try something like this
select *
from (
select
json_to_recordset(info) as ed(major text, end int, name text, degree text, start int, desc text)
from perfiles
where id = 1252710
)
Ref: https://www.postgresql.org/docs/9.5/functions-json.html

Query JSON Attributes from JSON-formatted Column in Database

here is the problem I am facing
I have a table called GAMELOG (well, could be SQL Table or NoSQL column family), that looks like this :
ID INT,
REQUESTDATE DATE,
REQUESTMESSAGE VARCHAR,
RESPONSEDATE DATE,
RESPONSEMESSAGE VARCHAR
the REQUESTMESSAGE and RESPONSEMESSAGE column is JSON-formatted.
let's say for example a specific value in REQUESTMESSAGE is :
{
"name" : "John",
"specialty" : "Wizard",
"joinDate" : "17-Feb-1988"
}
and for the RESPONSEMESSAGE is :
{
"name" : "John Doe",
"specialty" : "Wizard",
"joinDate" : "17-Feb-1988",
"level" : 89,
"lastSkillLearned" : "Megindo"
}
now, the data in my table has grown to incredibly large (around billion rows, few terabytes harddisk space)
What I want to do is query the row which contains JSON Property of "name" that has a value of "John" in the REQUESTMESSAGE
What I understand about SQL Database, well Oracle that I've used before, I have to make the REQUESTMESSAGE and RESPONSEMESSAGE as a CLOB, and query using LIKE, i.e.
SELECT * FROM GAMELOG WHERE REQUESTMESSAGE LIKE '%"name" : "John"%';
But, the result is very slow and painfully.
Now, I move to Cassandra, but I don't know how to properly query it, well I haven't used Apache Hadoop yet to get the data, which I intended to use it to get the data sometimes later.
My question is, is there a database product that support the query to select the JSON-formatted JSON Attribute inside the table/column family? As far as I know, MongoDB stores document in JSON, but that means that all of my column family will be stored as JSON, i.e.
{
"ID" : 1,
"REQUESTMESSAGE" : "{
"name" : "John",
"specialty" : "Wizard",
"joinDate" : "17-Feb-1988"
}",
"REQUESTDATE" : "17-Feb-1967"
"RESPONSEMESSAGE" : "{
"name" : "John Doe",
"specialty" : "Wizard",
"joinDate" : "17-Feb-1988",
"level" : 89,
"lastSkillLearned" : "Megindo"
}",
"RESPONSEDATE" : "17-Feb-1967"
}
and I still have trouble to get the JSON Attributes inside the REQUESTMESSAGE column (please correct me if I'm wrong)
Thank you very much
If you aren't committed to storing your data in Apache Cassandra, MySQL has SQL query functions that can extract data from JSON values, in particular, you would want to look at the JSON_EXTRACT function: https://dev.mysql.com/doc/refman/5.7/en/json-search-functions.html
In your case, the query should look something like the following:
SELECT REQUESTMESSAGE, JSON_EXTRACT(REQUESTMESSAGE, "$.name")
FROM GAMELOG
WHERE JSON_EXTRACT(REQUESTMESSAGE, "$.name") = "John";