Json Querying using SQL Server - sql

I have a Json data stored in SQL server:
{
"group":{
"operator":"AND",
"rules":[
{
"condition":"=",
"field":"F1",
"table":"ATT",
"data":"TEST",
"readOnly":false,
"hidden":false,
"$$hashKey":"005"
},
{
"condition":"=",
"field":"CLASS",
"table":"OBJ",
"data":"A1",
"readOnly":false,
"hidden":false,
"$$hashKey":"008"
},
{
"group":{
"operator":"AND",
"rules":[
{
"condition":"=",
"field":"F1",
"table":"ATT",
"data":"TEST2",
"readOnly":false,
"hidden":false,
"$$hashKey":"00D"
},
{
"condition":"=",
"field":"F1",
"table":"ATT",
"data":"TEST3",
"readOnly":false,
"hidden":false,
"$$hashKey":"00G"
}
]
},
"table":"",
"$$hashKey":"009"
}
]
}
}
How can I get the count of the element field having value =F1 using SQL?

DECLARE #json NVARCHAR(MAX) = '{"group":{
"operator":"AND",
"rules":
[{"condition":"=",
"field":"F1",
"table":"ATT",
"data":"TEST",
"readOnly":false,
"hidden":false,
"$$hashKey":"005"},
{"condition":"=",
"field":"BANKID",
"table":"ATT",
"data":"A1",
"readOnly":false,
"hidden":false,
"$$hashKey":"008"}]}}';
SELECT COUNT(*)
FROM OPENJSON(#json, '$.group.rules')
WHERE JSON_VALUE(value, '$.field') = 'F1'
You have a table, you say? CROSS APPLY is your friend:
SELECT T.[data], rules.F1_count
FROM T CROSS APPLY (
SELECT COUNT(*) AS F1_count
FROM OPENJSON(jsonData, '$.group.rules')
WHERE JSON_VALUE(value, '$.field') = 'F1'
) AS rules

Try this ->
Select Count(*)
From mytable
WHERE JSON_VALUE(Serialized, '$.field')="F1"

As an advice JSON data should not be stored it self. You should make tables and columns depending on the data you are storing. Maybe make a database called group, with a table called operator, which would have three columns, operator, rule and rule value. So it would look something like this:
Operator Rule RuleValue
-------- ----------- ---------
| #1 | condition | = |
| #1 | field | f1 |
| #1 | table | ATT |
| #2 | data | test |
You can change the table to your needs but remember JSON and XML is a way of storing data in files. So when storing data in database, you shouldn't store it in XMl or JSON

Related

SQL query to run over object of an object with array of data

| some_data |
-------------
| cards: [{"story-elements": [{
"metadata": {
"video-url": "somesite.com/video/x8cse6q?pubtool",
"video-id": "x8cse6q?pubtool"},
"subtype": "abc-video"
}]|
How to filter data from some_data column where cards consists of video-url with query params when subtype is abc-video
select *
from table_name
WHERE ("some_data" #> '{"video-id"}')::varchar like '%?pubtool%';

SQL select from array in JSON

I have a table with a json field with the following json:
[
{
productId: '1',
other : [
otherId: '2'
]
},
{
productId: '3',
other : [
otherId: '4'
]
}
]
I am trying to select the productId and otherId for every array element like this:
select JSON_EXTRACT(items, $.items[].productId) from order;
But this is completely wrong since it takes only the first element in the array
Do I need to write a loop or something?
First of all, the data you show is not valid JSON. It has multiple mistakes that make it invalid.
Here's a demo using valid JSON:
mysql> create table orders ( items json );
mysql> insert into orders set items = '[ { "productId": "1", "other": { "otherId": "2" } }, { "productId": "3", "other" : { "otherId": "4" } } ]'
mysql> SELECT JSON_EXTRACT(items, '$[*].productId') AS productIds FROM orders;
+------------+
| productIds |
+------------+
| ["1", "3"] |
+------------+
If you want each productId on a row by itself as a scalar value instead of a JSON array, you'd have to use JSON_TABLE() in MySQL 8.0:
mysql> SELECT j.* FROM orders CROSS JOIN JSON_TABLE(items, '$[*]' COLUMNS(productId INT PATH '$.productId')) AS j;
+-----------+
| productId |
+-----------+
| 1 |
| 3 |
+-----------+
This is tested in MySQL 8.0.23.
You also tagged your question MariaDB. I don't use MariaDB, and MariaDB has its own incompatible implementation of JSON support, so I can't predict how it will work.

Postgresql update JSONB object to array

I don't know why, but probably PHP persisted some of my data as object and some of them as array. My table looks something like:
seller_info_address Table:
ID (INT) | address (JSONB) |
------------+--------------------------------------------------|
1 | {"addressLines":{"0":"Technology Park",...},...} |
2 | {"addressLines":["Technology Park",...],...} |
Some addressLines are objects:
{
"addressLines": {
"0": "Technology Park",
"1": "Blanchard Road",
"3": "Dublin",
"4": "2"
},
"companyName": "...",
"emailAddress": [],
"...": "..."
}
Some addressLines are arrays:
{
"addressLines": [
"Technology Park",
"Blanchard Road",
"Dublin",
"2"
],
"companyName": "...",
"emailAddress": [],
"...": "..."
}
I would like to equalize the data with a SQL query, but I'm not sure how to do it. All addressLines persisted as object should be updated to array form.
I am grateful for help, thanks!
You can convert the objects to an array using this:
select id, (select jsonb_agg(e.val order by e.key::int)
from jsonb_each(sia.address -> 'addressLines') as e(key,val))
from seller_info_address sia
where jsonb_typeof(address -> 'addressLines') = 'object'
The where condition makes sure we only do this for addresslines that are not an array.
The aggregation used can also be used inside an UPDATE statement:
update seller_info_address
set address = jsonb_set(address, '{addressLines}',
(select jsonb_agg(e.val order by e.key::int)
from jsonb_each(address -> 'addressLines') as e(key,val))
)
where jsonb_typeof(address -> 'addressLines') = 'object';
Ok, I have now found a solution myself. Definitely not the most eloquent solution. I'm sure there's a nicer and more efficient one, but it works...
DROP FUNCTION update_address_object_to_array(id INTEGER);
CREATE OR REPLACE FUNCTION
update_address_object_to_array(id INTEGER)
RETURNS VOID AS
$UPDATE_ADDRESS_OBJECT_TO_ARRAY$
BEGIN
UPDATE seller_info_address
SET address = jsonb_set(address, '{addressLines}', (
SELECT CASE
WHEN jsonb_agg(addressLines) IS NOT NULL THEN jsonb_agg(addressLines)
ELSE '[]'
END
FROM seller_info_address sia,
jsonb_each(address #> '{addressLines}') as t(key, addressLines)
WHERE jsonb_typeof(sia.address -> 'addressLines') = 'object'
AND seller_info_id = update_address_object_to_array.id
), true)
WHERE seller_info_id = update_address_object_to_array.id
AND jsonb_typeof(address -> 'addressLines') = 'object';
END
$UPDATE_ADDRESS_OBJECT_TO_ARRAY$
LANGUAGE 'plpgsql';
SELECT update_address_object_to_array(sia.seller_info_id)
FROM seller_info_address sia
WHERE jsonb_typeof(address -> 'addressLines') = 'object';
The inner SELECT fetches all lines in the addressLines object using jsonb_each and then aggregates them into an array using jsonb_agg. The conditional expressions is to prevented null cases.
The result is then stored in the UPDATE via jsonb_set to the required position in the json. The WHERES should be self-explanatory.

How to pretty format JSON in Oracle?

I wanted to know if there is any way to format a JSON in Oracle (as does this web site example)
In XML I used:
SELECT XMLSERIALIZE(Document XMLTYPE(V_RESPONSE) AS CLOB INDENT SIZE = 2)
INTO V_RESPONSE
FROM DUAL;
And it works very well.
With Oracle 12c, you can use the JSON_QUERY() function with the RETURNING ... PRETTY clause :
PRETTY : Specify PRETTY to pretty-print the return character string by inserting newline characters and indenting
Expression :
JSON_QUERY(js_value, '$' RETURNING VARCHAR2(4000) PRETTY)
Demo on DB Fiddle :
with t as (select '{"a":1, "b": [{"b1":2}, {"b2": "z"}]}' js from dual)
select json_query(js, '$' returning varchar2(4000) pretty) pretty_js, js from t;
Yields :
PRETTY_JS | JS
--------------------------|----------------------------------------
{ | {"a":1, "b": [{"b1":2}, {"b2": "z"}]}
"a" : 1, |
"b" : |
[ |
{ |
"b1" : 2 |
}, |
{ |
"b2" : "z" |
} |
] |
} |
When you're lucky enough to get to Oracle Database 19c, there's another option for pretty printing: JSON_serialize.
This allows you to convert JSON between VARCHAR2/CLOB/BLOB. And includes a PRETTY clause:
with t as (
select '{"a":1, "b": [{"b1":2}, {"b2": "z"}]}' js
from dual
)
select json_serialize (
js returning varchar2 pretty
) pretty_js,
js
from t;
PRETTY_JS JS
{ {"a":1, "b": [{"b1":2}, {"b2": "z"}]}
"a" : 1,
"b" :
[
{
"b1" : 2
},
{
"b2" : "z"
}
]
}

Postgres WHERE array contains empty string

I'm trying to do select * from demo where demojson->'sub'->'item' = array("") but this doesn't work. I'd like to find the following
All rows where .sub.item in the JSON column is an array containing exactly one empty string ([""])
All rows where .sub.item in the JSON column is an array that may contain more than one item, but at least one of the items is an empty string. (["not empty", "also not empty", ""])
demojson column could contain for example
{
"key": "value",
"sub": {
"item": [""]
}
}
Have you tried
SELECT * from demo
WHERE demojson->'sub'->>'item' = '[""]';
Here ->> operator allows to get JSON object field as text.
And another solution
SELECT * from demo
WHERE json_array_length(demojson->'sub'->'item') = 1 AND
demojson->'sub'->'item'->>0 = '';
Here ->> operators allows to get JSON first array element as text.
Due JSONLint doesn't validate the supplied text example, I've used the next:
CREATE TABLE info (id int, j JSON);
insert into info values
(1, '{"key":"k1", "sub": {"item":["i1","i2"]}}'),
(2, '{"key":"k2", "sub": {"item":[""]}}'),
(3, '{"key":"k3", "sub": {"item":["i2","i3"]}}');
Using the where clause in this way, it works:
select * from info
where j->'sub'->>'item' = '[""]';
+----+------------------------------------+
| id | j |
+----+------------------------------------+
| 2 | {"key":"k2", "sub": {"item":[""]}} |
+----+------------------------------------+
Can check it here: http://rextester.com/VEPY57423
Try the following:
SELECT * FROM demo
WHERE demojson->'sub'->'item' = to_jsonb(ARRAY['']);