How to ignore few properties in SQL Server JSON value - sql

I have a table which has a column with JSON data. Every JSON object in each column has many properties. I have another table which has property name not all but few.
What I need to do is to write a query to select JSON data from table but JSON should not contain properties which are there in the second table. For example, below is the jSON table.
Id Person
1 {FirstName:"test", LastName: "test", Email: "test#test.abc"}
2 {FirstName:"syx", LastName: "ave", Email: "yyeyd#test.abc"}
Second table with properties name:
ExclusionId ExcludedProperty
1 Email
Query should join these two table and output should be
Id Person
1 {FirstName:"test", LastName: "test"}
2 {FirstName:"syx", LastName: "ave"}

With corrected JSON, you can use JSON_MODIFY() and set the desired value to NULL
Example
Declare #YourTable Table ([Id] int,[Person] varchar(500))
Insert Into #YourTable Values
(1,'{"FirstName":"test", "LastName": "test", "Email": "test#test.abc"}')
,(2,'{"FirstName":"syx", "LastName": "ave", "Email": "yyeyd#test.abc"}')
Select A.ID
,NewValue = JSON_MODIFY([Person],'$.Email',null)
From #YourTable A
Returns
ID NewValue
1 {"FirstName":"test", "LastName": "test"}
2 {"FirstName":"syx", "LastName": "ave"}

This is a fully generic approach with no need for dynamic SQL:
--a mockup to simulate your issue
--credits to John Cappelletti for the mockup-code
DECLARE #YourTable TABLE (Id INT,Person NVARCHAR(500))
INSERT INTO #YourTable VALUES
(1,'{"FirstName":"test", "LastName": "test", "Email": "test#test.abc"}')
,(2,'{"FirstName":"syx", "LastName": "ave", "Email": "yyeyd#test.abc"}');
DECLARE #ExcludeProperties TABLE (Id INT,PropName VARCHAR(500))
INSERT INTO #ExcludeProperties VALUES (1,'Email');
--The query
WITH cte AS
(
SELECT t.Id
,JsonValues.[key] AS PropName
,JsonValues.[value] AS PropValue
FROM #YourTable t
CROSS APPLY OPENJSON(t.Person) jsonValues
WHERE JsonValues.[key] COLLATE Latin1_General_CI_AS NOT IN(SELECT excl.PropName COLLATE Latin1_General_CI_AS FROm #ExcludeProperties excl)
)
SELECT cte.Id
,CONCAT('{',
STUFF((
SELECT CONCAT(',"',cte2.PropName,'":"',cte2.PropValue,'"')
FROM cte cte2
WHERE cte2.Id=cte.Id
FOR XML PATH('')
),1,1,'')
,'}')
FROM cte
GROUP BY cte.Id;
The idea in short:
First we use a CTE to create a list of all properties within your JSON as EAV (Entity-Attribute-Value). With this its easy to get rid of all the attributes, where the name is found in your exclude table.
As we are not allowed to use a column's value as the output column's alias we can build the JSON with string-methods instead of a FOR JSON query.
First I use a GROUP BY to reduce the final output to one-row-per-Id, and the I use a correlated sub-query to build the JSON per Id from the corresponding EAVs.

Related

How to query a table with JSON column with key-value pairs to match all keys and values

Say I have an Image table with a meta column in JSON:
Id
Meta
1
{ "size": 80, "effect": "blur" }
2
{ "size": 200, "optimize": true }
3
{ "color": "#abcdef", "ext": ".jpg" }
And I have a dynamic param of table type like so
Key
Value
size
200
optimize
true
How should I write my query to filter the rows in which the Meta column's key-value pairs matched all the values in the param table?
SELECT Id
FROM Image
WHERE (
--?? all keys and values matched the param table
)
This is a type of relational division (with remainder) question, with the extra twist of shredding JSON at the same time.
There are a number of solutions to this type of question. One common solution is to LEFT JOIN the divisor to the dividend, group it and check for any non-matches:
DECLARE #tmp TABLE (
"Key" NVARCHAR(8) COLLATE Latin1_General_BIN2,
"Value" NVARCHAR(4) COLLATE Latin1_General_BIN2
);
INSERT INTO #tmp
("Key", "Value")
VALUES
('size', '200'),
('optimize', 'true');
SELECT *
FROM Image i
WHERE EXISTS (SELECT 1
FROM #tmp t
LEFT JOIN OPENJSON(i.Meta) j ON t.[Key] = j.[key] AND t.Value = j.value
HAVING COUNT(j.value) = COUNT(*) -- all match
);
Another solution is to use a double NOT EXISTS: there are no key/value input pairs which do not have a match
DECLARE #tmp TABLE (
"Key" NVARCHAR(8) COLLATE Latin1_General_BIN2,
"Value" NVARCHAR(4) COLLATE Latin1_General_BIN2
);
INSERT INTO #tmp
("Key", "Value")
VALUES
('size', '200'),
('optimize', 'true');
SELECT *
FROM Image i
WHERE NOT EXISTS (SELECT 1
FROM #tmp t
WHERE NOT EXISTS (SELECT 1
FROM OPENJSON(i.Meta) j
WHERE t.[Key] = j.[key] AND t.Value = j.value
)
);
db<>fiddle
YMMV as to which solution is faster.

Include null value for parent attributes in json Response if all child values are null

I am trying to generate a json response from a query and output looks like below.
{
"id": "1",
"PassStatus": "FAILED",
"Details": {
"Id": null,
"Name": null,
"Type": null,
"Location": null
}
}
I want the json response like below. If all my child attributes are null then I need to set the value as null for the parent attribute.
{
"id": "1",
"PassStatus": "FAILED",
"Details": null
}
I am using below query. How can I make changes to this query to achieve the desired result. Really appreciate any suggestions or inputs.
select *
from Table
for json auto,INCLUDE_NULL_VALUES,WITHOUT_ARRAY_WRAPPER
As mentioned, the SQL you have won't generate the above JSON. But making a couple of guesses, you could use a subquery:
CREATE TABLE dbo.SomeTable (ID int,
PassStatus varchar(10))
CREATE TABLe dbo.OtherTable (ID INT,
fID int,
[Name] varchar(10),
[Type] int,
Location varchar(10));
GO
INSERT INTO dbo.SomeTable
VALUES(1,'FAILED'),
(2,'PASSED');
INSERT INTO dbo.OtherTable (ID,fID,Name)
VALUES(1,2,'Jane')
GO
SELECT ST.ID,
ST.PassStatus,
(SELECT OT.ID,
OT.Name,
OT.[Type],
OT.Location
FROM dbo.OtherTable OT
WHERE OT.fID = ST.ID
FOR JSON PATH,INCLUDE_NULL_VALUES) AS Details
FROM dbo.SomeTable ST
WHERE ID = 1
FOR JSON AUTO,INCLUDE_NULL_VALUES,WITHOUT_ARRAY_WRAPPER;
GO
SELECT ST.ID,
ST.PassStatus,
(SELECT OT.ID,
OT.Name,
OT.[Type],
OT.Location
FROM dbo.OtherTable OT
WHERE OT.fID = ST.ID
FOR JSON PATH,INCLUDE_NULL_VALUES) AS Details
FROM dbo.SomeTable ST
WHERE ID = 2
FOR JSON PATH,INCLUDE_NULL_VALUES,WITHOUT_ARRAY_WRAPPER;
GO
DROP TABLE dbo.SomeTable;
DROP TABLE dbo.OtherTable;
db<>fiddle
Note the subqueries require the array wrapper, as otherwise the double quotes will be escaped.
Given that detail is not an array, you don't actually need to double-nest FOR JSON, you can just specify the path for each property.
Based on #Larnu's schema:
SELECT ST.ID id,
ST.PassStatus,
OT.ID [Details.Id],
OT.Name [Details.Name],
OT.[Type] [Details.Type],
OT.Location [Details.Location]
FROM dbo.SomeTable ST
JOIN dbo.OtherTable OT ON OT.fID = ST.ID
WHERE ST.ID = 2
FOR JSON PATH,INCLUDE_NULL_VALUES,WITHOUT_ARRAY_WRAPPER;
db<>fiddle

SQL Server FOR JSON PATH, Exclude specific field if null

I have table with data like this
Id | Name | Phone | OtherField
----+---------+--------+-----------
1 | ABC | 12344 | NULL
2 | XYZ | NULL | NULL
I want a SQL query to transform it like this
[
{
"ID":1,
"Name":"ABC",
"Phone":[
{"Home":"12344"}
],
"OtherFields":NULL
},
{
"ID":1,
"Name":"ABC",
"OtherFields":NULL
}
]
I know about INCLUDE_NULL_VALUES it includes all the empty field.
I want to include all other fields except Phone.
I have edited my answer as you have changed your original request.
I don't believe you can have it both ways, keeping some NULLs and not others. The best way I can think of at the moment is to use ISNULL on columns you must keep.
For example:
DECLARE #Table TABLE ( Id INT, Name VARCHAR(10), Phone VARCHAR(10), OtherField VARCHAR(10) );
INSERT INTO #Table ( Id, Name, Phone ) VALUES
( 1, 'ABC', '12344' ), ( 2, 'XYZ', NULL );
SELECT
Id, Name,
JSON_QUERY ( CASE
WHEN t.Phone IS NOT NULL THEN x.Phone
ELSE NULL
END ) AS Phone,
ISNULL( OtherField, '' ) AS OtherFields
FROM #Table t
CROSS APPLY (
SELECT ( SELECT Phone AS Home FOR JSON PATH ) AS Phone
) x
FOR JSON PATH;
Returns
[{
"Id": 1,
"Name": "ABC",
"Phone": [{
"Home": "12344"
}],
"OtherFields": ""
}, {
"Id": 2,
"Name": "XYZ",
"OtherFields": ""
}]
Update:
The original question was edited and I don't think that you can generate the expected ouptut using a single FOR JSON and INCLUDE_NULL_VALUES, because now the table has more than one column with NULL values (OtherField in the example).
As a possible solution you may try a mixed approach (using FOR JSON and STRING_AGG()) to build the final JSON output and keep the NULL values for all columns, except Phones:
CREATE TABLE Data (
Id int,
Name varchar(100),
Phone varchar(100),
OtherField varchar(1)
);
INSERT INTO Data (Id, Name, Phone, OtherField)
VALUES
(1, 'ABC', '12344', NULL),
(2, 'ABC', NULL, NULL),
(3, 'ABC', NULL, NULL)
Statement:
SELECT CONCAT(
'[',
(
SELECT STRING_AGG(j.Json, ',')
FROM Data d
CROSS APPLY (
SELECT CASE
WHEN Phone IS NOT NULL THEN (
SELECT Id, Name, (SELECT Phone AS Home FOR JSON PATH) AS Phone, OtherField
FOR JSON PATH, INCLUDE_NULL_VALUES, WITHOUT_ARRAY_WRAPPER
)
ELSE (
SELECT Id, Name, OtherField
FOR JSON PATH, INCLUDE_NULL_VALUES, WITHOUT_ARRAY_WRAPPER
)
END
) j (Json)
),
']'
)
Result:
[
{"Id":1,"Name":"ABC","Phone":[{"Home":"12344"}],"OtherField":null},
{"Id":2,"Name":"ABC","OtherField":null},
{"Id":3,"Name":"ABC","OtherField":null}
]
Original answer:
You may try the following statement:
Table:
CREATE TABLE Data (
Id int,
Name varchar(100),
Phone varchar(100)
);
INSERT INTO Data (Id, Name, Phone)
VALUES
(1, 'ABC', '12344'),
(2, 'ABC', NULL )
Statement:
SELECT
Id,
Name,
JSON_QUERY(CASE WHEN Phone IS NOT NULL THEN (SELECT Phone AS Home FOR JSON PATH) END) AS Phone
FROM Data
FOR JSON PATH
Result:
[
{"Id":1,"Name":"ABC","Phone":[{"Home":"12344"}]},
{"Id":2,"Name":"ABC"}
]

How to INSERT getDate() into a column in a table which also has columns that have data inserted with OPENJSON and CROSSAPPLY

I have a the following JSON that needs to get inserted into a table
{
"Parent": {
"level": "Ground",
},
"Items":[
{ "name" : "name1",
"id" : "id1"
},
{ "name" : "name2",
"id" : "id2"
},
]
}
This is the query i wrote for inserting into the table
Insert into table(name,id,level,dateInserted) Select(name,id,level,dateInserted)FROM OPENJSON((select
* from #json), N'$.Items')
WITH ( id nvarchar(max) N'$.id',
name nvarchar(max) N'$.name')
CROSS APPLY OPENJSON((select * from #json), N'$.Parent')
WITH( level nvarchar(max) N'$.level'))
**CROSS APPLY dateInserted dateTime getDate()**
I am having issue inserting getDate() into the dateInserted column. As dateInserted is something that is not being read from the JSON itself, I am having trouble inserting it into the table. What is the right way to do it?
It was a silly mistake. It has nothing to do with the OPENJSON or CROSS APPLY
For anyone looking for an answer
Insert into table(name,id,level,dateInserted) Select(name,id,level,getDate())FROM
OPENJSON((select
* from #json), N'$.Items')
WITH ( id nvarchar(max) N'$.id',
name nvarchar(max) N'$.name')
CROSS APPLY OPENJSON((select * from #json), N'$.Parent')
WITH( level nvarchar(max) N'$.level'))
The getDate() has to be inside the Select clause

Output json in dictionary (string-indexed list) notation from SQL Server

I have this result set in SQL server:
ID CUSTOMER PRODUCT DATE COUNT
A1 Walmart Widget 1/1/2020 5
B2 Amazon Thingy 1/2/2020 10
C3 Target Gadget 2/1/2020 7
I want to output it as json, which SQL server 2016+ has plenty ability to do. But I want a traditional string-indexed list ('dictionary') indexed by the id, like so:
Goal
{
"A1": {"Customer":"Walmart", "Product":"Widget", "Date":"1/1/2020", "Count":5 },
"B2": {"Customer":"Amazon", "Product":"Thingy", "Date":"1/2/2020", "Count":10},
"C3": {"Customer":"Target", "Product":"Gadget", "Date":"2/1/2020", "Count":7 }
}
However, typical select * from table for json path outputs as an unindexed array of objects:
Current State
[
{"Id":"A1", "Customer":"Walmart", "Product":"Widget", "Date":"1/1/2020", "Count":5 },
{"Id":"B2", "Customer":"Amazon", "Product":"Thingy", "Date":"1/2/2020", "Count":10},
{"Id":"C3", "Customer":"Target", "Product":"Gadget", "Date":"2/1/2020", "Count":7 }
]
The other for json modifiers such as root seem superficially relevant, but as far as I can tell just does glorified string concatenation of capturing the entire object in an outer root node.
How can the above notation be done using native (performant) SQL server json functions?
I don't think that you can generate JSON output with variable key names using FOR JSON AUTO or FOR JSON PATH, but if you can upgrade to SQL Server 2017, the following approach, that uses only JSON built-in support, is a possible option:
Table:
CREATE TABLE Data (
Id varchar(2),
Customer varchar(50),
Product varchar(50),
[Date] date,
[Count] int
)
INSERT INTO Data
(Id, Customer, Product, [Date], [Count])
VALUES
('A1', 'Walmart', 'Widget', '20200101', 5),
('B2', 'Amazon', 'Thingy', '20200102', 10),
('C3', 'Target', 'Gadget', '20200201', 7)
Statement:
DECLARE #json nvarchar(max) = N'{}'
SELECT #json = JSON_MODIFY(
#json,
CONCAT(N'$."', ID, N'"'),
JSON_QUERY((SELECT Customer, Product, [Date], [Count] FOR JSON PATH, WITHOUT_ARRAY_WRAPPER))
)
FROM Data
SELECT #json
Result:
{"A1":{"Customer":"Walmart","Product":"Widget","Date":"2020-01-01","Count":5},"B2":{"Customer":"Amazon","Product":"Thingy","Date":"2020-01-02","Count":10},"C3":{"Customer":"Target","Product":"Gadget","Date":"2020-02-01","Count":7}}
Notes:
Using a variable or expression instead of value for path parameter in JSON_MODIFY() is available in SQL Server 2017+. JSON_QUERY() is used to prevent the escaping of the special characters.
The question is tagged sql2016, string_agg() won't work ... (aggregate with xpath or custom aggregate)
declare #t table
(
Id varchar(10),
CUSTOMER varchar(50),
PRODUCT varchar(50),
[DATE] date,
[COUNT] int
);
insert into #t(Id, CUSTOMER, PRODUCT, [DATE], [COUNT])
values
('A1','Walmart','Widget','20200101', 5),
('B2','Amazon','Thingy','20200201', 10),
('C3','Target','Gadget','20200102', 7);
select concat('{', STRING_AGG(thejson, ','), '}')
from
(
select concat('"', STRING_ESCAPE(Id, 'json'), '":', (select CUSTOMER, PRODUCT, DATE, COUNT for json path, without_array_wrapper )) as thejson
from #t
) as src;
Unfortunately, you want a JSON result that has multiple values -- A1, B2, and C3 -- derived from the data. This means that you need to aggregate the data into one row. Normally, for json path would want to create an array of values, one for each row.
So, this should do what you want:
select json_query(max(case when id = 'A1' then j.p end)) as A1,
json_query(max(case when id = 'B2' then j.p end)) as B2,
json_query(max(case when id = 'B3' then j.p end)) as B3
from t cross apply
(select t.customer, t.product, t.date, t.count
for json path
) j(p)
for json path;
Here is a db<>fiddle.
However, it is not easily generalizable. For a general solution, you might need to do string manipulations.