I have a complex json stored in a varchar(max) column. I have an array of strings in the json.
myArray: ['one', 'two', 'three', 'four']
I am running the following update query to delete property two from the array above using JSON_MODIFY.
UPDATE MYTABLE SET MYJSONCOL = JSON_MODIFY(MYJSONCOL, '$.section[0].subsection[7].myArray[1]', null) WHERE MYID = 'ABCD';
However, the query results in:
myArray: ['one', null, 'three', 'four']
But I want it to be:
myArray: ['one', 'three', 'four']
How do I achieve this?
I tried adding lax in front of the path. But I got the same result i.e. null instead of property being completely removed.
UPDATE MYTABLE SET MYJSONCOL = JSON_MODIFY(MYJSONCOL, 'lax $.section[0].subsection[7].myArray[1]', null) WHERE MYID = 'ABCD';
How can I completely remove a property from a json array using JSON_MODIFY.
As far as I am aware JSON_MODIFY() does not support removing items from an array. One workaround is to expand the array using OPENJSON(), remove the items you need to, then rebuild it using STRING_AGG() and then replace your full array with the newly create one.
It feels a bit hacky, but it does work:
IF OBJECT_ID(N'tempdb..#T', 'U') IS NOT NULL
DROP TABLE #T;
CREATE TABLE #T (MyJSONCol VARCHAR(MAX));
INSERT #T (MyJSONCol)
VALUES ('{ "SomeProperty": "SomeValue", "myArray": ["one", "two", "three", "four"] }');
SELECT *
FROM #T AS t;
UPDATE t
SET t.MyJSONCol = JSON_MODIFY(t.MyJSONCol, '$.myArray', JSON_QUERY(oj.MyNewArray))
FROM #T AS t
CROSS APPLY
( SELECT CONCAT('[', STRING_AGG(CONCAT('"', oj.Value, '"'), ','), ']')
FROM OPENJSON(t.MyJSONCol, '$.myArray') AS oj
WHERE oj.[Key] <> 1
) AS oj (MyNewArray);
SELECT *
FROM #T AS t;
Example on db<>fiddle
ADDENDUM
The above will fall down if any of the items in the array are objects, to get around this you need to check if the item is JSON first, and if so, don't concatenate the quotes to the value:
IF OBJECT_ID(N'tempdb..#T', 'U') IS NOT NULL
DROP TABLE #T;
CREATE TABLE #T (MyJSONCol VARCHAR(MAX));
INSERT #T (MyJSONCol)
VALUES
('{ "SomeProperty": "SomeValue", "myArray": ["one", "two", "three", "four"] }'),
('{ "SomeProperty": "SomeValue", "myArray": ["one", {"two": "SomeValue"}, {"three": "SomeValue"}, "four"] }'),
('{ "SomeProperty": "SomeValue", "myArray": [{"one": "SomeValue"}, {"two": "SomeValue"}, {"three": "SomeValue"}, {"four": ["one", "two", "three"]}] }');
SELECT *
FROM #T AS t;
UPDATE t
SET MyJSONCol = JSON_MODIFY(t.MyJSONCol, '$.myArray', JSON_QUERY(oj.MyNewArray))
FROM #T AS t
CROSS APPLY
( SELECT CONCAT('[', STRING_AGG(CASE WHEN ISJSON(oj.Value) = 1 THEN oj.Value ELSE CONCAT('"', oj.Value, '"') END, ','), ']')
FROM OPENJSON(t.MyJSONCol, '$.myArray') AS oj
WHERE oj.[Key] <> 1
) AS oj (MyNewArray);
SELECT *
FROM #T AS t;
Example on db<>fiddle
Try this to remove the whole array:
DECLARE #DataSource TABLE
(
[Project] VARCHAR(MAX)
);
INSERT INTO #DataSource ([Project])
VALUES ('{"SomeProperty": "SomeValue","myArray":["one", "two", "three", "four"]}');
UPDATE #DataSource
SET [Project] = JSON_MODIFY([Project], '$.myArray', NULL)
SELECT [Project]
FROM #DataSource
Related
Say I have an Image table with a meta column in JSON:
Id
Meta
1
{ "size": 80, "effect": "blur" }
2
{ "size": 200, "optimize": true }
3
{ "color": "#abcdef", "ext": ".jpg" }
And I have a dynamic param of table type like so
Key
Value
size
200
optimize
true
How should I write my query to filter the rows in which the Meta column's key-value pairs matched all the values in the param table?
SELECT Id
FROM Image
WHERE (
--?? all keys and values matched the param table
)
This is a type of relational division (with remainder) question, with the extra twist of shredding JSON at the same time.
There are a number of solutions to this type of question. One common solution is to LEFT JOIN the divisor to the dividend, group it and check for any non-matches:
DECLARE #tmp TABLE (
"Key" NVARCHAR(8) COLLATE Latin1_General_BIN2,
"Value" NVARCHAR(4) COLLATE Latin1_General_BIN2
);
INSERT INTO #tmp
("Key", "Value")
VALUES
('size', '200'),
('optimize', 'true');
SELECT *
FROM Image i
WHERE EXISTS (SELECT 1
FROM #tmp t
LEFT JOIN OPENJSON(i.Meta) j ON t.[Key] = j.[key] AND t.Value = j.value
HAVING COUNT(j.value) = COUNT(*) -- all match
);
Another solution is to use a double NOT EXISTS: there are no key/value input pairs which do not have a match
DECLARE #tmp TABLE (
"Key" NVARCHAR(8) COLLATE Latin1_General_BIN2,
"Value" NVARCHAR(4) COLLATE Latin1_General_BIN2
);
INSERT INTO #tmp
("Key", "Value")
VALUES
('size', '200'),
('optimize', 'true');
SELECT *
FROM Image i
WHERE NOT EXISTS (SELECT 1
FROM #tmp t
WHERE NOT EXISTS (SELECT 1
FROM OPENJSON(i.Meta) j
WHERE t.[Key] = j.[key] AND t.Value = j.value
)
);
db<>fiddle
YMMV as to which solution is faster.
I have a the following JSON that needs to get inserted into a table
{
"Parent": {
"level": "Ground",
},
"Items":[
{ "name" : "name1",
"id" : "id1"
},
{ "name" : "name2",
"id" : "id2"
},
]
}
This is the query i wrote for inserting into the table
Insert into table(name,id,level,dateInserted) Select(name,id,level,dateInserted)FROM OPENJSON((select
* from #json), N'$.Items')
WITH ( id nvarchar(max) N'$.id',
name nvarchar(max) N'$.name')
CROSS APPLY OPENJSON((select * from #json), N'$.Parent')
WITH( level nvarchar(max) N'$.level'))
**CROSS APPLY dateInserted dateTime getDate()**
I am having issue inserting getDate() into the dateInserted column. As dateInserted is something that is not being read from the JSON itself, I am having trouble inserting it into the table. What is the right way to do it?
It was a silly mistake. It has nothing to do with the OPENJSON or CROSS APPLY
For anyone looking for an answer
Insert into table(name,id,level,dateInserted) Select(name,id,level,getDate())FROM
OPENJSON((select
* from #json), N'$.Items')
WITH ( id nvarchar(max) N'$.id',
name nvarchar(max) N'$.name')
CROSS APPLY OPENJSON((select * from #json), N'$.Parent')
WITH( level nvarchar(max) N'$.level'))
The getDate() has to be inside the Select clause
In SQL Server, I have a stored procedure that takes a JSON parameter #ChangeSet as below.
DECLARE #ChangeSet varchar(MAX) =
'{
"Acts":
[
{"ActId":100,"ActText":"Intro","ActNumber":1},
{"ActId":0, "ActText":"Beginning","ActNumber":2},
{"ActId":0, "ActText":"Middle","ActNumber":3},
{"ActId":0, "ActText":"End","ActNumber":4},
]
}';
Within the proc, I have a MERGE statement that updates tables based on whether it is an INSERT (if ActId is 0) or an UPDATE. I would like to update the JSON #ChangeSet variable with the multiple PK ActId's returned from the INSERTED table from the MERGE so that I can return it in an OUT parameter.
ActId Type Action Value ActNumber
---------------------------------------------
100 Act UPDATE Intro 1
101 Act INSERT Beginning 2
102 Act INSERT Middle 3
103 Act INSERT End 4
I could re-query the database, outputting as JSON but am interested in a figuring out a technique directly updating the JSON using something like JSON_MODIFY, etc. if possible.
I looked at various samples but have not found anything similar. Anybody have any good examples?
If I understand the question correctly, you have two options:
Modify the Acts JSON array using JSON_MODIFY (but you need SQL Server 2017+ to use a variable as a path expression). This approach sets a local variable in a SELECT statement, so yo must not use ORDER BY or DISTINCT in the statement.
Parse the input JSON, use a set-based approach to get the expected results as a table and output the table content as JSON using FOR JSON AUTO
JSON:
DECLARE #ChangeSet varchar(MAX) =
'{
"Acts":
[
{"ActId":100,"ActText":"Intro","ActNumber":1},
{"ActId":0, "ActText":"Beginning","ActNumber":2},
{"ActId":0, "ActText":"Middle","ActNumber":3},
{"ActId":0, "ActText":"End","ActNumber":4}
]
}';
Statement with JSON_MODIFY:
SELECT #ChangeSet = JSON_MODIFY(
#ChangeSet,
CONCAT('$.Acts[', j1.[key], '].ActId'),
v.[Id]
)
FROM OPENJSON(#ChangeSet, '$.Acts') j1
CROSS APPLY OPENJSON(j1.[value]) WITH (ActNumber int '$.ActNumber') j2
JOIN (VALUES
(100, 'Act', 'UPDATE', 'Intro', 1),
(101, 'Act', 'INSERT', 'Beginning', 2),
(102, 'Act', 'INSERT', 'Middle', 3),
(103, 'Act', 'INSERT', 'End', 4)
) v ([Id], [Type], [Action], [Value], [ActNumber]) ON v.[ActNumber] = j2.[ActNumber]
Statement with FOR JSON:
SELECT #ChangeSet = (
SELECT v.[Id] AS ActId, j.ActText, j.ActNumber
FROM OPENJSON(#ChangeSet, '$.Acts') WITH (
ActId int '$.ActId',
ActText varchar(50) '$.ActText',
ActNumber int '$.ActNumber'
) j
JOIN (VALUES
(100, 'Act', 'UPDATE', 'Intro', 1),
(101, 'Act', 'INSERT', 'Beginning', 2),
(102, 'Act', 'INSERT', 'Middle', 3),
(103, 'Act', 'INSERT', 'End', 4)
) v ([Id], [Type], [Action], [Value], [ActNumber]) ON v.[ActNumber] = j.[ActNumber]
FOR JSON AUTO, ROOT ('Acts')
)
Result:
{
"Acts":
[
{"ActId":100, "ActText":"Intro", "ActNumber":1},
{"ActId":101, "ActText":"Beginning", "ActNumber":2},
{"ActId":102, "ActText":"Middle", "ActNumber":3},
{"ActId":103, "ActText":"End", "ActNumber":4}
]
}
For completeness with the question, here is the finished routine that takes the Output of a Merge statement, inserts it into a temp table variable, then updates the input JSON with the newly inserted ActId primary keys so that it can then returned by a procedure OUT variable.
-- SQL 2017+ REQUIRED
DECLARE #ActActions table( [ActId] int, [Action] varchar(30),
[Value] nvarchar(max), [ActNumber] int );
---------------------------------------------------------------------------------
OUTPUT COALESCE (INSERTED.ActId, DELETED.ActId), $action,
COALESCE (INSERTED.ActText, DELETED.ActText),
COALESCE (INSERTED.ActNumber, DELETED.ActNumber)
INTO #ActActions; -- Required semi-colon at end of MERGE
---------------------------------------------------------------------------------
SELECT #ChangeSetJson = JSON_QUERY(JSON_MODIFY(
#ChangeSetJson, '$.Acts[' + j1.[key] + '].ActId', a.[ActId] ) )
FROM OPENJSON(#ChangeSetJson, '$.Acts') j1 CROSS APPLY
OPENJSON(j1.[value])
WITH (ActNumber int '$.ActNumber') j2 INNER JOIN
#ActActions a
ON a.[ActNumber] = j2.[ActNumber]
WHERE a.[Action] = 'INSERT'
I have a table which has a column with JSON data. Every JSON object in each column has many properties. I have another table which has property name not all but few.
What I need to do is to write a query to select JSON data from table but JSON should not contain properties which are there in the second table. For example, below is the jSON table.
Id Person
1 {FirstName:"test", LastName: "test", Email: "test#test.abc"}
2 {FirstName:"syx", LastName: "ave", Email: "yyeyd#test.abc"}
Second table with properties name:
ExclusionId ExcludedProperty
1 Email
Query should join these two table and output should be
Id Person
1 {FirstName:"test", LastName: "test"}
2 {FirstName:"syx", LastName: "ave"}
With corrected JSON, you can use JSON_MODIFY() and set the desired value to NULL
Example
Declare #YourTable Table ([Id] int,[Person] varchar(500))
Insert Into #YourTable Values
(1,'{"FirstName":"test", "LastName": "test", "Email": "test#test.abc"}')
,(2,'{"FirstName":"syx", "LastName": "ave", "Email": "yyeyd#test.abc"}')
Select A.ID
,NewValue = JSON_MODIFY([Person],'$.Email',null)
From #YourTable A
Returns
ID NewValue
1 {"FirstName":"test", "LastName": "test"}
2 {"FirstName":"syx", "LastName": "ave"}
This is a fully generic approach with no need for dynamic SQL:
--a mockup to simulate your issue
--credits to John Cappelletti for the mockup-code
DECLARE #YourTable TABLE (Id INT,Person NVARCHAR(500))
INSERT INTO #YourTable VALUES
(1,'{"FirstName":"test", "LastName": "test", "Email": "test#test.abc"}')
,(2,'{"FirstName":"syx", "LastName": "ave", "Email": "yyeyd#test.abc"}');
DECLARE #ExcludeProperties TABLE (Id INT,PropName VARCHAR(500))
INSERT INTO #ExcludeProperties VALUES (1,'Email');
--The query
WITH cte AS
(
SELECT t.Id
,JsonValues.[key] AS PropName
,JsonValues.[value] AS PropValue
FROM #YourTable t
CROSS APPLY OPENJSON(t.Person) jsonValues
WHERE JsonValues.[key] COLLATE Latin1_General_CI_AS NOT IN(SELECT excl.PropName COLLATE Latin1_General_CI_AS FROm #ExcludeProperties excl)
)
SELECT cte.Id
,CONCAT('{',
STUFF((
SELECT CONCAT(',"',cte2.PropName,'":"',cte2.PropValue,'"')
FROM cte cte2
WHERE cte2.Id=cte.Id
FOR XML PATH('')
),1,1,'')
,'}')
FROM cte
GROUP BY cte.Id;
The idea in short:
First we use a CTE to create a list of all properties within your JSON as EAV (Entity-Attribute-Value). With this its easy to get rid of all the attributes, where the name is found in your exclude table.
As we are not allowed to use a column's value as the output column's alias we can build the JSON with string-methods instead of a FOR JSON query.
First I use a GROUP BY to reduce the final output to one-row-per-Id, and the I use a correlated sub-query to build the JSON per Id from the corresponding EAVs.
I'm trying to use JSONtype as an input parameter for stored procedure to do some filtering on the return data.
I have the following TableA:
CREATE TABLE TableA (
Id INT NULL,
Value1 VARCHAR(25) NULL,
Value2 VARCHAR(25) NULL
);
INSERT INTO TableA (Id, Value1, Value2) values (1, 'test1', 'new1')
INSERT INTO TableA (Id, Value1, Value2) values (1, 'test1', 'new2')
INSERT INTO TableA (Id, Value1, Value2) values (null, null, 'test3')
INSERT INTO TableA (Id, Value1, Value2) values (2, 'myvalue1', 'newvalue')
The JSON parameter is dynamic - representing one or more column name and value from the above table.
DECLARE #Filter NVARCHAR(MAX)
SET #Filter=N'{
"Id": 2,
"Value1": "myvalue1",
"Value2": "newvalue"
}'
And I extract the data from the json and inner join it with the TableA to get the output I need:
...
;WITH cte AS
(
SELECT * FROM OPENJSON(#Filter)
WITH(Id INT,
Value1 VARCHAR(25),
Value2 VARCHAR(25))
)
SELECT a.* FROM cte ct
INNER JOIN TableA a
ON ct.Id = a.Id
INNER JOIN TableA b
ON ct.Value1 = b.Value1
INNER JOIN TableA c
ON ct.Value2 = c.Value2
In this particular example, I get the desired output, since I'm specifying all the columns. However, the whole reason behind using JSON as parameter is to be able to dynamically pass different columns (the actual table has much more columns).
If I would to pass the following filters, I will no longer get the desired output because of inner join.
SET #Filter=N'{
"Id": 2,
}'
...
SET #Filter=N'{
"Id": 2,
"Value1": "test1"
}'
...
SET #Filter=N'{
"Value1": "test1,
"Value1": "new2"
}'
...etc
Is there a way to dynamically select all the objects from json that has values and skip the nulls? Or any other suggestion on how I can resolve this issue?
Is there a way to check if the json has any objects in it? Based on the documentation, there's only one function ISJSON that validates to make sure it's in the proper format. However, if I pass: SET #Filter=N'{}' it's a valid json object, but it's empty.
SQLFIDDLE
You can try this:
SELECT a.*
FROM TableA a
WHERE (a.Id IS NULL AND JSON_VALUE(#Filter,N'$.Id') IS NULL OR a.Id = ISNULL(CAST(JSON_VALUE(#Filter,N'$.Id') AS INT),a.Id))
AND (a.Value1 IS NULL AND JSON_VALUE(#Filter,N'$.Value1') IS NULL OR a.Value1 = ISNULL(JSON_VALUE(#Filter,N'$.Value1'),a.Value1))
AND (a.Value2 IS NULL AND JSON_VALUE(#Filter,N'$.Value2') IS NULL OR a.Value2 = ISNULL(JSON_VALUE(#Filter,N'$.Value2'),a.Value2));
The trick is, to replace a NULL with the actual column's value...
I know this is not really a satisfying answer, but the approach I've used in such situations is to combine each optional field with:
where (value1 is null or field1 = value1)
This removes the ability to explicitly select field1 is null using the filter, and you will need to blow up your select statement by two expressions per parameter:
--SELECT * FROM TableA
DECLARE #Filter NVARCHAR(MAX)
SET #Filter=N'{
"Id": 1,
"Value1": "test1",
"Value2": "new2"
}'
;WITH cte AS
(
SELECT * FROM OPENJSON(#Filter)
WITH(Id INT,
Value1 VARCHAR(25),
Value2 VARCHAR(25))
)
SELECT a.* FROM cte filter
INNER JOIN TableA a
ON filter.Id = a.Id
and (filter.Value1 is null or filter.Value1 = a.Value1)
and (filter.Value2 is null or filter.Value2 = a.Value2)
Query Result
Id Value1 Value2
1 test1 new2
Filtering with values not set
If you have a missing value in your JSON, this will not be used as a filter criterion:
--SELECT * FROM TableA
DECLARE #Filter NVARCHAR(MAX)
SET #Filter=N'{
"Id": 1,
"Value2": "new2"
}'
;WITH cte AS
(
SELECT * FROM OPENJSON(#Filter)
WITH(Id INT,
Value1 VARCHAR(25),
Value2 VARCHAR(25))
)
SELECT a.* FROM cte filter
INNER JOIN TableA a
ON filter.Id = a.Id
and (filter.Value1 is null or filter.Value1 = a.Value1)
and (filter.Value2 is null or filter.Value2 = a.Value2)
SQL Output
Id Value1 Value2
1 test1 new2