Prevent double-escaped JSON in FOR JSON output in SQL - sql

I have a small problem in my case because in my case a column can contain text 'John' directly or text as array '["John","Smith"]' both. So how can I prevent double-escaped JSON in FOR JSON output? I think I am doing something wrong here. Please check my example:
Create table #jsonTest(NameList varchar(max))
insert into #jsonTest(NameList)
select '["John","Smith"]'
Now if I want its output it will give correct output from this (without escape character):
select JSON_QUERY(NameList) NameList from #jsonTest for json auto
Output:
[{"NameList":["John","Smith"]}]
Simple text example:
truncate table #jsonTest
insert into #jsonTest(NameList)
Select 'John'
Now for this I have to change my select query for the correct output because JSON_QUERY, as mentioned, it only returns objects and arrays. So i've changed it to this:
select case when ISJSON(NameList) = 1 then JSON_QUERY(NameList) else NameList end NameList from #jsonTest for json auto
Output:
[{"NameList":"John"}]
Now It will give correct output for now but if I insert previous data again and try upper select query
truncate table #jsonTest
insert into #jsonTest(NameList)
select '["John","Smith"]'
select case when ISJSON(NameList) = 1 then JSON_QUERY(NameList) else NameList end NameList from #jsonTest for json auto
Output:
[{"NameList":"[\"John\",\"Smith\"]"}]
then it is giving escape characters in output. What is wrong in the code?

This behaviour is explained in the documentation - If the source data contains special characters, the FOR JSON clause escapes them in the JSON output with '\'. Of course, as you already know, when JSON_QUERY() is used with FOR JSON AUTO, FOR JSON doesn't escape special characters in the JSON_QUERY return value.
Your problem is the fact, that your data is not always a JSON. So, one possible approach is to generate a statement with duplicate column names (NameList). By default FOR JSON AUTO does not include NULL values in the output, so the result is the expected JSON. Just note, that you must not use INCLUDE_NULL_VALUES in the statement or the final JSON will contain duplicate keys.
Table:
CREATE TABLE #jsonTest(NameList varchar(max))
insert into #jsonTest(NameList)
select '["John","Smith"]'
insert into #jsonTest(NameList)
Select 'John'
Statement:
SELECT
JSON_QUERY(CASE WHEN ISJSON(NameList) = 1 THEN JSON_QUERY(NameList) END) AS NameList,
CASE WHEN ISJSON(NameList) = 0 THEN NameList END AS NameList
FROM #jsonTest
FOR JSON AUTO
Result:
[{"NameList":["John","Smith"]},{"NameList":"John"}]

Related

How to manipulate String count in JSON in SQL Server

In one use case I need to write a Query which return 1 if string count is >=4 or else return 0.
Below is the JSON which contains string.
Below is the query, which is returning a single row.
Select * from [Frs_def_businessobjectlayouts] where Definition like '%Open In Parent%' AND name like
'Task.ResponsiveAnalyst'
Note - Definition is the column which contains JSON data
Could some one help me out here!!.
First, you need to query the JSON and store in a temp variable and declare the word you want to search in another variable and perform the below operation. You will get the number of times it occurred in the JSON as output.
DECLARE #string VARCHAR(MAX)="Query your JSON from the table and assign to this variable"
DECLARE #tosearch VARCHAR(MAX)='Open In Parent'
SELECT (DATALENGTH(#string)-DATALENGTH(REPLACE(#string,#tosearch,'')))/DATALENGTH(#tosearch)
AS OccurrenceCount

How to extract value from a JSON string with no Key?

I have a JSON column in one of the tables, and the JSON column has no key or property, only the value.
I tried to parse the column with JSON_Query and JSON_Value, but both of these functions only work if the JSON string has a key, but in my situation, the JSON string has no key.
So how can I parse the column from the top table to the bottom table in SQL Server like the image below?
Please try this:
DECLARE #Table TABLE (ID INT, [JSONColumn] NVARCHAR(MAX));
INSERT INTO #Table(ID,[JSONColumn])VALUES
(151616,'["B0107C57WO","B066EYU4IY"]')
,(151617,'["B0088MD64S"]')
;
SELECT t.ID,j.[value]
FROM #Table t
CROSS APPLY OPENJSON(t.JSONColumn) j
;

Update an existing JSON value inside a JSON Array in SQL

I want to update an existing JSON value inside a JSON array. I can append a new JSON to the JSON array using JSON_MODIFY. Suppose i have a JSON like :
[{"id":"101","name":"John"}, {"id":"102","name":"peter"}]
But i want to update only the json with id=102.
Is it possible using JSON_MODIFY()?
EDIT:
Actual data
{"Details":{"SId":{"Type":"string","Value":"1234"},"BookList":{"Type":"List","Value":[{"id": "101", "name": "Book1"},{"id": "102", "name": "Book2"}]},"SName":{"Type":"string","Value":"john"}}}
You could use CTE to parse it and combine path in UPDATE part:
WITH cte AS (
SELECT *
FROM t
CROSS APPLY OPENJSON(c) s
WHERE i = 1
AND JSON_VALUE(s.value, '$.id')=102
)
UPDATE cte
SET c = JSON_MODIFY(c, '$[' + cte.[key] + '].name', 'Joe');
DBFiddle Demo
Output:
-- Before
[{"id":"101","name":"John"}, {"id":"102","name":"peter"}]
-- After
[{"id":"101","name":"John"}, {"id":"102","name":"Joe"}]
This will work on SQL Server 2017+ or SQL Azure DB otherwise you will get error. More info about path literal
Updating JSON Data (Postgresql)
If the column in your table contains json data and you want to update this data, you can use the following structure:
UPDATE table_name SET column_name = '{"key" : value}'::jsonb
WHERE column_name::jsonb #> '{“new_key” : new_value}'::jsonb;
Note: Usually #> is used as the "contains" operator.
The best way is to generate the statement like this:
In this way you, won't get the error of "Cannot resolve the collation conflict between Latin1_General_BIN and SQL_Latin1_General_CP1_CI_AS"
Also, you won't get this error "The argument 2 of the JSON_MODIFY must be a string literal"
WITH cte AS (
SELECT
t.PrimaryKey,
JSON_VALUE([value], '$.id') as id,
t.JsonColumn,
o.*
,('UPDATE MyTable set JsonColumn = JSON_MODIFY(JsonColumn, ''$['+[key]+'].id'', ''NewVALUE'') WHERE PrimaryKey = '''+t.PrimaryKey COLLATE SQL_Latin1_General_CP1_CI_AS+ '''') as statement
FROM MyTable t
CROSS APPLY OPENJSON(JSON_QUERY(JsonColumn, '$')) o WHERE JSON_VALUE(o.value, '$.Id')= 1
)
select * from cte;

Updating a JSON field replaces whole document?

In sql server 2016 I am expecting a document to have 3000+ fields in a JSON column. Can I update one field in the document without replacing to whole document. How can I do this?
You could use JSON_MODIFY function:
Updates the value of a property in a JSON string and returns the
updated JSON string.
JSON_MODIFY ( expression , path , newValue )
Something like:
UPDATE table_name
SET json_column = JSON_MODIFY(json_column, '$.name', 'new_name')
WHERE id = 1;

SQL- Collect all data into a variable

i need to collect all return data into a variable using comma separated.
let say i have a select command like: select * from #temptable.
it's return:
Field1|Field2
-------------
Value1|Value2
Expected Result: #testvariable hold the value: 'Value1','Value2'
On this table their may have 2 columns and i need to store all the return result into a single variable. We can easily collect a single value like: select #var=column1 from #temptable. But i need to store all.Here the problem is, the number of column can be vary. Mean, number of column and name of column generate from another query.So, i can't mention the field name.I need a dynamic way to do it. on this table only one row will be return. Thanks in advance.
You can do this without dynamic SQL using XML
DECLARE #xml XML = (SELECT * FROM #temptable FOR XML PATH(''))
SELECT stuff((SELECT ',' + node.value('.', 'varchar(100)')
FROM #xml.nodes('/*') AS T(node)
FOR XML PATH(''), type).value('.','varchar(max)')
, 1, 1, '');
This can probably be simplified by someone more adept at XML querying than me.
Since your column names are dynamic, so first you have to take the column names as comma separated in a variable and then can use EXEC()
for example :-
//making comma seperated column names from table B
DECLARE #var varchar(1000)=SELECT SUBSTRING(
(SELECT ',' + Colnames
FROM TABLEB
ORDER BY Colnames
FOR XML PATH('')),2,200000)
//Execute the sql statement
EXEC('select '+#var+' from tableA')
if you want to get the value returned after execution of sql statement then you can use
sp_executesql (Transact-SQL)