I want to update an existing JSON value inside a JSON array. I can append a new JSON to the JSON array using JSON_MODIFY. Suppose i have a JSON like :
[{"id":"101","name":"John"}, {"id":"102","name":"peter"}]
But i want to update only the json with id=102.
Is it possible using JSON_MODIFY()?
EDIT:
Actual data
{"Details":{"SId":{"Type":"string","Value":"1234"},"BookList":{"Type":"List","Value":[{"id": "101", "name": "Book1"},{"id": "102", "name": "Book2"}]},"SName":{"Type":"string","Value":"john"}}}
You could use CTE to parse it and combine path in UPDATE part:
WITH cte AS (
SELECT *
FROM t
CROSS APPLY OPENJSON(c) s
WHERE i = 1
AND JSON_VALUE(s.value, '$.id')=102
)
UPDATE cte
SET c = JSON_MODIFY(c, '$[' + cte.[key] + '].name', 'Joe');
DBFiddle Demo
Output:
-- Before
[{"id":"101","name":"John"}, {"id":"102","name":"peter"}]
-- After
[{"id":"101","name":"John"}, {"id":"102","name":"Joe"}]
This will work on SQL Server 2017+ or SQL Azure DB otherwise you will get error. More info about path literal
Updating JSON Data (Postgresql)
If the column in your table contains json data and you want to update this data, you can use the following structure:
UPDATE table_name SET column_name = '{"key" : value}'::jsonb
WHERE column_name::jsonb #> '{“new_key” : new_value}'::jsonb;
Note: Usually #> is used as the "contains" operator.
The best way is to generate the statement like this:
In this way you, won't get the error of "Cannot resolve the collation conflict between Latin1_General_BIN and SQL_Latin1_General_CP1_CI_AS"
Also, you won't get this error "The argument 2 of the JSON_MODIFY must be a string literal"
WITH cte AS (
SELECT
t.PrimaryKey,
JSON_VALUE([value], '$.id') as id,
t.JsonColumn,
o.*
,('UPDATE MyTable set JsonColumn = JSON_MODIFY(JsonColumn, ''$['+[key]+'].id'', ''NewVALUE'') WHERE PrimaryKey = '''+t.PrimaryKey COLLATE SQL_Latin1_General_CP1_CI_AS+ '''') as statement
FROM MyTable t
CROSS APPLY OPENJSON(JSON_QUERY(JsonColumn, '$')) o WHERE JSON_VALUE(o.value, '$.Id')= 1
)
select * from cte;
Related
I have a column which type is JSON but it contains JSON strings like this:
"{\"a\":1,\"b\":2,\"c\":3}"
I want to update the values in the column with proper JSON objects without the quotes and escapes like this:
{"a":1,"b":2,"c":3}
I've tried the following statement even tough it says it does update rows, the columns are still the same.
UPDATE table SET column = to_json(column);
It seems like the to_json function doesn't work since it is a JSON string?
How can I update those values?
You could cast the JSON column as text, remove the unwanted quotes and escapes, and then cast the resulting text as JSON.
update tbl
set js = trim(replace(js::text, '\',''), '"')::json
demo
You can use the #>> operator to extract the string and then convert it back to JSON with ::json:
UPDATE your_table
SET your_column = (your_column #>> '{}')::json;
A fully working demo:
create table your_table (
your_column json
);
insert into your_table (your_column) values ('"{\"a\":1,\"b\":2,\"c\":3}"'::json);
select your_column, your_column #>> '{}'
from your_table ;
your_column
?column?
"{"a":1,"b":2,"c":3}"
{"a":1,"b":2,"c":3}
update your_table
set your_column = (your_column #>> '{}')::json;
select *
from your_table;
your_column
{"a":1,"b":2,"c":3}
I have a small problem in my case because in my case a column can contain text 'John' directly or text as array '["John","Smith"]' both. So how can I prevent double-escaped JSON in FOR JSON output? I think I am doing something wrong here. Please check my example:
Create table #jsonTest(NameList varchar(max))
insert into #jsonTest(NameList)
select '["John","Smith"]'
Now if I want its output it will give correct output from this (without escape character):
select JSON_QUERY(NameList) NameList from #jsonTest for json auto
Output:
[{"NameList":["John","Smith"]}]
Simple text example:
truncate table #jsonTest
insert into #jsonTest(NameList)
Select 'John'
Now for this I have to change my select query for the correct output because JSON_QUERY, as mentioned, it only returns objects and arrays. So i've changed it to this:
select case when ISJSON(NameList) = 1 then JSON_QUERY(NameList) else NameList end NameList from #jsonTest for json auto
Output:
[{"NameList":"John"}]
Now It will give correct output for now but if I insert previous data again and try upper select query
truncate table #jsonTest
insert into #jsonTest(NameList)
select '["John","Smith"]'
select case when ISJSON(NameList) = 1 then JSON_QUERY(NameList) else NameList end NameList from #jsonTest for json auto
Output:
[{"NameList":"[\"John\",\"Smith\"]"}]
then it is giving escape characters in output. What is wrong in the code?
This behaviour is explained in the documentation - If the source data contains special characters, the FOR JSON clause escapes them in the JSON output with '\'. Of course, as you already know, when JSON_QUERY() is used with FOR JSON AUTO, FOR JSON doesn't escape special characters in the JSON_QUERY return value.
Your problem is the fact, that your data is not always a JSON. So, one possible approach is to generate a statement with duplicate column names (NameList). By default FOR JSON AUTO does not include NULL values in the output, so the result is the expected JSON. Just note, that you must not use INCLUDE_NULL_VALUES in the statement or the final JSON will contain duplicate keys.
Table:
CREATE TABLE #jsonTest(NameList varchar(max))
insert into #jsonTest(NameList)
select '["John","Smith"]'
insert into #jsonTest(NameList)
Select 'John'
Statement:
SELECT
JSON_QUERY(CASE WHEN ISJSON(NameList) = 1 THEN JSON_QUERY(NameList) END) AS NameList,
CASE WHEN ISJSON(NameList) = 0 THEN NameList END AS NameList
FROM #jsonTest
FOR JSON AUTO
Result:
[{"NameList":["John","Smith"]},{"NameList":"John"}]
I'm attempting to validate some column headings before the import of a monthly data set. I've set up an Execute SQL Task that's supposed to retrieve the column headings of the prior month's table and store it in Header_Row as a single string with the field names separated by commas. The query runs just fine in SQL Server, but when running in SSIS, it throws the following error:
"The type of the value (Empty) being assigned to variable 'User:Header_Row' differs from the current variable type (String)."
1) Does this mean that I'm not getting anything back from my query?
2) Is there another method I should be using in SSIS to get the query results I'm looking for?
3) Is there an issue with me using the variable reference in my query as a portion of a string? I think the answer is yes, but would like to confirm, as my variable was still empty after changing this.
Original Query:
SELECT DISTINCT
STUFF((
SELECT
',' + COLUMN_NAME
FROM
db_Analytics.INFORMATION_SCHEMA.COLUMNS aa
WHERE
TABLE_NAME = 'dt_table_?'
ORDER BY
aa.ORDINAL_POSITION
FOR
XML PATH('')
), 1, 1, '') AS Fields
FROM
db_Analytics.INFORMATION_SCHEMA.COLUMNS a;
EDIT: After changing the variable to cover the full table name, I have a new error saying "The value type (__ComObject) can only be converted to variables of the type Object."
Final Query:
SELECT DISTINCT
CAST(STUFF((
SELECT
',' + COLUMN_NAME
FROM
db_Analytics.INFORMATION_SCHEMA.COLUMNS aa
WHERE
TABLE_NAME = ?
ORDER BY
aa.ORDINAL_POSITION
FOR
XML PATH('')
), 1, 1, '') As varchar(8000)) AS Fields
FROM
db_Analytics.INFORMATION_SCHEMA.COLUMNS a;
You are attempting to parameterize your query. Proper query parameterization is useful for avoiding SQL Injection attacks and the like.
Your query is looking for a TABLE_NAME that is literally 'dt_table_?' That's probably not what you want.
For laziness, I'd just rewrite it as
DECLARE #tname sysname = 'dt_table_' + ?;
SELECT DISTINCT
STUFF((
SELECT
',' + COLUMN_NAME
FROM
db_Analytics.INFORMATION_SCHEMA.COLUMNS aa
WHERE
TABLE_NAME = #tname
ORDER BY
aa.ORDINAL_POSITION
FOR
XML PATH('')
), 1, 1, '') AS Fields
FROM
db_Analytics.INFORMATION_SCHEMA.COLUMNS a;
If that's not working, you might need to use an Expression to build out the query.
I'm really pretty sure that this is your problem:
TABLE_NAME = 'dt_table_?'
I'm guessing this is an attempt to parameterize the query, but having the question mark inside the single-quote will cause the question mark to be taken literally.
Try like this instead:
TABLE_NAME = ?
And when you populate the variable that you use as the parameter value, include the 'dt_table_' part in the value of the variable.
EDIT:
Also in your ResultSet assignment, try changing "Fields" to "0" in the Result Name column.
There are two issues with the query above:
1) The query in the task was not properly parameterized. I fixed this by putting the full name of the prior month's table into the variable.
2) The default length of the result was MAX, which was causing an issue when SSIS would try to put it into my variable, Header_Row. I fixed this by casting the result of the query as varchar(8000).
Thanks for the help everyone.
In sql server 2016 I am expecting a document to have 3000+ fields in a JSON column. Can I update one field in the document without replacing to whole document. How can I do this?
You could use JSON_MODIFY function:
Updates the value of a property in a JSON string and returns the
updated JSON string.
JSON_MODIFY ( expression , path , newValue )
Something like:
UPDATE table_name
SET json_column = JSON_MODIFY(json_column, '$.name', 'new_name')
WHERE id = 1;
i need to collect all return data into a variable using comma separated.
let say i have a select command like: select * from #temptable.
it's return:
Field1|Field2
-------------
Value1|Value2
Expected Result: #testvariable hold the value: 'Value1','Value2'
On this table their may have 2 columns and i need to store all the return result into a single variable. We can easily collect a single value like: select #var=column1 from #temptable. But i need to store all.Here the problem is, the number of column can be vary. Mean, number of column and name of column generate from another query.So, i can't mention the field name.I need a dynamic way to do it. on this table only one row will be return. Thanks in advance.
You can do this without dynamic SQL using XML
DECLARE #xml XML = (SELECT * FROM #temptable FOR XML PATH(''))
SELECT stuff((SELECT ',' + node.value('.', 'varchar(100)')
FROM #xml.nodes('/*') AS T(node)
FOR XML PATH(''), type).value('.','varchar(max)')
, 1, 1, '');
This can probably be simplified by someone more adept at XML querying than me.
Since your column names are dynamic, so first you have to take the column names as comma separated in a variable and then can use EXEC()
for example :-
//making comma seperated column names from table B
DECLARE #var varchar(1000)=SELECT SUBSTRING(
(SELECT ',' + Colnames
FROM TABLEB
ORDER BY Colnames
FOR XML PATH('')),2,200000)
//Execute the sql statement
EXEC('select '+#var+' from tableA')
if you want to get the value returned after execution of sql statement then you can use
sp_executesql (Transact-SQL)