MSSQL Search inside a JSON - sql

How can i filter inside a json file in SQL server?
i have a Column call details.
{"test","source":"web"}
i want to filter by source
what i did:
select * from TABLE_NAME
CROSS APPLY OPENJSON(details,'$.source')
where value ='web'

As per Zohar's comment, make your json valid, then something like:
--{"mode":"test","source":"web"}
select * from TABLE_NAME
CROSS APPLY
OPENJSON(details)
WITH (
m varchar(256) '$.mode',
s varchar(256) '$.source'
) j
where
j.w = 'web'
But it might suit you better/simpler to just use JSON_VALUE:
select * from TABLE_NAME
WHERE json_value(details, '$.source') = 'web'
Use CROSS APPLY OPENJSON if you want to turn each row's json into a pseudotable looking like the table spec in the WITH clause. SQLServer behaves as if all the matching "rows" in each row's json are compounded into the psseudotable and auto-joined to the source data table based on where each bunch of json pseudorows came from
Use JSON_VALUE if you only really want one value out of the json and can uniquely identify a single "row" in the json from which to get the value.. Either the json only has one "row" / is not a collection, or you want a "row" out of a json collection that can be referenced according to a formula

Related

Biq-query: add new key:val in json

How can I add a new key/val pair in an already existing JSON col in bigqyery using SQL (big query flavor).
To something like
BigQuery provides Data Manipulation Language (DML) statements such as the SQL Update statement. See:
https://cloud.google.com/bigquery/docs/reference/standard-sql/dml-syntax#update_statement
What you will have to do is retrieve the original value of your structured column and then perform a SQL UPDATE statement to set the new value of the column to be the absolute new value that you want.
Take care to realize that BigQuery is an OLAP database and is optimized for queries rather that updates or deletes. Make sure you read the information on using DML statements in BigQuery found here.
https://cloud.google.com/bigquery/docs/reference/standard-sql/data-manipulation-language
I feel like this question is less about how to update the table, but more about how to adjust existing json with extra/new key:value (then to either update table or just simply select out)
So, I assume you have table like below
and you might have another table with those new key:value pairs to use
in case if you don't really have second table - you can just use CTE like below
with new_key_val as (
select 1 id, '{"key3":"value3"}' add_json union all
select 2 id, '{"key14":"value14"}'
)
So, having above - you can use below approach
select *,
( select '{' || string_agg(trim(kv)) || ',' || trim(add_json, '{}') || '}'
from unnest(split(trim(json_col, '{}'), ',')) kv
) adjusted_json
from your_table
left join new_key_val
using(id)
with output
BigQuery supports JSON as a native data type but only offers a limited set of JSON functions. Unless your json data has a pre-defined, simple schema with known keys, you probably want to go the string-manipulation way.

Unable to extract property with SQL JSON_QUERY [duplicate]

This question already has an answer here:
Why is JSON_QUERY sending back a null value?
(1 answer)
Closed last year.
I have the folowing Json data stored in a sql table :
{"OrderNumber":"12450-OF","OrderType":"OF"}
I need to extract the OrderNumber from a sql query
The folowing statement returns null:
select
JSON_QUERY(Metadata,'$.OrderNumber') AS 'orderNumber'
from Documents
where Documents is my table, and metadata is the column where my json data is stored.
You need to use JSON_VALUE() to extract a scalar value from a JSON content. JSON_QUERY() is usually used to extract an object or an array from a JSON string.
SELECT JSON_VALUE(Metadata,'$.OrderNumber') AS 'orderNumber'
FROM (VALUES
('{"OrderNumber":"12450-OF","OrderType":"OF"}')
) Documents (Metadata)
Note, that if you want to extract more values from the stored JSON, OPENJSON() with explicit schema is another option:
SELECT *
FROM Documents d
CROSS APPLY OPENJSON(d.Metadata, '$') WITH (
OrderNumber varchar(10) '$.OrderNumber',
OrderType varchar(2) '$.OrderType'
) j
You just need to replace function Json_Query to JSON_VALUE.
JSON_VALUE use for getting value from json.
JSON_Query use for getting object from json string.
For example, if you have:
"OrderNumber":["12450-OF","12450-02"]
then your query will return object
["12450-OF","12450-02"]

Select all existing json fields from a postgres table

In my table mytable I have a json field called data and I inserted json with a lot of keys & values.
I know that it's possible to select individual fields like so:
SELECT data->'mykey' as mykey from mytable
But how can I get an overview of all of the json keys on a certain depth? I would have expected something like
SELECT data->* from mytable
but that doesn't work. Is there something similar?
You can use the json_object_keys() function to get all the top-level keys of a json value:
SELECT keys.*
FROM mytable, json_object_keys(mytable.data) AS keys (mykey);
If you want to search at a deeper level, then first extract that deeper level from the json value using the #> operator:
SELECT keys.*
FROM mytable, json_object_keys(mytable.data #> '{level1, level2}') AS keys (mykey);
Note that the function returns a set of text, so you should invoke the function as a row source.
If you are using the jsonb data type, then use the jsonb_object_keys() function.

Can get an average of values in a json array using postgres?

One of the great things about postgres is that it allows indexing into a json object.
I have a column of data formatted a little bit like this:
{"Items":
[
{"RetailPrice":6.1,"EffectivePrice":0,"Multiplier":1,"ItemId":"53636"},
{"RetailPrice":0.47,"EffectivePrice":0,"Multiplier":1,"ItemId":"53404"}
]
}
What I'd like to do is find the average RetailPrice of each row with these data.
Something like
select avg(json_extract_path_text(item_json, 'RetailPrice'))
but really I need to do this for each item in the items json object. So for this example, the value in the queried cell would be 3.285
How can I do this?
Could work like this:
WITH cte(tbl_id, json_items) AS (
SELECT 1
, '{"Items": [
{"RetailPrice":6.1,"EffectivePrice":0,"Multiplier":1,"ItemId":"53636"}
,{"RetailPrice":0.47,"EffectivePrice":0,"Multiplier":1,"ItemId":"53404"}]}'::json
)
SELECT tbl_id, round(avg((elem->>'RetailPrice')::numeric), 3) AS avg_retail_price
FROM cte c
, json_array_elements(c.json_items->'Items') elem
GROUP BY 1;
The CTE just substitutes for a table like:
CREATE TABLE tbl (
tbl_id serial PRIMARY KEY
, json_items json
);
json_array_elements() (Postgres 9.3+) to unnest the json array is instrumental.
I am using an implicit JOIN LATERAL here. Much like in this related example:
Query for element of array in JSON column
For an index to support this kind of query consider this related answer:
Index for finding an element in a JSON array
For details on how to best store EAV data:
Is there a name for this database structure?

Replacing JSON Formatted String in SQL

I have a something like this in my table column:
{"InputDirection":0,"Mask":"AA","FormatString":null,"AutoCompleteValue":null,
"Filtered":"0123456789","AutoComplete":false,"ReadOnly":true}
What I want to do is to change A to N in "Mask":"AA" and remove "Filtered":"0123456789" if they exist. Mask could be in different forms like A9A, 'AAAA`, etc.
If it was in C# I could do it by myself by parsing it to JSON, etc but I need to do it within SQL.
I've found this article which shows how to parse JSON to Table. This gave me an idea that I can parse each field to temp table and make the changes on that and convert it back to JSON so update the actual field where I take this JSON field from. However, this looks like a cumbersome process for both me and the server.
Any better ideas?
You can use this LINK .
And then use the following code
select * into #demo from
(Select * from parseJSON('{"InputDirection":0,"Mask":"AA","FormatString":null,"AutoCompleteValue":null,
"Filtered":"0123456789","AutoComplete":false,"ReadOnly":true}
')) a
select * from #demo
--- CHANGE THE DATA HERE AS REQUIRED
DECLARE #MyHierarchy JSONHierarchy;
INSERT INTO #myHierarchy
select * from #demo;
-- USE THIS VALUE AND UPDATE YOUR JSON COLUMN
SELECT dbo.ToJSON(#MyHierarchy)
drop table #demo
I may be getting something wrong here but why can’t you simply use REPLACE to update what’s needed and LIKE to identify JSON strings that should be updated?
update table_T
set json_string = REPLACE(json_string, '"Filtered":"0123456789",', '')
where json_string like '%"Mask":"AA"%'
Not sure I understand why do you need to parse it….