Parse Json Response and insert into oracle - sql

I am new to oracle and facing a little challenge in iterating through a json response.
I know there are many examples online which work on json array element and then insert, but i tried various things and couldnt get this to work.
The json response is like this:
{
'data':
{
'Key_one' : 'value_one',
'Key_two' : 'value_two'
}
}
I have a stored proc:
CREATE OR REPLACE PROCEDURE TEST(JSON_TEXT_DATA) AS
BEGIN
---- need a for loop here to dynamically iterate and insert.
INSERT INTO TABLE_NAME('KEY_ONE') values(json_value(JSON_TEXT_DATA,'$.data.Key_One'));
INSERT INTO TABLE_NAME('KEY_TWO') values(json_value(JSON_TEXT_DATA,'$.data.Key_Two'));
END;

You can try using JSON_TABLE to insert the values into your table. Using this method will put both keys into a single row in your table.
INSERT INTO table_name (key_one, key_two)
SELECT key1, key2
FROM JSON_TABLE ('{
"data":
{
"Key_one" : "value_one",
"Key_two" : "value_two"
}
}',
'$.data'
COLUMNS key1 VARCHAR2 PATH '$.Key_one', key2 VARCHAR2 PATH '$.Key_two');

Related

Returning a declared array in stored procedure

I'm currently stuck on a problem with a stored procedure. I have to return an array / list of calculated data from that stored procedure.
Retrieving it with a server using Entity Framework but the problem isn't here.
I want to return data that are like this:
field1 : int,
field2 : string,
field3 : int,
calculatedData : List<int>
At the moment my procedure looks like this:
CREATE PROCEDURE [dbo].[SP_GDF_UNITE_Previsionnel]
AS
DECLARE #calculatedData TABLE(jour int, valeur int)
SELECT
field1,
field2 ,
field3
FROM
(... content )
...
I would like to return the calculatedData but I don't have any idea how to do it.
Maybe by changing my procedure structure ?
Hope someone could help me to do it.
Tried to include the array in my select but doesn't work. Didn't find any information on how to do it on internet. The SELECT ... FROM has to be here in the procedure.
method 1
return an json structure from your select statement. The nest json will act like an array in your json data. Based on your example, the json subquery can have one or more records.
Now you can convert the json data to a diction in python and retrieve the values recursively
/****** Script for SelectTopNRows command from SSMS ******/
select
Field1
,Field2
,Field3
,(select * from dbo.myTable myTable2
where myTable2.Field1=myTable.Field1
for json path) as data
FROM dbo.myTable myTable
where myTable.Field1=123
for json path
method 2
try creating a temp table instead of an array then insert values into the temp table.
declare #tmp_table as table(newid uniqueidentifier , field1 varchar(10));
insert into #tmp_table(newid,field1) values(newid(),'a'),(newid(),'b')
select * from #tmp_table
you can now use the newid as a hashkey for updating the table similar to a dictionary which simulates an array.

What is the right way to handle type string null values in SQL's Bulk Insert?

For example, I have a column with type int.
The raw data source has integer values, but the null values, instead of being empty (''), is 'NIL'
How would I handle those values when trying to Bulk Insert into MSSQL?
My code is
create table test (nid INT);
bulk insert test from #FILEPATH with (format="CSV", firstrow=2);
the first 5 rows of my .csv file looks like
1
2
3
NIL
7
You can replace the nil with " (empty string) directly in your data source file or insert the data into a staging table and transform it:
BULK INSERT staging_sample_data
FROM '\\data\sample_data.dat';
INSERT INTO [sample_data]
SELECT NULLIF(ColA, 'nil'), NULLIF(ColB, 'nil'),...
Of course if your field is for example a numeric, the staging table should have a string field. Then, you can do as Larnu offers: 'TRY_CONVERT(INT, ColA)'.
*Note: if there are default constraints you may need to check how to keep nulls

JSON_Query with For Json Path

Please see the table given below. The table contains the json string and need to create a json array with those json string. But When I use JSON_Query and For Json Path it adds additional header. (Alias name or the source column name). How to generate the json array without alias name or source column name.
Please see the example given below.
DECLARE #jsonTbl TABLE (id INT,json VARCHAR(MAX))
INSERT INTO #jsonTbl (id,json) VALUES (1,'{"id":"1A", "names":{"firstname":"Name1"}}')
INSERT INTO #jsonTbl (id,json) VALUES (1,'{"id":"2A", "names":{"firstname":"Name2"}}')
SELECT JSON_QUERY(json) AS 'someName'
FROM #jsonTbl
FOR JSON AUTO
--When I use the above select query it returns the data as
[{"SomeName":{"id":"1A", "names":{"firstname":"Name1"}}},{"SomeName":{"id":"2A", "names":
{"firstname":"Name2"}}}]
Formatted JSON
```[
{
"someName":{
"id":"1A",
"names":{"firstname":"Name1"}
}
},
{
"someName":{
"id":"1B",
"names":{
"firstname":"Name1"
}
}
}
]
--But need the result as follows. Do not need someName
[
{
"id":"1A",
"names":{
"firstname":"Name1"
}
},
{
"id":"2A",
"names":{
"firstname":"Name2"
}
}
]```
You can use OPENJSON() together with CROSS APPLY
SELECT j.[id], j.[names]
FROM #jsonTbl t
CROSS APPLY OPENJSON(t.json, '$') WITH ([id] VARCHAR(100),
[names] NVARCHAR(MAX) AS JSON) j
FOR JSON AUTO
Demo

How to add json object from a nested json into a db-column in Oracle DB

I need to insert json from a nested json file into a column in Oracle. For example, in the following json
{
"name":"John",
"age":30,
"cars": {
"car1":"Ford",
"car2":"BMW",
"car3":"Fiat"
}
}
I need to store the whole json:
"cars": {
"car1":"Ford",
"car2":"BMW",
"car3":"Fiat"
}
in a db column. How can i do that? I am using Oracle DB.
I have tried the following query but its not working. (Says clob isn't a valid datatype)
select x.*
from json_tab t,
json_table (t.json_data, '$.[*]'
COLUMNS
name VARCHAR2(4000) PATH '$.name',
cars clob PATH '$.cars[*]') x;
I have tried the same using varchar2 datatype but it selects null.
Assuming you use oracle 12c, try to experiment with this query to achieve the result you need.
select x.*, json_object(key 'cars' value x.cars format json) cars_json
from json_tab t,
json_table(t.json_data, '$'
COLUMNS
name VARCHAR2(4000) PATH '$.name',
cars VARCHAR(4000) format json PATH '$.cars[*]') as x;

why insert statement not working in postgres trying to insert json?

Why my insert statement not working?
here is my code
http://sqlfiddle.com/#!17
create table json_check (
id serial primary key not null ,
body jsonb
)
insert into json_check (body)
values ('{
"test":"naveeb",
"data":"{'a':'ss'}"}')
It is showing me a syntax error.
It seems that your JSON is invalid.
insert into json_check (body)
values ('{
"test": "naveeb",
"data": {
"a": "ss"
}
}');
You can check your JSON status on https://jsonlint.com/.
Running example pasted on below link:
http://sqlfiddle.com/#!17/80ca3/6