One of my tables contains JSON values in each row of a column.
The data is as below (example. one row)
[{"id":"30a66bec-c0aa-4655-a8ef-506e52bfcc14","type":"nps","value":"promoter","decimalValue":"10"},{"id":"37850b3b-1eac-4921-ae22-b2f6d2450897","type":"sentiment","value":"positive","decimalValue":"0.990000009536743"}]
Now I'm trying to retrieve two columns from it. (id, value)
I'm writing the below query using JSON_VALUE but getting NULL values in each row of the new column.
select a.jsondata,JSON_VALUE(a.jsondata,'$.id') from table as a
Your JSON field is an array so you need to specify which element you're after, assuming its always the first you can use:
select a.jsondata,JSON_VALUE(a.jsondata,'$[0].id') from table as a
You need to change the index inside the square brackets to access the id you want from the JSON string
You have a JSON array. If you want to break it into multiple rows you need to use OPENJSON
SELECT j.*
FROM YourTable t
CROSS APPLY OPENJSON (t.Json)
WITH (
id uniqueidentifier,
value varchar(100)
) j;
db<>fiddle
Related
If cross apply is used for a function, but the query also uses a join, how does SQL determine which records to use for the function?
I have a table which contains some JSON strings, and I'm using OPENJSON to parse them. I'm also using a join to determine which records to pass to the OPENJSON function. However, it is erroring on records that aren't in the selection
eg in this example, I am adding two json strings to a table, one of which is good, and the other is truncated. I'm then trying to parse the good row using openJSON, but it is failing on the bad row, even though it isn't being selected...
--create table to store JSON strings
declare #jsontable table (id int identity,
JSONstring nvarchar(2000))
--add one good row, and one bad row
INSERT into #jsontable (JSONstring)
values
('{"RowNumber":1,"Field1":"Hello","Field2":"World"}'), --First row good
('{"RowNumber":2,"Field1":"Hello"') --Second row truncated
--create table with ID of the good record
DECLARE #IDsToSelect table (id int)
INSERT into #IDsToSelect (id)
values
(1)
--parse the json in the good record
select * from #jsontable jst
inner join #IDsToSelect its
on jst.id=its.id
cross apply openjson(jst.JSONstring)
The above example gives a 'JSON text is not properly formatted' error, and it fails because of the bad JSON in row 2, even though I'm only joining to row 1.
Does SQL apply the openjson function to the entire #jsontable table, and then apply the inner join to those results? If so, how should I structure this query so that it only parses the JSON of the selected row?
I have a table that contains a list of xml tags/values that I need to use to join to another table to retrieve their actual value and display the result as a csv list.
Example varchar data:
<choice id="100"/><choice id="101"/><choice id="102"/>
However, these values actually translate to other values: red, white, blue respectively. I need to convert that list to the following list:
red,white,blue
As a recap, the "source" table column is varchar, and contains a list of xml attribute values, and those values translate to other values by joining to another table. So the other table has a primary key of id (int) with rows for 100,101,102. Each of those rows has values red,white,blue respectively. I hope this makes enough sense.
Here is the ddl to set up the scenario:
create table datatable(
id int,
data nvarchar(449)
primary key (id)
);
insert into datatable(id, data)
values(1,'<choice id="100"/><choice id="101"/><choice id="102"/>')
,(2,'<choice id="100"/>')
,(3,'<choice id="101"/>')
,(4,'<choice id="102"/>');
create table choicetable(
id int,
choicevalue nvarchar(449)
primary key (id)
);
insert into choicetable(id, choicevalue)
values(100,'red')
,(101,'white')
,(102,'blue');
This would be the first time I've tried parsing XML in this manner so I'm a little stumped where to start. Also, I do not have control over the database I am retrieving the data from (3rd party software).
Without proper sample data it's hard to give an exact query. But you would do something like this
Use CROSS APPLY to convert the varchar to xml
Use .nodes to shred the XML into separate rows.
Join using .value to get the id attribute
Group up, and concatenate using STRING_AGG. You may not need GROUP BY depending on your situation.
SELECT
xt.Id,
STRING_AGG(ot.Value, ',')
FROM XmlTable xt
CROSS APPLY (SELECT CAST(xt.XmlColumn AS xml) ) v(XmlData)
CROSS APPLY v.XmlData.nodes('/choice') x1(choice)
JOIN OtherTable ot ON ot.Id = x1.choice.value('#id','int')
GROUP BY
xt.Id;
I would advise you to store XML data in an xml typed column if at all possible.
I have column "elements" in table which is having a json(array json) row values which looks like this
elements
[{"key":12,"value":"qw"},{"key":13,"value":"fa"}]
[{"key":32,"value":"24"},{"key":321,"value":"21"}]
I want to make an column of arrays for every row which consist of keys extracted from that row's json values ,my desired column "result" may look like this
elements
result
[{"key":12,"value":"qw"},{"key":13,"value":"fa"}]
{12,13}
[{"key":32,"value":"24"},{"key":321,"value":"21"}]
{32,321}
is there a way to do it? thank you
Schema (PostgreSQL v13)
CREATE TABLE test (
elements json
);
INSERT INTO test VALUES ('[{"key":12,"value":"qw"},{"key":13,"value":"fa"}]');
INSERT INTO test VALUES ('[{"key":32,"value":"24"},{"key":321,"value":"21"}]');
Query #1
select elements::text, array_agg(cast(value->>'key' as integer)) as result
from test, json_array_elements(elements)
group by 1
ORDER BY 1;
elements
result
[{"key":12,"value":"qw"},{"key":13,"value":"fa"}]
12,13
[{"key":32,"value":"24"},{"key":321,"value":"21"}]
32,321
View on DB Fiddle
select elements::text,
array_agg(value->>'key')
from your_table, json_array_elements(elements)
group by 1;
I have a table with PK and another column for other id. In some cases i need to insert record with equal values in both columns. For primary key values i'm using sequence, which gives a Field<Long> from Sequences.MY_SEQ.nextval().
How can i extract value from a Field<Long> for guaranteed insert same ids in both columns? Using Field<Long> in insert clause generates 2 different ids in columns.
Here is the solution:
Long id = dsl.select(Sequences.MY_SEQ.nextval()).fetchOne().value1();
Your own solution works, of course, but it will generate two round trips to the database. One for fetching the sequence value and another one for the insert. If that's not a problem, perfect. Otherwise, you can still do it in one single query using INSERT .. SELECT:
In SQL:
(using Oracle syntax. Your SQL syntax may vary...)
INSERT INTO my_table (col1, col2, val)
SELECT t.id, t.id, 'abc'
FROM (
SELECT my_seq.nextval AS id
FROM dual
) t
With jOOQ
Table<?> t = table(select(MY_SEQ.nextval().as("id"))).as("t");
dsl.insertInto(MY_TABLE)
.columns(MY_TABLE.COL1, MY_TABLE.COL2, MY_TABLE.VAL)
.select(
select(t.field("id"), t.field("id"), val("abc"))
.from(t))
.execute();
I want to insert data into one table from another table. In some of the columns I don't have data, so I want to set column to null. I don't know how I should do this?
This is the SQL:
INSERT INTO _21Appoint(
PCUCODE,PID,SEQ,
DATE_SERV,APDATE,
APTYPE,APDIAG,D_UPDATE,CID
) SELECT (
NULL,NULL,NULL,
treatment_date,appointment_date,
typeap_id,appointment_id,NULL,patient_id
) FROM cmu_treatment,cmu_appointment
WHERE cmu_treatment.treatment_id LIKE cmu_appointment.treatment_id;
Your insert is essentially correct. Just don't put the column list in parentheses:
INSERT INTO _21Appoint
(PCUCODE,PID,SEQ,DATE_SERV,APDATE,APTYPE,APDIAG,D_UPDATE,CID)
SELECT NULL,NULL,NULL,treatment_date,appointment_date,typeap_id,appointment_id,NULL,patient_id
FROM cmu_treatment,cmu_appointment
WHERE cmu_treatment.treatment_id LIKE cmu_appointment.treatment_id;
In Postgres (unlike other DBMS) putting a column list in parentheses makes the result a single "record", rather then individual columns. And therefore the select only returns a single column, not multiples and thus it doesn't match the column list for the insert
another option is to simply leave out the columns completely:
INSERT INTO _21Appoint
(DATE_SERV,APDATE,APTYPE,APDIAG,CID)
SELECT treatment_date,appointment_date,typeap_id,appointment_id,patient_id
FROM cmu_treatment,cmu_appointment
WHERE cmu_treatment.treatment_id LIKE cmu_appointment.treatment_id;