ORACLE JSON_VALUE to build a view. Error "single-row subquery returns more than one row" - sql

I'm writing a view were I transform a bunch of data to a more clean state to work with it, but I'm having some trouble with a coupe of fields that basically are json stuffed in columns. They're not large ones, and I've figured out with json_value how to get some of it, but the other column has an array within the json like this:
{"field1":"string","field2":0,"field3":[{"field1":"string","field2":true}]}
The data I need is the one in field3.
I've come up with a way of getting the portion of data I need raw (the contents of field3) and the idea is to extract then the data with json_vale
--code I figured out
select rtrim(ltrim(field3_raw,'['),']')
FROM json_table (
'{"field1":"string","field2":0,"field3":[{"field1":"string","field2":true}]}'format json,
'$'columns (field3_raw varchar2 format json path '$.field3')
)
;
which outputs
{"field1":"string","field2":true}
But the bad news comes when I change the raw json to the query like this:
Extract of the table I want to query:
|METADATA|
{"field1":"string1","field2":1,"field3":[{"field1":"string11","field2":true}]}
{"field1":"string2","field2":2,"field3":[{"field1":"string22","field2":true}]}
{"field1":"string3","field2":3,"field3":[{"field1":"string33","field2":true}]}
{"field1":"string4","field2":4,"field3":[{"field1":"string44","field2":true}]}
Query used:
select metadata_raw
FROM json_table (
(SELECT metadata FROM table)format json,
'$'columns (METADATA_RAW varchar2 format json path '$.field3')
)
;
Just to get this error:
SQL Error [1427] [21000]: ORA-01427: single-row subquery returns more than one row
Is there a way to retrieve in one query all the rows transformed the way I got it with json_table to use it as a subquery to build the column with the data I need in the view?
Additional details: Oracle 19c

Related

How can you filter Snowflake EXPLAIN AS TABULAR syntax when its embedded in the TABLE function? Can you filter it with anything?

I have a table named Posts I would like to count and profile in Snowflake using the current Snowsight UI.
When I return the results via EXPLAIN using TABLULAR I am able to return the set with the combination of TABLE, RESULT_SCAN, and LAST_QUERY_ID functions, but any predicate or filter or column reference seems to fail.
Is there a valid way to do this in Snowflake with the TABLE function or is there another way to query the output of the EXPLAIN using TABLULAR?
-- Works
EXPLAIN using TABULAR SELECT COUNT(*) from Posts;
-- Works
SELECT t.* FROM TABLE(RESULT_SCAN(LAST_QUERY_ID())) as t;
-- Does not work
SELECT t.* FROM TABLE(RESULT_SCAN(LAST_QUERY_ID())) as t where operation = 'GlobalStats';
-- invalid identifier 'OPERATION', the column does not seem recognized.
Tried the third example and expected the predicate to apply to the function output. I don't understand why the filter works on some TABLE() results and not others.
You need to double quote the column name
where "operation"=
From the Documentation
Note that because the output column names from the DESC USER command
were generated in lowercase, the commands use delimited identifier
notation (double quotes) around the column names in the query to
ensure that the column names in the query match the column names in
the output that was scanned

Handling Json data in snowflake

enter image description here
I have a table which contains Json file data in each row which gets updated into my snowflake table every weak. I am extracting values from the Json files into another table. When the data is loaded in Json format there are multiple entries of the same ID. So, when I extract values from Json to a table there are duplicate rows. How do I tackle them in order to get the distinct rows only. My select query look something like this:
select
json_data:data[0].attributes."Additional Invoice?":: string as "Additional Invoice?",
json_data:data[0].attributes."Additional PO?":: string as "Additional PO?",
json_data:data[0].attributes."Aggregate Contract Value":: number as "Aggreagate Contract Value" ,
json_data:data[0].attributes."Annualized Baseline Spend" :: number as "Annualized Baseline Spend",
json_data:data[0].id ::number as ID,
json_data:data[0].type::string as TYPE
from scout_projects order by ID
the scout project file screenshot is attached.
The attached Screenshot is the output form the given query and as you could see the ID column is the same but there are only 2 unique rows. I want my query to return only those 2 unique rows.
select distinct json_data:data[0].id :: number as ID from scout_projects
what is the approach should I take?
I tried using subquery, but it gave me error stating "single-row subquery returns more than one row. snowflake error" which is obvious. so, need a way out .

How to use json array in WHERE IN clause in Postgres

I have a Postgres query like this
SELECT * FROM my_table WHERE status IN (2,1);
This is part of a big query, but I am facing an issue with the WHERE IN part here. I am using this query inside a function and the input parameters are in JSON format. Now the status values I am getting in in the form of a JSON array and it will be like status=[2,1]. I need to use this array in the WHERE clause in the query and not sure how to do that. Currently, I am using like
SELECT * FROM my_table WHERE status IN (array([2,1]));
But this is giving me an error. The status column is of smallint data type. I know this is simple, but I am very much new to Postgres and could not figure out any method to use the JSON array in WHERE IN clause. Any help will be appreciated.

ORA-31013 when reading a JSON_VALUE from a JSON_TABLE

When I query a JSON_TABLE for a value using a JSON_VALUE expression (and not a COLUMN in the table expression) such as:
WITH SAMPLE_TABLE AS (
SELECT '{"a":[{"b":"foo"},{"b":"bar"}]}' AS PAYLOAD FROM DUAL
)
SELECT JSON_VALUE(SUB, '$.b')
FROM SAMPLE_TABLE, JSON_TABLE(
PAYLOAD,
'$.a[*]'
COLUMNS (SUB CLOB FORMAT JSON PATH '$')
);
I get an error that I am using an invalid XPATH expression (ORA-31013: Invalid XPATH expression). The message alone confuses me but if I change the select to JSON_VALUE(TO_CHAR(SUB), '$.b'), the query works by showing two rows foo and bar what confuses me even more.
I do not have any such problems when using the XML-equivalent in Oracle where such selects just work. I am not using columns as this allows me to reuse a lot of stuff I already have for XML and beyond that, I am curious what is wrong here. I am using Oracle Database 12c Enterprise Edition Release 12.2.0.1.0.

Searching an element in top-level json array oracle

I have an array of strings stored in oracle column as a json array in the following format:
["abc", "xyz"]
["cde", "fgh"]
["xyz"]
I have to write a query to check whether a given string is present in any of the arrays in any of the rows. In the above example I would like to see whether "xyz" is present. How should the json path be? I know I can use the 'like' clause but I don't think that is a neat way to do.
Why the query SELECT JSON_QUERY(my_column, '$[*]') FROM my_table is always returning null?
I did the following test, this may be what you are looking for:
create table t(json_v varchar2(40))
insert into t values('["abc", "xyz"]');
insert into t values('["cde", "fgh"]');
insert into t values('["xyz"]');
SELECT *
from t, json_table(t.json_v, '$[*]' columns (value PATH '$'))
WHERE value = 'xyz'
Output Result
JSON_V value
["abc", "xyz"] xyz
["xyz"] xyz
Your question two why the query always returns zero as you have to wrap the values see the JSON_QUERY syntax
SELECT JSON_QUERY(json_v, '$[*]' WITH WRAPPER) AS value FROM myTable;