Oracle search array within JSON values? - sql

I have the following stored in an Oracle database as JSON:
{
value: [1,2,3]
}
The value can be of any type (strings, integers or arrays). How would I query if the type is array and if it contains a certain value?
In pseudocode:
SELECT * FROM TABLE WHERE COLUMN_NAME.value CONTAINS 2
I can see how to query strings using Oracle functions such as json_query but cannot see how to run this specific type of query without selecting all data and searching on the client.

You may use JSON_TABLE in the FROM, defining the columns and then use it in where clause to filter rows.
--Test data
with t (id,j)
as
( select 1, TO_CLOB(
'{
value : [1,2,3]
}') FROM DUAL
)
--Test data ends--
select t.id,tbl.val FROM t cross join
json_table(j,'$.value[*]' columns (val varchar2(100) path '$') ) as tbl
where tbl.val = 2
ID VAL
------ -------
1 2

Related

SQL find elements is in JSON array column

Hello i have a column table who contains json array with tags in it, I try to find a way to select result who contain some of values in it :
ID
tags
1
["test01"]
2
["test02","test03"]
I try to used JSON_QUERY AND JSON_VALUE():
SELECT *
FROM table_tags
WHERE JSON_QUERY(tags,'$') IN ('test01', 'test02')
return nothing
but with json_value on the first array element [0]
SELECT *
FROM table_tags
WHERE JSON_VALUE(tags,'$[0]') IN ('test01', 'test02')
it return the first one with test01
ID
tags
1
["test01"]
i need to find a way to iterate through json_value tags to find all tags in ('test01', 'test02')
You need an OPENJSON() call to parse the stored JSON array. The result is a table with columns key, value and type and in the value column is each element from the parsed JSON array. The column data type is nvarchar(max) with the same collation as tags column collation.
SELECT *
FROM (VALUES
(1, '["test01"]'),
(2, '["test02","test03"]')
) table_tags (id, tags)
WHERE EXISTS (
SELECT 1 FROM OPENJSON(tags) WHERE [value] IN ('test01', 'test02')
)

Compare two arrays in PostgreSQL

I have a table in postgres with a value column that contains string arrays. My objective is to find all arrays that contain any of the following strings: {'cat', 'dog'}
id value
1 {'dog', 'cat', 'fish'}
2 {'elephant', 'mouse'}
3 {'lizard', 'dog', 'parrot'}
4 {'bear', 'bird', 'cat'}
The following query uses ANY() to check if 'dog' is equal to any of the items in each array and will correctly return rows 1 and 3:
select * from mytable where 'dog'=ANY(value);
I am trying to find a way to search value for any match in an array of strings. For example :
select * from mytable where ANY({'dog', 'cat'})=ANY(value);
Should return rows 1, 3, and 4. However, the above code throws an error. Is there a way to use the ANY() clause on the left side of this equation? If not, what would be the workaround to check if any of the strings in an array are in value?
You can use && operator to find out whether two array has been overlapped or not. It will return true only if at least one element from each array match.
Schema and insert statements:
create table mytable (id int, value text[]);
insert into mytable values (1,'{"dog", "cat", "fish"}');
insert into mytable values (2,'{"elephant", "mouse"}');
insert into mytable values (3,'{"lizard", "dog", "parrot"}');
insert into mytable values (4,'{"bear", "bird", "cat"}');
Query:
select * from mytable where array['dog', 'cat'] && (value);
Output:
id
value
1
{dog,cat,fish}
3
{lizard,dog,parrot}
4
{bear,bird,cat}
db<>fiddle here

query json data from oracle 12.1 having fields value with "."

I have a table that has JSON data stored and I'm using json_exists functions in the query. Below is my sample data from the column for one of the rows.
{"fields":["query.metrics.metric1.field1",
"query.metrics.metric1.field2",
"query.metrics.metric1.field3",
"query.metrics.metric2.field1",
"query.metrics.metric2.field2"]}
I want all those rows which have a particular field. So, I'm trying below.
SELECT COUNT(*)
FROM my_table
WHERE JSON_EXISTS(fields, '$.fields[*]."query.metrics.metric1.field1"');
It does not give me any results back. Not sure what I'm missing here. Please help.
Thanks
You can use # operator which refers to an occurrence of the array fields such as
SELECT *
FROM my_table
WHERE JSON_EXISTS(fields, '$.fields?(#=="query.metrics.metric1.field1")')
Demo
Edit : The above case works for 12R2+, considering that it doesn't work for your version(12R1), try to use JSON_TABLE() such as
SELECT fields
FROM my_table,
JSON_TABLE(fields, '$.fields[*]' COLUMNS ( js VARCHAR2(90) PATH '$' ))
WHERE js = 'query.metrics.metric1.field1'
Demo
I have no idea how to "pattern match" on the array element, but just parsing the whole thing and filtering does the job.
with t(x, json) as (
select 1, q'|{"fields":["a", "b"]}|' from dual union all
select 2, q'|{"fields":["query.metrics.metric1.field1","query.metrics.metric1.field2","query.metrics.metric1.field3","query.metrics.metric2.field1","query.metrics.metric2.field2"]}|' from dual
)
select t.*
from t
where exists (
select null
from json_table(
t.json,
'$.fields[*]'
columns (
array_element varchar2(100) path '$'
)
)
where array_element = 'query.metrics.metric1.field1'
);
In your code, you are accessing the field "query.metrics.metric1.field1" of an object in the fields array, and there is no such object (the elements are strings)...

SQL query, which search all rows with specific items of array column

my sql case in postgres:9.6 database
CREATE TABLE my_table (
id serial PRIMARY KEY,
numbers INT []
);
INSERT INTO my_table (numbers) VALUES ('{2, 3, 4}');
INSERT INTO my_table (numbers) VALUES ('{2, 1, 4}');
-- which means --
test=# select * from my_table;
id | numbers
----+---------
1 | {2,3,4}
2 | {2,1,4}
(2 rows)
I need to find all rows with numbers 1 and/or 2. According this answer I use query like this:
SELECT * FROM my_table WHERE numbers = ANY('{1,2}'::int[]);
And got following error:
LINE 1: SELECT * FROM my_table WHERE numbers = ANY('{1,2}'::int[]);
^
HINT: No operator matches the given name and argument type(s). You might need to add explicit type casts.
How does correct sql query look?
Using var = ANY(array) works well for finding if a single value (var) is contained in an array.
To check if an array contains parts of another array, you would use the && operator
&& -- overlap (have elements in common) -- ARRAY[1,4,3] && ARRAY[2,1] --> true
SELECT * FROM my_table WHERE numbers && '{1,2}'::int[];
To check if an array contains all members of another array, you would use the #> operator
#> -- contains -- ARRAY[1,4,3] #> ARRAY[3,1] --> true
SELECT * FROM my_table WHERE numbers #> '{1,2}'::int[];

JSONB array contains like OR and AND operators

Consider a table temp (jsondata jsonb)
Postgres provides a way to query jsonb array object for contains check using
SELECT jsondata
FROM temp
WHERE (jsondata->'properties'->'home') ? 'football'
But, we can't use LIKE operator for array contains. One way to get LIKE in the array contains is using -
SELECT jsondata
FROM temp,jsonb_array_elements_text(temp.jsondata->'properties'->'home')
WHERE value like '%foot%'
OR operation with LIKE can be achieved by using -
SELECT DISTINCT jsondata
FROM temp,jsonb_array_elements_text(temp.jsondata->'properties'->'home')
WHERE value like '%foot%' OR value like 'stad%'
But, I am unable to perform AND operation with LIKE operator in JSONB array contains.
After unnesting the array with jsonb_array_elements() you can check values meeting one of the conditions and sum them in groups by original rows, example:
drop table if exists temp;
create table temp(id serial primary key, jsondata jsonb);
insert into temp (jsondata) values
('{"properties":{"home":["football","stadium","16"]}}'),
('{"properties":{"home":["football","player","16"]}}'),
('{"properties":{"home":["soccer","stadium","16"]}}');
select jsondata
from temp
cross join jsonb_array_elements_text(temp.jsondata->'properties'->'home')
group by jsondata
-- or better:
-- group by id
having sum((value like '%foot%' or value like 'stad%')::int) = 2
jsondata
---------------------------------------------------------
{"properties": {"home": ["football", "stadium", "16"]}}
(1 row)
Update. The above query may be expensive with a large dataset. There is a simplified but faster solution. You can cast the array to text and apply like to it, e.g.:
select jsondata
from temp
where jsondata->'properties'->>'home' like all('{%foot%, %stad%}');
jsondata
---------------------------------------------------------
{"properties": {"home": ["football", "stadium", "16"]}}
(1 row)
I have the following, but it was a bit fiddly. There's probably a better way but this is working I think.
The idea is to find the matching JSON array entries, then collect the results. In the join condition we check the "matches" array has the expected number of entries.
CREATE TABLE temp (jsondata jsonb);
INSERT INTO temp VALUES ('{"properties":{"home":["football","stadium",16]}}');
SELECT jsondata FROM temp t
INNER JOIN LATERAL (
SELECT array_agg(value) AS matches
FROM jsonb_array_elements_text(t.jsondata->'properties'->'home')
WHERE value LIKE '%foo%' OR value LIKE '%sta%'
LIMIT 1
) l ON array_length(matches, 1) = 2;
jsondata
-------------------------------------------------------
{"properties": {"home": ["football", "stadium", 16]}}
(1 row)
demo: db<>fiddle
I would cast the array into text. Then you are able to search for keywords with every string operator.
Disadvantage: because it was an array the text contains characters like braces and commas. So it's not that simple to search for keyword with a certain beginning (ABC%): You always have to search like %ABC%
SELECT jsondata
FROM (
SELECT
jsondata,
jsondata->'properties'->>'home' as a
FROM
temp
)s
WHERE
a LIKE '%stad%' AND a LIKE '%foot%'