How to filter a value of any key of json in postgres - sql

I have a table users with a jsonb field called data. I have to retrieve all the users that have a value in that data column matching a given string. For example:
user1 = data: {"property_a": "a1", "property_b": "b1"}
user2 = data: {"property_a": "a2", "property_b": "b2"}
I want to retrieve any user that has a value data matching 'b2', in this case that will be 'user2'.
Any idea how to do this in an elegant way? I can retrieve all keys from data of all users and create a query manually but that will be neither fast nor elegant.
In addition, I have to retrieve the key and value matched, but first things first.

There is no easy way. Per documentation:
GIN indexes can be used to efficiently search for keys or key/value
pairs occurring within a large number of jsonb documents (datums)
Bold emphasis mine. There is no index over all values. (Those can have non-compatible data types!) If you do not know the name(s) of all key(s) you have to inspect all JSON values in every row.
If there are just two keys like you demonstrate (or just a few well-kown keys), it's still easy enough:
SELECT *
FROM users
WHERE data->>'property_a' = 'b2' OR
data->>'property_b' = 'b2';
Can be supported with a simple expression index:
CREATE INDEX foo_idx ON users ((data->>'property_a'), (data->>'property_b'))
Or with a GIN index:
SELECT *
FROM users
WHERE data #> '{"property_a": "b2"}' OR
data #> '{"property_b": "b2"}'
CREATE INDEX bar_idx ON users USING gin (data jsonb_path_ops);
If you don't know all key names, things get more complicated ...
You could use jsonb_each() or jsonb_each_text() to unnest all values into a set and then check with an ANY construct:
SELECT *
FROM users
WHERE jsonb '"b2"' = ANY (SELECT (jsonb_each(data)).value);
Or
...
WHERE 'b2' = ANY (SELECT (jsonb_each_text(data)).value);
db<>fiddle here
But there is no index support for the last one. You could instead extract all values into and array and create an expression index on that, and match that expression in queries with array operators ...
Related:
How do I query using fields inside the new PostgreSQL JSON datatype?
Index for finding an element in a JSON array
Can PostgreSQL index array columns?

Try this query.
SELECT * FROM users
WHERE data::text LIKE '%b2%'
Of course it won't work if your key will contain such string too.

Related

How to speed up SELECT for a JSONB column in Postgres when the first level key is unknown?

I have a table with a JSONB column called "attributes" that contains a JSON object with various keys and values. The keys are dynamic and I do not know their names until the time of the query. I have over 20 million rows in this table and the queries on this column are currently very slow. Is there a way to improve the search performance in this scenario without using dynamically generated indexes?
How my data stored:
attributes
JSONB
JSON looks like this:
{
dynamicName1: 'value',
dynamicName2: 'value',
dynamicName3: 'value',
...
}
Example of query:
SELECT * FROM table WHERE "attributes" ->> 'dynamicName1' = 'SomeValue'
SELECT * FROM table WHERE "attributes" ->> 'abcdefg' = 'SomeValue'
SELECT * FROM table WHERE "attributes" ->> 'anyPossibleName' = 'SomeValue'
Create table:
CREATE TABLE "table" ("id" SERIAL NOT NULL, "attributes" JSONB)
Explain:
Gather (cost=1000.00..3460271.08 rows=91075 width=1178)
Workers Planned: 2
" -> Parallel Seq Scan on ""table"" (cost=0.00..3450163.58 rows=37948 width=1178)"
" Filter: ((""attributes"" ->> 'Beak'::text) = 'Yellow'::text)"
I have attempted to research the use of indexes to improve search performance on JSONB columns, but have been unable to find any information that specifically addresses my scenario where the keys in the JSON object are dynamic and unknown until the time of the query.
You don't need to specify the keys within the jsonb object to build a useful index on its column.
create index on "table" using gin("attributes" jsonb_path_ops);
and then use ##jsonpath or #>jsonb operators that are supported by GIN. You can omit the jsonb_path_ops operator class if you'll need to use other operators with this index.
select * from "table" where "attributes" ## '$.dynamicName1 == "SomeValue"';
select * from "table" where "attributes" #> '{"dynamicName1":"SomeValue"}'::jsonb;
Online demo where this speeds things up about three orders of magnitude on 400k random records.

Query JSONB column for any value where =?

I have a jsonb column which has the unfortunate case of being very unpredictable, in some cases its value may be an array with nested values:
["UserMailer", "applicant_setup_3", ["5cbffeb7-8d5e-4b52-a475-3cf320b2cee9"]]
Sometimes it will be something with key/values like this:
[{"reference_id": "5cbffeb7-8d5e-4b52-a475-3cf320b2cee9", "job_dictionary": ["StatusUpdater", "FollowTwitterUsersJob"]}]
Is there a way to write a query which just treats the whole column like text and does a like to see if I can find the uuid in the big text blob? I want to find all the records where a particular uuid string is present in the jsonb column.
The query doesn't need to be fast or efficient.
Postgres has search operator ? for jsonb, but that would require you to search the json content recursively.
A possible, although not very efficient method, would to stringify the object and use LIKE to search it:
myjsonb::text LIKE '%"5cbffeb7-8d5e-4b52-a475-3cf320b2cee9"%'
myjsonb::text LIKE '%"' || myuuid || '"%'
Demo on DB Fiddle:
The problem with the jsonb operator ? is that it only considers top-level keys (including array elements), not values, and no nested objects.
You seem to be looking for values and array elements (not keys) on any level. You can get that with a full text search on top of your json(b) column:
SELECT * FROM tbl
WHERE to_tsvector('simple', jsonb_column)
## tsquery '5cbffeb7-8d5e-4b52-a475-3cf320b2cee9';
db<>fiddle here
to_tsvector() extracts values and array elements on all levels - just what you need.
Requires Postgres 10 or later. json(b)_to_tsvector() in Postgres 11 offers more flexibility.
That's attractive for tables of non-trivial size as it can be supported with a full text index very efficiently:
CREATE INDEX tbl_jsonb_column_fts_gin_idx ON tbl USING GIN (to_tsvector('simple', jsonb_column));
I use the 'simple' text search configuration in the example. You might want a language-specific one, like 'english'. Doesn't matter much while you only look for UUID strings, but stemming for a particular language might make the index a bit smaller ...
Related:
LIKE query on elements of flat jsonb array
Does the phrase search operator <-> work with JSONB documents or only relational tables?
While you are only looking for UUIDs, you might optimize further with a custom (IMMUTABLE) function to extract UUIDs from the JSON document as array (uuid[]) and build a functional GIN index on top of it. (Considerably smaller index, yet.) Then:
SELECT * FROM tbl
WHERE my_uuid_extractor(jsonb_column) #> '{5cbffeb7-8d5e-4b52-a475-3cf320b2cee9}';
Such a function can be expensive, but does not matter much with a functional index that stores and operates on pre-computed values.
You can split the array elements first by using jsonb_array_elements(json), and then filter the casted string from those elements by like operator
select q.elm
from
(
select jsonb_array_elements(js) as elm
from tab
) q
where elm::varchar like '%User%'
elm
----------------------------------------------------------------------------------------------------------------------
"UserMailer"
{"reference_id": "5cbffeb7-8d5e-4b52-a475-3cf320b2cee9", "job_dictionary": ["StatusUpdater", "FollowTwitterUsersJob"]}
Demo

Postgres: How to search for a value on _all_ fields in an object in a json-type column without knowing the keys

In a Postgres database I have a column of type jsonb which contains a generic object whose keys I cannot predict:
{"some_key":"value", "another_unpredictable_key":"another value}
Is it possible to search for a particular value in all of the fields without knowing the keys? Sth. like
select * from ... where column_whatever->>'*' = '...'
You need to turn the json value into multiple rows (of key/value pairs) and search in the result of that:
select *
from some_table t
where exists (select *
from jsonb_each_text(t.jsonb_column) as x(ky,val)
where x.val = 'some value');
jsonb_each_text() returns one row for each top-level key/value pair. This does not handle nested keys.

Search a JSON array for an object containing a value matching a pattern

I have a DB with a jsonb column where each row essentially holds an array of name value pairs. Example for a single jsonb value:
[
{"name":"foo", "value":"bar"},
{"name":"biz", "value":"baz"},
{"name":"beep", "value":"boop"}
]
How would I query for rows that contain a partial value? I.e., find rows with the JSON object key value ilike '%ba%'?
I know that I can use SELECT * FROM tbl WHERE jsoncol #> '[{"value":"bar"}]' to find rows where the JSON is that specific value, but how would I query for rows containing a pattern?
There are no built in jsonb operators nor any indexes supporting this kind of filter directly (yet).
I suggest an EXISTS semi-join:
SELECT t.*
FROM tbl t
WHERE EXISTS (
SELECT FROM jsonb_array_elements(t.jsoncol) elem
WHERE elem->>'value' LIKE '%ba%'
);
It avoids redundant evaluations and the final DISTINCT step you would need to get distinct rows with a plain CROSS JOIN.
If this still isn't fast enough, a way more sophisticated specialized solution for the given type of query would be to extract a concatenated string of unique values (with a delimiter that won't interfere with your search patterns) per row in an IMMUTABLE function, build a trigram GIN index on the functional expression and use the same expression in your queries.
Related:
Search for nested values in jsonb array with greater operator
Find rows containing a key in a JSONB array of records
Create Postgres JSONB Index on Array Sub-Object
Aside, if your jsonb values really look like the example, you could trim a lot of noise and just store:
[
{"foo":"bar"},
{"biz":"baz"},
{"beep":"boop"}
]
You can use the function jsonb_array_elements() in a lateral join and use its result value in the WHERE clause:
select distinct t.*
from my_table t
cross join jsonb_array_elements(jsoncol)
where value->>'value' like '%ba%'
Please, read How to query jsonb arrays with IN operator for notes about distinct and performance.

check if a jsonb field contains an array

I have a jsonb field in a PostgreSQL table which was supposed to contain a dictionary like data aka {} but few of its entries got an array due to source data issues.
I want to weed out those entries. One of the ways is to perform following query -
select json_field from data_table where cast(json_field as text) like '[%]'
But this requires converting each jsonb field into text. With data_table having order of 200 million entries, this looks like bit of an overkill.
I investigated pg_typeof but it returns jsonb which doesn't help differentiate between a dictionary and an array.
Is there a more efficient way to achieve the above?
How about using the json_typeof function?
select json_field from data_table where json_typeof(json_field) = 'array'