I have a field in a PostgreSQL database with a JSONB type in the format of ["tag1","tag2"] and I am trying to implement a search that will provide results for a predictive dropdown (i.e. if a user types "t" and the column above exists both tags are returned.
Any suggestions on how to do this?
I tried the query below but it is not working:
SELECT table.tags::JSONB from table where table.tags::TEXT like 't%';
One way you can do that is using jsonb_array_elements_text() function (https://www.postgresql.org/docs/current/static/functions-json.html)
Example test:
SELECT *
FROM jsonb_array_elements_text($$["tag1","tag2","xtag1","ytag1"]$$::jsonb)
WHERE value LIKE 't%';
value
-------
tag1
tag2
(2 rows)
Since jsonb_array_elements_text() creates set of records and in your case there is no other condition than LIKE then using LATERAL (https://www.postgresql.org/docs/9.5/static/queries-table-expressions.html#QUERIES-LATERAL) should help you out like this:
SELECT T.tags
FROM table T,
LATERAL jsonb_array_elements_text(T.tags) A
WHERE A.value LIKE 't%';
Related
I have a column called vendor having jsonnb type and a json data like [{"id":"1","name":"Dev"}]
I wanted to select row data puting this column in where clause like WHERE vendor.id=1
So how can i do that, any help will be appriciated
You can use the contains operator #>:
select *
from the_table
where vendor #> '[{"id":"1"}]'::jsonb;
i have a table column looks like below.
what is the sql query statement i can use to have multiple partial match conditions?
search by ID or Name
if search abc then list the row A1 , row A2
if search test then list the row A1 , row A2, row 3
if search ghj then list the row A2
i was trying this but nothing return:
SELECT * FROM table where colB LIKE '"ID":"%abc%"'
updating data in text
{"ItemId":"123","IDs":[{"ID":"abc","CodingSystem":"cs1"}],"Name":"test itemgh"}
{"ItemId":"123","IDs":[{"ID":"ghj","CodingSystem":"cs1"}],"Name":"test abc"}
{"ItemId":"123","IDs":[{"ID":"defg","CodingSystem":"cs1"}],"Name":"test 111"}
JSON parsing
Oracle
Looked into the JSON parsing capabilities of Oracle and I managed to make running a query like this:
select * from table t where json_exists(t.colB, '$.IDs[?(#.ID=="abc")]') or json_exists(t.colB, '$.IDs?(#.name=="abc"')
And inside the same JSON query expression:
select * from table t where json_exists(t.colB, '$.IDs[?(#.ID=="abc" || #.name=="abc")]')
The call of function json_exists() is the key to this.
The first parameter can be a VARCHAR2, and I also tried with a BLOB containing text, and it works.
The second parameter is the path to your json object attribute that needs to be tested, with the condition.
I wrote two ORed conditions for the ID and for the Name, but maybe there is a better JSON query expression you can use to include them both.
More information about json_exists() function here.
Postgres
There is a JSON datatype in Postgres that supports parsing in queries.
So, if your colB column is declared as JSON you can do something like this:
select * from table where colB->>'Name' LIKE '%abc%';
And in order to have available the array elements of the IDs array, you should use the function json_array_elements().
select * from table, json_array_elements(colB->'IDs') e where colB->>'Name' LIKE '%abc%' or e->>'ID' = 'abc';
Check an example I created for you here.
Here is an online tool for online testing your JSON queries.
Check also this question in SO.
MSSQL Server 2017
I made a couple of tests also with MS SQL Server, and I managed to create an example searching for partial matching in the name field.
select * from table where JSON_VALUE(colB,'$.Name') LIKE '%abc%';
And finally I arrived to a working query that does partial match to the Name field and full match to the ID field like this:
select * from table t
CROSS APPLY OPENJSON(colB, '$.IDs') WITH (
ID VARCHAR(10),
CodingSystem VARCHAR(10)
) e
where JSON_VALUE(t.colB,'$.Name') LIKE '%abc%'
or e.ID = 'abc';
The problem is that we need to open the IDs array, and make something like a table from it, that can be queried also by accessing its columns.
The example I created is here.
LIKE text query
Your tries are good but you misplace the % symbols. They have to be first and last in your given string:
If you want the ID to be the given value:
SELECT * FROM table where colB LIKE '%"ID":"abc"%'
If the given value can be anywhere, then don't put the "ID" part:
SELECT * FROM table where colB LIKE '%abc%'
If the given value can be only on the ID or Name field then:
SELECT * FROM table where colB LIKE '%"ID":"abc"%' OR colB LIKE '%"Name":"abc"%'
And because you are giving hard-coded identifiers of fields (eg ID and Name) that can be in variable case:
SELECT * FROM table where lower(colB) LIKE '%"id":"abc"%' OR lower(colB) LIKE '%"name":"abc"%'
Assuming that the number of spaces do not vary between the : character and the value or the name of the properties.
For partial matching you can use more % in between like '%"name":"%abc%"%':
SELECT * FROM table where lower(colB) LIKE '%"id":"abc"%' OR lower(colB) LIKE '%"name":"%abc%"%'
Regular Expressions
A different option would be to test with regular expressions.
Consider checking this: Oracle extract json fields using regular expression with oracle regexp_substr
I have a jsonb field in a PostgreSQL table which was supposed to contain a dictionary like data aka {} but few of its entries got an array due to source data issues.
I want to weed out those entries. One of the ways is to perform following query -
select json_field from data_table where cast(json_field as text) like '[%]'
But this requires converting each jsonb field into text. With data_table having order of 200 million entries, this looks like bit of an overkill.
I investigated pg_typeof but it returns jsonb which doesn't help differentiate between a dictionary and an array.
Is there a more efficient way to achieve the above?
How about using the json_typeof function?
select json_field from data_table where json_typeof(json_field) = 'array'
In a table called payouts, there is a column stripeResponseData where the data is in the following structure:
{"id":"tr_1BlSHbGQXLV7RqqnHJffUVO0","object":"transfer","amount":39415,"amount_reversed":0,"balance_transaction":"txn_1BlSHbGQXfV7AqqnGi2o7UiY","created":1516239215,"currency":"usd","description":null,"destination":"acct_1BWWAmAzms5xPfV9","destination_payment":"py_1BlSHbAzms5xkfV91RHAOrno","livemode":true,"metadata":{},"reversals":{"object":"list","data":[],"has_more":false,"total_count":0,"url":"/v1/transfers/tr_1BlSHbYQXLV7AqqnHJffUVO0/reversals"},"reversed":false,"source_transaction":null,"source_type":"card","transfer_group":null}
Within my SQL SELECT statement, I want to return only the value of the key "destination". How do I write my SQL query?
My desired result of the query:
SELECT "stripeResponseData" FROM payouts [...]
(where I don't know how to write [...]) should look like the following (assume we have 3 rows with different values on "destination"):
acct_1BWWAmAzms5xPfV9
acct_1AY0phDc9pCDpLR8
acct_1AwG3VL7DXxftOaS
How do I extract that value from the list within the stripeResponseData column?
See this sqlfiddle. This query will fetch the ID from stripResponseData where the id is a specific id (Probably not very useful, but does show you how to select and query):
SELECT data->>'id' FROM stripeResponseData WHERE data #> '{"id":"tr_1BlSHbGQXLV7RqqnHJffUVO0"}';
Because you mentioned your data was a string, you need to to type conversions to query/use it correctly. See this sqlfiddle:
SELECT data::jsonb->>'id' FROM stripeResponseData WHERE data::jsonb #> '{"id":"tr_1BlSHbGQXLV7RqqnHJffUVO0"}';
Per your edit, you can simply query destination in almost the exact same way. This will get all the id's from stripeResponseData where destination = acct_1BWWAmAzms5xPfV9:
SELECT data::jsonb->>'id' FROM stripeResponseData WHERE data::jsonb #> '{"destination":"acct_1BWWAmAzms5xPfV9"}';
In my table mytable I have a json field called data and I inserted json with a lot of keys & values.
I know that it's possible to select individual fields like so:
SELECT data->'mykey' as mykey from mytable
But how can I get an overview of all of the json keys on a certain depth? I would have expected something like
SELECT data->* from mytable
but that doesn't work. Is there something similar?
You can use the json_object_keys() function to get all the top-level keys of a json value:
SELECT keys.*
FROM mytable, json_object_keys(mytable.data) AS keys (mykey);
If you want to search at a deeper level, then first extract that deeper level from the json value using the #> operator:
SELECT keys.*
FROM mytable, json_object_keys(mytable.data #> '{level1, level2}') AS keys (mykey);
Note that the function returns a set of text, so you should invoke the function as a row source.
If you are using the jsonb data type, then use the jsonb_object_keys() function.