How to perform ILIKE query against JSONB type column? - sql

I have column "category_products" with datatype as JSONB. In that column data is inserted as array and this array contains objects. and that object contains array of object.
Here I need to perform ILIKE query against product_name.
example
category_products
-----------------
[{"products":[{product_name: product_one, price: 123}, {product_name: product_two, price: 999}]]

You may first flatten your data using a lateral join with jsonb_path_query and then apply an ILIKE in a WHERE clause as you need. Here is an illustration.
See the demo.
select id, l, l ->> 'product_name' as prod
from the_table,
lateral jsonb_path_query(category_products, '$[*].products[*]') as l;
Please note that your sample data are not valid JSON at all.
Unrelated but this would be so much easier and cleaner with a normalized data design.
Edit
As jsonb_path_query does not exist in pre-PG12 versions here is an alternative and a new demo.
select id, l, l ->> 'product_name' as prod
from the_table,
lateral jsonb_array_elements(category_products) as arr_ex,
lateral jsonb_array_elements(arr_ex -> 'products') as l;

Related

Example of table function

Is the UNNEST an example of a table-function? It seems to produce a single named column if I'm understanding it correctly. Something like:
`vals`
[1,2,3]
unnest(vals) as v
`v`
1
2
3
with Table as (
select [1,2,3] vals
) select v from Table, UNNEST(vals) as v
Is this an example of a table-function? If not, what kind of function is it? Are there any other predefined table functions in BQ?
The UNNEST operator takes an ARRAY and returns a table, with one row for each element in the ARRAY. You can also use UNNEST outside of the FROM clause with the IN operator.
So, you might may call it table function if you wish :o)
You can read more about UNNEST here
It seems to produce a single named column if I'm understanding it correctly
Not exactly correct. See example below
with Table as (
select [struct(1 as a,2 as b),struct(3, 4), struct(5, 6)] vals
)
select v.* from Table, UNNEST(vals) as v
with output

Is there any way to search in json postgres column using matching clause?

I'm trying to search for a record from the Postgres JSON column. The stored data has a structure is like this
{
"contract_shipment_date": "2015-06-25T19:00:00.000Z",
"contract_product_grid": [
{
"product_name": "Axele",
"quantity": 22.58
},
{
"product_name": "Bell",
"quantity": 52.58
}
],
"lc_status": "Awaited"
}
My table name is Heap and column name is contract_product_grid. Also, contract_product_grid column can contain multiple product records.
I found this documentation but not able to get the desired output.
The required case is, I have a filter in which users can select product_name and on the basics of entered name by using matching clause, the record will be fetched and returned to users.
Suppose you entered Axele as a product_name input, and want to return the matching value for quantity key.
Then use :
SELECT js2->'quantity' AS quantity
FROM
(
SELECT JSON_ARRAY_ELEMENTS(value::json) AS js2
FROM heap,
JSON_EACH_TEXT(contract_product_grid) AS js
WHERE key = 'contract_product_grid'
) q
WHERE js2->> 'product_name' = 'Axele'
where expand the outermost JSON into key&value pairs through JSON_EACH_TEXT(json), and split all the elements with the newly formed array by JSON_ARRAY_ELEMENTS(value::json) function.
Then, filter out by spesific product_name within the main query.
Demo
P.S. Don't forget to wrap JSON column's value up with curly braces
SELECT *
FROM
(
SELECT JSON_ARRAY_ELEMENTS(contract_product_grid::json) AS js2
FROM heaps
WHERE 'contract_product_grid' = 'contract_product_grid'
) q
WHERE js2->> 'product_name' IN ('Axele', 'Bell')
As, I mentioned in question that my column name is 'contract_product_grid' and i have to only search from it.
Using this query, I'm able to get contract_product_grid information using IN clause with entered product name.
You need to unnest the array in order to be able to use a LIKE condition on each value:
select h.*
from heap h
where exists (select *
from jsonb_array_elements(h.contract_product_grid -> 'contract_product_grid') as p(prod)
where p.prod ->> 'product_name' like 'Axe%')
If you don't really need a wildcard search (so = instead of LIKE) you can use the contains operator #> which is a lot more efficient:
select h.*
from heap h
where h.contract_product_grid -> 'contract_product_grid' #> '[{"product_name": "Axele"}]';
This can also be used to search for multiple products:
select h.*
from heap h
where h.contract_product_grid -> 'contract_product_grid' #> '[{"product_name": "Axele"}, {"product_name": "Bell"}]';
If you are using Postgres 12 you can simplify that a bit using a JSON path expression:
select *
from heap
where jsonb_path_exists(contract_product_grid, '$.contract_product_grid[*].product_name ? (# starts with "Axe")')
Or using a regular expression:
select *
from heap
where jsonb_path_exists(contract_product_grid, '$.contract_product_grid[*].product_name ? (# like_regex "axe.*" flag "i")')

How can I aggregate Jsonb columns in postgres using another column type

I have the following data in a postgres table,
where data is a jsonb column. I would like to get result as
[
{field_type: "Design", briefings_count: 1, meetings_count: 13},
{field_type: "Engineering", briefings_count: 1, meetings_count: 13},
{field_type: "Data Science", briefings_count: 0, meetings_count: 3}
]
Explanation
Use jsonb_each_text function to extract data from jsonb column named data. Then aggregate rows by using GROUP BY to get one row for each distinct field_type. For each aggregation we also need to include meetings and briefings count which is done by selecting maximum value with case statement so that you can create two separate columns for different counts. On top of that apply coalesce function to return 0 instead of NULL if some information is missing - in your example it would be briefings for Data Science.
At a higher level of statement now that we have the results as a table with fields we need to build a jsonb object and aggregate them all to one row. For that we're using jsonb_build_object to which we are passing pairs that consist of: name of the field + value. That brings us with 3 rows of data with each row having a separate jsonb column with the data. Since we want only one row (an aggregated json) in the output we need to apply jsonb_agg on top of that. This brings us the result that you're looking for.
Code
Check LIVE DEMO to see how it works.
select
jsonb_agg(
jsonb_build_object('field_type', field_type,
'briefings_count', briefings_count,
'meetings_count', meetings_count
)
) as agg_data
from (
select
j.k as field_type
, coalesce(max(case when t.count_type = 'briefings_count' then j.v::int end),0) as briefings_count
, coalesce(max(case when t.count_type = 'meetings_count' then j.v::int end),0) as meetings_count
from tbl t,
jsonb_each_text(data) j(k,v)
group by j.k
) t
You can aggregate columns like this and then insert data to another table
select array_agg(data)
from the_table
Or use one of built-in json function to create new json array. For example jsonb_agg(expression)

Select with filters on nested JSON array

Postgres 10: I have a table and a query below:
CREATE TABLE individuals (
uid character varying(10) PRIMARY KEY,
data jsonb
);
SELECT data->'files' FROM individuals WHERE uid = 'PDR7073706'
It returns this structure:
[
{"date":"2017-12-19T22-35-49","type":"indiv","name":"PDR7073706_indiv_2017-12-19T22-35-49.jpeg"},
{"date":"2017-12-19T22-35-49","type":"address","name":"PDR7073706_address_2017-12-19T22-35-49.pdf"}
]
I'm struggling with adding two filters by date and time. Like (illegal pseudo-code!):
WHERE 'type' = "indiv"
or like:
WHERE 'type' = "indiv" AND max('date')
It is probably easy, but I can't crack this nut, and need your help!
Assuming data type jsonb for lack of info.
Use the containment operator #> for the first clause (WHERE 'type' = "indiv"):
SELECT data->'files'
FROM individuals
WHERE uid = 'PDR7073706'
AND data -> 'files' #> '[{"type":"indiv"}]';
Can be supported with various kinds of indexes. See:
Query for array elements inside JSON type
Index for finding an element in a JSON array
The second clause (AND max('date')) is more tricky. Assuming you mean:
Get rows where the JSON array element with "type":"indiv" also has the latest "date".
SELECT i.*
FROM individuals i
JOIN LATERAL (
SELECT *
FROM jsonb_array_elements(data->'files')
ORDER BY to_timestamp(value ->> 'date', 'YYYY-MM-DD"T"HH24-MI-SS') DESC NULLS LAST
LIMIT 1
) sub ON sub.value -> 'type' = '"indiv"'::jsonb
WHERE uid = 'PDR7073706'
AND data -> 'files' #> '[{"type":"indiv"}]' -- optional; may help performance
to_timestamp(value ->> 'date', 'YYYY-MM-DD"T"HH24-MI-SS') is my educated guess on your undeclared timestamp format. Details in the manual here.
The last filter is redundant and optional. but it may help performance (a lot) if it is selective (only few rows qualify) and you have a matching index as advised:
AND data -> 'files' #> '[{"type":"indiv"}]'
Related:
Optimize GROUP BY query to retrieve latest record per user
Select first row in each GROUP BY group?
Update nth element of array using a WHERE clause

Select query in row_to_json function

For example ,
I use the following function to convert rows into json in PostgreSQL 9.2
select row_to_json(row(productid, product)) from gtab04;
and this will returns below results
row_to_json
---------------
{"f1":3029,"f2":"DIBIZIDE M TAB"}
{"f1":3026,"f2":"MELMET 1000 SR TAB"}
{"f1":2715,"f2":"GLUCORED FORTE"}
{"f1":3377,"f2":"AZINDICA 500 TAB"}
unfortunately it loses the field names and replaces them with f1, f2, f3, etc.
How can I get the actual field names or cast field name?
To work around this we must either create a row type and cast the row to that type or use a subquery. A subquery will typically be easier.
select row_to_json(t)
from (
select productid, product from gtab04
) t
If one wants to prevent a sub-query, json_build_object() might be a solution. It does not map the column name, but let's your set the JSON keys explicitly.
Query
SELECT json_build_object('productid', productid, 'product', product) FROM gtab04;
json_build_object
------------------
{"productid":3029,"product":"DIBIZIDE M TAB"}
{"productid":3026,"product":"MELMET 1000 SR TAB"}
{"productid":2715,"product":"GLUCORED FORTE"}
{"productid":3377,"product":"AZINDICA 500 TAB"}
View on DB Fiddle