Update a value in json column array of objects in postgresql raw query - sql

I have a json type column in postgreSQL and I want to update the specific field in that json column. Below is the column I want to update in,
[{"id":"xyyc","answered":false,"payable":true,"productIncentiveCost":{"incentive":0,"cost":0,"dollarIncentive":0,"dollarCost":0},"reward":0,"amountInDollar":0,"delayToNextProduct":"","extraDelayToNextProduct":""}].
I want to update the "reward" at 0 index of this column using postgreSQL raw query.
I have tried updating it using update method set and set_json but no luck.

update data
set value = tmp.upd_json_arr
from (
select
jsonb_agg(jsonb_row || '{"reward": 167}') as upd_json_arr
from data,
jsonb_array_elements(value) jsonb_row
) tmp;
Details:
First, you need to 'expand' your jsonb array using jsonb_array_elements() function. As a result you'll get jsonb rows.
Then, using concatenation operation you can rewrite a jsonb field: jsonb_row || '{"reward": 167}'.
Finally, jsonb_agg() function can help you to flatten all the jsonb rows into an array upd_json_arr.
upd_json_arr is used in a set-statement.
Here is the demo.
This answer shows different ways to manipulate with jsonb fields.

Related

Update Array Of Strings to new JSONB column of Array Of Objects in PostgreSQL

I would like to backfill my data from an array to strings from one column columns to another jsonb column ordered_columns
I'm doing it in rails and I know it works, but I would like to get this behavior with raw SQL
Example how it should work:
columns: ["city", "leaseUsers"]
ordered_columns: [{key: "city",visible:true} , {key:"leaseUsers", visible:true}]
I know I should update with jsonb_set but I'm not sure how I should generate a new array to update the new column.
You can unnest the array, then aggregate it back using jsonb_agg() and jsonb_build_object()
update the_table
set ordered_columns = (select jsonb_agg(jsonb_build_object('key', item, 'visible', true))
from jsonb_array_elements(columns) as c(item));

Alter single value in json string in text field

I have a table with one column of type text which contains a json string. I would like to do a query where I select a bunch of rows (around 50) and for these rows update a single value in the json that is saved in the text field. So lets say it currently looks like this
{"amount":"45","level":1}
I want to update the amount value for every one of these rows, to for example "level" * 5.
I can't figure out a way to do this with one query since it does not seem possible to alter a single value for a text type field like this. Or am I missing something? Otherwise i will just have to alter it manually for every single row I need to change which would be a pain.
You need to first cast the value to a proper jsonb value, then you can manipulate it using JSON functions.
update the_table
set the_column = jsonb_set(the_column::jsonb, '{level}', to_jsonb((the_column::jsonb ->> 'level')::int * 5))::text
where ....
The expression (the_column::jsonb ->> 'level')::int * 5 extracts the current value of level converts it into an integer and multiplies it with 5. The to_jsonb() around it is necessary because jsonb_set() requires a jsonb value as the last parameter
The '{level}' parameter tells jsonb_set() to put the new value (see above) into the (top level) key level
And finally the whole jsonb value is converted back to a text value.
If you really store JSON in that column, you should think about converting that column to the jsonb data type to avoid all that casting back and forth.
Or maybe think about a properly normalized model where this would be as simple as set level = level * 5

Insert an array of UUIDs using Objection.js

I am attempting to insert a new row into a table with a column defined as an array or UUIDs:
alter table medias add column "order" uuid[];
I am using Objection.js ORM and attempting to execute the following query:
const order = [
'BFAD6B0D-D3E6-4EB3-B3AB-108244A5DD7F'
]
Medias
.query()
.insert({
order: lit(order.map(id => lit(id).castType('uuid'))).castArray()
})
But the query is malformed and therefore does not execute:
INSERT INTO xxx ("order")
VALUES (ARRAY [
{"_value":"BFAD6B0D-D3E6-4EB3-B3AB-108244A5DD7F","_cast":"uuid","_toJson":false,"_toArray":false}
])
As can be seen, the query contains the JSON-stringified representation of the LiteralBuilder object and not something that the SQL syntax understands as a typecast.
If I skip casting the individual UUID strings and just cast the whole column into an array, then Postgres rejects the query because the column is of type uuid[] but I am attempting to insert the column as text[].
How can I format this query using Objection.js ORM?
My goal is to keep the column definition untouched and be able to insert a Postgres' array of UUIDs using Objection.js, either through its API or via raw query. If this is not currently possible with Objection, I am willing, as a last resort, to re-define the column as text[], but I would like to make sure I really have no other option.

Search a JSON array for an object containing a value matching a pattern

I have a DB with a jsonb column where each row essentially holds an array of name value pairs. Example for a single jsonb value:
[
{"name":"foo", "value":"bar"},
{"name":"biz", "value":"baz"},
{"name":"beep", "value":"boop"}
]
How would I query for rows that contain a partial value? I.e., find rows with the JSON object key value ilike '%ba%'?
I know that I can use SELECT * FROM tbl WHERE jsoncol #> '[{"value":"bar"}]' to find rows where the JSON is that specific value, but how would I query for rows containing a pattern?
There are no built in jsonb operators nor any indexes supporting this kind of filter directly (yet).
I suggest an EXISTS semi-join:
SELECT t.*
FROM tbl t
WHERE EXISTS (
SELECT FROM jsonb_array_elements(t.jsoncol) elem
WHERE elem->>'value' LIKE '%ba%'
);
It avoids redundant evaluations and the final DISTINCT step you would need to get distinct rows with a plain CROSS JOIN.
If this still isn't fast enough, a way more sophisticated specialized solution for the given type of query would be to extract a concatenated string of unique values (with a delimiter that won't interfere with your search patterns) per row in an IMMUTABLE function, build a trigram GIN index on the functional expression and use the same expression in your queries.
Related:
Search for nested values in jsonb array with greater operator
Find rows containing a key in a JSONB array of records
Create Postgres JSONB Index on Array Sub-Object
Aside, if your jsonb values really look like the example, you could trim a lot of noise and just store:
[
{"foo":"bar"},
{"biz":"baz"},
{"beep":"boop"}
]
You can use the function jsonb_array_elements() in a lateral join and use its result value in the WHERE clause:
select distinct t.*
from my_table t
cross join jsonb_array_elements(jsoncol)
where value->>'value' like '%ba%'
Please, read How to query jsonb arrays with IN operator for notes about distinct and performance.

check if a jsonb field contains an array

I have a jsonb field in a PostgreSQL table which was supposed to contain a dictionary like data aka {} but few of its entries got an array due to source data issues.
I want to weed out those entries. One of the ways is to perform following query -
select json_field from data_table where cast(json_field as text) like '[%]'
But this requires converting each jsonb field into text. With data_table having order of 200 million entries, this looks like bit of an overkill.
I investigated pg_typeof but it returns jsonb which doesn't help differentiate between a dictionary and an array.
Is there a more efficient way to achieve the above?
How about using the json_typeof function?
select json_field from data_table where json_typeof(json_field) = 'array'