Store SELECT output to 'json' type field - sql

I would like to store some of rows of data in database to as a JSON (in json type field) for backup reasons, before transaction is launched.
Something like:
INSERT INTO public.backup (user_id, data) VALUES (1, (SELECT * FROM ...))
Is it possible to do it simple, without parsing select and converting it to JSON in my application?

You can convert whole rows to json with row_to_json():
INSERT INTO public.backup (user_id, data)
SELECT 1, row_to_json(t)
FROM tbl t
WHERE ...; -- select some rows
It's not as simple to preserve column names if the source is a query rather than a plain table. See:
Return multiple columns of the same row as JSON array of objects
In Postgres 9.4 or later, consider the data type jsonb for your data column. Same query, the result is cast to jsonb with the assignment automatically.
How do I query using fields inside the new PostgreSQL JSON datatype?

Related

Insert an array of UUIDs using Objection.js

I am attempting to insert a new row into a table with a column defined as an array or UUIDs:
alter table medias add column "order" uuid[];
I am using Objection.js ORM and attempting to execute the following query:
const order = [
'BFAD6B0D-D3E6-4EB3-B3AB-108244A5DD7F'
]
Medias
.query()
.insert({
order: lit(order.map(id => lit(id).castType('uuid'))).castArray()
})
But the query is malformed and therefore does not execute:
INSERT INTO xxx ("order")
VALUES (ARRAY [
{"_value":"BFAD6B0D-D3E6-4EB3-B3AB-108244A5DD7F","_cast":"uuid","_toJson":false,"_toArray":false}
])
As can be seen, the query contains the JSON-stringified representation of the LiteralBuilder object and not something that the SQL syntax understands as a typecast.
If I skip casting the individual UUID strings and just cast the whole column into an array, then Postgres rejects the query because the column is of type uuid[] but I am attempting to insert the column as text[].
How can I format this query using Objection.js ORM?
My goal is to keep the column definition untouched and be able to insert a Postgres' array of UUIDs using Objection.js, either through its API or via raw query. If this is not currently possible with Objection, I am willing, as a last resort, to re-define the column as text[], but I would like to make sure I really have no other option.

check if a jsonb field contains an array

I have a jsonb field in a PostgreSQL table which was supposed to contain a dictionary like data aka {} but few of its entries got an array due to source data issues.
I want to weed out those entries. One of the ways is to perform following query -
select json_field from data_table where cast(json_field as text) like '[%]'
But this requires converting each jsonb field into text. With data_table having order of 200 million entries, this looks like bit of an overkill.
I investigated pg_typeof but it returns jsonb which doesn't help differentiate between a dictionary and an array.
Is there a more efficient way to achieve the above?
How about using the json_typeof function?
select json_field from data_table where json_typeof(json_field) = 'array'

PostgreSQL array in form type: value

Is there any way to create an array in PostgreSQL which contains multiple data types in form type:value?
For example, one of the table records should be an array with values height:190, color:black etc.
If it isn't possible with arrays, how could I mannage this other way?
https://www.postgresql.org/docs/current/static/hstore.html
This module implements the hstore data type for storing sets of
key/value pairs within a single PostgreSQL value
t=# select ('height=>190, color=>black')::hstore;
hstore
-----------------------------------
"color"=>"black", "height"=>"190"
(1 row)
https://www.postgresql.org/docs/current/static/datatype-json.html
JSON data types are for storing JSON (JavaScript Object Notation)
data, as specified in RFC 7159. Such data can also be stored as text,
but the JSON data types have the advantage of enforcing that each
stored value is valid according to the JSON rules.
t=# select '{"height":190, "color":"black"}'::json;
json
---------------------------------
{"height":190, "color":"black"}
(1 row)

Select all existing json fields from a postgres table

In my table mytable I have a json field called data and I inserted json with a lot of keys & values.
I know that it's possible to select individual fields like so:
SELECT data->'mykey' as mykey from mytable
But how can I get an overview of all of the json keys on a certain depth? I would have expected something like
SELECT data->* from mytable
but that doesn't work. Is there something similar?
You can use the json_object_keys() function to get all the top-level keys of a json value:
SELECT keys.*
FROM mytable, json_object_keys(mytable.data) AS keys (mykey);
If you want to search at a deeper level, then first extract that deeper level from the json value using the #> operator:
SELECT keys.*
FROM mytable, json_object_keys(mytable.data #> '{level1, level2}') AS keys (mykey);
Note that the function returns a set of text, so you should invoke the function as a row source.
If you are using the jsonb data type, then use the jsonb_object_keys() function.

Can get an average of values in a json array using postgres?

One of the great things about postgres is that it allows indexing into a json object.
I have a column of data formatted a little bit like this:
{"Items":
[
{"RetailPrice":6.1,"EffectivePrice":0,"Multiplier":1,"ItemId":"53636"},
{"RetailPrice":0.47,"EffectivePrice":0,"Multiplier":1,"ItemId":"53404"}
]
}
What I'd like to do is find the average RetailPrice of each row with these data.
Something like
select avg(json_extract_path_text(item_json, 'RetailPrice'))
but really I need to do this for each item in the items json object. So for this example, the value in the queried cell would be 3.285
How can I do this?
Could work like this:
WITH cte(tbl_id, json_items) AS (
SELECT 1
, '{"Items": [
{"RetailPrice":6.1,"EffectivePrice":0,"Multiplier":1,"ItemId":"53636"}
,{"RetailPrice":0.47,"EffectivePrice":0,"Multiplier":1,"ItemId":"53404"}]}'::json
)
SELECT tbl_id, round(avg((elem->>'RetailPrice')::numeric), 3) AS avg_retail_price
FROM cte c
, json_array_elements(c.json_items->'Items') elem
GROUP BY 1;
The CTE just substitutes for a table like:
CREATE TABLE tbl (
tbl_id serial PRIMARY KEY
, json_items json
);
json_array_elements() (Postgres 9.3+) to unnest the json array is instrumental.
I am using an implicit JOIN LATERAL here. Much like in this related example:
Query for element of array in JSON column
For an index to support this kind of query consider this related answer:
Index for finding an element in a JSON array
For details on how to best store EAV data:
Is there a name for this database structure?