I have a habit of writing queries that returns JSON structures directly from the PostgreSQL query,
-- Something like this...
-- The function
CREATE OR REPLACE FUNCTION get_data(_team_id UUID) RETURNS JSON AS
$$
DECLARE
_output JSON;
BEGIN
SELECT ROW_TO_JSON(rec)
INTO _output
FROM (SELECT COUNT(*) AS total
FROM users,
(SELECT ARRAY_TO_JSON(ARRAY_AGG(ROW_TO_JSON(a))) AS data
FROM (SELECT id,
name,
(SELECT ARRAY_TO_JSON(ARRAY_AGG(ROW_TO_JSON(b))) AS emails
FROM (SELECT email
FROM emails
WHERE user_id = users.id) b)
FROM users
WHERE active IS TRUE
AND team_id = _team_id
ORDER BY name
LIMIT 5 OFFSET 0) a)
WHERE active IS TRUE
AND team_id = _team_id) rec;
RETURN _output;
END;
$$ LANGUAGE plpgsql;
-- The query
SELECT get_data('ee0a7ea0-3888-476b-810e-de93a58aa6f6') AS data;
This gives me the below structure for my JavaScript web application,
{
"total": 100,
"data": [
{ "id": 1, "name": "User 1", "emails": [{"email": "email1"}, {"email": "email2"}] },
{ "id": 2, "name": "User 2", "emails": [{"email": "email1"}, {"email": "email2"}] },
{ "id": 3, "name": "User 3", "emails": [{"email": "email1"}, {"email": "email2"}] },
{ "id": 4, "name": "User 4", "emails": [{"email": "email1"}, {"email": "email2"}] },
{ "id": 5, "name": "User 5", "emails": [{"email": "email1"}, {"email": "email2"}] }
]
}
Since I'm new to querying, I see that building the structure directly from the query saves a lot of time over building the view models from JavaScript.
But I have some hesitations. Is this the best way to approach it? Or not?
I searched through the internet and find nothing about this.
The best approach is to generate the SQL queries not by hand but let an ORM like TypeORM (Typescript) or JPA (Java, implemented by Hibernate and Eclipselink) or Eloquent (PHP) do this for you.
With TypeORM and a matching entity model, your code would be:
const users = await entityManager.getRepository(User)
.createQueryBuilder('user')
.where('user.active = true')
.andWhere('user.team_id = :teamId', {teamId: 1})
.getManyAndCount();
This is much cleaner than the SQL query you have written and it allows for code reuse.
That way, if you e.g. change the type of a column or if you want a custom mapping (Postgres dates, ...) or whatever, you can use the ORM for that and don't have to copy+paste the code into every query.
Related
I am working on a sensitive migration. The scenario is as follows:
I have a new table that I need to populate with data
There is an existing table, which contains a column (type = json), which contains an array of objects such as:
[
{
"id": 0,
"name": "custom-field-0",
"label": "When is the deadline for a response?",
"type": "Date",
"options": "",
"value": "2020-10-02",
"index": 1
},
{
"id": 1,
"name": "custom-field-1",
"label": "What territory does this relate to?",
"type": "Dropdown",
"options": "UK, DE, SE, DK, BE, NL, IT, FR, ES, AT, CH, NO, US, SG, Other",
"value": " DE",
"index": 2
}
]
I need to essentially map these values in this column to my new table. I have worked with JSON data in PostgresQL before, where I was dealing with a single object in the JSON, but never with arrays of objects and on such a large scale.
So just to summarise, how does someone iterate every row, and every object in an array, and insert that data into a new table?
EDIT
I have been experimenting with some functions, and I found one that seems promising json_array_elements_text or json_array_elements. As this allowed me to add multiple rows to the new table using this array of objects.
However, my issue is that I need to map certain values to the new table.
INSERT INTO form_field_value ("name", "label", "inputType", "options", "form" "workspace")
SELECT <<HERE IS WHERE I NEED TO EXTRACT VALUES FROM THE JSON ARRAY>>, task.form, task.workspace
FROM task;
EDIT 2
I have been playing around some more with the above functions, but reached a slight issue.
INSERT INTO form_field_value ("name", "label", "inputType", "options", "form" "workspace")
SELECT cf ->> 'name',
(cf ->> 'label')
...
FROM jsonb_array_elements(task."customFields") AS t(cf);
My issue lies in the FROM clause, so customFields is the array of objects, but I also need to get the form and workspace attribute from this table too. Plus I a pretty sure that the FROM clause would not work anyway, as it probably will complain about the task."customFields" not being specified or something.
Here is the select statement that uses json_array_elements and a lateral join in the from clause to flatten the data.
select j ->> 'name' as "name", j ->> 'label' as "label",
j ->> 'type' as "inputType", j ->> 'options' as "options", form, workspace
from task
cross join lateral json_array_elements("customFields") as l(j);
The from clause can be less verbose
from task, json_array_elements("customFields") as l(j)
you can try to use json_to_recordset:
select * from json_to_recordset('
[
{
"id": 0,
"name": "custom-field-0",
"label": "When is the deadline for a response?",
"type": "Date",
"options": "",
"value": "2020-10-02",
"index": 1
},
{
"id": 1,
"name": "custom-field-1",
"label": "What territory does this relate to?",
"type": "Dropdown",
"options": "UK, DE, SE, DK, BE, NL, IT, FR, ES, AT, CH, NO, US, SG, Other",
"value": " DE",
"index": 2
}
]
') as x(id int, name text,label text,type text,options text,value text,index int)
for insert record you can use an sql like this:
INSERT INTO form_field_value ("name", "label", "inputType", "options", "form" "workspace")
SELECT name, label, type, options, form, workspace
FROM
task,
json_to_record(task) AS
x (id int, name text,label text,type text,options text,value text,index int)
SELECT * FROM some_table;
I can query to get the following results:
{
"sku0": {
"Id": "18418",
"Desc": "yes"
},
"sku1": {
"Id": "17636",
"Desc": "no"
},
"sku2": {
"Id": "206714",
"Desc": "yes"
},
"brand": "abc",
"displayName": "something"
}
First, the number of skus is not fixed. It may be sku0, sku1, sku2, sku3, sku4 ... but they all start with sku.
Then, I want to query Id with 17636 and determine whether its value of Desc is yes or no. After reading the PostgreSQL JSON Functions and Operators documentation, Depressing I didn't find a good way.
I can convert the result into a Python dictionary, and then use python's method can easily achieve my requirements.
If the requirements can also be achieved with postgresql statements, which method is more recommended than the Python dictionary?
I am not sure I completely understand what the result is you want. But if you want to filter on the Id, you need to unnest all the elements inside the JSON column:
select d.v ->> 'Desc' as description
from the_table t
cross join jsonb_each(t.data) as d(k,v)
where d.v ->> 'Id' = '17636'
You could use the new jsonpath notation of PostgreSQL v12:
SELECT data ## '$.* ? (#.Id == "17636").Desc == "yes"'
FROM some_table;
That will start with the root of data ($), find any attribute in it (*), filter only those attributes that contain an Id with value "17636", get their Desc attribute and return TRUE only if that attribute is "yes".
Nice, isn't it?
This will probably give you what you need.
select value->>'Desc' from jsonb_each('{
"sku0": {
"Id": "18418",
"Desc": "yes"
},
"sku1": {
"Id": "17636",
"Desc": "no"
},
"sku2": {
"Id": "206714",
"Desc": "yes"
},
"brand": "abc",
"displayName": "something"
}'::jsonb)
where key like 'sku%'
and value->>'Id'='17636'
Best regards,
Bjarni
It is possible to store SQL query in JSON array of objects?? Because when i have something like this:
[{
"id": "1",
"query": "SELECT ID FROM table"
},
{
"id": "2",
"query": "SELECT ID FROM table"
},
{
"id": "3",
"query": "SELECT USER FROM table"
}
]
JSON file in VSCode is ok no error it is getting nasty when i want to store complex queries with joins etc.
for example this query even if i format it correctly it will generate error in JSON file about formatting
(just example i not it is not valid)
SELECT user, id, , count(price) as numrev
FROM price
where id = 1 and user = 0
group by user, id, price
that it can't be stored in string
It is bit easy to do, but requires on extra step.
Simply convert/encode you raw SQL queries in base64 text.
Decode the text before you execute the queries in you code.
If the JSON file is created automatically by a program/code
All most all programming languages proved base64 encode / decode functions as part of the core if not download compatible package / library to achieve this automation
var queries = [{
"id": "1",
"query": "U0VMRUNUIElEIEZST00gdGFibGU="
},
{
"id": "2",
"query": "U0VMRUNUIElEIEZST00gdGFibGU="
},
{
"id": "3",
"query": "U0VMRUNUIFVTRVIgRlJPTSB0YWJsZQ=="
},
{
"id": "4",
"query": "U0VMRUNUIHVzZXIsIGlkLCAsIGNvdW50KHByaWNlKSBhcyBudW1yZXYKICBGUk9NIHByaWNlCiAgd2hlcmUgaWQgPSAxIGFuZCB1c2VyID0gMCAKICBncm91cCBieSB1c2VyLCBpZCwgcHJpY2U="
}
];
for (i = 0; i < queries.length; i++) {
console.log("id = " + queries[i].id + ", query = " + atob(queries[i].query));
}
when you parse you JSON array make sure to decode before you execute the SQL queries.
let me know this one helped you.. ☺☻☺
FYI , refer http://www.utilities-online.info/base64/
enter image description here
I'm trying to merge some nested JSON arrays without looking at the id. Currently I'm getting this when I make a GET request to /surveyresponses:
{
"surveys": [
{
"id": 1,
"name": "survey 1",
"isGuest": true,
"house_id": 1
},
{
"id": 2,
"name": "survey 2",
"isGuest": false,
"house_id": 1
},
{
"id": 3,
"name": "survey 3",
"isGuest": true,
"house_id": 2
}
],
"responses": [
{
"question": "what is this anyways?",
"answer": "test 1"
},
{
"question": "why?",
"answer": "test 2"
},
{
"question": "testy?",
"answer": "test 3"
}
]
}
But I would like to get it where each survey has its own question and answers so something like this:
{
"surveys": [
{
"id": 1,
"name": "survey 1",
"isGuest": true,
"house_id": 1
"question": "what is this anyways?",
"answer": "test 1"
}
]
}
Because I'm not going to a specific id I'm not sure how to make the relationship work. This is the current query I have that's producing those results.
export function getSurveyResponse(id: number): QueryBuilder {
return db('surveys')
.join('questions', 'questions.survey_id', '=', 'surveys.id')
.join('questionAnswers', 'questionAnswers.question_id', '=', 'questions.id')
.select('surveys.name', 'questions.question', 'questions.question', 'questionAnswers.answer')
.where({ survey_id: id, question_id: id })
}
Assuming jsonb in current Postgres 10 or 11, this query does the job:
SELECT t.data, to_jsonb(s) AS new_data
FROM t
LEFT JOIN LATERAL (
SELECT jsonb_agg(s || r) AS surveys
FROM (
SELECT jsonb_array_elements(t.data->'surveys') s
, jsonb_array_elements(t.data->'responses') r
) sub
) s ON true;
db<>fiddle here
I unnest both nested JSON arrays in parallel to get the desired behavior of "zipping" both directly. The number of elements in both nested JSON arrays has to match or you need to do more (else you lose data).
This builds on implementation details of how Postgres deals with multiple set-returning functions in a SELECT list to make it short and fast. See:
What is the expected behaviour for multiple set-returning functions in select clause?
One could be more explicit with a ROWS FROM expression, which works properly since Postgres 9.4:
SELECT t.data
, to_jsonb(s) AS new_data
FROM tbl t
LEFT JOIN LATERAL (
SELECT jsonb_agg(s || r) AS surveys
FROM ROWS FROM (jsonb_array_elements(t.data->'surveys')
, jsonb_array_elements(t.data->'responses')) sub(s,r)
) s ON true;
The manual about combining multiple table functions.
Or you could use WITH ORDINALITY to get original order of elements and combine as you wish:
PostgreSQL unnest() with element number
I am trying to build a query which combines rows of one table into a JSON array, I then want that array to be part of the return.
I know how to do a simple query like
SELECT *
FROM public.template
WHERE id=1
And I have worked out how to produce the JSON array that I want
SELECT array_to_json(array_agg(to_json(fields)))
FROM (
SELECT id, name, format, data
FROM public.field
WHERE template_id = 1
) fields
However, I cannot work out how to combine the two, so that the result is a number of fields from public.template with the output of the second query being one of the returned fields.
I am using PostGreSQL 9.6.6
Edit, as requested more information, a definition of field and template tables and a sample of each queries output.
Currently, I have a JSONB row on the template table which I am using to store an array of fields, but I want to move fields to their own table so that I can more easily enforce a schema on them.
Template table contains:
id
name
data
organisation_id
But I would like to remove data and replace it with the field table which contains:
id
name
format
data
template_id
At the moment the output of the first query is:
{
"id": 1,
"name": "Test Template",
"data": [
{
"id": "1",
"data": null,
"name": "Assigned User",
"format": "String"
},
{
"id": "2",
"data": null,
"name": "Office",
"format": "String"
},
{
"id": "3",
"data": null,
"name": "Department",
"format": "String"
}
],
"id_organisation": 1
}
This output is what I would like to recreate using one query and both tables. The second query outputs this, but I do not know how to merge it into a single query:
[{
"id": 1,
"name": "Assigned User",
"format": "String",
"data": null
},{
"id": 2,
"name": "Office",
"format": "String",
"data": null
},{
"id": 3,
"name": "Department",
"format": "String",
"data": null
}]
The feature you're looking for is json concatenation. You can do that by using the operator ||. It's available since PostgreSQL 9.5
SELECT to_jsonb(template.*) || jsonb_build_object('data', (SELECT to_jsonb(field) WHERE template_id = templates.id)) FROM template
Sorry for poorly phrasing what I was trying to achieve, after hours of Googling I have worked it out and it was a lot more simple than I thought in my ignorance.
SELECT id, name, data
FROM public.template, (
SELECT array_to_json(array_agg(to_json(fields)))
FROM (
SELECT id, name, format, data
FROM public.field
WHERE template_id = 1
) fields
) as data
WHERE id = 1
I wanted the result of the subquery to be a column in the ouput rather than compiling the entire output table as a JSON.