PostgreSQL query get all entries with many-to-many relationship with another table - sql

I have a table posts which has a many-to-many relationship with tags, the pivot table is called posts-tags.
I want to be able to retrieve all posts by a list of tag id's.
Imagine
posts
id | text
--------
1 | "foo"
2 | "bar"
3 | "baz"
posts_tags
post_id | tag_id
-----------------
1 | 1
1 | 2
1 | 3
2 | 1
3 | 1
tags
id | name
--------
1 | "foo"
2 | "bar"
3 | "baz"
With tag id's [1, 2, 3], I should get back [{id: 1, text: "foo"}]
With tag id's [1], I should get back [{id: 1, text: "foo"}, {id: 2, text: "bar"}, {id: 3, text: "baz"}]
Basically, I want to retrieve all the posts related to the list of tags.

You can use a subquery to filter posts that have all the specified tags:
select json_agg(json_build_object('id', p.id, 'text', p.txt))
from posts p where (select count(*) from json_array_elements('[1, 2, 3]') v
join post_tags t on t.post_id = p.id and v.value::text::int = t.tag_id) = json_array_length('[1, 2, 3]')
See fiddle.

Related

how to obtain objects in objects (with all the key names) in SQL?

my database contains two tables named category and sport:
category
-----------------------------------
id | title | ...
1 | summer
2 | autumn
3 | spring
4 | winter
sport
--------------------------------------------------------------
id | title | description | category_id | ...
1 | ski | ... | 4
2 | surf | ... | 1
3 | snorkeling | ... | 1
4 | running | ... | 3
5 | hiking | ... | 2
...
my question is how to obtain with one request in Sql (postgresql) this result :
{
category: [
{
c.id: 1,
c.title: summer,
sport: [
{
s.id: 2,
s.title: surf,
},
{
s.id: 3,
s.title: snorkeling,
},
],
},
{
c.id: 2,
c.title: autumn,
sport: [
{
s.id: 5,
s.title: hiking
}
],
},
{
c.id: 3
c.title: spring,
sport: [
{
s.id: 4,
s.title: running,
}
],
},
{
...
}
]
}
I've tried with ARRAY_AGG but it removes the key_names and i need it to call the values with in my API.
You need to combine jsonb_agg(), to_jsonb()andjsonb_build_object()`
select to_jsonb(c)||jsonb_build_object('sport', jsonb_agg(to_jsonb(s) - 'category_id'))
from category c
join sport s on s.category_id = c.id
group by c.id;
The above assumes that category.id is defined as the primary key.
It returns one row per category.
If you really want one gigantic array of all rows, you need to aggregate in two steps:
select jsonb_build_object('category', jsonb_agg(to_jsonb(c)||sport))
from category c
join (
select category_id, jsonb_build_object('sport', jsonb_agg(to_jsonb(s) - 'category_id')) as sport
from sport s
group by s.category_id
) s on s.category_id = c.id
Online example

How to get collection users with collection tasks inside?

User has_many tasks
Task belongs_to user
(Task has key user_id)
I`m trying make sql query
User.find_by_sql("
SELECT
users.id,
users.name,
tasks # <-- how can I get collection tasks inside each users?
FROM users
JOIN tasks
ON tasks.user_id = users.id
")
I would like to get next response
[
{
id: 1,
name: Jon,
tasks: [ # <- collection, please
{ id: 1, user_id: 1, title: ... },
{ id: 2, user_id: 1, title: ... }
]
},
{
id: 2,
name: Sofia,
tasks: [ # <- collection, please
{ id: 3, user_id: 2, title: ... },
{ id: 4, user_id: 2, title: ... }
]
]
If use ActiveRecord, it be User.includes(:tasks)
How can I get collection tasks inside each users? (sql)
Is it possible?
To select rows from the joined table just specify the name of the table and the column:
SELECT
users.id,
users.name,
tasks.id AS task_id,
tasks.name AS task_name
FROM
users
LEFT OUTER JOIN
tasks ON tasks.user_id = users.id
Since SQL is tabular the result is a list of rows and columns:
id | name | task_id | task_name
----+----------+---------+-----------
2 | Xzbdulia | 1 | Cleaning
2 | Xzbdulia | 2 | Laundry
2 | Xzbdulia | 3 | Coding
3 | Elly | 4 | Cleaning
3 | Elly | 5 | Laundry
3 | Elly | 6 | Coding
4 | Lewis | 7 | Cleaning
4 | Lewis | 8 | Laundry
4 | Lewis | 9 | Coding
5 | Hang | 10 | Cleaning
5 | Hang | 11 | Laundry
5 | Hang | 12 | Coding
If you really want to rough it and do this as a raw sql query and handle processing the results yourself you can do:
sql = <<~SQL
SELECT
users.id,
users.name,
tasks.id AS task_id,
tasks.name AS task_name
FROM
users
LEFT OUTER JOIN
tasks ON tasks.user_id = users.id;
SQL
results = User.connection.execute(sql).each_with_object({}) do |row, memo|
id = row["id"]
memo[id] ||= {
id: id,
name: row["name"],
tasks: []
}
memo[id][:tasks].push({
id: row["task_id"],
name: row["task_name"]
}) unless row["task_id"].nil?
end.values
[{:id=>2, :name=>"Xzbdulia", :tasks=>[{:id=>1, :name=>"Cleaning"}, {:id=>2, :name=>"Laundry"}, {:id=>3, :name=>"Coding"}]}, {:id=>3, :name=>"Elly", :tasks=>[{:id=>4, :name=>"Cleaning"}, {:id=>5, :name=>"Laundry"}, {:id=>6, :name=>"Coding"}]}, {:id=>4, :name=>"Lewis", :tasks=>[{:id=>7, :name=>"Cleaning"}, {:id=>8, :name=>"Laundry"}, {:id=>9, :name=>"Coding"}]}, {:id=>5, :name=>"Hang", :tasks=>[{:id=>10, :name=>"Cleaning"}, {:id=>11, :name=>"Laundry"}, {:id=>12, :name=>"Coding"}]}]
But whats the point when you can just use ActiveRecord and ActiveModel::Serializers::JSON and do this in two lines:
#users = User.includes(:tasks).select(:id, :name)
.as_json(only: [:id, :name], include: { tasks: { only: [:id, :name]} })
Yeah it does two queries ands selects a few more columns on tasks but I doubt there will be any really noticeable performance difference plus its a lot more maintainable. There are better things to spend your time on like tests.

SQLite convert String of values delimited by commas into a List

I have a field called 'languageids' of which can contain multiple ids e.g. 0,1,2,3 or a single value e.g. 2. I need to be able to select movies that have a particular ID. I am limited to using functions that are available within SQLite. I am not allowed to restructure the existing database or modify the entries.
Structure
+----+-------+-------+-------------+
| id | title | genre | languageids |
+----+-------+-------+-------------+
Example Data
+----+-----------+-----------+--------------+
| id | title | genre | languageids |
+----+-----------+-----------+--------------+
| 1 | "Movie 1" | "Action" | "1,2,5,8,10" |
+----+-----------+-----------+--------------+
| 2 | "Movie 2" | "Romance" | "2,4" |
+----+-----------+-----------+--------------+
| 3 | "Movie 3" | "Comedy" | "3,8,21" |
+----+-----------+-----------+--------------+
If I was to fetch all movies with the language id 2 I should get back the movie that have the ID 1 and 2 but not 3. Is there any where to split this string up into a list of which I can do "languageId IN languageids"? If not, is there an alternative solution?
You can try this
select * from tablename where ',' || languageids || ',' like '%,2,%'
it will gave you all records containing language id 2
with t (id,title,genre,languageids)
as (
select 1, 'Movie 1', 'Action', '1,2,5,8,10' union all
select 2, 'Movie 2', 'Romance', '2,4' union all
select 3, 'Movie 3', 'Comedy', '3,8,21'
)
select * from t where (',' + languageids + ',') like '%,2,%'

PostgreSQL json_array_elements with array indexes (keys)

Simple query works fine:
SELECT json_array_elements_text('["first", "third", "second"]'::json)
But I also want to retrieve array keys somehow so the output would be like:
key value
0 first
1 third
2 second
UPD
Seems like row_number() is a p̶r̶o̶p̶e̶r̶ solution, but I cannot figure out how to use it further.
Lets say i have 'posts' table, each post contains an array of related comments in JSON format:
SELECT id, title, comments FROM posts
id title comments
1 Title 1 ["comment 1", "comment 2"]
2 Title 2 ["comment 3", "comment 4", "comment 5"]
3 Title 3 ["comment 6"]
The goal is to expand not only comments values, but also the keys:
Tricky SQL here
id title comment key
1 Title 1 comment 1 0
1 Title 1 comment 2 1
2 Title 2 comment 3 0
2 Title 2 comment 4 1
2 Title 2 comment 5 2
3 Title 3 comment 6 0
UPD2
Solution using row_numbers():
SELECT *, row_number() OVER (PARTITION BY id) - 1 AS key
FROM (
SELECT id, title, json_array_elements_text(comments::json) AS comment
FROM posts
) p
Thanks in advance!
Use the function json_array_elements_text() with ordinality:
with my_table(id, title, comments) as (
values
(1, 'Title 1', '["comment 1", "comment 2"]'::json),
(2, 'Title 2', '["comment 3", "comment 4", "comment 5"]'),
(3, 'Title 3', '["comment 6"]')
)
select id, title, value as comment, ordinality- 1 as key
from my_table
cross join json_array_elements_text(comments) with ordinality
id | title | comment | key
----+---------+-----------+-----
1 | Title 1 | comment 1 | 0
1 | Title 1 | comment 2 | 1
2 | Title 2 | comment 3 | 0
2 | Title 2 | comment 4 | 1
2 | Title 2 | comment 5 | 2
3 | Title 3 | comment 6 | 0
(6 rows)
From the documentation:
If the WITH ORDINALITY clause is specified, an additional column of type bigint will be added to the function result columns. This column numbers the rows of the function result set, starting from 1.
JSON arrays don't have keys, however you can get the desired output by using rownum:
SELECT row_number() OVER ()-1 AS key, *
FROM json_array_elements_text('["first", "third", "second"]'::json) q
sqlfiddle

Unique values in Sequelize

I use Sequelize ORM for Node.js. I need to get lines with unique values in certain column. Example table:
id | name | group
-----------------
1 | One | 2
2 | Two | 1
3 | Three| 2
4 | Four | 3
5 | Five | 1
Query for column group and result:
id | name | group
-----------------
1 | One | 2
2 | Two | 1
4 | Four | 3
Lines One, Two and Four was the first who had unique group values. How to make it in Sequelize?
A Sequelize raw query is one way of getting out the rows that you want:
/*
var sql =
SELECT r.id,
r.name,
r.groupnum
FROM s04.row r
JOIN
(SELECT min(id) AS id,
groupnum
FROM s04.row
GROUP BY groupnum) s
ON r.id = s.id
*/
return sq.query(sql, { type: sq.QueryTypes.SELECT});
The resulting promise will resolve to a JSON array:
[
{
"id": 1,
"name": "One",
"groupnum": 2
}
...
]
If you then needed to work with these rows as Instances you can call build on each element of the array:
Model.build({ /* attributes-hash */ }, { isNewRecord: false })
See here for an example. If I find a way of doing this via Sequelize function calls (aggregate, find*, etc) that isn't too hideous I'll also post that here as a separate answer.