PostgreSQL dynamic table access - sql

I have a products schema and some tables there.
Each table in products schema has an id, and by this id I can get this table name, e.g.
products
\ product1
\ product2
\ product3
I need to select info from dynamic access to appropriate product, e.g.
SELECT * FROM 'products.'(SELECT id from categories WHERE id = 7);
Of course, this doesn't work...
How I can do something like that in PostgreSQL?

OK, I found a solution:
CREATE OR REPLACE FUNCTION getProductById(cid int) RETURNS RECORD AS $$
DECLARE
result RECORD;
BEGIN
EXECUTE 'SELECT * FROM ' || (SELECT ('products.' || (select category_name from category where category_id = cid) || '_view')::regclass) INTO result;
RETURN result;
END;
$$ LANGUAGE plpgsql;
and to select:
SELECT * FROM getProductById(7) AS b (category_id int, ... );
works for PostgreSQL 9.x

If you can change your database layout to use partitioning instead, that would probably be the way to go. Then you can just access the "master" table as if it were one table rather than multiple subtables.
You could create a view that combines the tables with an extra column corresponding to the table it's from. If all your queries specify a value for this extra column, the planner should be smart enough to skip scanning all the rest of the tables.
Or you could write a function in PL/pgSQL, using the EXECUTE command to construct the appropriate query after fetching the table name. The function can even return a set so it can be used in the FROM clause just as you would a table reference. Or you could just do the same query construction in your application logic.

To me, it sounds like you've a major schema design problem: shouldn't you only have one products table with a category_id in it?
Might you be maintaining the website mentioned in this article?
http://thedailywtf.com/Articles/Confessions-The-Shopping-Cart.aspx

Related

Convert JSONB Keys to Columns

I have a user table users containing id , name and information of type jsonb
User Table
id
name
information
1001
Alice
{"1":"Google","2":"1991-02-08"}
1002
Bob
{"1":"StackOverflow","3":"www.google.com"}
I have another Table having all the profile fields values named ProfileFields
profilefieldid
Value
1
Company
2
DateOfBirth
3
ProfileLink
The information jsonb column can only have keys present in the ProfileField Table.
You can expect the data is coming from a real world and the profile field will be updating.
I would like to output export this table in the format of
id
name
Company
DateOfBirth
ProfileLink
1001
Alice
Google
1991-02-08
1002
Bob
StackOverflow
www.google.com
My Trails :-
I was able to map profilefieldid with its respective values
SELECT
id ,
name ,
(SELECT STRING_AGG(CONCAT((SELECT "title" FROM "profile_fields" WHERE CAST("key" AS INTEGER)="id"),':',REPLACE("value",'"','')),',') FROM JSONB_EACH_TEXT("profile_fields")) "information"
FROM "users" ORDER BY "id";
I tried to use json_to record() but since the profilefield can have dynamic keys i was not able to come up with a solution because in the AS block i need to specify the columns in advance.
I sometimes encounter errors in Select Statement as Subquery returning more than 1 column.
Any suggestions and Solutions are greatly appreciated and welcomed.
Let me know if i need to improve my db structure , like its not in 2nd NormalForm or not well structured like that. Thank You
There is no way you can make this dynamic. A fundamental restriction of the SQL language is, that the number, names and data type of all columns of a query must be known before the database starts retrieving data.
What you can do though is to create a stored procedure that generates a view with the needed columns:
create or replace procedure create_user_info_view()
as
$$
declare
l_columns text;
begin
select string_agg(concat('u.information ->> ', quote_literal(profilefield_id), ' as ', quote_ident(value)), ', ')
into l_columns
from profile_fields;
execute 'drop view if exists users_view cascade';
execute 'create view users_view as select u.id, u.name, '||l_columns||' from users u';
end;
$$
language plpgsql;
After the procedure is executed, you can run select * from users_view and see all profile keys as columns.
If you want, you can create a trigger on the table profile_fields that re-creates the view each time the table is changed.
Online example

Trigger keeps updating value in every row instead of particular one

I am trying to calculate a sum for each particular order. I am using this trigger but it doesn't work properly, it updates every row with the same value instead of the only one with proper id.
done_services table
id
service_id
price
service table
id
name
payment table
id
sum
service_id
CREATE FUNCTION make_sum() RETURNS TRIGGER
AS $$
BEGIN
UPDATE payment
SET sum = (select sum(price) from done_services where service_id = new.service_id);
RETURN NULL;
END;$$ LANGUAGE plpgsql;
CREATE TRIGGER make_sum
AFTER INSERT ON basket FOR EACH ROW EXECUTE FUNCTION make_sum();
I used this command to enter an item
insert into done_services(id, service_id, price) values(uuid_generate_v4(), '76594d2f-7153-495f-9671-0ddaa331568c', 100);
But the sum changed for both rows instead of the only one with service id
Image
The immediate cause for the error message is the missing WHERE clause as instructed by Edouard. Plus, prevent expensive empty updates like:
UPDATE payment p
SET sum = ds.sum_price
FROM (
SELECT sum(d.price) AS sum_price
FROM done_services d
WHERE d.service_id = NEW.service_id
) ds
WHERE p.service_id = sum_price
AND p.sum IS DISTINCT FROM ds.sum_price;
In addition to fixing the prime error, this prevents empty updates that would not change the sum, but still write a new row version at full cost.
But the whole idea is questionable.
Keeping a sum from many rows up to date via trigger is expensive and error prone. Did you cover DELETE and INSERT accordingly? What about TRUNCATE? What about concurrent write access? Race conditions, deadlocks?
To get get the current sum for a set that can change dynamically, the superior solution is typically not to save that sum in the table at all. Use a VIEW or MATERIALIZED VIEW instead.
Or, to get the sum for a single or few payments, use a function:
CREATE OR REPLACE FUNCTION f_payment_sum(_service_id int)
RETURNS numeric
LANGUAGE sql STABLE PARALLEL SAFE AS
$func$
SELECT sum(d.price)
FROM done_services d
WHERE d.service_id = _service_id;
$func$
Related:
Updating a row based on a value from another table?
just missing something in your UPDATE statement :
UPDATE payment
SET sum = (select sum(price) from done_services where service_id = new.service_id)
WHERE service_id = new.service_id ;
Next time please create a dbfiddle with your data model, sample of data and queries.

Oracle SQL statement to create temp table with values from stored procedure

I need to create a temp table that is created by looping through a table of numbers and then adding values to the the temp table that pass a stored procedure check.
If this was C# I would write the following:
List<string> tempString = new List<string>;
foreach(var order in OrderList.Orders)
{
if (ShipOrder(order) == true)
tempString.Add(order.OrderId);
}
Or maybe pseudo code would be a better explanation.
Loop through each order in a table of orders. For each order that the stored procedure ShipOrder returns true for add to a temp table of OrdersToShip. Later use the table of OrdersToShip to do an update.
Please note this is a simple version of what I am doing. The procedure to determine if something should be shipped is rather complicated.
Oracle SQL knows no boolean data type, so your function ShipOrder will have to return something else, maybe numbers 1/0 or strings 'Y'/'N', 'TRUE'/'FALSE'. Apart from that, you simply want to select order IDs:
To insert rows into an existing table:
insert into orders_to_ship (orderid)
select orderid
from orders
where shiporder(orderid) = 'TRUE';
Or to create the not yet existing table on-the-fly:
create table orders_to_ship as
select orderid
from orders
where shiporder(orderid) = 'TRUE';
You see you don't need loops in SQL, because you are just describing the data sets you want. And usually you don't need temporary tables either :-)

INSERT or UPDATE on PostgreSQL views

I'm starting with PostgreSQL views, since they are useful for my use case and will provide better performance than functions.
(This isn't relevant, but I'm using Django 1.7 on Heroku Postgresā€”just in case).
I have already created a view and can query it fine. I'd like to write a Django wrapper around the view so I can treat it like a table, and query and write to it accordingly. I've been reviewing the
Postgres docs on INSERT and UPDATE for views but to be honest I find their docs so hard to read I can barely parse through what they're saying.
Let's say I have the following view:
CREATE OR REPLACE VIEW links AS
SELECT
listing.id AS listing_id,
CONCAT('/i-', industry.slug, '-j-', listing.slug, '/') AS link,
'https://www.example.com' || CONCAT(industry.slug, '-SEP-', listing.slug, '/') AS full_link,
listing.slug AS listing_slug,
industry.slug AS industry_slug
FROM listing
INNER JOIN company ON company.id = listing.company_id
INNER JOIN industry ON industry.id = company.industry_id
Here, I'm using industry.slug and listing.slug to build links. I'd like to be able to update those two fields from this view, as such:
UPDATE links
SET listing_slug = 'my-new-slug'
WHERE listing_id = 5;
How do I create the rules to do this properly?
Because of the double join, is better for you to use a trigger procedure. To update the industry table, you need first to find the industry.id using the foreign key listing-company-industry.
A procedure could look like so:
CREATE OR REPLACE FUNCTION update_listing_and_industry() RETURNS TRIGGER AS
$$
DECLARE _company_id int; _industry_id int;
BEGIN
_company_id = (SELECT company_id FROM listing WHERE id = OLD.listing_id);
_industry_id = (SELECT industry_id FROM company WHERE id = _company_id);
UPDATE listing SET slug = NEW.listing_slug WHERE id = OLD.listing_id;
UPDATE industry SET slug = NEW.industry_slug WHERE id = _industry_id;
RETURN NEW;
END;
$$
LANGUAGE plpgsql;
NOTE: a trigger procedure is a normal procedure which returns a TRIGGER. Depending on what the trigger do, the procedure must return NEW or OLD (in this case NEW).
And the trigger with the INSTEAD OF UPDATE clause:
CREATE trigger update_view INSTEAD OF UPDATE ON links
FOR EACH ROW EXECUTE PROCEDURE update_listing_and_industry();
The only column in your view, which can be updated is listing_slug.
Updating the other columns is impossible or pointless (e.g. it makes no sense to update industry_slug as there is no primary key for industry in the view).
In such a case you should use a conditional rule which precludes the ability to update other columns.
As it is described in the documentation, there must be an unconditional INSTEAD rule for each action you wish to allow on the view. Therefore you should create dummy INSTEAD rule for update and conditional ALSO rule:
CREATE RULE update_links_default
AS ON UPDATE TO links DO INSTEAD NOTHING;
CREATE RULE update_links_listing
AS ON UPDATE TO links
WHERE NEW.listing_slug <> OLD.listing_slug
DO ALSO
UPDATE listing
SET slug = NEW.listing_slug
WHERE id = OLD.listing_id;
If you'll add column industry.id as industry_id to the view, you can define appropriate rule for table industry:
CREATE RULE update_links_industry
AS ON UPDATE TO links
WHERE NEW.industry_slug <> OLD.industry_slug
DO ALSO
UPDATE industry
SET slug = NEW.industry_slug
WHERE id = OLD.industry_id;
Generally, you need one ALSO rule for a table.

Is it possible to write a PostgreSQL query that exports different tables by using a loop?

I would like to export several tables, but each of them created under a different condition and with different file names. For example, it is possible to create a table under the condition that a certain id is equal to x and to export this table to a file called table_x.txt. This can be done by using the following (simple) query.
COPY(
SELECT *
FROM table
WHERE table.id=x
)
TO '/home/table_x.txt'
;
However, instead of doing this about 100 times, I would like to generate a loop which performs this query for different values of x. Is this possible? I don't need to save the different tables in my database, I only want to export them to different text files.
You can use something like:
DO $$
DECLARE r record;
BEGIN
FOR r IN
SELECT id FROM id_list
LOOP
COPY(
SELECT *
FROM table
WHERE table.id=r.id
)
TO '/home/table_'||r.id||'.txt';
END LOOP;
END $$;
It is an anonymous PL/SQL function that will loop through id in SELECT id FROM id_list and execute COPY for every id.