Postgres: Dynamic creation of a temporat table - sql

I would like to create a report based on a few variables, one of them being months in a year. I would like to create a temporary table containing only the desired periods. Here's what I came up with:
The parts I'm having trouble with are placed between << and >>
DO $$
DECLARE myPeriodFrom INTEGER := 7;
DECLARE myPeriodTo INTEGER := 9;
DECLARE myColumn VARCHAR(255) := 'Period';
BEGIN
DROP TABLE IF EXISTS tmpTable;
CREATE TABLE tmp_table (tmpId INT, tmpDesc varchar(255));
FOR i IN myPeriodFrom..myPeriodTo LOOP
-- Create dynamic column name and fill with data
<< code here >> -- myColumn := (myColumn || CAST(i AS VARCHAR));
ALTER TABLE tmpTable ADD COLUMN << myColumn >> INT;
INSERT INTO tmpTable (myId, myColumn)
SELECT "Id", "YearAdded" FROM "Item" WHERE "Item"."MonthAdded" = i;
END LOOP;
END $$;
SELECT * FROM tmpTable

With a little regret I'm afraid to conclude that what I want is not possible. There is no real workable solution to dynamicly create columns.
Second, the application I'm using the query in turns out not to support queries with dynamic output. For these reasons I had to rethink my whole approach.
For those who read it and gave a suggestion, thanks.

Related

Create Temp Table in Each Loop and Union After Loop Completion

Using BigQuery's standard SQL scripting functionality, I want to 1) create a temp table for each iteration of a loop, and 2) union those temp tables after the loop is complete. I've tried something like the following:
DECLARE i INT64 DEFAULT 1;
DECLARE ttable_name STRING;
WHILE i < 10 DO
SET ttable_name = CONCAT('temp_table_', CAST(i AS STRING));
CREATE OR REPLACE TEMP TABLE ttable_name AS
SELECT * FROM my_table AS mt WHERE mt.my_col = 1;
SET i = i + 1;
END LOOP;
SELECT * FROM temp_table_*; -- wildcard table to union all results
But I get the following error:
Exceeded rate limits: too many table update operations for this table.
How can I accomplish this task?
Your script does not work the way you think it does!
Instead of writing in each iteration into separate table named like temp_table_N - you actually writing to the very same temp table named ttable_name - thus the Exceeded rate limits error
BigQuery does not allow using variables for objects names
Don't create new tables. Add to an existing one with an INSERT INTO, or hold data in a variable (if it's not too much data), as in:
DECLARE steps INT64 DEFAULT 1;
DECLARE table_holder ARRAY<STRUCT<steps INT64, x INT64, y ARRAY<INT64>>>;
LOOP
SET table_holder = (
SELECT ARRAY_AGG(
STRUCT(steps, 1 AS x, [1,2,3] AS y))
FROM (SELECT '')
);
SET steps = steps+1;
IF steps=30 THEN LEAVE; END IF;
END LOOP;
CREATE TABLE temp.results
AS
SELECT *
FROM UNNEST(table_holder)
Related: https://stackoverflow.com/a/59314390/132438
Question asker/OP here. While I have selected #felipe-hoffa's answer as I believe it will be best for future readers of this question, I have actually gone a different route in solving my problem:
BEGIN
DECLARE i INT64 DEFAULT 1;
CREATE OR REPLACE TEMP TABLE ttable AS
SELECT
CAST(NULL AS INT64) AS col1 -- cast NULL as the type of target col
,CAST(NULL AS FLOAT64) AS col2
,CAST(NULL AS DATE) AS col3;
WHILE i < 10 DO
-- overwrite `ttable` with its previous contents union'ed
-- with new data results from current loop iteration
CREATE OR REPLACE TEMP TABLE ttable AS
SELECT mt.col1, mt.col2, mt.col3 FROM my_table AS mt WHERE mt.other_col = i
UNION ALL
SELECT * FROM ttable;
SET i = i + 1;
END LOOP;
SELECT * FROM ttable; -- UNION'ed results
DROP TABLE IF EXISTS ttable;
END;
Why? I find it easier to stay in "table land" than to venture into "STRUCT/ARRAY land".

Store result of a query inside a function

I've the following function:
DO
$do$
DECLARE
maxgid integer;
tableloop integer;
obstacle geometry;
simplifyedobstacle geometry;
BEGIN
select max(gid) from public.terrain_obstacle_temp into maxgid;
FOR tableloop IN 1 .. maxgid
LOOP
insert into public.terrain_obstacle (tse_coll,tse_height,geom) select tse_coll,tse_height,geom
from public.terrain_obstacle_temp where gid = tableloop;
END LOOP;
END
$do$;
I need to modify this function in order to execute different queries according to the type of a column of public.terrain_obstacle_temp.
This is a temporary table created by reading a shapefile, and I need to know the kind of the geom column of that table. I have a query that give the data to me:
SELECT type
FROM geometry_columns
WHERE f_table_schema = 'public'
AND f_table_name = 'terrain_obstacle'
and f_geometry_column = 'geom';
It returns me a character_varying value (in this case MULTIPOLYGON).
Ho can I modify the function in order to get the result of the query, and create an if statement that allows me to execute some code according to the result of that query?
Is the intention to copy all the records from the temp table to the actual table? If so, you may be able to skip the loop:
insert into public.terrain_obstacle (tse_coll, tse_height, geom)
select tse_coll, tse_height, geom
from public.terrain_obstacle_temp
;
Do terrain_obstacle and terrain_obstacle_temp have the same structure? If so, then the "insert into ... select ..." should work fine provided the column types are the same.
If conditional typing is required, use the CASE WHEN syntax:
v_type geometry_columns.type%TYPE;
...
SELECT type
INTO v_type
FROM geometry_columns
WHERE f_table_schema = 'public'
AND f_table_name = 'terrain_obstacle'
AND f_geometry_column = 'geom'
;
insert into public.terrain_obstacle (tse_coll, tse_height, geom)
select tse_coll
,tse_height
,CASE WHEN v_type = 'MULTIPOLYGON' THEN my_func1(geom)
WHEN v_type = 'POINT' THEN my_func2(geom)
ELSE my_default(geom)
END
from public.terrain_obstacle_temp
;

SQL Oracle get the all columns of a cursor to use in a insert into

I want to create a procedure in which I use a cursor to select certain lines and then insert them in another table. I wonder if there is a notation to write it faster.
For instance here is the complete procedure
create or replace procedure myProc as
Cursor lines is
select * from table1 where c = '2';
begin
for line in lines loop
insert into table2 values(line.a, line.b, line.c, line.d ....);
end loop;
end;
/
I want to know if I can replace the 'insert into' line by something like
insert into table2 values(line.something);
or
insert into tables2 values(something(line));
(I think a view could be more effective but it's not the question here.)
Absolutely:
create or replace procedure myProc as
begin
insert into table2( . . .)
select a, b, c, d, . .
from table1
where c = '2';
end;
/
You should list the columns in table2 as well. That is what the table2( . . . ) means.
Despite the following code should work in case structure of both tables were identical...
INSERT INTO target_table
SELECT * FROM source_table;
... you should avoid this way of programming because any column addition to source or target table will end in SQL became invalid.
Looks like you're looking for a short-cut in order to avoid qualifying column names in an insert statement.
Well, although I do not recommend this, it can be done bug it's a little tedious. Here is an example:
if you have a table employee (employee_id, employee_name, designation). You'll need to create two types in the database-
a. An object type which is similar to employee-
create type employee_obj as object (employee_id number(28,0),
employee_name varchar2(100),
designation varchar2(30));
b. Create the set type of this record-
create type employee_obj_set is table of employee_obj;
c. In your PL/SQL procedure you can use something like:
DECLARE
empset employee_obj_set;
-- More variables here
BEGIN
-- other operations
-- populate your empset
-- other operations
INSERT INTO employee
SELECT * FROM table(empset);
-- other operations
END;
/
For faster inserts use bulk collect/FORALL insert instead of singular inserts.Find below sample...
DECLARE
CURSOR lines is select * from table1 where c = '2';
type lines_ty is table of table2%rowtype;
l_lines_t2 lines_ty;
BEGIN
OPEN lines;
LOOP
FETCH lines BULK COLLECT INTO l_lines_t2 LIMIT 500;
FORALL i IN 1..l_lines_t2.COUNT
INSERT INTO table2 VALUES l_lines_t2(i);
EXIT WHEN lines%NOTFOUND;
END LOOP;
CLOSE lines;
END;

value limitation in an IN clause Oracle

I work for a company that has a DW - ETL setup. I need to write a query that looks for over 2500+ values in an WHEN - IN clause and also over 1000+ values in a WHERE - IN clause. Basically it would look like the following:
SELECT
,user_id
,CASE WHEN user_id IN ('user_n', +2500 user_[n+1] ) THEN 1
ELSE 0
,item_id
FROM user_table
WHERE item_id IN ('item_n', +1000 item_[n+1] );
As you probably already know PL/SQL allows a maximum of 1000 values in an IN clause, so I tried adding OR - IN clauses (as suggested in other stackoverflow threads):
SELECT
,user_id
,CASE WHEN user_id IN ('user_n', +999 user_[n+1] )
OR user_id IN ('user_n', +999 user_[n+1] )
OR user_id IN ('user_n', +999 user_[n+1] ) THEN 1
ELSE 0 END AS user_group
,item_id
FROM user_table
WHERE item_id IN ('item_n', +999 item_[n+1] )
OR item_id IN ('item_n', +999 item_[n+1] );
NOTE: i know the math is erroneous in the examples above, but you get the point
The problem is that queries have a maximum executing time of 120 minutes and the job is being automatically killed. So I googled what solutions I could find and it seems Temporary Tables could be the solution I'm looking for, but with all honesty none of the examples I found is clear enough on how to include the values I want in the table and also how to use this table in my original query. Not even the ORACLE documentation was of much help.
Another potential problem is that I have limited rights and I've seen other people mention that in their companies they don't have the rights to create temporary tables.
Some of the info I found in my research:
ORACLE documentation
StackOverflow thread
[StackOverflow thread 2]
Another solution I found was using tuples instead, as mentioned in THIS thread (which I haven't tried) because as another user mentions performance seems greatly affected.
Any guidance on how to use a Temporary Table or if anyone has another way of dealing with this limitation would be greatly appreciated.
Create a global temporary table so no undo logs are created
CREATE GLOBAL TEMPORARY TABLE <table_name> (
<column_name> <column_data_type>,
<column_name> <column_data_type>,
<column_name> <column_data_type>)
ON COMMIT DELETE ROWS;
then depending on how the user list arrives import the data into a holding table and then run
select 'INSERT INTO global_temporary_table <column> values '
|| holding_table.column
||';'
FROM holding_table.column;
This gives you insert statements as output which you run to insert the data.
then
SELECT <some_column>
FROM <some_table>
WHERE <some_value> IN
(SELECT <some_column> from <global_temporary_table>
Use a collection:
CREATE TYPE Ints_Table AS TABLE OF INT;
CREATE TYPE IDs_Table AS TABLE OF CHAR(5);
Something like this:
SELECT user_id,
CASE WHEN user_id MEMBER OF Ints_Table( 1, 2, 3, /* ... */ 2500 )
THEN 1
ELSE 0
END
,item_id
FROM user_table
WHERE item_id MEMBER OF IDs_table( 'ABSC2', 'DITO9', 'KMKM9', /* ... */ 'QD3R5' );
Or you can use PL/SQL to populate a collection:
VARIABLE cur REFCURSOR;
DECLARE
t_users Ints_Table;
t_items IDs_Table;
f UTL_FILE.FILE_TYPE;
line VARCHAR2(4000);
BEGIN
t_users.EXTEND( 2500 );
FOR i = 1 .. 2500 LOOP
t_users( t_users.COUNT ) := i;
END LOOP;
// load data from a file
f := UTL_FILE.FOPEN('DIRECTORY_HANDLE','datafile.txt','R');
IF UTL_FILE.IS_OPEN(f) THEN
LOOP
UTL_FILE.GET_LINE(f,line);
IF line IS NULL THEN EXIT; END IF;
t_items.EXTEND;
t_items( t_items.COUNT ) := line;
END LOOP;
OPEN :cur FOR
SELECT user_id,
CASE WHEN user_id MEMBER OF t_users
THEN 1
ELSE 0
END
,item_id
FROM user_table
WHERE item_id MEMBER OF t_items;
END;
/
PRINT cur;
Or if you are using another language to call the query then you could pass the collections as a bind value (as shown here).
In PL/SQL you could use a collection type. You could create your own like this:
create type string_table is table of varchar2(100);
Or use an existing type such as SYS.DBMS_DEBUG_VC2COLL which is a table of VARCHAR2(1000).
Now you can declare a collection of this type for each of your lists, populate it, and use it in the query - something like this:
declare
strings1 SYS.DBMS_DEBUG_VC2COLL := SYS.DBMS_DEBUG_VC2COLL();
strings2 SYS.DBMS_DEBUG_VC2COLL := SYS.DBMS_DEBUG_VC2COLL();
procedure add_string1 (p_string varchar2) is
begin
strings1.extend();
strings1(strings.count) := p_string;
end;
procedure add_string2 (p_string varchar2) is
begin
strings2.extend();
strings2(strings2.count) := p_string;
end;
begin
add_string1('1');
add_string1('2');
add_string1('3');
-- and so on...
add_string1('2500');
add_string2('1');
add_string2('2');
add_string2('3');
-- and so on...
add_string2('1400');
for r in (
select user_id
, case when user_id in table(strings2) then 1 else 0 end as indicator
, item_id
from user_table
where item_id in table(strings1)
)
loop
dbms_output.put_Line(r.user_id||' '||r.indicator);
end loop;
end;
/
You can use below example to understand Global temporary tables and the type of GTT.
CREATE GLOBAL TEMPORARY TABLE GTT_PRESERVE_ROWS (ID NUMBER) ON COMMIT PRESERVE ROWS;
INSERT INTO GTT_PRESERVE_ROWS VALUES (1);
COMMIT;
SELECT * FROM GTT_PRESERVE_ROWS;
DELETE FROM GTT_PRESERVE_ROWS;
COMMIT;
TRUNCATE TABLE GTT_PRESERVE_ROWS;
DROP TABLE GTT_PRESERVE_ROWS;--WONT WORK IF YOU DIDNOT TRUNCATE THE TABLE OR THE TABLE IS BEING USED IN SOME OTHER SESSION
CREATE GLOBAL TEMPORARY TABLE GTT_DELETE_ROWS (ID NUMBER) ON COMMIT DELETE ROWS;
INSERT INTO GTT_DELETE_ROWS VALUES (1);
SELECT * FROM GTT_DELETE_ROWS;
COMMIT;
SELECT * FROM GTT_DELETE_ROWS;
DROP TABLE GTT_DELETE_ROWS;
However as you mentioned you receive the input in an excel file so you can simply create a table and load data in that table. Once the data is loaded you can use the data in IN clause of your query.
select * from employee where empid in (select empid from temptable);
create temporary table userids (userid int);
insert into userids(...)
then a join or in subquery
select ...
where user_id in (select userid from userids);
drop temporary table userids;

In SAP HANA how can I generate a range of numbers, eg from 1 to 10?

In SAP HANA I wish to have a view which has a range of number from 1 to 10, or 1 to n where n is any number. So when I select from the view I can select n records to get the first n records from the range.
I was able to create a table with 1000 rows with a ID that increment's by using this stored procedure. Is there an easier way?
DROP PROCEDURE "DEMO_PROC";
CREATE PROCEDURE "DEMO_PROC"(
IN ID INTEGER )
LANGUAGE SQLSCRIPT AS
/*********BEGIN PROCEDURE SCRIPT ************/
BEGIN
DECLARE
START_ID INTEGER;
DROP TABLE TEST_TABLE;
CREATE COLUMN TABLE "TEST_TABLE" (ID INTEGER, NAME VARCHAR(10));
START_ID := 0;
WHILE START_ID < 1000 DO
START_ID := START_ID + 1;
INSERT INTO "TEST_TABLE" VALUES(:START_ID, '');
END WHILE;
END;
CALL "DEMO_PROC"(1);
SELECT * FROM "TEST_TABLE";
Using a generator is the prefered way:
INSERT INTO "TEST_TABLE" SELECT GENERATED_PERIOD_START as ID, '' as NAME from SERIES_GENERATE_INTEGER(1,1,1001);
is much easier and faster.
I think for loop is easier than while.
FOR START_ID IN 1..1000 DO
INSERT INTO "TEST_TABLE" VALUES(START_ID,'');
END FOR;