I have two tables, let's call them Table1 and Table2.
I have a variable named: role
Now I need to do a select statement from Table1 (which can return several rows):
SELECT roles INTO role FROM TABLE1
And then I need to insert the role in Table2
INSERT INTO Table2(RELATION_ID, ROLES) VALUES (maxID+1, role);
Now my problem is that, the first select statement returns more than 1 row, and so I cannot put the value in the role variable. So I need something like an ArrayList or DataSet to put the values in, since I need to do a loop over these values, and insert them in another table. How do I use a list and iterate over it in SQL (Oracle).
Below you will see my code, simplified a bit, so that it is only my issue that is in focus. Of course the code will not compile, just trying to make clear what I need. Thanks.
CREATE OR REPLACE PROCEDURE "myProcedure"(eventID IN NUMBER)
IS
roleNumber NUMBER;
maxID NUMBER;
BEGIN
SELECT roles FROM TABLE1;
SELECT MAX(RELATION_ID) INTO maxID FROM SOMEOTHERTABLE;
IF maxID IS NULL
THEN
maxID := 0;
END IF;
FOR counter IN theListThatINeed
INSERT INTO Table2(RELATION_ID, ROLES) VALUES (maxID+1, roleAtCounter);
END IF;
END "myProcedure"
In this situation I would use just a single SQL Statement. No need to context switch back and forth between the SQL engine and another language, and definitely no need to process something like this 1 row at a time!
INSERT INTO Table2(RELATION_ID, ROLES)
SELECT (SELECT NVL(MAX(RELATION_ID),0) FROM SOMEOTHERTABLE)+rownum, roles
FROM table1;
Use a collection and get your job done. See below:
CREATE OR REPLACE PROCEDURE myProcedure (eventID IN NUMBER)
IS
ID_max NUMBER;
CURSOR cur
IS
SELECT roles FROM TABLE1;
TYPE x IS TABLE OF cur%ROWTYPE
INDEX BY PLS_INTEGER;
var x;
BEGIN
OPEN CUR;
FETCH cur BULK COLLECT INTO var;
CLOSE cur;
SELECT MAX (RELATION_ID) INTO ID_max FROM SOMEOTHERTABLE;
IF ID_max IS NULL
THEN
ID_max := 0;
END IF;
FOR i IN 1 .. var.COUNT
LOOP
INSERT INTO Table2 (RELATION_ID, ROLES)
VALUES ( (ID_max + i), var(i).roles);
END LOOP;
COMMIT;
END myProcedure;
I know this is not exactly you're asking about but solution for your problem is:
create sequence seqid start with 1 increment by 1;
INSERT INTO Table2(RELATION_ID, ROLES)
select seqid.nextval, role from table1;
If some other process can also insert to table2 and you really need max(id) you can go with:
with maxid as (select max(id) mid from table2)
INSERT INTO Table2(RELATION_ID, ROLES)
select mid+rownum, role from table1, maxid;
Related
I wrote Stored Procedure (SP) where inside SP, 2 SP separated for 2 insertion from table. Both table contains more than 25 columns in each temp & main table. Below is query-
create or replace procedure sp_main as
procedure tbl1_ld as
cursor c1 is select * from tmp1;
type t_rec1 is table of c1%rowtype;
v_rec1 t_rec1;
begin
open c1;
loop
fetch c1 bulk collect into v_rec1 limit 1000;
exit when v_rec1.count=0;
insert into tbl1 values v_rec1;
end loop;
end tbl1_ld;
procedure tbl2_ld as
cursor c2 is select * from tmp2;
type t_rec2 is table of c2%rowtype;
v_rec2 t_rec2;
begin
open c2;
loop
fetch c2 bulk collect into v_rec2 limit 1000;
exit when v_rec2.count=0;
insert into tbl2 values v_rec2;
end loop;
end tbl2_ld;
begin
null;
end sp_main;
/
I used EXECUTE IMMEDIATE 'insert into tbl1 select * from tmp1'; for insertion inside both SP tbl1_ld & tbl2_ld instead of using cursor, SP compiled but no record has been inserted.
Well, you didn't actually run any of these procedures. The last few lines of your code should be
<snip>
end tbl2_ld;
begin
tbl1_ld; --> this
tbl2_ld --> this
end sp_main;
/
On the other hand, I prefer avoiding insert into ... select * from because it just loves to fail when you modify tables' description and don't fix code that uses those tables.
Yes, I know - it is just boring to name all 25 columns, but - in my opinion - it's worth it. Therefore, I'd just
begin
insert into tbl1 (id, name, address, phone, ... all 25 columns)
select id, name, address, phone, ... all 25 columns
from tmp1;
insert into tbl2 (id, name, address, phone, ... all 25 columns)
select id, name, address, phone, ... all 25 columns
from tmp2;
end;
In other words, no cursors, types, loops, ... nothing. Could have been pure SQL (i.e. no PL/SQL). If you want to restrict number of rows inserted, use e.g. ... where rownum <= 1000 (if that's why you used the limit clause).
As of dynamic SQL you mentioned (execute immediate): why would you use it? There's nothing dynamic in code you wrote.
I have a trigger whose purpose is to fire whenever there is a DELETE on a particular table and insert the deleted data into another table in json format.
The trigger works fine if I am specifying each column explicitly. Is there any way to access the entire table row?
This is my code.
TRIGGER1
AFTER DELETE
ON QUESTION
FOR EACH ROW
DECLARE
json_doc CLOB;
BEGIN
select json_arrayagg (
json_object ('code' VALUE :old.id,
'name' VALUE :old.text,
'description' VALUE :old.text) returning clob
) into json_doc
from dual;
PROCEDURE1(json_doc);
END;
This works fine. However, what I want is something like this. Instead of explicity specifying each column, I want to convert the entire :OLD data
TRIGGER1
AFTER DELETE
ON QUESTION
FOR EACH ROW
DECLARE
json_doc CLOB;
BEGIN
select json_arrayagg (
json_object (:old) returning clob
) into json_doc
from dual;
PROCEDURE1(json_doc);
END;
Any suggestion please.
The short and correct answer is you can't. We have a few tables in our application where we do this and the developer is responsible for updating the trigger when they add a column: this is enforced with code reviews and is probably the cleanest solution for this scenario.
The long answer is you can get close, but I wouldn't do this in production for several reasons:
Triggers are terrible for performance
Triggers are terrible for code clarity
This requires reading the row again using flashback query so
You aren't getting the values of this row from inside your current transaction: if you update the row in your transaction and then delete it the JSON will show what the values were BEFORE your update
There is a performance penalty for reading from UNDO
There is potential that UNDO won't be available and your trigger will fail
Your user needs permission to execute flashback queries
Your database needs to meet all the perquisites to support flashback queries
Deleting a lot of rows will cause the ROWID collection to get large and consume PGA
There are probably more reasons, but in the interest of "can it be done" here you go...
DROP TABLE t1;
DROP TABLE t2;
DROP TRIGGER t1_ad;
CREATE TABLE t1 (
id NUMBER,
name VARCHAR2(100),
description VARCHAR2(100)
);
CREATE TABLE t2 (
dt TIMESTAMP(9),
json_data CLOB
);
INSERT INTO t1 VALUES (1, 'A','aaaa');
INSERT INTO t1 VALUES (2, 'B','bbbb');
INSERT INTO t1 VALUES (3, 'C','cccc');
INSERT INTO t1 VALUES (4, 'D','dddd');
CREATE OR REPLACE TRIGGER t1_ad
FOR DELETE ON t1
COMPOUND TRIGGER
TYPE t_rowid_tab IS TABLE OF ROWID;
v_rowid_tab t_rowid_tab := t_rowid_tab();
AFTER EACH ROW IS
BEGIN
v_rowid_tab.extend;
v_rowid_tab(v_rowid_tab.last) := :old.rowid;
END AFTER EACH ROW;
AFTER STATEMENT IS
v_scn v$database.current_scn := dbms_flashback.get_system_change_number;
v_json_data CLOB;
v_sql CLOB;
BEGIN
FOR i IN 1 .. v_rowid_tab.count
LOOP
SELECT 'SELECT json_arrayagg(json_object(' ||
listagg('''' || lower(t.column_name) || ''' VALUE ' ||
lower(t.column_name),
', ') within GROUP(ORDER BY t.column_id) || ') RETURNING CLOB) FROM t1 AS OF SCN :scn WHERE rowid = :r'
INTO v_sql
FROM user_tab_columns t
WHERE t.table_name = 'T1';
EXECUTE IMMEDIATE v_sql
INTO v_json_data
USING v_scn, v_rowid_tab(i);
INSERT INTO t2
VALUES
(current_timestamp,
v_json_data);
END LOOP;
END AFTER STATEMENT;
END t1_ad;
/
UPDATE t1
SET NAME = 'zzzz' -- not captured
WHERE id = 2;
DELETE FROM t1 WHERE id < 3;
SELECT *
FROM t2;
-- 13-NOV-20 01.08.15.955426000 PM [{"id":1,"name":"A","description":"aaaa"}]
-- 13-NOV-20 01.08.15.969755000 PM [{"id":2,"name":"B","description":"bbbb"}]
I work for a company that has a DW - ETL setup. I need to write a query that looks for over 2500+ values in an WHEN - IN clause and also over 1000+ values in a WHERE - IN clause. Basically it would look like the following:
SELECT
,user_id
,CASE WHEN user_id IN ('user_n', +2500 user_[n+1] ) THEN 1
ELSE 0
,item_id
FROM user_table
WHERE item_id IN ('item_n', +1000 item_[n+1] );
As you probably already know PL/SQL allows a maximum of 1000 values in an IN clause, so I tried adding OR - IN clauses (as suggested in other stackoverflow threads):
SELECT
,user_id
,CASE WHEN user_id IN ('user_n', +999 user_[n+1] )
OR user_id IN ('user_n', +999 user_[n+1] )
OR user_id IN ('user_n', +999 user_[n+1] ) THEN 1
ELSE 0 END AS user_group
,item_id
FROM user_table
WHERE item_id IN ('item_n', +999 item_[n+1] )
OR item_id IN ('item_n', +999 item_[n+1] );
NOTE: i know the math is erroneous in the examples above, but you get the point
The problem is that queries have a maximum executing time of 120 minutes and the job is being automatically killed. So I googled what solutions I could find and it seems Temporary Tables could be the solution I'm looking for, but with all honesty none of the examples I found is clear enough on how to include the values I want in the table and also how to use this table in my original query. Not even the ORACLE documentation was of much help.
Another potential problem is that I have limited rights and I've seen other people mention that in their companies they don't have the rights to create temporary tables.
Some of the info I found in my research:
ORACLE documentation
StackOverflow thread
[StackOverflow thread 2]
Another solution I found was using tuples instead, as mentioned in THIS thread (which I haven't tried) because as another user mentions performance seems greatly affected.
Any guidance on how to use a Temporary Table or if anyone has another way of dealing with this limitation would be greatly appreciated.
Create a global temporary table so no undo logs are created
CREATE GLOBAL TEMPORARY TABLE <table_name> (
<column_name> <column_data_type>,
<column_name> <column_data_type>,
<column_name> <column_data_type>)
ON COMMIT DELETE ROWS;
then depending on how the user list arrives import the data into a holding table and then run
select 'INSERT INTO global_temporary_table <column> values '
|| holding_table.column
||';'
FROM holding_table.column;
This gives you insert statements as output which you run to insert the data.
then
SELECT <some_column>
FROM <some_table>
WHERE <some_value> IN
(SELECT <some_column> from <global_temporary_table>
Use a collection:
CREATE TYPE Ints_Table AS TABLE OF INT;
CREATE TYPE IDs_Table AS TABLE OF CHAR(5);
Something like this:
SELECT user_id,
CASE WHEN user_id MEMBER OF Ints_Table( 1, 2, 3, /* ... */ 2500 )
THEN 1
ELSE 0
END
,item_id
FROM user_table
WHERE item_id MEMBER OF IDs_table( 'ABSC2', 'DITO9', 'KMKM9', /* ... */ 'QD3R5' );
Or you can use PL/SQL to populate a collection:
VARIABLE cur REFCURSOR;
DECLARE
t_users Ints_Table;
t_items IDs_Table;
f UTL_FILE.FILE_TYPE;
line VARCHAR2(4000);
BEGIN
t_users.EXTEND( 2500 );
FOR i = 1 .. 2500 LOOP
t_users( t_users.COUNT ) := i;
END LOOP;
// load data from a file
f := UTL_FILE.FOPEN('DIRECTORY_HANDLE','datafile.txt','R');
IF UTL_FILE.IS_OPEN(f) THEN
LOOP
UTL_FILE.GET_LINE(f,line);
IF line IS NULL THEN EXIT; END IF;
t_items.EXTEND;
t_items( t_items.COUNT ) := line;
END LOOP;
OPEN :cur FOR
SELECT user_id,
CASE WHEN user_id MEMBER OF t_users
THEN 1
ELSE 0
END
,item_id
FROM user_table
WHERE item_id MEMBER OF t_items;
END;
/
PRINT cur;
Or if you are using another language to call the query then you could pass the collections as a bind value (as shown here).
In PL/SQL you could use a collection type. You could create your own like this:
create type string_table is table of varchar2(100);
Or use an existing type such as SYS.DBMS_DEBUG_VC2COLL which is a table of VARCHAR2(1000).
Now you can declare a collection of this type for each of your lists, populate it, and use it in the query - something like this:
declare
strings1 SYS.DBMS_DEBUG_VC2COLL := SYS.DBMS_DEBUG_VC2COLL();
strings2 SYS.DBMS_DEBUG_VC2COLL := SYS.DBMS_DEBUG_VC2COLL();
procedure add_string1 (p_string varchar2) is
begin
strings1.extend();
strings1(strings.count) := p_string;
end;
procedure add_string2 (p_string varchar2) is
begin
strings2.extend();
strings2(strings2.count) := p_string;
end;
begin
add_string1('1');
add_string1('2');
add_string1('3');
-- and so on...
add_string1('2500');
add_string2('1');
add_string2('2');
add_string2('3');
-- and so on...
add_string2('1400');
for r in (
select user_id
, case when user_id in table(strings2) then 1 else 0 end as indicator
, item_id
from user_table
where item_id in table(strings1)
)
loop
dbms_output.put_Line(r.user_id||' '||r.indicator);
end loop;
end;
/
You can use below example to understand Global temporary tables and the type of GTT.
CREATE GLOBAL TEMPORARY TABLE GTT_PRESERVE_ROWS (ID NUMBER) ON COMMIT PRESERVE ROWS;
INSERT INTO GTT_PRESERVE_ROWS VALUES (1);
COMMIT;
SELECT * FROM GTT_PRESERVE_ROWS;
DELETE FROM GTT_PRESERVE_ROWS;
COMMIT;
TRUNCATE TABLE GTT_PRESERVE_ROWS;
DROP TABLE GTT_PRESERVE_ROWS;--WONT WORK IF YOU DIDNOT TRUNCATE THE TABLE OR THE TABLE IS BEING USED IN SOME OTHER SESSION
CREATE GLOBAL TEMPORARY TABLE GTT_DELETE_ROWS (ID NUMBER) ON COMMIT DELETE ROWS;
INSERT INTO GTT_DELETE_ROWS VALUES (1);
SELECT * FROM GTT_DELETE_ROWS;
COMMIT;
SELECT * FROM GTT_DELETE_ROWS;
DROP TABLE GTT_DELETE_ROWS;
However as you mentioned you receive the input in an excel file so you can simply create a table and load data in that table. Once the data is loaded you can use the data in IN clause of your query.
select * from employee where empid in (select empid from temptable);
create temporary table userids (userid int);
insert into userids(...)
then a join or in subquery
select ...
where user_id in (select userid from userids);
drop temporary table userids;
Here is my first procedure (sample)
CREATE OR REPLACE PROCEDURE GPTOWNER_CORP_AMF.testt1
AS
po_status VARCHAR2(100);
po_cur_1 SYS_REFCURSOR;
po_cur_2 SYS_REFCURSOR;
BEGIN
OPEN po_cur_1 FOR
select app_var_row_seq,app_var_name,app_var_value,app_var_description,r_date
from TMP_PMT_APP_VARIABLES_REF
where ROWNUM < 5;
OPEN po_cur_2 FOR
select config_to_lob_row_seq,config_row_seq,lobref_row_seq,r_date
from TMP_PMT_CONFIG_TO_LOB_DAT
where ROWNUM < 6;
TESTT2(po_cur_1,po_cur_2,po_status);
DBMS_output.put_line(po_status);
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE(SQLERRM||SQLCODE);
END;
Here is my second procedure (sample)
CREATE OR REPLACE procedure GPTOWNER_CORP_AMF.testt2 (pi_cur_1 IN sys_refcursor, pi_cur_2 IN sys_refcursor,po_status OUT VARCHAR2)
AS
app_var_row_seq NUMBER;
app_var_name VARCHAR2(100);
app_var_value VARCHAR2(1000);
app_var_description VARCHAR2(1000);
r_date1 DATE;
config_to_lob_row_seq NUMBER;
config_row_seq VARCHAR2(100);
lobref_row_seq NUMBER;
r_date2 DATE;
BEGIN
LOOP
FETCH pi_cur_1 into app_var_row_seq,app_var_name,app_var_value,app_var_description,r_date1;
FETCH pi_cur_2 into config_to_lob_row_seq,config_row_seq,lobref_row_seq,r_date2;
EXIT WHEN (pi_cur_2%NOTFOUND AND pi_cur_1%NOTFOUND ) ;
INSERT INTO testt1testt2 (colid,col1,col2,col3,col4,col5,col6,col7,col8,col9)
VALUES(colid.nextval,app_var_row_seq,app_var_name,app_var_value,app_var_description,r_date1,config_to_lob_row_seq,config_row_seq,lobref_row_seq,r_date2);
END LOOP;
DBMS_OUTPUT.PUT_LINE ('rows inserted:' || pi_cur_1%ROWCOUNT || 'and' || pi_cur_2%ROWCOUNT);
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE(SQLERRM||SQLCODE);
END;
My problem statement is that from first procedure I am getting two refcursor as output and in the second procedure I am trying to read them and put them into a temp table which will be used by another procedure. Cant union the two select statements as they are having different set of output. Is there any better mechanism to do so , as by my approach I am facing issue as when I run the first procedure (say first select return 4 row and second select return 6 rows) the need is that 6 rows would be inserted into temp table but the columns that are read from first select will be inserted as NULL when there is now row fetched , but in my case duplicate row is getting inserted. Any help would be appreciated. And do post if anyone needs more info on the same.
If I understand you right, then you don't really need to union them - but join them.
Since there is no really relation between the 2 tables and you want nulls in "both side"s you need to full outer join them.
I will not ask you, why you want them both on the same temp table if there is no relation between them. But if you do this why not just use an insert-select ?
INSERT INTO testt1testt2 (colid,col1,col2,col3,col4,col5,col6,col7,col8,col9)
SELECT colid.nextval, app_var_row_seq,app_var_name,app_var_value,app_var_description, t1.r_date,
config_to_lob_row_seq,config_row_seq,lobref_row_seq, t2.r_date
FROM (select app_var_row_seq,app_var_name,app_var_value,app_var_description,r_date
from TMP_PMT_APP_VARIABLES_REF
where ROWNUM < 5) t1
FULL OUTER JOIN (select config_to_lob_row_seq,config_row_seq,lobref_row_seq,r_date
from TMP_PMT_CONFIG_TO_LOB_DAT
where ROWNUM < 6) t2 on 1=2
UPDATE:
If the requirement is to get 2 refcursors, then my approach isn't relevant...
What you can do though, is have 2 insert commands one like this:
INSERT INTO testt1testt2 (colid,col1,col2,col3,col4,col5,col6,col7,col8,col9)
VALUES (colid.nextval,app_var_row_seq,app_var_name,app_var_value,app_var_description,r_date1,null,null,null,null);
and the other like:
INSERT INTO testt1testt2 (colid,col1,col2,col3,col4,col5,col6,col7,col8,col9)
VALUES (colid.nextval,null,null,null,null,null,config_to_lob_row_seq,config_row_seq,lobref_row_seq,r_date2);
If you really want to do it nicely, you can use bulk insert for performance, see example here
I can't figure out the correct syntax for the following pseudo-sql:
INSERT INTO some_table
(column1,
column2)
SELECT col1_value,
col2_value
FROM other_table
WHERE ...
RETURNING id
INTO local_var;
I would like to insert something with the values of a subquery.
After inserting I need the new generated id.
Heres what oracle doc says:
Insert Statement
Returning Into
OK i think it is not possible only with the values clause...
Is there an alternative?
You cannot use the RETURNING BULK COLLECT from an INSERT.
This methodology can work with updates and deletes howeveer:
create table test2(aa number)
/
insert into test2(aa)
select level
from dual
connect by level<100
/
set serveroutput on
declare
TYPE t_Numbers IS TABLE OF test2.aa%TYPE
INDEX BY BINARY_INTEGER;
v_Numbers t_Numbers;
v_count number;
begin
update test2
set aa = aa+1
returning aa bulk collect into v_Numbers;
for v_count in 1..v_Numbers.count loop
dbms_output.put_line('v_Numbers := ' || v_Numbers(v_count));
end loop;
end;
You can get it to work with a few extra steps (doing a FORALL INSERT utilizing TREAT)
as described in this article:
returning with insert..select
T
to utilize the example they create and apply it to test2 test table
CREATE or replace TYPE ot AS OBJECT
( aa number);
/
CREATE TYPE ntt AS TABLE OF ot;
/
set serveroutput on
DECLARE
nt_passed_in ntt;
nt_to_return ntt;
FUNCTION pretend_parameter RETURN ntt IS
nt ntt;
BEGIN
SELECT ot(level) BULK COLLECT INTO nt
FROM dual
CONNECT BY level <= 5;
RETURN nt;
END pretend_parameter;
BEGIN
nt_passed_in := pretend_parameter();
FORALL i IN 1 .. nt_passed_in.COUNT
INSERT INTO test2(aa)
VALUES
( TREAT(nt_passed_in(i) AS ot).aa
)
RETURNING ot(aa)
BULK COLLECT INTO nt_to_return;
FOR i IN 1 .. nt_to_return.COUNT LOOP
DBMS_OUTPUT.PUT_LINE(
'Sequence value = [' || TO_CHAR(nt_to_return(i).aa) || ']'
);
END LOOP;
END;
/
Unfortunately that's not possible. RETURNING is only available for INSERT...VALUES statements. See this Oracle forum thread for a discussion of this subject.
You can't, BUT at least in Oracle 19c, you can specify a SELECT subquery inside the VALUES clause and so use RETURNING! This can be a good workaround, even if you may have to repeat the WHERE clause for every field:
INSERT INTO some_table
(column1,
column2)
VALUES((SELECT col1_value FROM other_table WHERE ...),
(SELECT col2_value FROM other_table WHERE ...))
RETURNING id
INTO local_var;
Because the insert is based on a select, Oracle is assuming that you are permitting a multiple-row insert with that syntax. In that case, look at the multiple row version of the returning clause document as it demonstrates that you need to use BULK COLLECT to retrieve the value from all inserted rows into a collection of results.
After all, if your insert query creates two rows - which returned value would it put into an single variable?
EDIT - Turns out this doesn't work as I had thought.... darn it!
This isn't as easy as you may think, and certainly not as easy as it is using MySQL. Oracle doesn't keep track of the last inserts, in a way that you can ping back the result.
You will need to work out some other way of doing this, you can do it using ROWID - but this has its pitfalls.
This link discussed the issue: http://forums.oracle.com/forums/thread.jspa?threadID=352627