I have two very large tables and I need to process a small resultset from these tables. However, processing is done in several functions each and function must do some joining in order to format the data in proper way.
I would definitely need to cache the initial resultset somehow so it can be reused by the functions. What I would like to do is put the first result set in one collection, the second resultset in another collection, and then manipulate these collections through SQL queries as if they were real SQL tables.
Can you suggest how this can be done?
Sounds like a job for temp tables:
CREATE GLOBAL TEMPORARY TABLE table_name (...) ON ...
The ON has two options, with different impacts:
ON COMMIT DELETE ROWS specifies temporary table would be transaction specific. Data persist within table up to transaction ending time. If you end the transaction the database truncates the table (delete all rows). Suppose if you issue commit or run ddl then data inside the temporary table will be lost. It is by default option.
ON COMMIT PRESERVE ROWS specifies temporary table would be session specific. Data persist within table up to session ending time. If you end the session the database truncates the table (delete all rows). Suppose you type exit in SQL*Plus then data inside the temporary table will be lost.
Reference:
ASKTOM
Create Temporary Table in Oracle
...but it is possible that you don't need to use temporary tables. Derived tables/inline views/subqueries (maybe pipelining) can possibly do what you want, but the info is vague so I can't recommend a particular approach.
If your collections are declared with a SQL type you can use them in SQL statements with a TABLE() function. In Oracle 10g we can merge collections using the MULTISET UNION operator. The following code shows examples of both techniques...
SQL> declare
2 v1 sys.dbms_debug_vc2coll;
3 v2 sys.dbms_debug_vc2coll;
4 v3 sys.dbms_debug_vc2coll := sys.dbms_debug_vc2coll();
5 begin
6 select ename
7 bulk collect into v1
8 from emp;
9 select dname
10 bulk collect into v2
11 from dept;
12
13 -- manipulate connects using SQL
14
15 for r in ( select * from table(v1)
16 intersect
17 select * from table(v2)
18 )
19 loop
20 dbms_output.put_line('Employee '|| r.column_value ||' has same name as a department');
21 end loop;
22
23 -- combine two collections into one
24
25 dbms_output.put_line('V3 has '|| v3.count() ||' elements');
26 v3 := v1 multiset union v2;
27 dbms_output.put_line('V3 now has '|| v3.count() ||' elements');
28 end;
29 /
Employee SALES has same name as a department
V3 has 0 elements
V3 now has 23 elements
PL/SQL procedure successfully completed.
SQL>
There are a number of other approaches which you can employ. As a rule it is better to use SQL rather than PL/SQL, so OMG Ponies suggestion of temporary tables might be appropriate. It really depends on the precise details of your processing needs.
You need to create an schema-level type(not inside a package) as a nested table. You can populate them, and then you may use them in you queries as normal tables using the "table()" statement.
This link explains what you need. A quick example
create type foo as table of number;-- or a record type, data%rowtype, whatever
...
myfoo1 foo := foo (1,2,3);
myfoo2 foo := foo(3,4,5)
select column_value
into bar
from table(foo1) join table(foo2) using (column_value)
Related
I have an Oracle SQL database with 2 scripts I uploaded to Oracle, one includes tables, the other includes the registries.
I have a table DEPARTMENTS with 2 columns, CODE and NAME.
From Oracle 10g SQL home > SQL workshop > SQL commands, I type in this statement:
SELECT CODE, NAME
FROM DEPARTMENTS
The result is
"No data found"
I cannot retrieve results correctly either in Oracle 10g nor in livesql.oracle.com.
Is my statement incorrect? Is it not retrieving the data uploaded?
Result should actually be "no rows selected", not "no data found".
You said you have two scripts; I presume that one of them contains something like this:
SQL> create table departments
2 (code number,
3 name varchar2(20));
Table created.
Basically, you created an empty table. If you query it, there's nothing in there:
SQL> select code, name from departments;
no rows selected
Therefore, the second script most probably contains statements that populate tables created in the first script. Is that what you call "registries"?
SQL> insert into departments (code, name)
2 select 1, 'Accounting' from dual union all
3 select 2, 'Finance' from dual;
2 rows created.
Let's repeat the previous select statement:
SQL> select code, name from departments;
CODE NAME
---------- ----------
1 Accounting
2 Finance
SQL>
Right; now we got the result.
On the other hand, maybe you actually ran both scripts, created tables and inserted rows, but you didn't commit at the end of the second script. If you then opened a new session (in Oracle 10g SQL home > SQL workshop > SQL commands), then you saw tables (because create table is a DDL and it modified data dictionary so select you wrote didn't return "ORA-00942: table or view does not exist"), but insert is a DML and you have to COMMIT - otherwise only that session (that ran inserts) will see data.
Finally, check whether insert statements completed successfully; if not, there's a chance that you didn't insert anything and will first have to fix errors. Which ones? No idea, I don't have those scripts nor saw the screen (or log file, if there is any).
I have a table, participated, which has a trigger that returns the total damage amount for a driver id when a new record is inserted into the table.
create or replace trigger display
after insert on participated
for each row
declare
t_id integer;
total integer;
begin
select driver_id, sum(damage_amount)
into t_id, total
from participated
where :new.driver_id = driver_id
group by driver_id;
dbms_output.put_line(' total amount ' || total' driver id' || t_id);
end;
/
The trigger is created, but it returns this error:
ORA-04091: table SQL_HSATRWHKNJHKDFMGWCUISUEEE.PARTICIPATED is mutating,
trigger/function may not see it ORA-06512: at
"SQL_HSATRWHKNJHKDFMGWCUISUEEE.DISPLAY", line 5
ORA-06512: at "SYS.DBMS_SQL", line 1721
Please help with this trigger.
As commented above, this feels like a code-smell. A row level trigger cannot change the table being changed, since that would fire another trigger, which would end up in an endless loop of calling triggers.
Changing this to a statement level trigger is not doing the same thing.
Preferred solution:
1) put this to the application logic, and calculate after row has been inserted - this is trivial as #kfinity mentioned.
2) earmark newly inserted rows and use a statement level trigger. For example, have an extra column, say is_new default 1 - therefore all new inserted rows will have this flag. Then use a statement level trigger suggested by #hbourchi to calculate and update all drivers that is_new is 1, and then set this flag back to zero
3) the logic in 2) can be implemented using pl/sql and in-memory pl/sql tables. The pl/sql table collects affected driver ids using a row level trigger, and then updates the totals of the selected drivers. Tom Kyte has lots of examples on this, this is not a rocket science, however if you lack of PL/SQL knowledge, then this is probably not your way. (For the note: PL/SQL is super important when using Oracle - without that Oracle is just an expensive Excel sheet like any other db. Worth of using it.)
4) probably, you shall revise your data model - and the problem solves itself. The participated table shows multiple rows per driver id. You want to calculate one total row per driver id - why would you put that summary to the same table? Simply add a new table, participated_total which has driver_id and damaged_amount fields. Then feel free to insert or update that from your trigger as you planned originally!
5) in fact, you can calculate these totals on the fly (depending on the number of rows and your performance expectations), by simply crafting the right SQL when querying - this way no need to store the pre-calculated totals.
6) but if you wish Oracle to store these totals for you, you can do 5) and use materialized views. These are in-fact tables, which are updated and maintained automatically by Oracle, so your actual query at 5) does not need to calculate anything on the fly but can get the automatically pre-calculated data from the materialized view.
How about just no triggers at all ?
SQL> create table participated (
2 incident_id int primary key,
3 driver_id int not null,
4 damage_amount int not null
5 );
Table created.
SQL>
SQL> insert into participated
2 select rownum, mod(rownum,10), dbms_random.value(1000,2000)
3 from dual
4 connect by level <= 200;
200 rows created.
SQL> create materialized view log on participated with rowid, (driver_id,damage_amount), sequence including new values;
Materialized view log created.
SQL> create materialized view driver_tot
2 refresh fast on commit
3 with primary key
4 as
5 select driver_id,
6 sum(damage_amount) tot,
7 count(*) cnt
8 from participated
9 group by driver_id;
Materialized view created.
SQL> select driver_id, tot
2 from driver_tot
3 order by 1;
DRIVER_ID TOT
---------- ----------
0 32808
1 29847
2 28585
3 29714
4 32148
5 30491 <====
6 29258
7 32103
8 30131
9 26834
10 rows selected.
SQL>
SQL> insert into participated values (9999,5,1234);
1 row created.
SQL> commit;
Commit complete.
SQL>
SQL> select driver_id, tot
2 from driver_tot
3 order by 1;
DRIVER_ID TOT
---------- ----------
0 32808
1 29847
2 28585
3 29714
4 32148
5 31725 <====
6 29258
7 32103
8 30131
9 26834
10 rows selected.
SQL>
SQL>
You didn't post your trigger definition but normally you can not query a table, inside trigger when updating records in the same table.
Try using after update trigger. it might work in your case. Something like this:
CREATE OR REPLACE TRIGGER my_trigger AFTER UPDATE ON my_table FOR EACH ROW
DECLARE
...
BEGIN
...
END;
another option would be to make your trigger AUTONOMOUS_TRANSACTION:
CREATE OR REPLACE TRIGGER my_trigger BEFORE INSERT ON my_table FOR EACH ROW
DECLARE
PRAGMA AUTONOMOUS_TRANSACTION;
BEGIN
...
END;
However read this before choosing for this option:
https://docs.oracle.com/cd/B14117_01/appdev.101/b10807/13_elems002.htm
Is there a way to generically copy a row, in particular WITHOUT specifying all the columns.
In my situatoin I have a large table where I would like to copy all the columns except the ID and one other column. In fact data is copied from year to year at the start of the year. The table has 50+ columns so it would be more flexible and robust to change in schema if I did not have to specify all the columns.
This is closely related to the question : copy row while updating one field
In that question Kevin Cline's comment essentially asks my question, but no solution was actually provided for this more general case.
EDIT to provide more detail as requested, here is an example of what is needed:
-- setup test table
create table my_table(pk, v1,v2,v3,v4 ... v50) as
select 17 pk, 1 v1,2 v2,3 v3... 50 v50 from dual;
On the above table copy the row and set pk to 18 and v2 to 10.
An easy way to do this is an anonymous PL/SQL block and the usage of ROWTYPE:
-- setup test table
create table my_table(pk, value) as
select 17 pk, 'abc' value from dual;
declare
l_data my_table%rowtype;
begin
-- fetch the row we want to copy
select * into l_data from my_table tbl where tbl.pk = 17;
-- update all fields that need to change
l_data.pk := 18;
-- note the lack of parens around l_data in the next line
insert into my_table values l_data;
end;
I already tried this:
CREATE GLOBAL TEMPORARY TABLE tempTable AS
SELECT * FROM realTable;
But then tempTable has only the structure of realTable, but not the elements themselves.
"But then tempTable has only the structure of realTable, but not the
elements themselves."
What makes a global temporary table temporary is that the data is transient. Firstly, the data is only visible within the session which inserts it; any other session will see an empty table. Secondly, the data can persist for either a transaction or the session, depending on the ON COMMIT clause; the default is ON COMMIT DELETE ROWS. Find out more.
Now the thing is, a DDL statement in Oracle issues two commits, one before and one after the statement in question. So a DDL statement is a complete, discrete transaction. Hence this ...
CREATE GLOBAL TEMPORARY TABLE tempTable AS
SELECT * FROM realTable;
... is a transaction and, as it doesn't specify the ON COMMIT clause, it will apply the default which is DELETE ROWS. So an empty table is the expected behaviour.
The solution is simple: specify the ON COMMIT statement with session-level retention:
SQL> select count(*) from t23;
COUNT(*)
----------
11
SQL> create global temporary table gtt23
2 as select * from t23
3 /
Table created.
SQL> select count(*) from gtt23;
COUNT(*)
----------
0
SQL> drop table gtt23;
Table dropped.
SQL> create global temporary table gtt23
2 on commit preserve rows
3 as select * from t23
4 /
Table created.
SQL> select count(*) from gtt23;
COUNT(*)
----------
11
SQL>
Generally, I think that a policy of CREATE GLOBAL TEMPORARY TABLE using SELECT * FROM indicates a misunderstanding of the construct. GTTs in Oracle are permanent data structures; only the records are temporary. They are not disposable objects like temporary tables in T-SQL. If that's the sort of thing you want, you should probably be using PL/SQL collections instead. Find out more.
Global temporary tables can have either transaction-level scope or session-level scope. The default is to have transaction-level scope which means that the data disappears after the transaction completes. If you do a CREATE TABLE AS SELECT to create your global temporary table, the data will be inserted but, since CREATE is DDL, the data will be removed as soon as the statement completes.
One option would be to create the structure using a query that doesn't return any data
CREATE GLOBAL TEMPORARY TABLE tempTable AS
SELECT *
FROM realTable
WHERE 1=0;
then insert the data
INSERT INTO tempTable
SELECT *
FROM realTable;
Of course, given how infrequently global temporary tables are used in Oracle (particularly in comparison to other databases), I'd want to be very certain that you really need to create a temporary table from a permanent table in the first place.
I have two tables
moduleprogress which contains fields:
studentid
modulecode
moduleyear
modules which contains fields:
modulecode
credits
I need a trigger to run when the user is attempting to insert or update data in the moduleprogress table.
The trigger needs to:
look at the studentid that the user has input and look at all modules that they have taken in moduleyear "1".
take the modulecode the user input and look at the modules table and find the sum of the credits field for all these modules (each module is worth 10 or 20 credits).
if the value is above 120 (yearly credit limit) then it needs to error; if not, input is ok.
Does this make sense? Is this possible?
#a_horse_with_no_name
This looks like it will work but I will only be using the database to input data manually so it needs to error on input. I'm trying to get a trigger similar to this to solve the problem(trigger doesn't work) and forget that "UOS_" is before everything. Just helps me with my database and other functions.
CREATE OR REPLACE TRIGGER "UOS_TESTINGS"
BEFORE UPDATE OR INSERT ON UOS_MODULE_PROGRESS
REFERENCING NEW AS NEW OLD AS OLD
DECLARE
MODULECREDITS INTEGER;
BEGIN
SELECT
m.UOS_CREDITS,
mp.UOS_MODULE_YEAR,
SUM(m.UOS_CREDITS)
INTO MODULECREDITS
FROM UOS_MODULE_PROGRESS mp JOIN UOS_MODULES m
ON m.UOS_MODULE_CODE = mp.UOS_MODULE_CODE
WHERE mp.UOS_MODULE_YEAR = 1;
IF MODULECREDITS >= 120 THEN
RAISE_APPLICATION_ERROR(-20000, 'Students are only allowed to take upto 120 credits per year');
END IF;
END;
I get the error message :
8 23 PL/SQL: ORA-00947: not enough values
4 1 PL/SQL: SQL Statement ignored
I'm not sure I understand your description, but the way I understand it, this can be solved using a materialized view, which might give better transactional behaviour than the trigger:
CREATE MATERIALIZED VIEW LOG
ON moduleprogress WITH ROWID (modulecode, studentid, moduleyear)
INCLUDING NEW VALUES;
CREATE MATERIALIZED VIEW LOG
ON modules with rowid (modulecode, credits)
INCLUDING NEW VALUES;
CREATE MATERIALIZED VIEW mv_module_credits
REFRESH FAST ON COMMIT WITH ROWID
AS
SELECT pr.studentid,
SUM(m.credits) AS total_credits
FROM moduleprogress pr
JOIN modules m ON pr.modulecode = m.modulecode
WHERE pr.moduleyear = 1
GROUP BY pr.studentid;
ALTER TABLE mv_module_credits
ADD CONSTRAINT check_total_credits CHECK (total_credits <= 120)
But: depending on the size of the table this might however be slower than a pure trigger based solution.
The only drawback of this solution is, that the error will be thrown at commit time, not when the insert happens (because the MV is only refreshed on commit, and the check constraint is evaluated then)