Create UNION ALL statements via a loop - sql

Manually, I can select partitions in an inner query with the first code block below. Is there a way to do this in a more elegant way via a loop? I'm showing 3 partitions here, but I have about 200 and the partitions are based on a date column and therefore the partition names will need to change when I run this query again at a future date.
SELECT *
FROM (
SELECT * FROM RSS_ACQ.TRX_ARQ PARTITION("SYS_P211048") UNION ALL
SELECT * FROM RSS_ACQ.TRX_ARQ PARTITION("SYS_P210329") UNION ALL
SELECT * FROM RSS_ACQ.TRX_ARQ PARTITION("SYS_P176323")
) TRX_ARQ
;
With this statement, I've created a loop that outputs the UNION ALL statements.
BEGIN
FOR ALL_TAB_PARTITIONS IN
(
SELECT PARTITION_NAME
FROM ALL_TAB_PARTITIONS
where TABLE_OWNER = 'TABLEOWNER'
AND TABLE_NAME = 'TABLENAME'
AND PARTITION_POSITION > 123
ORDER BY partition_position DESC
)
LOOP
DBMS_OUTPUT.PUT_LINE( 'SELECT * FROM RSS_ACQ.TRX_ARQ PARTITION(\"'
|| ALL_TAB_PARTITIONS.PARTITION_NAME || '\") UNION ALL');
END LOOP;
END;
And in this block, I've attempted to use the loop inside the inner query. It's not yet formatted correctly and I'll need to avoid having UNION ALL for the very last partition.
SELECT *
FROM (
BEGIN
FOR ALL_TAB_PARTITIONS IN
(
SELECT PARTITION_NAME
FROM ALL_TAB_PARTITIONS
where TABLE_OWNER = 'TABLEOWNER'
AND TABLE_NAME = 'TABLENAME'
AND PARTITION_POSITION > 123
ORDER BY partition_position DESC
)
LOOP
DBMS_OUTPUT.PUT_LINE( 'SELECT * FROM RSS_ACQ.TRX_ARQ PARTITION(\"'
|| ALL_TAB_PARTITIONS.PARTITION_NAME || '\") UNION ALL');
END LOOP;
END;
) TRX_ARQ
;
Here are some of the errors, but there were also many more. They are syntax errors pointing to other parts of the query so I would expect that I have an issue with escaping the quotes.
Error starting at line : 99 in command -
END LOOP
Error report -
Unknown Command
Error starting at line : 100 in command -
END
Error report -
Unknown Command
Error starting at line : 101 in command -
)
Error report -
Unknown Command
Error starting at line : 102 in command -
) TABLENAME
Error report -
Unknown Command

This is a bit of a guess, but it's too long for a comment.
I am assuming your table is interval partitioned. In that case, getting all the data from partition positions > 123 is the same as getting all the rows with a higher date than the highest date in partition 123.
You can obtain that date from ALL_TAB_PARTITIONS and then use it to query the table. Like this:
WITH FUNCTION get_high_value RETURN DATE IS
l_high_val_expr ALL_TAB_PARTITIONS.HIGH_VALUE%TYPE;
l_high_value DATE;
BEGIN
SELECT high_value
INTO l_high_val_expr
FROM all_tab_partitions
WHERE table_owner = 'RSS_ACQ'
AND table_Name = 'TRX_ARQ'
and partition_position = 123;
EXECUTE IMMEDIATE 'SELECT ' || l_high_val_expr || ' FROM DUAL' INTO l_high_value;
RETURN l_high_value;
END;
SELECT * FROM rss_acq.trx_arq
-- Replace "partitioned_date_column" with the name of the column on which the
-- table is interval partitioned.
WHERE partitioned_date_column > get_high_value;

We can't execute an anonymous PL/SQL block in a SELECT statement.
What you need to do is spool the output of the ALL_TAB_PARTITIONS loop to a file (or a SQL worksheet if you're using an IDE like SQL Developer). This will give you a script you can run separately after editing it (you need to trim UNION ALL from the final generated SELECT.
Probably there are more elegant ways of achieving the same thing, but the task seems sufficiently wrong that it doesn't strike me as being worth the effort. You want to query 200 partitions in a single statement. That is a brute force operation and there isn't mush to be gained from querying named blocks. In fact, producing a union of 200 separate queries may be more expensive than a single query. So why not try something like this?
select * from RSS_ACQ.TRX_ARQ
where partition_key_col >= date '2018-08-01' -- or whatever
"I think you are overlooking the 12c feature of using PL/SQL in the WITH clause"
That 12c feature is for functions not procedures, so it won't help the OP run their code. It would be possible to use a WITH clause function but that would require:
creating a type with the same projection as the target table
and a nested table type based on that type
a WITH clause function which assembles and executes a dynamic SQL statement
we can't use REF CURSORs in SQL so ...
the function has to execute the dynamic select INTO a local collection variable ...
then loop over the collection and PIPE ROW to output those rows ...
so the main query can call the function with a table() call
Can a WITH clause function be pipelined? I can't find anything in the documentation to say we can't (don't have access to 12c right now to test).

Related

How to create a Procedure with a specific WITH clause in Oracle?

I have a problem by creating a Procdure with a specific WITH clause. I use an Oracle database. I need a temp table for mapping specific values during time of execution.
In addition I have the following Oracle procedure. This Procedure will be created and executed by bringing into database:
How can I create this kind of Oracle SQL Procedure? Many thanks for helping me!
Use your WITH clause in a cursor FOR loop in your procedure:
DECLARE
PROCEDURE myProcedure(pinSchema IN VARCHAR2) AS
BEGIN
FOR aRow IN (WITH t AS (SELECT 'myFirstOldSchema' AS OLD_SCHEMA,
'myFirstNewSchema' AS NEW_SCHEMA
FROM DUAL UNION ALL
SELECT 'mySecondOldSchema' AS OLD_SCHEMA,
'mySecondNewSchema' AS NEW_SCHEMA
FROM DUAL UNION ALL
SELECT 'myThirdOldSchema' AS OLD_SCHEMA,
'myThirdNewSchema' AS NEW_SCHEMA
FROM DUAL)
SELECT t.*
FROM t
WHERE t.OLD_SCHEMA = pinSchema)
LOOP
-- Here a specific value from column "OLD_SCHEMA" must be inserted
EXECUTE IMMEDIATE 'INSERT INTO ' || aRow.OLD_SCHEMA ||
'.myTable(column_1, column_2, column_3)
(SELECT extern_column_1,
extern_column_2,
extern_column_3
FROM ' || aRow.NEW_SCHEMA || '.myExternTable)';
END LOOP;
END myProcedure;
BEGIN
FOR S IN (SELECT * FROM ROOT_SCHEMA.myTableWithSchema)
LOOP
-- First loop S.mySchemata represent the value 'myFirstOldSchema'
-- Second loop S.mySchemata represent the value 'mySecondOldSchema'
-- Third loop S.mySchemata represent the value 'myThirdOldSchema'
myProcedure(S.mySchemata);
END LOOP
COMMIT;
END;
Not tested on animals - you'll be first!

How can I loop to get counts from multiple tables?

I am trying to derive a table with counts from multiple tables. The tables are not on my schema. The table names on the schema that I am interested in all start with 'STAF_' and end with '_TS'. The criteria i am looking for is where SEP = 'MO'. So for example, the query in its base form is:
select area, count(SEP) areacount
from mous.STAF_0001_TS
where SEP = 'MO'
group by area;
I have about 1000 tables that i'd like to do this for.
Ultimatly, I'd like the output to be a table on my schema that looks like the following:
area| areacount
0001| 3
0002| 7
0003| 438
Thank you.
As a first step I'd write an SQL query that generates an SQL query:
SELECT 'SELECT area, count(*) FROM '||c.table_name||'UNION ALL' as run_me
FROM all_tables c
WHERE c.table_name LIKE 'STAF\_%\_MS' escape '\'
Running this will produce an output that is another SQL query. Copy the result text out of your results grid and paste it back into your query pane. Delete the final UNION ALL and run it
Once you dig how to write an SQL query that generate an SQL query, you can look at turning it into a view, or creating a dynamic query in a string.
Gotta say, this is a horrible way to store data; you'd be better off using ONE table with an extra column containing whatever is in xxx of STAF_xxx_MS right now
In Oracle 12c, you can embed a FUNCTION that will query the number of rows in any given table. Then you can use that function in your main query. Here is an example:
WITH FUNCTION cnt ( p_owner VARCHAR2, p_table_name VARCHAR2 ) RETURN NUMBER IS
l_cnt NUMBER;
BEGIN
EXECUTE IMMEDIATE 'SELECT count(*) INTO :cnt FROM ' || p_owner || '.' || p_table_name INTO l_cnt;
RETURN l_cnt;
EXCEPTION WHEN OTHERS THEN
RETURN NULL; -- This will happen for entries in ALL_TABLES that are not directly accessible (e.g., IOT overflow tables)
END cnt;
SELECT t.owner, t.table_name, cnt(t.owner, t.table_name)
FROM all_tables t
where t.table_Name like 'STAF\_%\_MS' escape '\';

Need to build a query with column names stored in another table

I have a table as shown below. This table will be generated dynamically and I have no prior idea about what value it is going to hold.
------------------------------------------
TABLE_NAME COLUMN_NAME CHAR_LENGTH
------------------------------------------
EMPLOYEE COL1 100
EMPLOYEE COL2 200
EMPLOYEE COL3 300
EMPLOYEE COL4 400
Based on this table, I want to build a query in such a way that it would give me those columns, that contains data having char length greater than CHAR_LENGTH column value.
For example if COL2 contains data having char length 500 (>200), then query would give me COL2.
I don't have any draft code to show my attempt, as I have no idea how would I do this.
I don't think this is possible in pure SQL due to the dynamic nature of your requirement. You'll need some form of PL/SQL.
Assuming you're ok with simply outputting the desired results, here is a PL/SQL block that will get the job done:
declare
wExists number(1);
begin
for rec in (select * from your_dynamic_table)
loop
execute immediate 'select count(*)
from dual
where exists (select null
from ' || rec.table_name || ' t
where length(t.' || rec.column_name || ') > ' || rec.char_length || ')'
into wExists;
if wExists = 1 then
dbms_output.put_line(rec.column_name);
end if;
end loop;
end;
You'll also notice the use of the exists clause to optimize the query, so as not to iterate over the whole table unnecessarily, when possible.
Alternatively, if you want the results to be queryable, you can consider converting the code to a pipelined function.
select column_name
from (
select statement that builds the table output
) A
where char_length<length(column_name)
will that help?
You would need a procedure to achieve the same :
Here I am treating the all_tab_columns table in Oracle, which is a default table with much similar structure as your reference example.(Try select * from all_tab_columns). The structure of all_tab_columns is much like yours, except that you will never find a varchar record whose value has exceeded its data length(obvious database level constraint). Date fields may exceed data length, and do reflect in this procedure's output. I am searching all columns in EMPLOYEES whose size exceeds what is specified.
DECLARE
cursor c is select column_name,data_length,table_name from all_tab_columns where table_name=:Table_name;
V_INDEX_NAME all_tab_columns.column_name%type;
v_data_length all_tab_columns.data_length%type;
V_NUMBER PLS_INTEGER;
v_table_name all_tab_columns.table_name%type;
BEGIN
open c;
LOOP
FETCH c into v_index_name,v_data_length,v_table_name;
EXIT when c%NOTFOUND;
v_number :=0;
execute immediate 'select count(*) from '|| :Table_name ||' where length('||v_index_name||')>'||v_data_length into v_number;
if v_number>1 then
dbms_output.put_line(v_index_name||' has values greater than specified'||' '||V_INDEX_NAME||' '||v_data_length);
end if;
END LOOP;
close c;
END;
/
Replace all_tab_columns and its respective columns with the column name of your table.
DEFECTS : The table name is hardcoded. Trying to make the code generic execute immediate or any other trick. Will achieve soon.
EDIT : Defect fixed.

Collecting the last updates of multiple tables into a single table

I have a problem in that I want my output to be a single table (lets call it Output) with 2 columns: one for the "TableName" and one for the DateTime of the last update (using the scn_to_timestamp(max(ora_rowscn)) command).
I have 100 tables and I want to pull in the last update date/times for all these tables into the Output table.
So I can do this:
insert into Output(TableName)
select table_name
from all_tables;
which will put all the tables I have from my database into the TableName column. But I don't know how to loop through each entry and use the tablename as a variable and pass this into the scn_to_timestamp(ora_rowscn).
I thought I would try something like below:
for counter in Output(TableName) LOOP
insert into Output(UpdateDate)
select scn_to_timestamp(max(ora_rowscn))
from counter;
END LOOP;
Any suggestions?
Thank you
This query is a little bit clumsy as it uses xmlgen to execute dynamic sql in a query, but it might work for you.
select x.*
from all_tables t,
xmltable('/ROWSET/ROW' passing
dbms_xmlgen.getxmltype('select ''' || t.table_name ||
''' tab_name, max(ora_rowscn) as la from ' ||
t.table_name)
COLUMNS tab_name varchar2(30) PATH 'TAB_NAME',
max_scn number PATH 'LA') x
Here is a sqlfiddle demo
You can also use PLSQL and then use execute immediate

How to choose tables on select from all_tables?

I have the following table name template, there are a couple with the same name and a number at the end: fmj.backup_semaforo_geo_THENUMBER, for example:
select * from fmj.backup_semaforo_geo_06391442
select * from fmj.backup_semaforo_geo_06398164
...
Lets say I need to select a column from every table which succeeds with the 'fmj.backup_semaforo_geo_%' filter, I tried this:
SELECT calle --This column is from the backup_semaforo_geo_# tables
FROM (SELECT table_name
FROM all_tables
WHERE owner = 'FMJ' AND table_name LIKE 'BACKUP_SEMAFORO_GEO_%');
But I'm getting the all_tables tables name data:
TABLE_NAME
----------
BACKUP_SEMAFORO_GEO_06391442
BACKUP_SEMAFORO_GEO_06398164
...
How can I achieve that without getting the all_tables output?
Thanks.
Presumably your current query is getting ORA-00904: "CALLE": invalid identifier, because the subquery doesn't have a column called CALLE. You can't provide a table name to a query at runtime like that, unfortunately, and have to resort to dynamic SQL.
Something like this will loop through all the tables and for each one will get all the values of CALLE from each one, which you can then loop through. I've used DBMS_OUTPUT to display them, assuming you're doing this in SQL*Plus or something that can deal with that; but you may want to do something else with them.
set serveroutput on
declare
-- declare a local collection type we can use for bulk collect; use any table
-- that has the column, or if there isn't a stable one use the actual data
-- type, varchar2(30) or whatever is appropriate
type t_values is table of table.calle%type;
-- declare an instance of that type
l_values t_values;
-- declare a cursor to generate the dynamic SQL; where this is done is a
-- matter of taste (can use 'open x for select ...', then fetch, etc.)
-- If you run the query on its own you'll see the individual selects from
-- all the tables
cursor c1 is
select table_name,
'select calle from ' || owner ||'.'|| table_name as query
from all_tables
where owner = 'FMJ'
and table_name like 'BACKUP_SEMAFORO_GEO%'
order by table_name;
begin
-- loop around all the dynamic queries from the cursor
for r1 in c1 loop
-- for each one, execute it as dynamic SQL, with a bulk collect into
-- the collection type created above
execute immediate r1.query bulk collect into l_values;
-- loop around all the elements in the collection, and print each one
for i in 1..l_values.count loop
dbms_output.put_line(r1.table_name ||': ' || l_values(i));
end loop;
end loop;
end;
/
May be a dynamic SQL in a PLSQL program;
for a in (SELECT table_name
FROM all_tables
WHERE owner = 'FMJ' AND table_name LIKE 'BACKUP_SEMAFORO_GEO_%')
LOOP
sql_stmt := ' SELECT calle FROM' || a.table_name;
EXECUTE IMMEDIATE sql_stmt;
...
...
END LOOP;