How to choose tables on select from all_tables? - sql

I have the following table name template, there are a couple with the same name and a number at the end: fmj.backup_semaforo_geo_THENUMBER, for example:
select * from fmj.backup_semaforo_geo_06391442
select * from fmj.backup_semaforo_geo_06398164
...
Lets say I need to select a column from every table which succeeds with the 'fmj.backup_semaforo_geo_%' filter, I tried this:
SELECT calle --This column is from the backup_semaforo_geo_# tables
FROM (SELECT table_name
FROM all_tables
WHERE owner = 'FMJ' AND table_name LIKE 'BACKUP_SEMAFORO_GEO_%');
But I'm getting the all_tables tables name data:
TABLE_NAME
----------
BACKUP_SEMAFORO_GEO_06391442
BACKUP_SEMAFORO_GEO_06398164
...
How can I achieve that without getting the all_tables output?
Thanks.

Presumably your current query is getting ORA-00904: "CALLE": invalid identifier, because the subquery doesn't have a column called CALLE. You can't provide a table name to a query at runtime like that, unfortunately, and have to resort to dynamic SQL.
Something like this will loop through all the tables and for each one will get all the values of CALLE from each one, which you can then loop through. I've used DBMS_OUTPUT to display them, assuming you're doing this in SQL*Plus or something that can deal with that; but you may want to do something else with them.
set serveroutput on
declare
-- declare a local collection type we can use for bulk collect; use any table
-- that has the column, or if there isn't a stable one use the actual data
-- type, varchar2(30) or whatever is appropriate
type t_values is table of table.calle%type;
-- declare an instance of that type
l_values t_values;
-- declare a cursor to generate the dynamic SQL; where this is done is a
-- matter of taste (can use 'open x for select ...', then fetch, etc.)
-- If you run the query on its own you'll see the individual selects from
-- all the tables
cursor c1 is
select table_name,
'select calle from ' || owner ||'.'|| table_name as query
from all_tables
where owner = 'FMJ'
and table_name like 'BACKUP_SEMAFORO_GEO%'
order by table_name;
begin
-- loop around all the dynamic queries from the cursor
for r1 in c1 loop
-- for each one, execute it as dynamic SQL, with a bulk collect into
-- the collection type created above
execute immediate r1.query bulk collect into l_values;
-- loop around all the elements in the collection, and print each one
for i in 1..l_values.count loop
dbms_output.put_line(r1.table_name ||': ' || l_values(i));
end loop;
end loop;
end;
/

May be a dynamic SQL in a PLSQL program;
for a in (SELECT table_name
FROM all_tables
WHERE owner = 'FMJ' AND table_name LIKE 'BACKUP_SEMAFORO_GEO_%')
LOOP
sql_stmt := ' SELECT calle FROM' || a.table_name;
EXECUTE IMMEDIATE sql_stmt;
...
...
END LOOP;

Related

Using variable as table name and pass into another sql query [duplicate]

This question already has answers here:
Querying an Oracle Database with Dynamic Table names
(1 answer)
dynamic table name without using cursors
(1 answer)
dynamic table name in cursor
(2 answers)
dynamic table name in select statement
(4 answers)
Closed 2 years ago.
without going into details - the tools we use for performance test, it spits out several tables (name changes with every new run) with different type of data. One table has list of all the table names and data type.
I am going to use oracle table as an example so it can be easily explained.
What I am wanting to do...
Query all_table like:
QueryA:
select table_name from all_tables where table_name like 'HZ_CUST_%'
Result will say: Table_name = 'HZ_CUST_ACCOUNTS'
Query using the QueryA result like:
Select * from [QUERYA_RESULT] WHERE creation_date > sysdate, account_number between '500000' and '599999'
I got something like this but not quite sure how to do this.
declare variable TABLE_NAME CHAR(10);
SELECT TABLE_NAME into :V_NAME from all_tables WHERE TABLE_NAME LIKE 'HZ_CUST_ACCOUNTS';
select *
from :V_NAME;
Thank you for the help.
The easiest solution for your problem given what you have expressed in the comment section is to use DYNAMIC SQL
declare
v_table_name varchar2(130);
v_sql clob;
begin
-- get the table name
select table_name into v_table_name from all_tables where table_name like 'HZ_CUST_%' ;
-- Build dynamic statement
v_sql := ' select * from '||v_table_name||' where creation_date > sysdate
and account_number between ''500000'' and ''599999'' ';
execute immediate v_sql;
end;
/
From this small version, you can expand this code to whatever you need to do, specialy with the output. Of course, depending on what you need to do with it. I could for example store the maximum value of a field based on this same sentence.
declare
v_table_name varchar2(130);
v_sql clob;
vmaxvalue pls_integer;
begin
-- get the table name
select table_name into v_table_name from all_tables where table_name like 'HZ_CUST_%' ;
-- Build dynamic statement
v_sql := ' select max(table_field) from '||v_table_name||' where creation_date > sysdate
and account_number between ''500000'' and ''599999'' ';
execute immediate v_sql into vmaxvalue;
end;
/
I don't believe you need for your case to use PTF ( Polymorphic Table Functions ) which are intended to evolve the concept of pipeline functions. However, as you did explain very well what you intent to do with the result of the query, you might want to have a look at them. Check this very good example of PTF ( Remember Oracle 18c or higher ).
Polymorphic Table Functions

How can I loop to get counts from multiple tables?

I am trying to derive a table with counts from multiple tables. The tables are not on my schema. The table names on the schema that I am interested in all start with 'STAF_' and end with '_TS'. The criteria i am looking for is where SEP = 'MO'. So for example, the query in its base form is:
select area, count(SEP) areacount
from mous.STAF_0001_TS
where SEP = 'MO'
group by area;
I have about 1000 tables that i'd like to do this for.
Ultimatly, I'd like the output to be a table on my schema that looks like the following:
area| areacount
0001| 3
0002| 7
0003| 438
Thank you.
As a first step I'd write an SQL query that generates an SQL query:
SELECT 'SELECT area, count(*) FROM '||c.table_name||'UNION ALL' as run_me
FROM all_tables c
WHERE c.table_name LIKE 'STAF\_%\_MS' escape '\'
Running this will produce an output that is another SQL query. Copy the result text out of your results grid and paste it back into your query pane. Delete the final UNION ALL and run it
Once you dig how to write an SQL query that generate an SQL query, you can look at turning it into a view, or creating a dynamic query in a string.
Gotta say, this is a horrible way to store data; you'd be better off using ONE table with an extra column containing whatever is in xxx of STAF_xxx_MS right now
In Oracle 12c, you can embed a FUNCTION that will query the number of rows in any given table. Then you can use that function in your main query. Here is an example:
WITH FUNCTION cnt ( p_owner VARCHAR2, p_table_name VARCHAR2 ) RETURN NUMBER IS
l_cnt NUMBER;
BEGIN
EXECUTE IMMEDIATE 'SELECT count(*) INTO :cnt FROM ' || p_owner || '.' || p_table_name INTO l_cnt;
RETURN l_cnt;
EXCEPTION WHEN OTHERS THEN
RETURN NULL; -- This will happen for entries in ALL_TABLES that are not directly accessible (e.g., IOT overflow tables)
END cnt;
SELECT t.owner, t.table_name, cnt(t.owner, t.table_name)
FROM all_tables t
where t.table_Name like 'STAF\_%\_MS' escape '\';

Select from table name defined in a loop

I have 2 database connections: a and b where they both share the same schema but not necessarily the same data. I need to check both database's total number of rows for each table using a loop and compare their counts.
Using a nested table collection, my attempt so far:
DECLARE
TYPE nt_rows IS TABLE OF NUMBER;
nt_rows1 nt_num := nt_num();
nt_rows2 nt_num := nt_num();
counter INT := 0;
CURSOR c_tables1 IS
SELECT TABLE_NAME
FROM ALL_TABLES
WHERE OWNER IN ('a')
ORDER BY TABLE_NAME ASC;
CURSOR c_tables2 IS
SELECT TABLE_NAME
FROM ALL_TABLES
WHERE OWNER IN ('b')
ORDER BY TABLE_NAME ASC;
BEGIN block: loop through each cursor and add them to each array
BEGIN
FOR i IN c_tables1 LOOP
nt_rows1.extend;
SELECT COUNT(*) FROM a.i.TABLE_NAME INTO nt_rows1(counter);
END LOOP;
The last line does not work where 'a' is the database connection name and i is index of each table_name.
I also tried:
nt_rows1(counter) := SELECT COUNT(*) FROM a.i.TABLE_NAME;
and it also doesn't work, any suggestions are appreciated.
If you want to query data from a table whose name is in a string (for example a local variable or a cursor field) and hence not known at compile time, you will need to use dynamic SQL.
Try replacing the line
SELECT COUNT(*) FROM a.i.TABLE_NAME INTO nt_rows1(counter);
with
EXECUTE IMMEDIATE 'SELECT COUNT(*) FROM a.' || i.TABLE_NAME INTO nt_rows1(counter);
Note also that your code is not incrementing the counter variable. Since you are starting with counter set to 0, and 0 isn't a valid index into a nested table, I would recommend adding a line to increment counter before the line above.

Need to build a query with column names stored in another table

I have a table as shown below. This table will be generated dynamically and I have no prior idea about what value it is going to hold.
------------------------------------------
TABLE_NAME COLUMN_NAME CHAR_LENGTH
------------------------------------------
EMPLOYEE COL1 100
EMPLOYEE COL2 200
EMPLOYEE COL3 300
EMPLOYEE COL4 400
Based on this table, I want to build a query in such a way that it would give me those columns, that contains data having char length greater than CHAR_LENGTH column value.
For example if COL2 contains data having char length 500 (>200), then query would give me COL2.
I don't have any draft code to show my attempt, as I have no idea how would I do this.
I don't think this is possible in pure SQL due to the dynamic nature of your requirement. You'll need some form of PL/SQL.
Assuming you're ok with simply outputting the desired results, here is a PL/SQL block that will get the job done:
declare
wExists number(1);
begin
for rec in (select * from your_dynamic_table)
loop
execute immediate 'select count(*)
from dual
where exists (select null
from ' || rec.table_name || ' t
where length(t.' || rec.column_name || ') > ' || rec.char_length || ')'
into wExists;
if wExists = 1 then
dbms_output.put_line(rec.column_name);
end if;
end loop;
end;
You'll also notice the use of the exists clause to optimize the query, so as not to iterate over the whole table unnecessarily, when possible.
Alternatively, if you want the results to be queryable, you can consider converting the code to a pipelined function.
select column_name
from (
select statement that builds the table output
) A
where char_length<length(column_name)
will that help?
You would need a procedure to achieve the same :
Here I am treating the all_tab_columns table in Oracle, which is a default table with much similar structure as your reference example.(Try select * from all_tab_columns). The structure of all_tab_columns is much like yours, except that you will never find a varchar record whose value has exceeded its data length(obvious database level constraint). Date fields may exceed data length, and do reflect in this procedure's output. I am searching all columns in EMPLOYEES whose size exceeds what is specified.
DECLARE
cursor c is select column_name,data_length,table_name from all_tab_columns where table_name=:Table_name;
V_INDEX_NAME all_tab_columns.column_name%type;
v_data_length all_tab_columns.data_length%type;
V_NUMBER PLS_INTEGER;
v_table_name all_tab_columns.table_name%type;
BEGIN
open c;
LOOP
FETCH c into v_index_name,v_data_length,v_table_name;
EXIT when c%NOTFOUND;
v_number :=0;
execute immediate 'select count(*) from '|| :Table_name ||' where length('||v_index_name||')>'||v_data_length into v_number;
if v_number>1 then
dbms_output.put_line(v_index_name||' has values greater than specified'||' '||V_INDEX_NAME||' '||v_data_length);
end if;
END LOOP;
close c;
END;
/
Replace all_tab_columns and its respective columns with the column name of your table.
DEFECTS : The table name is hardcoded. Trying to make the code generic execute immediate or any other trick. Will achieve soon.
EDIT : Defect fixed.

Collecting the last updates of multiple tables into a single table

I have a problem in that I want my output to be a single table (lets call it Output) with 2 columns: one for the "TableName" and one for the DateTime of the last update (using the scn_to_timestamp(max(ora_rowscn)) command).
I have 100 tables and I want to pull in the last update date/times for all these tables into the Output table.
So I can do this:
insert into Output(TableName)
select table_name
from all_tables;
which will put all the tables I have from my database into the TableName column. But I don't know how to loop through each entry and use the tablename as a variable and pass this into the scn_to_timestamp(ora_rowscn).
I thought I would try something like below:
for counter in Output(TableName) LOOP
insert into Output(UpdateDate)
select scn_to_timestamp(max(ora_rowscn))
from counter;
END LOOP;
Any suggestions?
Thank you
This query is a little bit clumsy as it uses xmlgen to execute dynamic sql in a query, but it might work for you.
select x.*
from all_tables t,
xmltable('/ROWSET/ROW' passing
dbms_xmlgen.getxmltype('select ''' || t.table_name ||
''' tab_name, max(ora_rowscn) as la from ' ||
t.table_name)
COLUMNS tab_name varchar2(30) PATH 'TAB_NAME',
max_scn number PATH 'LA') x
Here is a sqlfiddle demo
You can also use PLSQL and then use execute immediate