I have a list of tables, and I want to check which ones do not currently exist in the target database, I can't figure out a query to return only the tables from the list that do not exist? I am running DB2 9.7.
If the list of tables you want to check against is in a form that you can query it would be something like this. The query below will return all tables NOT in the select table query (that you'll have to provide):
select * from sysibm.systables
where owner = 'SCHEMA'
and type = 'T'
and name not in ( /* select query to return your list of tables */ );
Update post comment:
If the tables are listed in a flat file (.txt, .csv) and the number is manageable. You should be able to list them out in a coma seperated form like this (at least you can with other sql languages I'm more familiar with).
select * from sysibm.systables
where owner = 'SCHEMA'
and type = 'T'
and name not in ( 'table1', 'table2', 'table3', 'table4', 'tableA', 'tableB' );
Otherwise you might have to build a quick temp table to import all the table names into and go with the first example still.
Update post post comment:
And finally, after your most recent comment, I realize I was mis-understanding your question and had it backwards. To flip it and find the tables from the list that aren't in the SCHEMA you'd do something like this. (after importing the list into a temporary table).
select mytablename from templistoftables
where mytablename not in
(select name from sysibm.systables
where owner = 'SCHEMA'
and type = 'T');
Related
My question is, is it possible to list all the columns from the whole database not just in specific tables based on 3 different criteria which ones are in an "OR" relationship. so for example I have database called "Bank" and I have 3 criterias "Criteria1; Criteria2; Criteria3" and if any of them is true so the relation between them should be OR and not AND than I will get back all the columns matching the criterias and the output put should provide "account_id" or "customer_id" from the same table.
How do I procced in this case?
It is possible, but you probably don't want to do it. Anyway, you could write a stored procedure that finds all tables that contain the columns you want:
select distinct table_name from user_tab_cols utc
where exists (select * from user_tab_cols where table_name = utc.table_name
and column_name = 'ACCOUNT_ID')
and exists (select * from user_tab_cols where table_name = utc.table_name
and column_name = 'CUSTOMER_ID');
Given the tables you could run a query where you append table name and your criteria:
execute immediate 'select account_id, customer_id from agreement where '
|| your_criteria_here;
A bit messy, inefficient and treat this as pseudo-code. However, if you really want to do this for an ad-hoq query it should point you in the right direction!
I have an Oracle database with many tables that have identical structure (columns are all the same). The table names are similar also. The names of the tables are like table_1, table_2, table_3...
I know this isn't the most efficient design, but I don't have the option of changing this at this time.
In this case, is it possible to make a single sql query, to extract all rows with the same condition across multiple tables (hundreds of tables) without explicitly using the exact table name?
I realize I could use something like
select * from table_1 UNION select * from table_2 UNION select * from table_3...select * from table_1000
But is there a more elegant sql statement that can be run that extracts from all matching table names into one result without having to name each table explicitly.
Something like
select * from table_%
Is something like that possible? If not, what is the most efficient way to write this query?
You can use dbms_xmlgen to query tables using a pattern, which generates an XML document as a CLOB:
select dbms_xmlgen.getxml('select * from ' || table_name
|| ' where some_col like ''%Test%''') as xml_clob
from user_tables
where table_name like 'TABLE_%';
You said you wanted a condition, so I've included a dummy one, where some_col like '%Test%'.
You can then use XMLTable to extract the values back as relational data, converting the CLOB to XMLType on the way:
select x.*
from (
select xmltype(dbms_xmlgen.getxml('select * from ' || table_name
|| ' where some_col like ''%Test%''')) as xml
from user_tables
where table_name like 'TABLE_%'
) t
cross join xmltable('/ROWSET/ROW'
passing t.xml
columns id number path 'ID',
some_col varchar2(10) path 'SOME_COL'
) x;
SQL Fiddle demo which retrieves one matching row from each of two similar tables. Of course, this assumes your table names follow a useful pattern like table_%, but you suggest they do.
This is the only way I know to do something like this without resorting to PL/SQL (and having searched back a bit, was probably inspired by this answer to count multiple tables). Whether it's efficient (enough) is something you'd need to test with your data.
This is kind of messy and best performed in a middle-tier, but I suppose you could basically loop over the tables and use EXECUTE IMMEDIATE to do it.
Something like:
for t in (select table_name from all_tables where table_name like 'table_%') loop
execute immediate 'select blah from ' || t.table_name;
end loop;
You can write "select * from table_1 and table_2 and tabl_3;"
I have two schemas, say 'DB_Internals and 'Network'.
Both the schemas contain a table called cable. I just want to extract column names alone from the table 'cable' in both the schemas and check whether any difference in column names using SQL.
How can I accomplish this?
In DB_Internals schema,rename the table name to cable_1
Give grants to table 'Cable_1' to 'Network' schema..
grant select on cable_1 to network;
Now, login to Network schema..
select column_name from dba_tab_columns where table_name='cable'
minus
select COLUMN_NAME from dba_tab_columns where table_name='cable_1';
The data needed for your query is in the information schema:
http://www.postgresql.org/docs/current/static/infoschema-columns.html
You then need to compare the two tables using that data. Options to do so are plenty, e.g.:
list1 full join list2 with one or the other is null
(list1 except list2) union all (list2 except list1)
list columns of both group by column_name having count(*) <> 2
etc.
Is there some SQL that will either return a list of table names or (to cut to the chase) that would return a boolean as to whether a tablename with a certain pattern exists?
Specifically, I need to know if there is a table in the database named INV[Bla] such as INVclay, INVcherri, INVkelvin, INVmorgan, INVgrandFunk, INVgobbledygook, INV2468WhoDoWeAppreciate, etc. (the INV part is what I'm looking for; the remainder of the table name could be almost anything).
IOW, can "wildcards" be used in a SQL statement, such as:
SELECT * tables
FROM database
WHERE tableName = 'INV*'
or how would this be accomplished?
This should get you there:
SELECT *
FROM INFORMATION_SCHEMA.TABLES
where table_name LIKE '%INV%'
EDIT:
fixed table_name
To check for exists:
--
-- note that the sql compiler knows that it just needs to check for existence, so this is a case where "select *" is just fine
if exists
(select *
from [sys].[tables]
where upper([name]) like N'INV%')
select N'do something appropriate because there is a table based on this pattern';
You can try the following:
SELECT name FROM sys.tables where name LIKE 'INV%';
How to find all indexes available on table in db2?
db2 "select * from syscat.indexes where tabname = 'your table name here' \
and tabschema = 'your schema name here'"
You can also execute:
DESCRIBE INDEXES FOR TABLE SCHEMA.TABLE SHOW DETAIL
You can get the details of indexes with the below command.
describe indexes for table schemaname.tablename show detail
To see all indexes :-
select * from user_objects
where object_type='INDEX'
To see index and its columns on table :
select * from USER_IND_COLUMNS where TABLE_NAME='my_table'
This depends upon which version of DB2 you are using.
We have v7r1m0 and the following query works quite well.
WITH IndexCTE (Schema, Table, Unique, Name, Type, Columns) AS
(SELECT i.table_schema, i.Table_Name, i.Is_Unique,
s.Index_Name, s.Index_Type, s.column_names
FROM qsys2.SysIndexes i
INNER JOIN qsys2.SysTableIndexStat s
ON i.table_schema = s.table_schema
and i.table_name = s.table_name
and i.index_name = s.index_name)
SELECT *
FROM IndexCTE
WHERE schema = 'LIBDEK'
AND table = 'ECOMROUT'
If you're not familiar with CTE's they are worth getting to know. Our AS400 naming conventions are awful so I've been using CTE's to normalize field names. I ended up making a library of CTE's and have it automatically append to the top of all my queries.
For checking the indexes of a table on IBM Db2 on Cloud (previously DashDb) the following query should do it:
SELECT * FROM SYSCAT.INDEXES WHERE TABNAME = 'my_tablename' AND TABSCHEMA = 'my_table_schema'
You can use also check by index name:
SELECT COUNT(*) FROM SYSCAT.INDEXES WHERE TABNAME = 'my_tablename' AND TABSCHEMA = 'my_table_schema' AND INDNAME='index_name'
The same result can be achieved by using SYSIBM.SYSINDEXES. However, this table is not referenced directly on the product documentation page.
SELECT COUNT(*) FROM SYSIBM.SYSINDEXES WHERE TBNAME = 'my_tablename' AND TBCREATOR = 'my_table_schema' AND NAME='my_index_name'
See SYSCAT.INDEXES catalog view.
One more way is to generate the DDL of the table.
It will give you the complete description of table including index on it.
Just right click on table and click on generate DDL/Scripts.
Works on most of the database.