My current task involves deep cloning [table, index, sequence] a set of tables (not all of them are in the same schema) from a remote (Prod) Oracle DB to my local (Dev) XE Db (including data), and - if possible - even having it all be one script or file I can execute (if need be, I can hope they accept a compiled program).
I knew of create table <name> as select * from <name>#<link> and the equivalent insert, which is probably the best for copying the data, but not definition
I've searched around and stumbled across dbms_metadata.get_ddl() which helped a bit, but I haven't figured out how to connect to the remote database using it, and also haven't found out how to get the tables from the other schemas.
I have a total of 3 schemas (the main one for my application (we'll call "MY", the company's base data (we'll call "COMP" and a related application of a colleague (we'll call "COLL"))
I've tried the script from here, which looked promising, but as I said, I haven't figured out how to connect to remote and the different schemas (the company is the hardest one, as I don't know if I can log into it, and only have select permission
SELECT DBMS_METADATA.GET_DDL('TABLE', table_name, owner)
FROM all_tables WHERE owner = UPPER('&1');
When I tried the get_ddl with a different owner, the sql developer gave me the error:
"Object %s of type TABLE in schema %s not found" where the first %s is
the first table in the (second %s) schema (COMP)
Dbms_metadata is good option. But for using it with dblink you have to write more code.
First your remote user (dblinku user) should have select_catalog_role.
Below example is not complete solution and it should work only for object per request. For more details got to dbms_metadata manual.
Declare
h1 NUMBER;
table_name varchar2(100) :='table_name';
from_schema varchar2(100) :='schema_name';
db_link varchar2(100) :='db_link_name';
xml CLOB;
begin
h1 := DBMS_METADATA.OPEN('TABLE',network_link => db_link ); --open context with db link name.
DBMS_METADATA.SET_FILTER(h1,'NAME',table_name);
DBMS_METADATA.SET_FILTER(h1,'SCHEMA',from_schema);
xml := DBMS_METADATA.FETCH_CLOB(h1);
DBMS_OUTPUT.PUT_LINE(h1);
IF xml IS not NULL THEN
DBMS_OUTPUT.PUT_LINE(xml);
END IF;
DBMS_METADATA.CLOSE(h1);
end;
Related
This question has been asked here before so please forgive me for asking again; the answers did not resolve my issue.
I'm working on a report interface that will run stored procedures from an Oracle database through my .NET application. We have 2 oracle database instances: Dev and Stage. I can connect to both Oracle databases from SQL Developer app and run stored procedures successfully.
The problem exists when I try to run report from the web UI that calls the DEV database. It breaks and returns Oracle errors telling me there are no records, but when I run the same report using the same Dev stored procedure and the database pointing to the STAGE, it returns data with no issues.
BEGIN
-- <logic>Get current role</logic>
SELECT GRANTED_ROLE
INTO L_GRANTED_ROLE
FROM USER_ROLE_PRIVS
WHERE GRANTED_ROLE LIKE 'XYZ_%';
-- <logic>Retrieve the employee id</logic>
L_EMPLOYEE_ID := XYZ.UTILS.GET_EMPLOYEE_ID;
-- <logic>Load course profile</logic>
XYZ.UTILS.LOAD_EMP_TEACHING_PROFILE (P_COURSE_ID, NULL);
So it is very simple. In stage db you must have data returned in following query and in dev, there is no data with following query:
SELECT GRANTED_ROLE
--INTO L_GRANTED_ROLE
FROM USER_ROLE_PRIVS
WHERE GRANTED_ROLE LIKE 'XYZ_%';
Execute above query and you will find the issue by yourself.
You need to change the logic of your proc to handle such cases.
Cheers!!
We are migrating from oracle 11g to 12c. We have a number of DB links created in 11g db but we're not sure whether they are really in use or not.
Since its a production environment we cannot disable the DB links and wait for some jobs to fail.
Is there any way to find out whether particular DB links are in use or not?
One of the crude way I was thinking is, writing a script to loop every 5 minutes gv$sql and search for DB link names in a query and log it. The script will run for a few days.
Is there any other way to find out?
This will search all_source for every link that is found in all_db_links.
begin
for c_links in (select '#'||db_link as db_link from ALL_DB_LINKS)
loop
dbms_output.put_line('search for link: '||c_links.db_link);
for c_source in (select * from all_source s
where upper(s.text) like '%'||c_links.db_link||'%')
loop
dbms_output.put_line('link '||c_links.db_link || 'is used in: ' || c_source.name);
end loop;
end loop;
end;
This could help you identify which links are referenced in procedures, functions and packages.
I need to make a copy of a schema in an Oracle Database with a slightly different name.
I can do this pretty easily with MSS with something like:
BACKUP DATABASE {DATABASE_NAME} TO DISK='{DIRECTORY}\{BACKUP_NAME}'
RESTORE FILELISTONLY FROM DISK = '{DIRECTORY}\{BACKUP_NAME}'
RESTORE DATABASE {NEW_DATABASE} FROM DISK = '{DIRECTORY}\{BACKUP_NAME}' WITH MOVE '{mdf}' TO '{DIRECTORY}\{mdf}.mdf', MOVE '{ldf}' TO '{DIRECTORY}\{ldf}.ldf'
Is there any equivalent for Oracle DB?
For reference, I'm connecting to the database with full privileges using JBDC.
MSS uses "database" to refer to several different concepts in Oracle. I think maybe you mean that you want to export one schema and re-import it into the same Oracle database with a different schema name. I would normally use datapump from the command line for this (expdp/impdp). However, there is a datapump API that you can use to do this from a SQL shell.
-- export
declare
l_dp_handle NUMBER;
BEGIN
l_dp_handle := DBMS_DATAPUMP.open('EXPORT','SCHEMA',null,'MY_EXPORT','LATEST');
DBMS_DATAPUMP.add_file(l_dp_handle,'my_export.dmp','DATA_PUMP_DIR');
DBMS_DATAPUMP.add_file(l_dp_handle,'my_export.log','DATA_PUMP_DIR',null,DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_filter(l_dp_handle,'SCHEMA_EXPR','= ''OLD_SCHEMA_NAME''');
DBMS_DATAPUMP.start_job(l_dp_handle);
DBMS_DATAPUMP.detach(l_dp_handle);
END;
/
-- check status with:
select * from dba_datapump_jobs;
-- import
declare
l_dp_handle NUMBER;
BEGIN
l_dp_handle := DBMS_DATAPUMP.open('IMPORT','SCHEMA',null,'MY_IMPORT','LATEST');
DBMS_DATAPUMP.add_file(l_dp_handle,'my_export.dmp','DATA_PUMP_DIR');
DBMS_DATAPUMP.add_file(l_dp_handle,'my_export.imp.log','DATA_PUMP_DIR',null,DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_filter(l_dp_handle,'SCHEMA_EXPR','= ''OLD_SCHEMA_NAME''');
DBMS_DATAPUMP.metadata_remap(l_dp_handle,'REMAP_SCHEMA','OLD_SCHEMA_NAME','NEW_SCHEMA_NAME');
DBMS_DATAPUMP.start_job(l_dp_handle);
DBMS_DATAPUMP.detach(l_dp_handle);
END;
/
Note that you'll need DBA privileges if you want to import into a schema other than your own. Your Oracle user will also need read/write privileges on the directory (DATA_PUMP_DIR in this example), execute privileges on DBMS_DATAPUMP, etc.
hello everyone,
I want to create a web application using jsp,servlet and jdbc etc.
It simply create a table of name student,contains various field as information about student in database and different page perform different query tasks .
The problem is that I want to create a war file to distribute this app to my client.
When I write a sql query to create a table in my jsp page , it tries to create a new table every time whenever I run this project.
Here i want that it creates a table only when the user run this first time.So that when I distribute the war file to my client ,on the first run it create a student name table and perform the required query and for the future run it only perform other query but not create a table again.
I need your valuable guidance and overview for solving this problem.Any type of help will be appreciated.
"thanks"
In Oracle, the workaround is to wrap it in an anonymous BEGIN-END block. The EXCEPTION block would simply allow the table already exists error.
BEGIN
EXECUTE IMMEDIATE 'CREATE TABLE...
EXCEPTION
WHEN OTHERS THEN
IF SQLCODE = -955 THEN
NULL; -- ignore the ORA-00955 error
ELSE
RAISE;
END IF;
END;
/
This is one way preferably. Another way would be to manually check if the table exists or not in the data dictionary view dba_tables.
SELECT count(*)
INTO variable_count
FROM dba_tables
WHERE table_name = 'table_name';
So,
IF variable_count = 0 THEN EXECUTE IMMEDIATE CREATE TABLE ...
Oracle 11gR2 (x86 Windows):
I have a db with 250 tables with indexes and constraints. I need to re-create these tables, indexes and constraints in a new db and load the data. I need to know how to do the following in SQL Plus and/or SQL Developer, unless there's a magical utility that can automate all of this. Thanks in advance!
Unload (export) all the data from the 250 tables.
Create an sql script file containing the CREATE TABLE statements for the 250 tables.
Create an sql script file containing the CREATE INDEX statements for the 250 tables.
Create an sql script file containing the ALTER TABLE ADD CONSTRAINT statements for the 250 tables.
Run the script to create the tables in a new db.
Load the exported data into the tables in the new db.
Run the script to create all the indexes.
Run the script to add all the contraints.
EDIT: I'm connected to the remote desktop which links to the source db on a Windows Server 2008. The remote only has an Oracle client installed. For security reasons, I'm not allowed to link directly from my local computer to the Win Server, so can I dump the whole source db to the remote then zip it to my local target machine? I'm trying to replicate the entire db on my computer.
Starting from Oracle 10g, you could use the Data Pump command-line clients expdb and impdb to export/import data and/or schema from one DB to an other. As a matter of fact, those two command-line utilities are only wrappers that "use the procedures provided in the DBMS_DATAPUMP PL/SQL package to execute export and import commands, using the parameters entered at the command line." (quoted from Oracle's documentation)
Given your needs, you will have to create a directory then generate a full dump of your database using expdb:
SQL> CREATE OR REPLACE DIRECTORY dump_dir AS '/path/to/dump/folder/';
sh$ expdp system#db10g full=Y directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
As the dump is written using some binary format, you will have to use the corresponding import utility to (re)import your DB. Basically replacing expdb by impdb in the above command:
sh$ impdp system#db10g full=Y directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
For simple table dump, use that version instead:
sh$ expdp sylvain#db10g tables=DEPT,EMP directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
As you noticed, you can use it with your standard user account, provided you have access to the given directory (GRANT READ, WRITE ON DIRECTORY dump_dir TO sylvain;).
For detailed usage explanations, see
http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php.
If you can create a database link from your local database to the one that currently contains the data, you can use the DBMS_DATAPUMP package to copy the entire schema. This is an interface to Datapump (as #Sylvain Leroux mentioned) that is callable from within the database.
DECLARE
dph NUMBER;
source_schema VARCHAR2 (30) := 'SCHEMA_TO_EXPORT';
target_schema VARCHAR2 (30) := 'SCHEMA_TO_IMPORT';
job_name VARCHAR2 (30) := UPPER ('IMPORT_' || target_schema);
p_parallel NUMBER := 3;
v_start TIMESTAMP := SYSTIMESTAMP;
v_state VARCHAR2 (30);
BEGIN
dph :=
DBMS_DATAPUMP.open ('IMPORT',
'SCHEMA',
'DB_LINK_NAME',
job_name);
DBMS_OUTPUT.put_line ('dph = ' || dph);
DBMS_DATAPUMP.metadata_filter (dph,
'SCHEMA_LIST',
'''' || source_schema || '''');
DBMS_DATAPUMP.metadata_remap (dph,
'REMAP_SCHEMA',
source_schema,
target_schema);
DBMS_DATAPUMP.set_parameter (dph, 'TABLE_EXISTS_ACTION', 'REPLACE');
DBMS_DATAPUMP.set_parallel (dph, p_parallel);
DBMS_DATAPUMP.start_job (dph);
DBMS_DATAPUMP.wait_for_job (dph, v_state);
DBMS_OUTPUT.put_line ('Export/Import time: ' || (SYSTIMESTAMP - v_start));
DBMS_OUTPUT.put_line ('Final state: ' || v_state);
END;
The script above actually copies and renames the schema. If you want to keep the same schema name, I believe you'd just remove the metadata_remap call.
SQL Developer can help with #1 by creating INSERT statements with a formatted query result:
Select /*insert*/ *
from My_Table;