Oracle 11gR2 (x86 Windows):
I have a db with 250 tables with indexes and constraints. I need to re-create these tables, indexes and constraints in a new db and load the data. I need to know how to do the following in SQL Plus and/or SQL Developer, unless there's a magical utility that can automate all of this. Thanks in advance!
Unload (export) all the data from the 250 tables.
Create an sql script file containing the CREATE TABLE statements for the 250 tables.
Create an sql script file containing the CREATE INDEX statements for the 250 tables.
Create an sql script file containing the ALTER TABLE ADD CONSTRAINT statements for the 250 tables.
Run the script to create the tables in a new db.
Load the exported data into the tables in the new db.
Run the script to create all the indexes.
Run the script to add all the contraints.
EDIT: I'm connected to the remote desktop which links to the source db on a Windows Server 2008. The remote only has an Oracle client installed. For security reasons, I'm not allowed to link directly from my local computer to the Win Server, so can I dump the whole source db to the remote then zip it to my local target machine? I'm trying to replicate the entire db on my computer.
Starting from Oracle 10g, you could use the Data Pump command-line clients expdb and impdb to export/import data and/or schema from one DB to an other. As a matter of fact, those two command-line utilities are only wrappers that "use the procedures provided in the DBMS_DATAPUMP PL/SQL package to execute export and import commands, using the parameters entered at the command line." (quoted from Oracle's documentation)
Given your needs, you will have to create a directory then generate a full dump of your database using expdb:
SQL> CREATE OR REPLACE DIRECTORY dump_dir AS '/path/to/dump/folder/';
sh$ expdp system#db10g full=Y directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
As the dump is written using some binary format, you will have to use the corresponding import utility to (re)import your DB. Basically replacing expdb by impdb in the above command:
sh$ impdp system#db10g full=Y directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
For simple table dump, use that version instead:
sh$ expdp sylvain#db10g tables=DEPT,EMP directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
As you noticed, you can use it with your standard user account, provided you have access to the given directory (GRANT READ, WRITE ON DIRECTORY dump_dir TO sylvain;).
For detailed usage explanations, see
http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php.
If you can create a database link from your local database to the one that currently contains the data, you can use the DBMS_DATAPUMP package to copy the entire schema. This is an interface to Datapump (as #Sylvain Leroux mentioned) that is callable from within the database.
DECLARE
dph NUMBER;
source_schema VARCHAR2 (30) := 'SCHEMA_TO_EXPORT';
target_schema VARCHAR2 (30) := 'SCHEMA_TO_IMPORT';
job_name VARCHAR2 (30) := UPPER ('IMPORT_' || target_schema);
p_parallel NUMBER := 3;
v_start TIMESTAMP := SYSTIMESTAMP;
v_state VARCHAR2 (30);
BEGIN
dph :=
DBMS_DATAPUMP.open ('IMPORT',
'SCHEMA',
'DB_LINK_NAME',
job_name);
DBMS_OUTPUT.put_line ('dph = ' || dph);
DBMS_DATAPUMP.metadata_filter (dph,
'SCHEMA_LIST',
'''' || source_schema || '''');
DBMS_DATAPUMP.metadata_remap (dph,
'REMAP_SCHEMA',
source_schema,
target_schema);
DBMS_DATAPUMP.set_parameter (dph, 'TABLE_EXISTS_ACTION', 'REPLACE');
DBMS_DATAPUMP.set_parallel (dph, p_parallel);
DBMS_DATAPUMP.start_job (dph);
DBMS_DATAPUMP.wait_for_job (dph, v_state);
DBMS_OUTPUT.put_line ('Export/Import time: ' || (SYSTIMESTAMP - v_start));
DBMS_OUTPUT.put_line ('Final state: ' || v_state);
END;
The script above actually copies and renames the schema. If you want to keep the same schema name, I believe you'd just remove the metadata_remap call.
SQL Developer can help with #1 by creating INSERT statements with a formatted query result:
Select /*insert*/ *
from My_Table;
Related
My current task involves deep cloning [table, index, sequence] a set of tables (not all of them are in the same schema) from a remote (Prod) Oracle DB to my local (Dev) XE Db (including data), and - if possible - even having it all be one script or file I can execute (if need be, I can hope they accept a compiled program).
I knew of create table <name> as select * from <name>#<link> and the equivalent insert, which is probably the best for copying the data, but not definition
I've searched around and stumbled across dbms_metadata.get_ddl() which helped a bit, but I haven't figured out how to connect to the remote database using it, and also haven't found out how to get the tables from the other schemas.
I have a total of 3 schemas (the main one for my application (we'll call "MY", the company's base data (we'll call "COMP" and a related application of a colleague (we'll call "COLL"))
I've tried the script from here, which looked promising, but as I said, I haven't figured out how to connect to remote and the different schemas (the company is the hardest one, as I don't know if I can log into it, and only have select permission
SELECT DBMS_METADATA.GET_DDL('TABLE', table_name, owner)
FROM all_tables WHERE owner = UPPER('&1');
When I tried the get_ddl with a different owner, the sql developer gave me the error:
"Object %s of type TABLE in schema %s not found" where the first %s is
the first table in the (second %s) schema (COMP)
Dbms_metadata is good option. But for using it with dblink you have to write more code.
First your remote user (dblinku user) should have select_catalog_role.
Below example is not complete solution and it should work only for object per request. For more details got to dbms_metadata manual.
Declare
h1 NUMBER;
table_name varchar2(100) :='table_name';
from_schema varchar2(100) :='schema_name';
db_link varchar2(100) :='db_link_name';
xml CLOB;
begin
h1 := DBMS_METADATA.OPEN('TABLE',network_link => db_link ); --open context with db link name.
DBMS_METADATA.SET_FILTER(h1,'NAME',table_name);
DBMS_METADATA.SET_FILTER(h1,'SCHEMA',from_schema);
xml := DBMS_METADATA.FETCH_CLOB(h1);
DBMS_OUTPUT.PUT_LINE(h1);
IF xml IS not NULL THEN
DBMS_OUTPUT.PUT_LINE(xml);
END IF;
DBMS_METADATA.CLOSE(h1);
end;
Good morning,
I am rather new to SQL and I've scoured the internet to try to find a solution to my issue to no avail. I've tried creating procedures, jobs, programs, credentials and schedules through the SQL Developer interface and modifying them as instructed by every article I could find on the subject and I can't seem to get this working.
I'd like to run the following SQL Script every 30 minutes from 0600 to 1700 Mon-Friday, so that it exports a CSV file every 30 minutes.
When I execute the script in SQL developer, it queries the database and saves the file just as intended, but no matter how many times I've tried to get it working on a schedule I can't seem to get t right.
Thanks in advance for the help!
SPOOL C:\Users\X\Documents\SQL\Uploads\X.CSV
SET SQLFORMAT CSV
SELECT
NAME_OF_PERSON
FROM DATABASE;
In versions lower than 12c, Oracle's DBMS_JOB and/or DBMS_SCHEDULER will schedule execution of a stored procedure. It can create a file, but you'll have to use UTL_FILE package to do that, not SPOOL.
As you're on Oracle 12c, its DBMS_SCHEDULER now offers a new job type - SQL_SCRIPT, which lets you schedule a .SQL script. It means that code you posted should be stored as a file. I can't create an example on my 11gXE, but here's a link to ORACLE-BASE site: https://oracle-base.com/articles/12c/scheduler-enhancements-12cr1 and script copied from it which shows how to do that:
CONN test/test#pdb1
-- Create a job with a SQL*Plus script defined in-line,
-- including an explicit connect.
SET SERVEROUTPUT ON
DECLARE
l_job_name VARCHAR2(30);
l_script VARCHAR2(32767);
BEGIN
l_job_name := DBMS_SCHEDULER.generate_job_name;
DBMS_OUTPUT.put_line('JOB_NAME=' || l_job_name);
-- Notice the explicit database connection in the script.
l_script := 'CONN test/test#pdb1
SPOOL /tmp/test.lst
SELECT SYSDATE, USER FROM dual;
SPOOL OFF';
DBMS_SCHEDULER.create_job(
job_name => l_job_name,
job_type => 'SQL_SCRIPT',
job_action => l_script,
credential_name => 'oracle_ol6_121',
enabled => TRUE
);
END;
/
Alternatively, you could use your operating system's scheduling program (Task Scheduler on MS Windows) and tell it to run a .BAT script which would establish SQL*Plus connection and run a .SQL script which contains SPOOL command and that SELECT statement.
Forgot to say: I wouldn't involve SQL Developer into that.
You can also use SQLcl with Oracle 12c.
1) Create .sql file with your spool settings, export location and sql commands
2) Create a .bat file with the SQLcl command:
'>>cd C:\oracle\product\64bit_12.1.0.2client\sqldeveloper\sqldeveloper\bin <--or wherever your sql.exe file is-->
'>>SQL username/password#server:port/pid #C:/users/.../myjob.sql
3) Create a basic job in Windows Task Scheduler to trigger the .bat file
I need to make a copy of a schema in an Oracle Database with a slightly different name.
I can do this pretty easily with MSS with something like:
BACKUP DATABASE {DATABASE_NAME} TO DISK='{DIRECTORY}\{BACKUP_NAME}'
RESTORE FILELISTONLY FROM DISK = '{DIRECTORY}\{BACKUP_NAME}'
RESTORE DATABASE {NEW_DATABASE} FROM DISK = '{DIRECTORY}\{BACKUP_NAME}' WITH MOVE '{mdf}' TO '{DIRECTORY}\{mdf}.mdf', MOVE '{ldf}' TO '{DIRECTORY}\{ldf}.ldf'
Is there any equivalent for Oracle DB?
For reference, I'm connecting to the database with full privileges using JBDC.
MSS uses "database" to refer to several different concepts in Oracle. I think maybe you mean that you want to export one schema and re-import it into the same Oracle database with a different schema name. I would normally use datapump from the command line for this (expdp/impdp). However, there is a datapump API that you can use to do this from a SQL shell.
-- export
declare
l_dp_handle NUMBER;
BEGIN
l_dp_handle := DBMS_DATAPUMP.open('EXPORT','SCHEMA',null,'MY_EXPORT','LATEST');
DBMS_DATAPUMP.add_file(l_dp_handle,'my_export.dmp','DATA_PUMP_DIR');
DBMS_DATAPUMP.add_file(l_dp_handle,'my_export.log','DATA_PUMP_DIR',null,DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_filter(l_dp_handle,'SCHEMA_EXPR','= ''OLD_SCHEMA_NAME''');
DBMS_DATAPUMP.start_job(l_dp_handle);
DBMS_DATAPUMP.detach(l_dp_handle);
END;
/
-- check status with:
select * from dba_datapump_jobs;
-- import
declare
l_dp_handle NUMBER;
BEGIN
l_dp_handle := DBMS_DATAPUMP.open('IMPORT','SCHEMA',null,'MY_IMPORT','LATEST');
DBMS_DATAPUMP.add_file(l_dp_handle,'my_export.dmp','DATA_PUMP_DIR');
DBMS_DATAPUMP.add_file(l_dp_handle,'my_export.imp.log','DATA_PUMP_DIR',null,DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_filter(l_dp_handle,'SCHEMA_EXPR','= ''OLD_SCHEMA_NAME''');
DBMS_DATAPUMP.metadata_remap(l_dp_handle,'REMAP_SCHEMA','OLD_SCHEMA_NAME','NEW_SCHEMA_NAME');
DBMS_DATAPUMP.start_job(l_dp_handle);
DBMS_DATAPUMP.detach(l_dp_handle);
END;
/
Note that you'll need DBA privileges if you want to import into a schema other than your own. Your Oracle user will also need read/write privileges on the directory (DATA_PUMP_DIR in this example), execute privileges on DBMS_DATAPUMP, etc.
I have about 500 Linux scripts. I am trying to insert the source code from each script into an Oracle table:
CREATE TABLE "SCRIPT_CODE" (
"SCRIPT_NAME" VARCHAR2(200 BYTE),
"CODE_LINE" NUMBER(*,0),
"CODE_TEXT" VARCHAR2(2000 BYTE)
)
I was using a (painful) manual Excel solution. Opening each script and pasting the code into a column. I ran into difficulties and switched gears.
I decided to change the table and place the entire source code from each script into a CLOB field….
CREATE TABLE "SCRIPT_CODE_CLOB" (
"SCRIPT_NAME" VARCHAR2(200 BYTE),
"CODE_TEXT" CLOB
)
Here is the Insert code that I wrote:
set define off;
Declare Code Clob;
Script Varchar2(100);
sql_exec varchar2(1000);
Begin
Script := 'Some Script Name'
;
Code := '
[pasted code here]
'
;
sql_exec := 'INSERT INTO SCRIPT_CODE_CLOB VALUES (:1, :2)';
EXECUTE IMMEDIATE(sql_exec) USING Script, Code;
COMMIT;
End;
This was going great until I ran into a script that had 1,700 lines of code. When I pasted all the code in and ran the script, it gave me:
ORA-01704: string literal too long
I am looking for a better way of doing this. Is it possible to Import the files somehow and automate the process?
There are some external tables in the Oracle database, and I can get to the folder location that they point to.
Thanks very much for any assistance.
- Steve
Environment:
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
Oracle SQL Developer Version 4.0.2.15, Build 15.21
In order to insert into the clob, you need to use the DBMS_LOB functions (specifically DBMS_LOB.WRITE) rather than reading it into a variable and passing that directly into your insert statement. Check out the documentation for the package. You'll need to read the data into a buffer or temporary lob and then use that in the insert.
I need to be able to drop a specific user (which may have active sessions) from the batch without any user interaction. I don't care about active sessions and want them to be dropped and rolled back. For Microsoft SQL i would do similar task with a single line:
osql -E -S localhost -b -Q "use master if ((select name from sysdatabases where name='%DB%') is not null) begin alter database [%DB%] set single_user with rollback immediate drop database [%DB%] end"
How do i do it for Oracle (10g XE on Windows)?
My current batch is:
sqlplus sys/*** as SYSDBA #delete1.sql >delete.log
sqlplus sys/***#XE as SYSDBA #delete2.sql >>delete.log
where delete1.sql:
startup force;
exit;
and delete2.sql:
drop user MYUSER cascade;
exit;
This is ugly as hell and takes too long comparing to the split second of MSSQL solution.
It should work if you use the following script (here named drop_user_with_active_sessions.sql):
set verify off
begin
for s in (
select
sid, serial#
from
v$session
where
username = '&1'
) loop
execute immediate
'alter system kill session ''' ||
s.sid || ',' ||
s.serial# || ''' immediate';
end loop;
execute immediate 'drop user &1';
end;
/
exit
And the use it with
sqlplus username/password#instance #c:\path\to\drop_user_with_active_session.sql MYUSER
you can do Oracle SQL via the command prompt and then do your cascade drop user.
I would recommend creating a sql script and executing it from the command line.
then you can wrap up command line text in your cmd/batch file.
but if you would like Oracle to handle the entire process I would recommend looking into the job/schedule environment
In addition to "alter system kill session" mentioned above I've also needed to preface the kill session with something like:
execute immediate 'ALTER SYSTEM DISCONNECT SESSION ''' ||
to_char(s.sid) || ', ' || to_char(s.serial#) || ''' IMMEDIATE'
It's a very, very bad idea to take a construct from one Database platform and assume I can run the exact same thing on a different platform. For example. Oracle has Create OR REPLACE procedure. MSSS isn't quite so simple. MSSS you can make a "temp" table with #name, in Oracle we use DDL. While dropping a user to recreate a fresh environment may have been the simplest approach on MSSS, perhaps there's a more Oracle-centric way to accomplish the same thing. It's a very good idea to ask for help on how to accomplish a task instead of why your way isn't working.
First, does the app being tested do DDL? to the tables and other objects?
If it only changes data, they way Oracle prefers apps to work, then why do you have to recreate all the objects. You just need to get the data back to the starting point.
Have you looked into Flashback Database? You should be able to create a restore point... do whatever you want and then flashback the database to that point in time.