Trying to Insert contents of text files into an Oracle table - sql

I have about 500 Linux scripts. I am trying to insert the source code from each script into an Oracle table:
CREATE TABLE "SCRIPT_CODE" (
"SCRIPT_NAME" VARCHAR2(200 BYTE),
"CODE_LINE" NUMBER(*,0),
"CODE_TEXT" VARCHAR2(2000 BYTE)
)
I was using a (painful) manual Excel solution. Opening each script and pasting the code into a column. I ran into difficulties and switched gears.
I decided to change the table and place the entire source code from each script into a CLOB field….
CREATE TABLE "SCRIPT_CODE_CLOB" (
"SCRIPT_NAME" VARCHAR2(200 BYTE),
"CODE_TEXT" CLOB
)
Here is the Insert code that I wrote:
set define off;
Declare Code Clob;
Script Varchar2(100);
sql_exec varchar2(1000);
Begin
Script := 'Some Script Name'
;
Code := '
[pasted code here]
'
;
sql_exec := 'INSERT INTO SCRIPT_CODE_CLOB VALUES (:1, :2)';
EXECUTE IMMEDIATE(sql_exec) USING Script, Code;
COMMIT;
End;
This was going great until I ran into a script that had 1,700 lines of code. When I pasted all the code in and ran the script, it gave me:
ORA-01704: string literal too long
I am looking for a better way of doing this. Is it possible to Import the files somehow and automate the process?
There are some external tables in the Oracle database, and I can get to the folder location that they point to.
Thanks very much for any assistance.
- Steve
Environment:
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
Oracle SQL Developer Version 4.0.2.15, Build 15.21

In order to insert into the clob, you need to use the DBMS_LOB functions (specifically DBMS_LOB.WRITE) rather than reading it into a variable and passing that directly into your insert statement. Check out the documentation for the package. You'll need to read the data into a buffer or temporary lob and then use that in the insert.

Related

Deep Clone tables from Remote Oracle to Local Oracle XE with schemas

My current task involves deep cloning [table, index, sequence] a set of tables (not all of them are in the same schema) from a remote (Prod) Oracle DB to my local (Dev) XE Db (including data), and - if possible - even having it all be one script or file I can execute (if need be, I can hope they accept a compiled program).
I knew of create table <name> as select * from <name>#<link> and the equivalent insert, which is probably the best for copying the data, but not definition
I've searched around and stumbled across dbms_metadata.get_ddl() which helped a bit, but I haven't figured out how to connect to the remote database using it, and also haven't found out how to get the tables from the other schemas.
I have a total of 3 schemas (the main one for my application (we'll call "MY", the company's base data (we'll call "COMP" and a related application of a colleague (we'll call "COLL"))
I've tried the script from here, which looked promising, but as I said, I haven't figured out how to connect to remote and the different schemas (the company is the hardest one, as I don't know if I can log into it, and only have select permission
SELECT DBMS_METADATA.GET_DDL('TABLE', table_name, owner)
FROM all_tables WHERE owner = UPPER('&1');
When I tried the get_ddl with a different owner, the sql developer gave me the error:
"Object %s of type TABLE in schema %s not found" where the first %s is
the first table in the (second %s) schema (COMP)
Dbms_metadata is good option. But for using it with dblink you have to write more code.
First your remote user (dblinku user) should have select_catalog_role.
Below example is not complete solution and it should work only for object per request. For more details got to dbms_metadata manual.
Declare
h1 NUMBER;
table_name varchar2(100) :='table_name';
from_schema varchar2(100) :='schema_name';
db_link varchar2(100) :='db_link_name';
xml CLOB;
begin
h1 := DBMS_METADATA.OPEN('TABLE',network_link => db_link ); --open context with db link name.
DBMS_METADATA.SET_FILTER(h1,'NAME',table_name);
DBMS_METADATA.SET_FILTER(h1,'SCHEMA',from_schema);
xml := DBMS_METADATA.FETCH_CLOB(h1);
DBMS_OUTPUT.PUT_LINE(h1);
IF xml IS not NULL THEN
DBMS_OUTPUT.PUT_LINE(xml);
END IF;
DBMS_METADATA.CLOSE(h1);
end;

oracle sql error: Inserting a picture in blob format

I try to insert a picture to the next table:
create table Picture
(
pic BLOB,
title varchar2(30),
descript varchar2(200),
tags varchar2(100),
date_created varchar2(100),
actualdate date
);
I have a picture and 5 varchar2 paramaters. Here is the procedure where I want to insert:
create or replace procedure addKep (pic BLOB, title varchar2,descript varchar2, tags varchar2 , date_created varchar2, hiba out varchar2)
is
my_date date;
v_blob BLOB;
begin
--get actual date
SELECT TO_date
(SYSDATE, 'YYYY/MM/DD HH:MI:SS')into my_date
FROM DUAL;
INSERT INTO Picture (pic)
VALUES (empty_blob());
insert into Picture Values(pic,title,descript,tags,date_created,my_date);
--hiba:='Sikeres!';
commit;
end;
After I try to test my procedure:
declare
something varchar2(20);
BEGIN
addKep('c:\xampp\htdocs\php_web\_projekt\pic\akosfeladatok.jpg','Title','Description','tags','2020-06-15',something);
END;
But I will get the next error:
PLS-00306: wrong number or types of arguments in call to 'ADDKEP'
However, I have the same argument list
Thank you for your help
You don't pass a path to a file as a BLOB, you pass the actual bytes of the file - see Using PL/SQL how do you I get a file's contents in to a blob?
Though, I (personally) dare say your problematic code has the right idea; I'd recommend to NOT store your pictures inside your database. Store them in the file system as files and then use the paths in the DB. It's handy for all sorts of things such as being able to individually back them up, manipulate or substitute them, and you don't need to write code in order to serve them over the web; web servers already know how to serve files out of the file system but as soon as you put your picture (or any bytes data) into a DB, it becomes much harder to work with and you have to write all the code that pulls it out, and puts it back - and that's really all you can do. By storing files in your mission critical DB it means the DB now has to dedicate resources to fetching those files out any time they're needed - really files like pictures should be put on a CDN and made available close to the users who will use them/the job of storing, caching and serving them be handed off to existing technologies dedicated to the task
ps; There are a lot of reasonable arguments for and against, in https://dba.stackexchange.com/questions/2445/should-binary-files-be-stored-in-the-database - my thoughts, personally, align with Tek's

oracle sql developer first time user

I am new to plsql and trying to use oracle sql developer, I try to run a simple procedure with dbms output line and i get the following error,
ora-00904
, the code is
create or replace PROCEDURE proc_101 IS
v_string_tx VARCHAR2(256) := 'Hello World';
BEGIN
dbms_output.put_line(v_string_tx);
END;
whether i click the run(green colour) or debug(red colour) i get the same error.
You can see from the above code, procedure doesn't access any objects but still i get the same error.
Your procedure is fine. You may not have permissions to be able to Create a Procedure. If this is the case test your procedure/code without actually Creating it in the Database first. For example, when I'm testing code in my Production database my oracle user cannot Create Procedures, Packages, Tables etc... And so I test my Procedures within my Own PL/SQL Blocks. When the code is good to go I can get a database administrator to Create the Procedures and/or Packages for me.
The below screenshot is code that simply tests the Procedure:
The below screenshot is code that does much more and tests the Procedure from within a PL/SQL Block
For more advanced situations this allows you to do so much more as you can create all sorts of Procedures/Functions and/or Cursors and test them immediately without needing to CREATE these objects in your Oracle Database.
I'd say that there's some other code in the worksheet which raises that error, not just the CREATE PROCEDURE you posted. For example, something like this SQL*Plus example (just to show what's going on - you'd get the same result in SQL Developer):
SQL> select pixie from dual;
select pixie from dual
*
ERROR at line 1:
ORA-00904: "PIXIE": invalid identifier
SQL>
SQL> create or replace PROCEDURE proc_101 IS
2 v_string_tx VARCHAR2(256) := 'Hello World';
3 BEGIN
4 dbms_output.put_line(v_string_tx);
5 END;
6 /
Procedure created.
SQL>
See? The first part raised ORA-00904 as there's no PIXIE column in DUAL, while the procedure is created correctly.
So - remove code which fails and everything should be OK.
Check with your DBA to make sure the dbms_output package has been installed on your database, and that you have permissions on it.

How best can I recreate an Oracle database?

Oracle 11gR2 (x86 Windows):
I have a db with 250 tables with indexes and constraints. I need to re-create these tables, indexes and constraints in a new db and load the data. I need to know how to do the following in SQL Plus and/or SQL Developer, unless there's a magical utility that can automate all of this. Thanks in advance!
Unload (export) all the data from the 250 tables.
Create an sql script file containing the CREATE TABLE statements for the 250 tables.
Create an sql script file containing the CREATE INDEX statements for the 250 tables.
Create an sql script file containing the ALTER TABLE ADD CONSTRAINT statements for the 250 tables.
Run the script to create the tables in a new db.
Load the exported data into the tables in the new db.
Run the script to create all the indexes.
Run the script to add all the contraints.
EDIT: I'm connected to the remote desktop which links to the source db on a Windows Server 2008. The remote only has an Oracle client installed. For security reasons, I'm not allowed to link directly from my local computer to the Win Server, so can I dump the whole source db to the remote then zip it to my local target machine? I'm trying to replicate the entire db on my computer.
Starting from Oracle 10g, you could use the Data Pump command-line clients expdb and impdb to export/import data and/or schema from one DB to an other. As a matter of fact, those two command-line utilities are only wrappers that "use the procedures provided in the DBMS_DATAPUMP PL/SQL package to execute export and import commands, using the parameters entered at the command line." (quoted from Oracle's documentation)
Given your needs, you will have to create a directory then generate a full dump of your database using expdb:
SQL> CREATE OR REPLACE DIRECTORY dump_dir AS '/path/to/dump/folder/';
sh$ expdp system#db10g full=Y directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
As the dump is written using some binary format, you will have to use the corresponding import utility to (re)import your DB. Basically replacing expdb by impdb in the above command:
sh$ impdp system#db10g full=Y directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
For simple table dump, use that version instead:
sh$ expdp sylvain#db10g tables=DEPT,EMP directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
As you noticed, you can use it with your standard user account, provided you have access to the given directory (GRANT READ, WRITE ON DIRECTORY dump_dir TO sylvain;).
For detailed usage explanations, see
http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php.
If you can create a database link from your local database to the one that currently contains the data, you can use the DBMS_DATAPUMP package to copy the entire schema. This is an interface to Datapump (as #Sylvain Leroux mentioned) that is callable from within the database.
DECLARE
dph NUMBER;
source_schema VARCHAR2 (30) := 'SCHEMA_TO_EXPORT';
target_schema VARCHAR2 (30) := 'SCHEMA_TO_IMPORT';
job_name VARCHAR2 (30) := UPPER ('IMPORT_' || target_schema);
p_parallel NUMBER := 3;
v_start TIMESTAMP := SYSTIMESTAMP;
v_state VARCHAR2 (30);
BEGIN
dph :=
DBMS_DATAPUMP.open ('IMPORT',
'SCHEMA',
'DB_LINK_NAME',
job_name);
DBMS_OUTPUT.put_line ('dph = ' || dph);
DBMS_DATAPUMP.metadata_filter (dph,
'SCHEMA_LIST',
'''' || source_schema || '''');
DBMS_DATAPUMP.metadata_remap (dph,
'REMAP_SCHEMA',
source_schema,
target_schema);
DBMS_DATAPUMP.set_parameter (dph, 'TABLE_EXISTS_ACTION', 'REPLACE');
DBMS_DATAPUMP.set_parallel (dph, p_parallel);
DBMS_DATAPUMP.start_job (dph);
DBMS_DATAPUMP.wait_for_job (dph, v_state);
DBMS_OUTPUT.put_line ('Export/Import time: ' || (SYSTIMESTAMP - v_start));
DBMS_OUTPUT.put_line ('Final state: ' || v_state);
END;
The script above actually copies and renames the schema. If you want to keep the same schema name, I believe you'd just remove the metadata_remap call.
SQL Developer can help with #1 by creating INSERT statements with a formatted query result:
Select /*insert*/ *
from My_Table;

Transfer data from Oracle table to text file

I'm interested is it possible with PL/SQL block to transfer the content of a Oracle table into text file on the Hard Drive. I need a PL/SQL block which can download the content of a table witch will be used to store log data into text file.
Regards
you can use UTL_file package for this..
you can try below type of block --
declare
p_file util_file.file_type;
l_table <your_table_name>.ROWTYPE;
l_delimited varchar2(1) := '|';
begin
p_file:= utl_file.fopen('<file_path>','<file_name>','W');
for l_table in (select * from <your_table_name>) loop
utl_file.putline(p_file,l_table.col1||l_delimited||l_table.col2||l_delimited||l_table.col3||l_delimited||l_table.col4||l_delimited <continue with column list .........> ||chr(10));
end loop;
utl_file.fclose_all();
end;
pratik garg's answer is a good one.
But, you might want to consider also the use of an EXTERNAL TABLE.
Basically, it's a table which is mapped to a file. So every row inserted to the table is automatically written to a file.
you can see an example here