How to update CLOB column from a physical file? - sql

I have a CLOB column in a table which holds very large amount of XML data. I need to update this column's value for one row of table. How can I do this?
I have tried googling, but this visibly simple stuff is not available in a simple language anywhere. Can anybody please suggest?
If I use the normal update query syntax and paste the huge xml content inside the single quotes (the single quote in xml content replaced with 2 single quotes), then the sql developer just disables the execute query button.
update tableName t
set t.clobField = 'How to specify physical file data'
where t.anotherField='value';

You need to create a function that reads the data from the file into a variable of the CLOB data type and returns it as the result.
Try this working example:
create table tclob (id number, filename varchar2 (64), doc clob)
/
insert into tclob values (1, 'test.xml', empty_clob ());
commit;
create or replace function clobLoader (filename varchar2) return clob is
bf bfile := bfilename ('TEMPFILES', filename);
cl clob;
begin
if dbms_lob.fileexists (bf) = 1 then
dbms_lob.createtemporary (cl, true);
dbms_lob.fileopen (bf, dbms_lob.file_readonly);
dbms_lob.loadfromfile (cl, bf, dbms_lob.getlength (bf));
dbms_lob.fileclose (bf);
else cl := empty_clob ();
end if;
return cl;
end;
/
Usage:
update tclob t
set t.doc = clobLoader (t.filename)
where t.id = 1;
1 row updated.

Search the internet for "load clob from file" and you will find a couple of examples. Such as this (http://www.anujparashar.com/blog/loading-text-file-into-clob-field-in-oracle):
DECLARE
v_bfile BFILE;
v_clob CLOB;
BEGIN
v_bfile := BFILENAME (p_file_directory, p_file_name);
IF DBMS_LOB.FILEEXISTS (v_bfile) = 1 THEN
DBMS_LOB.OPEN (v_bfile);
DBMS_LOB.CREATETEMPORARY (v_clob, TRUE, DBMS_LOB.SESSION);
DBMS_LOB.LOADFROMFILE (v_clob, v_bfile, DBMS_LOB.GETLENGTH (v_bfile));
DBMS_LOB.CLOSE (v_bfile);
INSERT INTO tbl_clob (clob_col)
VALUES (v_clob);
END IF;
COMMIT;
END;
/

Related

performance issue when inserting large records

I am parsing string into comma separated and inserting them to global table. The performance is good when inserting around 5k records, performance sucks if the inserting record is around 40k+. The global table has only one column. I thought using bulk fetch and forall will increase the performance, but it’s not the case so far. How can I rewrite below insertion query or any other ways this can be achieved for inserting large records? help will be highly appreciated. I did testing by running insert query by its own and it’s taking long time to process if data size is large.
//large string
emp_refno in CLOB;
CREATE OR replace PROCEDURE employee( emp_refno IN CLOB ) AS
c_limit PLS_INTEGER := 1000;
CURSOR token_cur IS
WITH inputs(str) AS
( SELECT to_clob(emp_refno)
FROM dual ),
prep(s,n,token,st_pos,end_pos ) AS
(
SELECT ','|| str || ',',-1,NULL,NULL,1
FROM inputs
UNION ALL
SELECT s, n + 1,substr(s, st_pos, end_pos - st_pos),
end_pos + 1,instr(s, ',', 1, n + 3)
FROM prep
WHERE end_pos != 0
)
SELECT token
FROM prep
WHERE n > 0;
TYPE token_t
IS
TABLE OF CLOB;
rec_token_t TOKEN_T;
BEGIN
OPEN token_cur;
LOOP
FETCH token_cur bulk collect
INTO rec_token_t limit c_limit;
IF rec_token_t.count > 0 THEN
forall rec IN rec_token_t.first ..rec_token_t.last
INSERT INTO globaltemp_emp
VALUES ( rec_token_t(rec) );
COMMIT;
END IF;
EXIT
WHEN rec_token_t.count = 0;
END LOOP;
OPEN p_resultset FOR
SELECT e.empname,
e.empaddress,
f.department
FROM employee e
join department f
ON e.emp_id = t.emp_id
AND e.emp_refno IN
(
SELECT emp_refno
FROM globaltemp_emp) //USING gtt IN subquery
END;
I have adapted a function which gives better performance.For 90k records, it returns in 13 seconds.Also reduce the c_limit to 250
You can adapt the below
CREATE OR replace FUNCTION pipe_clob ( p_clob IN CLOB,
p_max_lengthb IN INTEGER DEFAULT 4000,
p_rec_delim IN VARCHAR2 DEFAULT '
' )
RETURN sys.odcivarchar2list pipelined authid current_user AS
/*
Break CLOB into VARCHAR2 sized bites.
Reduce p_max_lengthb if you need to expand the VARCHAR2
in later processing.
Last record delimiter in each bite is not returned,
but if it is a newline and the output is spooled
the newline will come back in the spooled output.
Note: this cannot work if the CLOB contains more than
<p_max_lengthb> consecutive bytes without a record delimiter.
*/
l_amount INTEGER;
l_offset INTEGER;
l_buffer VARCHAR2(32767 byte);
l_out VARCHAR2(32767 byte);
l_buff_lengthb INTEGER;
l_occurence INTEGER;
l_rec_delim_length INTEGER := length(p_rec_delim);
l_max_length INTEGER;
l_prev_length INTEGER;
BEGIN
IF p_max_lengthb > 4000 THEN
raise_application_error(-20001, 'Maximum record length (p_max_lengthb) cannot be greater than 4000.');
ELSIF p_max_lengthb < 10 THEN
raise_application_error(-20002, 'Maximum record length (p_max_lengthb) cannot be less than 10.');
END IF;
IF p_rec_delim IS NULL THEN
raise_application_error(-20003, 'Record delimiter (p_rec_delim) cannot be null.');
END IF;
/* This version is limited to 4000 byte output, so I can afford to ask for 4001
in case the record is exactly 4000 bytes long.
*/
l_max_length:=dbms_lob.instr(p_clob,p_rec_delim,1,1)-1;
l_prev_length:=0;
l_amount := l_max_length + l_rec_delim_length;
l_offset := 1;
WHILE (l_amount = l_max_length + l_rec_delim_length
AND
l_amount > 0)
LOOP
BEGIN
dbms_lob.READ ( p_clob, l_amount, l_offset, l_buffer );
EXCEPTION
WHEN no_data_found THEN
l_amount := 0;
END;
IF l_amount = 0 THEN
EXIT;
ELSIF lengthb(l_buffer) <= l_max_length THEN
pipe ROW(rtrim(l_buffer, p_rec_delim));
EXIT;
END IF;
l_buff_lengthb := l_max_length + l_rec_delim_length;
l_occurence := 0;
WHILE l_buff_lengthb > l_max_length
LOOP
l_occurence := l_occurence + 1;
l_buff_lengthb := instrb(l_buffer,p_rec_delim, -1, l_occurence) - 1;
END LOOP;
IF l_buff_lengthb < 0 THEN
IF l_amount = l_max_length + l_rec_delim_length THEN
raise_application_error( -20004, 'Input clob at offset '
||l_offset
||' for lengthb '
||l_max_length
||' has no record delimiter' );
END IF;
END IF;
l_out := substrb(l_buffer, 1, l_buff_lengthb);
pipe ROW(l_out);
l_prev_length:=dbms_lob.instr(p_clob,p_rec_delim,l_offset,1)-1;--san temp
l_offset := l_offset + nvl(length(l_out),0) + l_rec_delim_length;
l_max_length:=dbms_lob.instr(p_clob,p_rec_delim,l_offset,1)-1;--san temp
l_max_length:=l_max_length-l_prev_length;
l_amount := l_max_length + l_rec_delim_length;
END LOOP;
RETURN;
END;
and then use like the below in the cursor in your procedure
CURSOR token_cur IS
select * from table (pipe_clob(emp_refno||',',10,','));
Three quick suggestions:
Perform commit for around 1000(or in batches) records rather than doing for each.
Replace in with exists for the Ref cursor.
Index globaltemp_emp.emp_refno if it doesn't have already.
Additionally recommend to run explain plan for each of the DML operation to check for any odd behaviour.
user uploads text file and I parse that text file as a comma seperated string and pass it to Oracle DB.
You are doing a bunch of work to turn that file into a string and then another bunch of work to convert that string into a table. As many people have observed before me, the best performance comes from not doing work we don't have to do.
In this case this means you should load the file's contents directly into the database. We can do this with an external table. This is a mechanism which allows us to query data from a file on the server using SQL. It would look something like this:
create table emp_refno_load
(emp_refno varchar2(24))
organization external
(type oracle_loader
default directory file_upload_dir
access parameters
(records delimited by newline
fields (employee_number char(24)
)
)
location ('some_file.txt')
);
Then you can discard your stored procedure and temporary table and re-write your query to something like this:
SELECT e.empname,
e.empaddress,
f.department
FROM emp_refno_load l
join employee e ON l.emp_refno = e.emp_refno
join department f ON e.emp_id = f.emp_id
The one snag with external tables is they require access to an OS directory (file_upload_dir in my example above) and some database security policies are weird about that. However the performance benefits and simplicity of approach should carry the day.
Find out more.
An external table is undoubtedly the most performative approach (until you hit millions of roads and then you need SQL*Loader ).

Oracle fetch XML from BLOB

I have tried lots of methods and still can't get my full XML document from DB. What I want to achieve is displaying the XML in Oracle Apex (Display only element) but I can't manage to get the full XML out from my blob.
SELECT
utl_raw.cast_to_varchar2(dbms_lob.substr(<blob_column>, 2000, 1))
FROM
<my_table>
WHERE <some_id> = 123
Also tried to fetch it with mimetype but had no luck. Thank you.
First, you shouldn't convert it to varchar on the server side since Oracle SQL has a 4K limitation on a varchar string size. You can utilize a PL\SQL block for retrieving your data but in this case you will have a limitation in 32K. There is a special way how to get round this issue: http://mayo-tech-ans.blogspot.com/2013/06/displaying-large-clobs-in-oracle-apex.html .
Hoping, I understood the question correctly.
I think, below query will help you.
First, convert the blob column to XMLTYPE, this will also help to check, if XML is valid or not.
http://www.dba-oracle.com/t_convert_blob_to_xml_type.htm
Then, use EXTRACTVALUE to fetch the data from the XML.
select EXTRACTVALUE(xml_data,'/note/to')
from
(
select XMLTYPE.createXML('<note>
<to>Tove</to>
<from>Jani</from>
<heading>Reminder</heading>
<body>Test body</body>
</note>') xml_data from dual
)
;
In order to get full BLOB content:
Converted BLOB into CLOB
create or replace FUNCTION blob_to_clob (blob_in IN BLOB)
RETURN CLOB
AS
v_clob CLOB;
v_varchar VARCHAR2(32767);
v_start PLS_INTEGER := 1;
v_buffer PLS_INTEGER := 32767;
BEGIN
DBMS_LOB.CREATETEMPORARY(v_clob, TRUE);
FOR i IN 1..CEIL(DBMS_LOB.GETLENGTH(blob_in) / v_buffer)
LOOP
v_varchar := UTL_RAW.CAST_TO_VARCHAR2(DBMS_LOB.SUBSTR(blob_in, v_buffer, v_start));
DBMS_LOB.WRITEAPPEND(v_clob, LENGTH(v_varchar), v_varchar);
v_start := v_start + v_buffer;
END LOOP;
RETURN v_clob;
END blob_to_clob;
In Oracle Apex created Display Only (element) called ex: P5_XML
PLSQL code
declare
v_clob clob;
begin
SELECT blob_to_clob(<blob_column>) INTO v_clob FROM <table> WHERE <some_column> = <something>;
:P5_XML := v_clob;
end;

ORACLE BLOB to FILE

I am writing some pl/sql to generate pdf reports that are stored as blobs in an oracle table. I need to loop through this table which has a column for filename and blob and write the blob to the OS as a file with the corresponding filename in the table. I pretty much have completed this code but am running into a snag:
ORA-06550: line 13, column 59:
PL/SQL: ORA-00904: "SIMS_PROD"."PUBLISH_RPT_NEW"."RPT_FILE_NAME": invalid identifier
ORA-06550: line 13, column 12:
PL/SQL: SQL Statement ignored
06550. 00000 - "line %s, column %s:\n%s"
Cause: Usually a PL/SQL compilation error.
Action:
I did read the post on the site: How can I extract files from an Oracle BLOB field? - however - this is only for one file - my table contains hundreds of rows each that has a blob and associated filename - its the looping through this table thats giving me grief.
I need to prefix the schema name, table and column explicitly since I am logged in as a DBA user and not as the owner of the schema itself. Here is my code - what am I missing here or doing wrong. Thanks in advance for any help from the community - its much appreciated.
DECLARE
t_blob BLOB;
t_len NUMBER;
t_file_name VARCHAR2(100);
t_output utl_file.file_type;
t_totalsize NUMBER;
t_position NUMBER := 1;
t_chucklen NUMBER := 4096;
t_chuck RAW(4096);
t_remain NUMBER;
BEGIN
FOR X IN (SELECT SIMS_PROD.publish_rpt_new.RPT_FILE_NAME, SIMS_PROD.publish_rpt_new.RPT_CONTENTS FROM SIMS_PROD.PUBLISH_RPT)
LOOP
-- Get length of blob
SELECT dbms_lob.Getlength (SIMS_PROD.publish_rpt_new.RPT_CONTENTS), SIMS_PROD.publish_rpt_new.RPT_FILE_NAME INTO t_totalsize, t_file_name FROM SIMS_PROD.publish_rpt_new;
t_remain := t_totalsize;
-- The directory TEMPDIR should exist before executing
t_output := utl_file.Fopen ('PDF_REP', t_file_name, 'wb', 32760);
-- Get BLOB
SELECT SIMS_PROD.publish_rpt_new.RPT_CONTENTS INTO t_blob FROM SIMS_PROD.publish_rpt_new;
-- Retrieving BLOB
WHILE t_position < t_totalsize
LOOP
dbms_lob.READ (t_blob, t_chucklen, t_position, t_chuck);
utl_file.Put_raw (t_output, t_chuck);
utl_file.Fflush (t_output);
t_position := t_position + t_chucklen;
t_remain := t_remain - t_chucklen;
IF t_remain < 4096 THEN t_chucklen := t_remain;
END IF;
END LOOP;
END LOOP;
END;
Try replacing the line 13 with this:
FOR X IN (SELECT RPT_FILE_NAME, RPT_CONTENTS FROM SIMS_PROD.PUBLISH_RPT)
It is going to be a too late answer but at least you will know where have you done mistake. The problem is if you use:
FOR X IN (SELECT a, b from table) LOOP
You have to use X.a in next statements of this loop to properly refer to values of selected rows.
So in your code you had to change SIMS_PROD.publish_rpt_new.RPT_CONTENTS to X.SIMS_PROD.publish_rpt_new.RPT_CONTENTS.
I haven't read the rest of your code, so maybe there could be some more mistakes too.
have done a similar thing recently,
ping me for exact additional details
but the gist is this
create/frame excel or pdf as blob and store in to a blob column
use a base 64 converter function to store the same data into clob ( as text )
use sqlplus from windows/linux to spool the text from the clob column
convert the clob to blob with desired filename using os tools (probably ssl/certificate has utility to convert b64 t0 binary back )
Maybe you could consider writing a procedure separetly to save a BLOB as a file (any BLOB from any table) and then just loop through your table passing a BLOB, directory and file name. This way it could be used independently.
Here is the function (it is a function for some other reasons - you can change it to procedure) that does it like described:
Function BLOB2FILE (mBLOB BLOB, mDir VARCHAR2, mFile VARCHAR2) RETURN VarChar2
IS
BEGIN
Declare
utlFile UTL_FILE.FILE_TYPE;
utlBuffer RAW(32767);
utlAmount BINARY_INTEGER := 32767;
utlPos INTEGER := 1;
utlBlobLen INTEGER;
mRet VarChar2(100);
Begin
utlBlobLen := DBMS_LOB.GetLength(mBLOB);
utlFile := UTL_FILE.FOPEN(mDir, mFile,'wb', 32767);
--
WHILE utlPos <= utlBlobLen LOOP
DBMS_LOB.READ(mBLOB, utlAmount, utlPos, utlBuffer);
UTL_FILE.PUT_RAW(utlFile, utlBuffer, TRUE);
utlPos := utlPos + utlAmount;
END LOOP;
--
UTL_FILE.FCLOSE(utlFile);
mRet := 'OK - file created' || mFile;
RETURN mRet;
Exception
WHEN OTHERS THEN
IF UTL_FILE.IS_OPEN(utlFile) THEN
UTL_FILE.FCLOSE(utlFile);
END IF;
mRet := 'ERR - CLOB_ETL.BLOB2FILE error message ' || Chr(10) || SQLERRM;
RETURN mRet;
End;
END BLOB2FILE;
If you can select your BLOBS from any table just loop and pass it to the function/procedure with the directory and file name...

Concatenate CLOB-rows with PL/SQL

I've got a table which has an id and a clob content like:
Create Table v_example_l (
nip number,
xmlcontent clob
);
We insert our data:
Insert into V_EXAMPLE_L (NIP,XMLCONTENT)
Values (17852,'<section><block><name>delta</name><content>548484646846484</content></block></section>');
Insert into V_EXAMPLE_L (NIP,XMLCONTENT)
Values (17852,'<section><block><name>omega</name><content>545648468484</content></block></section>');
Insert into V_EXAMPLE_L (NIP,XMLCONTENT)
Values (17852,'<section><block><name>gamma</name><content>54564846qsdqsdqsdqsd8484</content></block></section>');
I'm trying to do a function that concatenates the rows of the clob that gone be the result of a select, i mean without having to give multiple parameter about the name of table or such, i should only give here the column that contain the clobs, and it should handle the rest.
CREATE OR REPLACE function assemble_clob(q varchar2)
return clob
is
v_clob clob;
tmp_lob clob;
hold VARCHAR2(4000);
--cursor c2 is select xmlcontent from V_EXAMPLE_L where id=17852
cur sys_refcursor;
begin
OPEN cur FOR q;
LOOP
FETCH cur INTO tmp_lob;
EXIT WHEN cur%NOTFOUND;
--v_clob := v_clob || XMLTYPE.getClobVal(tmp_lob.xmlcontent);
v_clob := v_clob || tmp_lob;
END LOOP;
return (v_clob);
--return (dbms_xmlquery.getXml( dbms_xmlquery.set_context("Select 1 from dual")) )
end assemble_clob;
The function is broken... (if anybody could give me a help, thanks a lot, and i'm noob in sql so ....). Thanks!
You don't really say why it's broken but the DBMS_LOB package has an APPEND function that might be what you're looking for.
You'll have to explain what is happening that is not what you expect. Your example works fine for me. In SQLPlus, note the need to SET LONG to a large enough value to fetch the entire CLOB contents.
dev> set long 2000
dev> select assemble_clob('select xmlcontent from v_example_l') from dual;
ASSEMBLE_CLOB('SELECTXMLCONTENTFROMV_EXAMPLE_L')
--------------------------------------------------------------------------------
<section><block><name>delta</name><content>548484646846484</content></block></se
ction><section><block><name>omega</name><content>545648468484</content></block><
/section><section><block><name>gamma</name><content>54564846qsdqsdqsdqsd8484</co
ntent></block></section>
try using
DBMS_LOB.append (v_clob,tmp_lob);

Converting small-ish Oracle long raw values to other types

I have an Oracle table that contains a field of LONG RAW type that contains ASCII character data. How can I write a query or view that will convert this to a more easily consumed character string? These are always going to be single-byte characters, FWIW.
Maybe
select ...., to_lob(long_raw) from old_table
(http://www.psoug.org/reference/convert_func.html)
or
UTL_RAW.CAST_TO_VARCHAR2(b)
(http://www.dbasupport.com/forums/showthread.php?t=5342).
I found this quote:
In Oracle9i, you can even:
alter table old_table modify ( c clob
);
to convert it.
See here: http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1037232794454
Edit
The max length of a varchar2 column is 4000. Is that too short?
I have found this works well on CLOB data types. I would believe the same would hold true for LOB types.
create or replace function lob2char(clob_col clob) return varchar2 IS
buffer varchar2(4000);
amt BINARY_INTEGER := 4000;
pos INTEGER := 1;
l clob;
bfils bfile;
l_var varchar2(4000):='';
begin
LOOP
if dbms_lob.getlength(clob_col)<=4000 THEN
dbms_lob.read (clob_col, amt, pos, buffer);
l_var := l_var||buffer;
pos:=pos+amt;
ELSE
l_var:= 'Cannot convert. Exceeded varchar2 limit';
exit;
END IF;
END LOOP;
return l_var;
EXCEPTION
WHEN NO_DATA_FOUND THEN
return l_var;
END;
INSERT INTO NEWTABLE (NEWCOLUMN) SELECT RTRIM(lob2char(OLDCOLUMN)) FROM OLDTABLE;