PL/SQL TEXT_IO package - sql

I am trying to write to a local file from a PL/SQL script. In order to do this, I am attempting to use the TEXT_IO package in PL/SQL.
DECLARE
file_out text_io.file_type;
len number;
blob_file blob;
my_var RAW(50);
bstart NUMBER := 1;
bytelen NUMBER := 50;
BEGIN
SELECT xxx
INTO blob_file
FROM yyy
WHERE zzz
dbms_lob.read(blob_file, bytelen, bstart, my_var);
file_out := text_io.fopen('local_file_path', 'w');
text_io.put_raw(file_out, my_var);
text_io.fflush(file_out);
text_io.fclose(file_out);
END;
/
quit
However, when I run this script I get the error,
PLS-00201: identifier 'TEXT_IO.FILE_TYPE' must be declared
Does anyone know how I can fix this error, and how I can write the contents of the blob to a file as I am attempting to do?
Thanks,
ktm

TEXT_IO exists only in Oracle Forms which had (in the old client/ server days) a client-side PL/SQL interpreter. If you are using SQL*Plus to execute PL/SQL, as it appears you are doing here, the TEXT_IO package will not be available and you will not be able to write to a file on the client machine (barring the odd setup where the server mounts a drive that your client is exposing and then proceeds to write to that mount).
Now, you can generally use SQL*Plus to directly write to a local file using the SPOOL command. Unfortunately, it's probably unlikely that you could do this for a BLOB in the general case.

If you want to create a file on the server UTL_FILE is a good choice.
This package can write files in any DIRECTORY specified in the database. A DIRECTORY is created in Oracle using CREATE DIRECTORY and can be linked to any writable directory accessible by the DBMS (server-side).

The general approach is: write a file on the server and download it. Or event better, don't write it down, just stream it. Quite complicated, yes.

Related

Relacing a word in an db2 sql file causes DSNC105I : End of file reached while reading the command error

I have a dynamic sql file in which name of TBCREATOR changes as given in a parameter.
I use a simple python script to change the TBCREATOR=<variable here> and write the result to an output sql file.
calling this file using db2 -td# -vf <generated sql file>gives
DSNC105I : End of file reached while reading the command
Here is the file i need the TBCREATOR variable replaced:
CONNECT to 204.90.115.200:5040/DALLASC user *** using ****#
select REMARKS from sysibm.SYSCOLUMNS WHERE TBCREATOR='table' AND NAME='LCODE'
#
Here is the python script:
#!/usr/bin/python3
# #------replace table value with schema name
# print(list_of_lines)
fin = open("decrypt.sql", "rt")
#output file to write the result to
fout = open("decryptout.sql", "wt")
for line in fin:
fout.write(line.replace('table', 'ZXP214'))
fin.close()
fout.close()
After decryptout.sql is generated I call it using db2 -td# -vf decryptout.sql
and get the error given above.
Whats irritating is I have another sql file that contains exactly same data as decryptout.sql which runs smoothly with the db2 -td# -vf ... command. I tried to use the unix command cmp to compare the generated file and the one which I wrote, with the variable ZXP214 already replaced but there are no differences. What is causing this error?.
here is the file (that executes without error) I compare generated output with:
CONNECT to 204.90.115.200:5040/DALLASC user *** using ****#
select REMARKS from sysibm.SYSCOLUMNS WHERE TBCREATOR='ZXP214' AND NAME='LCODE'
#
I found that specifically on the https://ibmzxplore.influitive.com/ challenge, if you are using the java db2 command and working in the Zowe USS system (Unix System Services of zOS), there is a conflict of character sets. I believe the system will generally create files in EBCDIC format, whereas if you do
echo "CONNECT ..." > syscat.clp
the resulting file will be tagged as ISO8859-1 and will not be processed properly by db2. Instead, go to the USS interface and choose "create file", give it a folder and a name, and it will create the file untagged. You can use
ls -T
to see the tags. Then edit the file to give it the commands you need, and db2 will interoperate with it properly. Because you are creating the file with python, you may be running into similar issues. When you open the new file, use something like
open(input_file_name, mode=”w”, encoding=”cp1047”)
This makes sure the file is open as an EBCDIC file.
If you are using the Db2-LUW CLP (command line processor) that is written in c/c++ and runs on windows/linux/unix, then your syntax for CONNECT is not valid.
Unfortunately your question is ambigiously tagged so we cannot tell which Db2-server platform you actually use.
For Db2-LUW with the c/c++ written classic db2 command, the syntax for a type-1 CONNECT statement does not allow a connection-string (or partial connection string) as shown in your question. For Db2-LUW db2 clp, the target database must be externally defined (i.e not inside the script) , either via the legacy actions of both catalog tcpip node... combined with catalog database..., or must be defined in the db2dsdriver.cfg configuration file as plain XML.
If you want to use connection-strings then you can use the clpplus tool which is available for some Db2-LUW client packages, and is present on currently supported Db2-LUW servers. This lets you use Oracle style scripting with Db2. Refer to the online documentation for details.
If you not using the c/c++ classic db2 command, and you are instead using the emulated clp written in java only available with Z/OS-USS, then you must open a ticket with IBM support for that component, as that is not a matter for stackoverflow.

Generate a .log file plsql / oracle

What i want do, is to generate a .log file that describes all my error, start time, end time, and so one. I found a way to have something like that but not in corect way.
I want to generate that file automaticaly, without being required to define it manually.
From what i have understood, is that UTL_FILE.FOPEN, when is not found that file, create one.
My app. is working. The question is, HOW TO GENERATE A FILE IN PLSQL (.log file) without create it manually.
create or replace procedure read_files(input varchar2) as
begin
declare
F2 UTL_FILE.FILE_TYPE;
F2 := UTL_FILE.FOPEN('FOLDER',input||'.log','w');
UTL_FILE.put_line(F2,'Start processing file at : ' || systimestamp);
UTL_FILE.put_line(F2,'End processing file at :'||systimestamp);
-- Close file
UTL_FILE.FCLOSE(F2);
END; --end begin
I found the probleme ! Where I stored my files, I had no right to create files / folders. THANKS all !

how to give relative path of local filesystem in pl/sql block

I am trying to insert a clob from a xml file which is in my local file system. Below is the piece of pl/sql block.
declare
xmlClobFile BFILE := BFILENAME(BFILE_DIR, 'clob.xml');
tempClob CLOB;
begin
EXECUTE IMMEDIATE 'CREATE OR REPLACE DIRECTORY BFILE_DIR AS '||''''||'/home/abc/data/emp/clobs'||''''
--CLOB INSERT
DBMS_LOB.createtemporary(tempClob, TRUE);
DBMS_LOB.open(xmlClobFile, DBMS_LOB.lob_readonly);
DBMS_LOB.loadfromfile(tempClob, xmlClobFile, DBMS_LOB.lobmaxsize);
EXECUTE IMMEDIATE 'insert into emp_data (id, clob_data) values (1000, :1)' using tempClob;
end;
/
Here when I give absolute path (/home/abc/data/emp/clobs) it works. But when I give relative path(like data/emp/clobs) and run this sql from /home/abc, it doesn't work.
[exec] ERROR at line 1:
[exec] ORA-22285: non-existent directory or file for FILEOPEN operation
[exec] ORA-06512: at "SYS.DBMS_LOB", line 937
[exec] ORA-06512: at line 57
How to provide a relative path here, as I want this to be run in any machine and not just mine.
The relative path, if anything will derive from the directory that the Oracle "start" command was run, e.g. /home/oracle. One way to test this, and to verify that relative paths will work (not used them myself) is to create a directory pointing to ".", and run the test to create a file, then search for that file. The directory you find the find in will be your start path. However, I think this is unsafe, since Oracle could be started from any folder (potentially), depending on if its autostarted, or whichever DBA was on hand to start it.
It must be like this one:
xmlClobFile BFILE := BFILENAME('BFILE_DIR', 'clob.xml');

stata odbc sqlfile

I am trying to load data from database (either MS Access or SQL server) using odbc sqlfile it seems that the code is running with any error but I am not getting data. I am using the following code odbc sqlfile("sqlcode.sql"),dsn("mysqlodbcdata"). Note that sqlcode.sql contains just sql statement with SELECT. The thing is that the same sql code is giving data with odbc load,exec(sqlstmt) dsn("mysqlodbcdata"). Can anyone suggest how can I use odbc sqlfile to import data? This would be a great help for me.
Thanks
Joy
sqlfile doesn't load any data. It just executes (and displays the results when the loud option is specified), without loading any data into Stata. That's somewhat counter-intuitive, but true. The reasons are somewhat opaquely explained in the pdf/dead tree manual entry for the odbc command.
Here's a more helpful answer. Suppose you have your SQL file named sqlcode.sql. You can open it in Stata (as long as it's not too long, where too long depends on your flavor of Stata). Basically, -file read- reads the SQL code line by line, storing the results in a local macro named exec. Then you pass that macro as an argument to the -odbc load- command:
Updated Code To Deal With Some Double Quotes Issues
Cut & paste the following code into a file called loadsql.ado, which you should put in directory where Stata can see it (like ~/ado/personal). You can find such directories with the -adopath- command.
program define loadsql
*! Load the output of an SQL file into Stata, version 1.3 (dvmaster#gmail.com)
version 14.1
syntax using/, DSN(string) [User(string) Password(string) CLEAR NOQuote LOWercase SQLshow ALLSTRing DATESTRing]
#delimit;
tempname mysqlfile exec line;
file open `mysqlfile' using `"`using'"', read text;
file read `mysqlfile' `line';
while r(eof)==0 {;
local `exec' `"``exec'' ``line''"';
file read `mysqlfile' `line';
};
file close `mysqlfile';
odbc load, exec(`"``exec''"') dsn(`"`dsn'"') user(`"`user'"') password(`"`password'"') `clear' `noquote' `lowercase' `sqlshow' `allstring' `datestring';
end;
/* All done! */
The syntax in Stata is
loadsql using "./sqlfile.sql", dsn("mysqlodbcdata")
You can also add all the other odbc load options, such as clear, as well. Obviously, you will need to change the file path and the odbc parameters to reflect your setup. This code should do the same thing as -odbc sqlfile("sqlfile.sql"), dsn("mysqlodbcdata")- plus actually load the data.
I also added the functionality to specify your DB credentials like this:
loadsql using "./sqlfile.sql", dsn("mysqlodbcdata") user("user_name") password("not12345")
For "--XYZ" style comments, do something like this (assuming you don't have "--" in your SQL code):
if strpos(`"``line''"', "--") > 0 {;
local `line' = substr(`"``line''"', 1, strpos(`"``line''"', "--")-1);
};
I had to post this as an answer otherwise the formatting would've been all messed up, but it's obviously referring to Dimitriy's code.
(You could also define a local macro holding the position of the "--" string to make your code a little cleaner.)

How to Store BLOB data in Sqlite Using Tcl

I have a Tcl TK application that has a Sqlite back-end. I pretty much understand the syntax for inserting, manipulating, and reading string data; however, I do not understand how to store pictures or files into Sqlite with Tcl.
I do know I have to create a column that holds BLOB data in Sqlite. I just don't know what to do on the Tcl side of things. If anyone knows how to do this or has a good reference to suggest for me, I would really appreciate it.
Thank you,
Damion
In my code, I basically open the file as a binary, load its content into a Tcl variable, and stuff that into the SQLite db. So, something like this...
# load the file's contents
set fileID [open $file RDONLY]
fconfigure $fileID -translation binary
set content [read $fileID}
close $fileID
# store the data in a blob field of the db
$db eval {INSERT OR REPLACE INTO files (content) VALUES ($content)}
Obviously, you'll want to season to taste, and you're table will probably contain additional columns...
The incrblob command looks like what you want: http://sqlite.org/tclsqlite.html#incrblob
The "incrblob" method
This method opens a TCL channel that
can be used to read or write into a
preexisting BLOB in the database. The
syntax is like this:
dbcmd incrblob ?-readonly?? ?DB? TABLE COLUMN ROWID
The command returns a new TCL channel
for reading or writing to the BLOB.
The channel is opened using the
underlying sqlite3_blob_open()
C-langauge interface. Close the
channel using the close command of
TCL.