sql file fetching value without quote - sql

#echo off
REM Build YYYYMMDD_hhmmss format date/time stamp for new file name
set Stamp=%DATE:~-4%%DATE:~-10,2%%DATE:~-7,2%_%TIME:~0,8%
set Stamp=%Stamp::=%
set Stamp=%Stamp: =0%
set data_file=D:\Oracle\XML\Dump\XMLBusiness-%Stamp%.xml
set log_file=D:\Oracle\XML\Log\XMLBusiness-%Stamp%.log
set SUBJECT_AREA='ENITITY'
set STATUS='COMPLETED'
exit | sqlplus -S xx/yy#database #"C:\Documents and Settings\Desktop\XML\insert_audit_table.sql" %SUBJECT_AREA %STATUS% "SYSTIMESTAMP" > %log_file%
I am using above code to pass value from batch file to sql but I am getting this error
old 1: INSERT INTO XML_AUDIT VALUES(&1,&2,&3)
new 1: INSERT INTO XML_AUDIT VALUES(BUSINESS_ENTITY,COMPLETED,SYSTIMESTAMP)
INSERT INTO XML_AUDIT VALUES(BUSINESS_ENTITY,COMPLETED,SYSTIMESTAMP)
*
ERROR at line 1:
ORA-00984: column not allowed here
Please help me in this. How can i enclose this in single quote

Didn't understand your question completely but as per Error message, it cannot insert text as you gave. You should use single quotes for it. Make sure the values are coming as below
INSERT INTO XML_AUDIT VALUES('BUSINESS_ENTITY','COMPLETED',SYSTIMESTAMP)

Related

TIBScript and local variables

I am working with Delphi 7 and Firebird 2.0. In my application I am using TIBScript components. The problem arises when I use local variables in the script. Firebird requires the names of local variables to be preceded by a colon in some cases. That’s where the problem lies in. The application stops showing the error message:
Dynamic SQL Error
SQL error code = -104
Token unknown - line 4, column 66
?
The token in question is the colon. Here is how my script looks like:
SET TERM ^ ;
EXECUTE BLOCK AS
DECLARE test_variable INT;
BEGIN
SELECT tt.id FROM test_table tt WHERE tt.name LIKE 'abc%' INTO :test_variable;
INSERT INTO test_table2(id, test_column)
VALUES(1, :test_variable);
INSERT INTO test_table3(id, test_column)
VALUES(1, :test_variable);
...
END^
SET TERM ; ^
The same script executes without any errors when run from IBExpert.
How can I use local variables in a TIBScript? Any help would be appreciated!
I want to add that this problem occurs only with variables inside an EXECUTE BLOCK construct. There is no problem with local variables in stored procedure and trigger definitions.
After executing the method TIBSQL.PreprocessSQL (Unit IBX.IBSQL line 2362), parameters marked with ":" on the front are replaced by "?". So you should use parameters without ":". Also I think it should be removed SET TERM. Instead, to set terminator value use the IBScript.Terminator property.
P.S. I watched unit IBX.IBSQL in Delphi 10.3 Rio.
this
EXECUTE BLOCK AS
DECLARE test_variable INT;
BEGIN
SELECT tt.id FROM USERS tt WHERE (tt.fname LIKE 'abc%') INTO test_variable;
END;
is executed properly when
IBScript.Terminator = ^;
Edit:
You can't execute INSERT with parameters in EXECUTE BLOCK using TIBScript component.
As Mark Rotteveel comented:
Unfortunately removing the colon is only an option in the into clause
in not with other occurrences of local variables or parameters.

Need double quotes in each column while spooling data from oracle database to Excel file

I am automating a process wherein I run a SQL query through batch and the output is spooled into a csv file.
Requirement: Each field of csv file should have double quotes.
Ex :
Currently the output is PROJ_SHORT_NAME,WBS_SHORT_NAME
CGL1,CGL1
Required output is "PROJ_SHORT_NAME","WBS_SHORT_NAME"
"CGL1","CGL1"
SQL Query :
set verify off
set trimout off
set trimspool off
set feedback off
set linesize 22000
set pagesize 200
col csv_string FORMAT a1200
set colsep ','
SET UNDERLINE OFF
SET ECHO OFF
SPOOL E:\PDE_GPO\outputfile1.csv
select * from <tablename>;
SPOOL OFF
exit;
The || concatenates items together.
You can do:
SELECT '"'||col1||'","'||col2||'","'||...
FROM table
That would produce something like:
row 1: "col1val","col2val","col3val"...
row 2: ...
The down-side is you have to list/know every column you want to pull, but best coding practices would state you should specify the columns anyways (in case things are added/removed you want to be sure you get what you want).
-Jim
I got the solution for this.
I had to import set markup csv on and the issue got resolved.

which set command to display entire line in sql output to txt file

I am trying to output a query result to a txt file in Windows. The query is being executed in Oracle. I want to export the entire record one by one but gets cut off at the end, the query however displays the full line.
I thought the command:
SET linesize 2000 will do the trick but no luck:
Getting:
2702M11F13-XL 38550116-06 Test 3 325 http://www.test.com/clot
Should get (what shows in query output):
2702M11F13-XL 38550116-06 Text 3 325 http://www.test.com/clothing/outerwear/coats/test/hybridge-lite-vest/p/38550116 CAD
Please help.
Thanks in advance
It should be possible using set colsep
Refer to the Oracle Sql*Plus documentation for set colsep
Note that there's also set tab {on/off} which is used to convert tabs to spaces

How can I update a single field in sqlite3 with the contents of a file?

This is equivalent to my earlier question here, but for sqlite.
As before, I am trying to do the following using the sqlite3 command line client.
UPDATE my_table set my_column=CONTENT_FROM_FILE where id=1;
I have looked at the documentation on .import, but that seems to be a little heavyweight for what I am trying to do.
What is the correct way to set the value of one field from a file?
The method I seek should not impose constraints on the contents of the file.
Assuming the file content is all UTF-8 text and doesn't have any quote characters that would be misinterpreted, you could do this (assuming posix shell - on Windows try cygwin):
$ echo "UPDATE my_table set my_column='" >> temp.sql
$ cat YourContentFile >> temp.sql
$ echo "' where id=1;" >> temp.sql
$ sqlite3
SQLite version 3.7.13 2012-07-17 17:46:21
Enter ".help" for instructions
Enter SQL statements terminated with a ";"
sqlite> .read temp.sql
If the content does have single quotes, escape them first with a simple find-and-replace (you'd need to do that anyway).
hth!
See: http://www.sqlite.org/cli.html#fileio
sqlite> INSERT INTO images(name,type,img)
...> VALUES('icon','jpeg',readfile('icon.jpg'));
In your case:
UPDATE my_table set my_column=readfile('yourfile') where id=1;
If you don't have readfile, you need to .load the module first.
Note
I found that the provided fileio module: http://www.sqlite.org/src/artifact?ci=trunk&filename=ext/misc/fileio.c uses sqlite3_result_blob. When I use it in my project with text columns, it results in Chinese characters being inserted into the table rather than the bytes read from file. This can be fixed by changing it to sqlite3_result_text. See http://www.sqlite.org/loadext.html for instructions on building and loading run-time extensions.

Unix putting extra spaces/lines while passing values to sql statement

I have a unix script like this:
value1=`sqlplus -s ivr/ivr <<EOF
set heading off;
set feedback off;
set linesize 500;
set serveroutput on;
set wrap off;
SELECT FM.getSequence_n('$b_period', '$t_period', '$opr', '$fran', '$poi_s')
FROM DUAL;
exit;
EOF` ,
The parameters are taken as input from the user. when i run the script in debug mode using 'ksh -x filename.sh' i notice that when the unix script is passing values to the select statement it breaks it like this:
SELECT FM.getSequence_n('
2010/12/01 - 2010/12/31','
2010/12/01 - 2010/12/31','TTSLAP','UWAP','TTSL-LOC')
FROM DUAL
...which gives the wrong output. When i run the same sql statement with the passed values in sqlplus with all the values in a single line i get the correct output.
I need to know why unix is breaking the statement into multiple lines and how can this be removed. This has been giving me nightmares. Firstly i was thinking that values were not being passed correctly that's why output was wrong . But only this linefeed by unix while passing values is the cause of error. Please help.
It seems like your $b_period and $t_period variables contain newline characters. If they are input directly from users you really should do some input validation first before using them.
Try something like
filtered_b_period=`echo $b_period | sed 's/[^0-9\/]*//g'`
filtered_t_period=`echo $t_period | sed 's/[^0-9\/]*//g'`
Probably you should add further check as well, but at least you should be able to filter out unwanted characters this way.
The values of your variables include newlines. Do something like this to check:
echo "$b_period" | hexdump -C