Unix putting extra spaces/lines while passing values to sql statement - sql

I have a unix script like this:
value1=`sqlplus -s ivr/ivr <<EOF
set heading off;
set feedback off;
set linesize 500;
set serveroutput on;
set wrap off;
SELECT FM.getSequence_n('$b_period', '$t_period', '$opr', '$fran', '$poi_s')
FROM DUAL;
exit;
EOF` ,
The parameters are taken as input from the user. when i run the script in debug mode using 'ksh -x filename.sh' i notice that when the unix script is passing values to the select statement it breaks it like this:
SELECT FM.getSequence_n('
2010/12/01 - 2010/12/31','
2010/12/01 - 2010/12/31','TTSLAP','UWAP','TTSL-LOC')
FROM DUAL
...which gives the wrong output. When i run the same sql statement with the passed values in sqlplus with all the values in a single line i get the correct output.
I need to know why unix is breaking the statement into multiple lines and how can this be removed. This has been giving me nightmares. Firstly i was thinking that values were not being passed correctly that's why output was wrong . But only this linefeed by unix while passing values is the cause of error. Please help.

It seems like your $b_period and $t_period variables contain newline characters. If they are input directly from users you really should do some input validation first before using them.
Try something like
filtered_b_period=`echo $b_period | sed 's/[^0-9\/]*//g'`
filtered_t_period=`echo $t_period | sed 's/[^0-9\/]*//g'`
Probably you should add further check as well, but at least you should be able to filter out unwanted characters this way.

The values of your variables include newlines. Do something like this to check:
echo "$b_period" | hexdump -C

Related

How to Pass a sqlplus parameter from shell script to sql file

I am having a Shell script that has a variable storing the date value using sqlplus connection as below. I want to pass the shell variable 'var_Date' value to the sql file.
cat extract.sh
#!/bin/ksh
var_Date=`sqlplus -s $DB_USER/$DB_PASS#$DB_HOST:$DB_PORT/$DB_SID << EOF
SELECT MAX(SAMPLE_DATE)-1 FROM SAMPLE_CASES WHERE KEY IN ('ab','bc');
EXIT;
EOF`
export var_Date
echo $var_Date
sqlplus -s $DB_USER/$DB_PASS#$DB_HOST:$DB_PORT/$DB_SID #data_extract.sql $var_Date
cat extract.sql
set echo off
set trimspool on
set pagesize 0
set colsep ~
spool extract.csv
SELECT CASE_ID FROM SAMPLE_CASE1 WHERE TIME_START>='&1'
spool off
I have tried to execute this script but it is failing with invalid identifier. Please help to understand what is the mistake I am doing here.
After searching in the internet for almost a week, could arrive at a working code. Below is the working code.
cat extract.sh
#!/bin/ksh
var_Date=`sqlplus -s $DB_USER/$DB_PASS#$DB_HOST:$DB_PORT/$DB_SID << EOF
SET HEADING OFF
SELECT ''''||TRIM(MAX(SAMPLE_DATE)-1)||'''' FROM SAMPLE_CASES WHERE KEY IN ('ab','bc');
EXIT;
EOF`
export var_Date
echo $var_Date
sqlplus -s $DB_USER/$DB_PASS#$DB_HOST:$DB_PORT/$DB_SID #data_extract.sql $var_Date
cat extract.sql
set echo off
set trimspool on
set pagesize 0
set colsep ~
spool extract.csv
SELECT CASE_ID FROM SAMPLE_CASE1 WHERE to_char(TIME_START,'DD-MON-YYYY') >='&1'
spool off
Firstly, we to need to pass the argument $var_Date to sqlplus within single quotes which is done within the variable declaration itself by concatenating ''''. We also need to use SET HEADING OFF to make the variable var_Date hold only the date value and not the header value.
Secondly, there was a ora error: "ORA-01858: a non-numeric character was found where a numeric was expected". To mitigate this, I had type-casted the field time_start to to_char post which the filter is applied and stored in the csv file.
Hope this helps others who had trouble like me.

Oracle SQLPlus: Echo without line numbers?

I'm working on a solution where several SQL and PL/SQL scripts are being run together, in a batch of sorts, via SQL*Plus.
I'm declaring SET ECHO OFF; and SET ECHO ON; at relevant points in the scripts so as to output relevant code.
Currently the output looks something like this:
SQL> DECLARE
2 ct number := 0;
3 ctChanges number := 0;
4
5 BEGIN
6 select count(*) into ct from ...
7 (...rest of code block...)
"some specific status message"
Commit executed.
We keep this output as a run-log in our build-environment, but can also access it as a plain text file.
One downside of this format however, is that if I'd like to copy a certain section of the code and run it again in an IDE (like Toad or SQL Developer), it's hard to exclude the line numbers.
Is it possible to tell SQL*Plus to output the code as above, but without including the line numbers?
You can use options sqlnumber and sqlprompt:
set sqlprompt ''
set sqlnumber off
SET SQLN[UMBER] {ON|OFF}
SET SQLNUMBER is not supported in iSQL*Plus
Sets the prompt for the second and subsequent lines of a SQL command or PL/SQL block. ON sets the prompt to be the line number. OFF sets the prompt to the value of SQLPROMPT.

Store my "Sybase" query result /output into a script variable

I need a variable to keep the results retrieved from a query (Sybase) that´s in a script.
I have built the following script, it works fine I get the desired result when I run it
Script: EXECUTE_DAILY:
isql -U database_dba -P password <<EOF!
select the_name from table_name where m_num="NUMB912" and date="17/01/2019"
go
quit
EOF!
echo "All Done"
Output:
"EXECUTE_DAILY" 97 lines, 293 characters
user#zp01$ ./EXECUTE_DAILY
the_name
-----------------------------------
NAME912
(1 row affected)
But now I would like to keep the output(the_name: NAME912) in a variable.
So far this is basically what I'm trying with no success.
variable=$(isql -U database_dba -P password -se "select the_name from table_name where m_num="NUMB912" and date="17/01/2019" ")
But, is not working. I can't save NAME912 in a variable.
You need to parse the output for the desired string/piece-of-data that you wish to store in your variable. I tend to make my life a bit easier by making sure I can easily/quickly search/parse out what I want.
Keeping a few issues in mind ...
I tend to use isql -s"|" -w10000 to ensure (most of the time) that a) the result set has all columns delimited with the pipe ('|') and b) a single row of data does not span multiple rows; the pipe delimiter makes it easier to parse out columns that may contain white space; obviously (?) use a different delimiter if a pipe may be part of your actual data
to make parsing of the isql output a bit easier I tend to add a unique, grep-able (literal) string to the rows that I'm looking to search/parse
some databases (eg, SQLAnywhere, Oracle) tend to mimic a literal value as the column header if said literal string has not been assigned an explicit alias/header; this means that if you do a simple search on your literal string then you'll get a match for the result set header as well as the actual data row
I tend to capture all isql output to a temporary file; this allows for easier follow-on processing, eg, error checking, data parsing, dumping contents to a logfile, etc
So, with the above in mind my code typically looks something like:
$ outfile=/tmp/.$$.isql.outfile
$ isql -s"|" -w10000 -U database_dba -P password <<-EOF > ${outfile} 2>&1
-- 'GREP'||'ME' ensures that 'GREPME' only shows up in the data row
select 'GREP'||'ME',the_name
from table_name
where m_num = "NUMB912"
and date = "17/01/2019"
go
EOF
$ cat ${outfile}
... snip ...
|'GREP'||'ME'|the_name | # notice the default column header = 'GREP'||'ME' which won't match my search for 'GREPME'
|------------|----------|
|GREPME |NAME912 | # this is the line I want to search/parse
... snip ...
$ read -r namevar < <(egrep GREPME ${outfile} | awk -F"|" '{print $3}')
$ echo ${namevar}
NAME912

Reuse same sql clause in script

The case is that I have an SQL clause inside a unix script like:
sqlplus -s user/pass << END_SQL1 >> outfile.txt
set echo off feedback off heading off tab off;
select .....
from ....
where ...
and ...
and ... ;
END_SQL
If the outfile.txt is not empty, which means that I get a result from the above SQL, then I am running an update SQL that should change something at some DB elements.
Then I need to reuse the same SQL above to check if the DB elements that I wanted have changed indeed. So, is that possible to reuse this same SQL, but WITHOUT including this same SQL code again later at the script, instead to run it again and, moreover, even put the result at another output file, e.g. outfile2.txt ?
You can use RETURNING ... INTO ... clause inside the script
UPDATE myTable
SET col1 = <something1>
WHERE col2 = <something2>
RETURNING col3, col1 INTO v_col3, v_col1;
to return the results into the variables v_col3 and v_col1.
You could put your hairy SELECT query in a file, say select.sql. Then whenever you need to run the SQL, you could just do :
sqlplus -s user/pass #select.sql >> outfile.txt
You can adapt the output file as you wish :
sqlplus -s user/pass #select.sql >> outfile2.txt
NB : you said
If the outfile.txt is not empty, which means that I get a result from the above SQL
You probably want to use > when writing to outfile.txt : >> appends to the file, while > replaces it.

loop sql query in a bash script

I need to loop a oracle sqlplus query using bash.
my scenario is like this. I have a set of names in a text file and i need to find out details of that names using a sqlplus query.
textfile.txt content:
john
robert
samuel
chris
bash script
#!/bin/bash
while read line
do
/opt/oracle/bin/sqlplus -s user#db/password #query.sql $line
done < /tmp/textfile.txt
sql query: query.sql
set verify off
set heading off
select customerid from customers where customername like '%&1%';
exit
problem is when I run the script I get errors like
SP2-0734: unknown command beginning
"robert..." - rest of line ignored.
can someone tell me how to solve this?
The way I do this all the time is as follows:
#!/bin/bash
cat textfile.txt |while read Name
do
sqlplus -s userid/password#db_name > output.log <<EOF
set verify off
set heading off
select customerid from customers where customername like '%${Name}%'
/
exit
EOF
Bash will auto magically expand ${Name} for each line and place it into the sql command before sending it into sqlplus
Do you have set define on ? Is your wildcard & ? You could check glogin.sql to know.
And yes, establishing n connections to pass n queries is probably not a good solution. Maybe it's faster for you to develop and you will do that one time, but if not, you should maybe think of crafting a procedure.