impossible capture sql error with shell script - sql

I have 2 shell scripts. One with data connection like that (script1.sh):
#!/usr/bin/ksh
query="$1"
whenever sqlerror exit 3
connect $user/$pass#sid
${query}
EOF
echo $?
if [ 0 -ne "$?" ]; then
exit 1
fi
and other is a shell script bigger where I execute sql commands like these:
#!/usr/bin/ksh
set -x
$PATH/script1.sh "
--set serveroutput on
--set feedback off
insert into table (column) values ('$1');
commit;
"
if [[ $? != 0 ]]
then
echo "Error"
exit 3
else
echo "Ok"
fi
............
..............
The problem is that these second script won't detect error in sql commands and always continues with all code. I put traces and I check that rc is always 0.
Could you help me to can detect errors if the sql failed? Thanks!

Related

How to run Oracle pl/sql or select query within a case statement in unix shell script

I am trying to run select statement within case statement in Unix Shell script, but getting unexpected end of file error.
I want to run particular select statement depending on the output of previous sql script ran in shell script. The output from previous sql script is spooled to a log, required pattern is fetched into a variable, which is used in case statement.
My script
#!/usr/bin/sh
exec > check_company_details.log 2>&1
sqlplus username/password#database << EOF
#check_company_details.sql $1
exit;
EOF
pool=$(cat company.log | grep dbPool | awk {'print $5'})
#everything is working till above steps
#if sqlplus command is removed from below case statements, correct output of echo is returned.
case $pool in
dbpool1)
echo "DBPool is POOL1"
sqlplus username/password#database<<EOF
select name from v\$database;
exit;
EOF
;;
dbpool2)
echo "DBPool is POOL1"
sqlplus username/password#database<<EOF
select name from v\$database;
exit;
EOF
;;
dbpool3)
echo "DBPool is DC4POOL1"
sqlplus username/password#database<<EOF
select name from v\$database;
exit;
EOF
;;
*)
echo No Results
;;
esac
Error message:
*./check_company_details.sh: line 37: syntax error: unexpected end of file*
A here doc end string should not have leading whitespace. This means you should rewrite
dbpool3)
echo "DBPool is DC4POOL1"
sqlplus username/password#database<<EOF
select name from v\$database;
exit;
EOF
as
dbpool3)
echo "DBPool is DC4POOL1"
sqlplus username/password#database<<EOF
select name from v\$database;
exit;
EOF
and the same goes for the other cases.
You should also say fgrep dbPool company.log instead of needlessly using cat and instead of using grep when you are not feeding in a regex. You also have the quotes around your awk script in a weird place; it works but it's not what it should be.
pool=$(cat company.log | grep dbPool | awk {'print $5'})
becomes
pool=$(fgrep dbPool company.log | awk '{print $5}')
You should not expand $pool without quoting it, e.g. it should be case "$pool" in. Even if you think it won't have spaces in the variable you should do this for safety.
You should get in to the habit of checking all of your shell scripts with shellcheck whether they work or not.
I think you don't require a case block. You could use an if else statement with pool variable.
if [ "$pool" = "dbpool1" ] || [ "$pool" = "dbpool2" ] || [ "$pool" = "dbpool3" ]
then
echo "DBPool is ${pool}"
sqlplus username/password#database<<EOF
select name from v\$database;
exit
EOF
else
echo "No Results"
fi

sending email based on hive query output

How can I send email based on a hive query output. Say I have a table where I want to check if the number is between two other numbers of a different table. I can check that in a sql query and return sql output as 0 or 1.
Now the question is how can I send email using mailx or equivalent from the same script based on that sql output.
$ var=hive -S -e "select '0' from test;"
$ echo $var
0
$ var=hive -S -e "select '1' from test;"
$ echo $var
1
Option : Use a shell action in oozie to run a shell script which will execute the hive command inline and capture the output as 0/1 in a variable. Use the variable in the shell to call mailx.
Workflow is your choice to run, you can use oozie or third-party tools or the famous cron job. You can leverage the below shell script for sending the emails based on the output of the Beeline/hive.
#!bin/bash
#Output variable from hive ql
Output=beeline -u ${hiveConnectionSTRING} --silent=true -e "your query that pulls the output as 0 or 1"
#Condition check and sending the email with mailx utility
if [ $Output -gt 0 ];
then
echo "output is zero"
#Email need to be added
#Username=From address(This is the name before your domain EX: Stack#domain name)
echo -e 'your email message should be here \n\n\n\nThank you,' | mailx -r $UserName -s 'Your Subject' -c stackoverflow#gmail.com(your email cc address) -- stackoverflow#gmail.com(your email to address)0
else
echo -e 'your email message should be here \n\n\n\nThank you,' | mailx -r $UserName -s 'Your Subject' -c stackoverflow#gmail.com(your email cc address) -- stackoverflow#gmail.com(your email to address)0
echo "output is 1"
fi

Logging messages in Hive .hql file

I have some insert statements to run in Hive. I am planning to put them in .hql file and run it through beeline -f option. Is there a way I can echo some log messages in between inserts so that I know the progress. Like :
echo "Starting the inserts ........."
insert1
echo "Insert 1 complete"
insert2
echo "Insert script is complete"
I tried putting echo statements by using linux shell command echo as
!echo ""
But it's not recognizing echo as a command
!sh echo ...
!sh echo "Starting the inserts ........."
insert ...
!sh echo "Insert 1 complete"
insert ...
!sh echo "Insert script is complete"
set msg = "Starting Insert";
set msg;
insert into .... ;
set msg = "Insert complete";
set msg;

get exit code of a background process that completed a while ago

This may not be a completely new topic but I ran into a little bit odd situation.
I'm processing about 1000 files in a loop by kicking off a script in background. I want to take some actions on the files based on the exit code each process returns. By the time I go in a loop to wait for each process to complete I found that some of the process were already done. I modified the script to wait only if pgrep finds a process and just assumed a process completed successfully otherwise. The problem is- I have to know exit code of each process in order to take action on the corresponding file. Any ideas?
pid_list=()
for FILE in $SOME_FOLDER
do
(process with FILE as parameter) &
done
for pid in "${pid_list[#]}"
do
if pgrep $pid; then #process could have just completed as we got here
if wait $pid; then
echo "process $pid successfully completed!" >> $logfile
else
echo "process $pid failed!" >> $logfile
fnc_error_exit
fi
else
echo "assumed that process $pid successfully completed but I DON'T KNOW THE EXIT CODE!" >> $logfile
continue
fi
done
I cannot solve your problem exactly.
Since I do not know the exact situation of you, anyway, could you try another method using parent and child scripts ?
For example, the topmost script is like this:
for FILE in $HOME/*.txt
do
parent.sh $FILE &
done
Then, the parent.sh is like this:
child.sh $1
RC=$?
case $RC in
0 ) echo Exit code 0 for $1 >> parent.log
;;
1 ) echo Exit code 1 for $1 >> parent.log
;;
* ) echo Other Exit code $RC for $1 >> parent.log
;;
esac
The child script is like this:
grep -q hello $1
Then, the parent.sh will handle every exit code of the child.sh
All files will be handled by a parent.sh, without a missing handling.

How to check return value of Find statment in shell script?

How can I check the return value of "Find" statement in shell script
I am use Find in my script , if find statement don't find any file the execute exit !!
I want to check the return value of "Find" if it found any files or not
You can redirect output of the find command to a file called say output.txt then you can check if the size of that file is 0 or not by using -s option;
if [[ -s "output.txt" ]]
then
echo "File is not empty!"
else
echo "File is empty!"
fi
You can count the number of files found by find using the wc -l command:
export result=`find . -name *.txt | wc -l`
You can now check result to see how many files where found
if [ $result == "0" ]; then echo zero found; fi