How to pass argument to pg loader connection string in postgres - pgscript

LOAD CSV
FROM '/dbf/appworx/pgscript/temp_table_data.csv' (INV_ID)
INTO postgresql://username:password#hostname:port_name/dbname
TARGET TABLE temp_inv_table
above Pgloader I am running using shell script like
#!/bin/ksh
echo "Start"
#enviroment variable
. /xzy/DBset
echo "Fetching password value"
pgloader temp_table_load2.load
but does not option to pass environment variables

Related

How to pass a parameter from shell script to SQL script?

I have 2 scripts - one shell and one sql.
My shell script is similar to this:
export nbr=&1
runsql script_name.sql
Im trying to pass a parameter for nbr while running the script.
The corresponding sql script is as such:
insert into table1
select * from table2
where year='&1'
I get the error as below:
"enter value for year: old 22: where year='$1')
new 22: where year='commit')"
I'll show how to deal with it easily with an example based on my own scripts.
In this script I define a simple function run_sql(DESCR, SCRIPT, DATABASES), where
DESCR - short description
SCRIPT - sqlplus run arguments, ie script name and its parameters
DATABASES - list of dbname defined above on which you want to run it
Then we can easily use it like this:
run_sql "1st script" "#sql1.sql param1" db1 db2
run_sql "2nd script" "#sql2.sql param1 param2 param3" db1 db2 db3
Here we execute:
"sql1.sql" with 1 argument on db1 and db2
"sql2.sql" with 3 arguments on db1,db2 and db3
And we save all output into own log files.
Full example with test output: https://gist.github.com/xtender/465951befeed7f0ae1a3fe112dcd7fe4
Simple script test.sh:
#!/bin/bash
# Here you can define your db connection strings:
db1=xtender/pass#PDB19C_11
db2=xtender/pass#PDB19C_11
db3=xtender/pass#PDB19C_11
#####################################################################
# functions:
# Function syntax: run_sql(DESCR, SCRIPT, DATABASES)
# where
# DESCR - short description
# SCRIPT - sqlplus run arguments, ie script name and its parameters
# DATABASES - list of dbname defined above on which you want to run it
run_sql(){
local DESCR="$1"; shift
local SCRIPT="$1"; shift
local databases=("$#")
echo =================================================
echo = $DESCR
echo = Going to execute $SCRIPT...
read -a res -p "Enter 'skip' to skip this step or press Enter to execute: "
if [[ $res = "skip" ]]
then
echo Skipping $SCRIPT...
else
echo Executing $SCRIPT...
for db in "${databases[#]}"
do
local cur=${!db}
echo Executing $SCRIPT on $db - $cur...
sqlplus -L -S ${cur} $SCRIPT >>log-$db.log 2>&1
echo Done.
done
echo =================================================
fi
}
#####################################################################
# Here we execute a script "sql1.sql" with one argument "param1" on db1 and db2:
run_sql "1st script" "#sql1.sql param1" db1 db2
# Here we execute a script "sql2.sql" with 3 arguments on db1,db2 and db3:
run_sql "2nd script" "#sql2.sql param1 param2 param3" db1 db2 db3
echo ============================================
echo === Done
echo ============================================
Then we can create sql scripts, for example sql1.sql and sql2.sql: sql1.sql requires one argument and sql2.sql - 3 arguments:
sql1.sql:
select '&1' as output from dual;
exit;
sql2.sql:
select
'&1' out1,
'&2' out2,
'&3' out3
from dual;
exit;

run os command and set out put to hive variable

Is it possible to run something like this in Hive CLI?
I am trying to pass file contents as a variable to another query.
set column_list=!cat /home/user/filename.lst ;
create table tabname as select $column_list from ...
if you have a query file you pass the variables as hiveconf
hive -hiveconf var1=abcd -f file.txt
or you can construct your query and then pass it to hive cli using -e
hive -e "create table ..."
file filename.lst
line
make a file test.sh,
temp=$(cat /home/user/filename.lst)
hive -f test.hql -hiveconf var=$temp
make a another file test.hql
create table test(${hiveconf:var} string);
on terminal
sh -x test.sh
It will pass the line to the test.hql and it will create a table with line as column;
note- all files should be in same directory .This script is passing only one variable.

dynamically fetching dynamic variable's value from properties file

Below unix commands works:
export myTempVar=myTempVar1
export myTempVar1=myTempVar2
eval echo '$'$myTempVar
This correctly prints myTempVar2.
However, what if myTempVar1=myTempVar2 is present in a properties file instead of directly in the script.
So my script will have
. $MYDIR/myProperties.properties
myTempVar=myTempVar1
myTempVar3=eval echo '$'$myTempVar
Above lines are not working and the value of myTempVar3 is not coming as myTempVar2.
myProperties.properties is having below line:
myTempVar1=myTempVar2
Using indirection is far safer than eval:
#!/bin/bash
. $MYDIR/myProperties.properties # myTempVar1=myTempVar2
myTempVar=myTempVar1
myTempVar3=${!myTempVar}
echo $myTempVar3
Gives:
myTempVar2
and you don't need the echo in eval:
eval myTempVar3='$'$myTempVar

Error executing shell command in pig script

I have a pig script where in the beginning I would like to generate a string of the dates of the past 7 days from a certain date (later used to retrieve log files for those days).
I attempt to do this with this line:
%declare CMD7 input= ; for i in {1..6}; do d=$(date -d "$DATE -i days" "+%Y-%m-%d"); input="\$input\$d,"; done; echo \$input
I get an error :
" ERROR 2999: Unexpected internal error. Error executing shell command: input= ; for i in {1..6}; do d=$(date -d "2012-07-10 -i days" "+%Y-%m-%d"); input="$input$d,"; done;. Command exit with exit code of 127"
however the shell command runs perfectly fine outside of pig. I am really not sure what is going wrong here.
Thank you!
I have got a working solution but not as streamlined as you want, essentially I don't manage to get Pig to execute a complex shell statement in the declare.
I first wrote a shell script (let's call it 6-days-back-from.sh):
#!/bin/bash
DATE=$1
for i in {1..6}; do d=$( date -d "$DATE -$i days" +%F ) ; echo -n "$d "; done
Then a pig script as follow (let's call it days.pig):
%declare my_date `./6-days-back-from.sh $DATE`
A = LOAD 'dual' USING PigStorage();
B = FOREACH A GENERATE '$my_date';
DUMP B
note that dual is a directory containing a text file with a single line of text, for the purpose of displaying our variable
I called the script as follow:
pig -x local -param DATE="2012-08-03" days.pig
and got the following output:
({(2012-08-02),(2012-08-01),(2012-07-31),(2012-07-30),(2012-07-29),(2012-07-28)})

Execute SQL from file in bash

I'm trying to load a sql from a file in bash and execute the loaded sql. The sql file needs to be versatile, meaning it cannot be altered in order to make things easy while being run in bash (escaping special characters like * )
So I have run into some problems:
If I read my sample.sql
SELECT * FROM SAMPLETABLE
to a variable with
ab=`cat sample.sql`
and execute it
db2 `echo $ab`
I receive an sql error because by doing a cat the * has been replaced by all the files in the directory of sample.sql.
Easy solution would be to replace "" with "\" . But I cannot do this, because the file needs to stay executable in programs like DB Visualizer etc.
Could someone give me hint in the right direction?
The DB2 command line processor has options that accept a filename as input, so you shouldn't need to load statements from a text file into a shell variable.
This command will execute all SQL statements in the file, with newline treated as the statement terminator:
db2 -f sample.sql
This command will execute all SQL statements in the file, with semicolon treated as the statement terminator:
db2 -t -f sample.sql
Other useful CLP flags are:
-x : Suppress the column headings
-v : Echo the statement text immediately before execution
-z : Tee a copy of all CLP output to the filename immediately following this flag
Redirect stdin from the file.
db2 < sample.sql
In case, you have a variable used in your script and wanted to get it replaced by the shell before executed in DB2 then use this approach:
Contents of File.sql:
cat <<xEOF
insert values(1,2) into ${MY_SCHEMA}.${MY_TABLE};
select * from ${MY_SCHEMA}.${MY_TABLE};
xEOF
In command prompt do:
export MY_SCHEMA='STAR'
export MY_TAVLE='DIMENSION'
Then you are all good to get it executed in DB2:
eval File.sq |db2 +p -t
The shell will replace the global variables and then DB2 will execute it.
Hope it helps.