Hive: passing parameters - hive

#libjack:
Hi,
I wanted to pass parameters into the hql from ksh script.
I went through Apache hive using variables document. The below worked fine where I set the values inside hql:
SET hivevar:parm1=valu1;
CREATE SCHEMA IF NOT EXISTS ${parm1};
But I wanted to pass the parameters while invoking hive inside ksh. I have the variables and its value in a file as below:
--hiveconf:
SET hivevar:parm1=valu1;
--Inside .ksh:
hive --hivevar "'cat ${hiveconf}`" -f hiveScript.hql
--hiveScript.hql:
CREATE SCHEMA IF NOT EXISTS ${parm1};
This fails. I tried with hiveconf as well.
How to pass the variable to hql?
Is it possible to pass as a file?
Thanks.

Related

Newman pass variable from GitLab

I have pipeline on GitLab and there the variable - ENV_VAR. This variable is changing based on branch for pipeline.
In the same yml file I have script with newman, where I want to pass this variable like this -> newman run ... -e test/apis/$ENV_VAR_environment.json
But the issue I have right now is that it seems the variable is not being passed as i want. The pipeline shows error - cannot read the test/apis/here_should_be_the_variable_name.json
Is there a way to pass this variable into the file source?
It looks like you only need to enclose the variable name in braces:
-e test/apis/${ENV_VAR}_environment.json
because test/apis/$ENV_VAR_environment.json means that it looks for $ENV_VAR_environment variable which obviously does not exist.

PLSQL pass parameter to .sql file from .shl script

I am trying to run a shell script with values and pass those values into a .sql file which will call my plsql procedures with these values. I've tried a lot with no luck.
What I have now:
shl script: RunSql $SQL_PATH/update_load_status.sql 'N'
sql script: execute dbms_output.put_line('&1'); (based on other code in system - (&1) gives a bind var not declared)
Will multiple values work in a similar manner?
Thanks!
Use the DEFINE keyword to capture the arguments passed to a shell script
define first_arg=&1
define second_arg=&2
BEGIN
dbms_output.put_line('&first_arg');
dbms_output.put_line('&second_arg');
END;
/

How to pass shell variable to pig param file

How we can pass shell variable to pig param file. As an example I have a shell variable defined as DB_NAME. i would like to define my pig parameter file as
p_db_nm=$DB_NAME
I tried like above which does not work and i did try like echo $DB_NAME does not work either.
I'm aware that i can pass this by using -param in command line but i have many variables which i would like to put it in param file but the values will be defined in shell script. I searched many topics in google and didn't have any luck!!!
My question is similar what was posted in http://grokbase.com/t/pig/user/09bdjeeftk/is-it-possible-to-use-an-env-variable-in-parameters-file but i see no workable solution is posted.
Can anyone help?
you can pass parameter file using –param_file option.
if Parameter File named "pig.cfg" defined like below,
p_db_nm=$DB_NAME
in the shell, pig command will be like this,
pig -param_file pig.cfg
and finally in your pig, you can use does variables named by KEY in the cfg file. (in this case, $p_db_nm)

Powershell: Specify file path as variable

I am running the following SQL query through a powershell script and need to run the script multiple times against different files. So what I am trying to figure out is how to specify a file path as a variable when I run the script?
update [$Db_name].[dbo].[$BatchTable]
set [$Db_name].[dbo].[$BatchTable].Wave = 'Wave1.1'
from [$Db_name].[dbo].[$BatchTable]
inner join OPENROWSET(BULK 'FilePath\file.csv',
FORMATFILE= 'E:\import.xml') AS a
on ([$Db_name].[dbo].[$BatchTable].Name= a.Name) and
([$Db_name].[dbo].[$BatchTable].Domain = a.Domain)
The 'FilePath\file.csv' is the file path I need to define as a variable so that my code would instead look like this:
inner join OPENROWSET(BULK '$INPUTFILEPATH',
FORMATFILE= 'E:\import.xml') AS a
Any help or potentially better methods to accomplish this would help very much.
From the command like I want to be able to run the script like this:
CMD: updatescript.ps1 $INPUTFILEPATH = C:\Documents\myfile.csv
Again, I'm not sure this is the best way to go about this?
You're nearly there.
You will need to add a parameter block at the very start of your script e.g.
Param(
[Parameter(Mandatory=$true)]
[ValidateScript({Test-Path $_ -PathType 'leaf'})]
[string] $InputFilePath
)
This creates a mandatory (not optional) string parameter, called InputFilePath, and the ValidateScript is code used to validate the parameter, in this case checking the file exists using the Test-Path cmdlet and pathtype of leaf (if checking existence of a directory use 'container').
When running your script use the syntax below:
updatescript.ps1 -INPUTFILEPATH "C:\Documents\myfile.csv"
and in the script use the variable as the path exactly as in your question:
inner join OPENROWSET(BULK '$INPUTFILEPATH',
FORMATFILE= 'E:\import.xml') AS a
NOTE: in powershell when using parameters when running a script you only need to use the least amount of characters that uniquely identify that parameter from all the others in your param block - in this case -I works just as well as -InputFilePath.
You can pass command line parameters to the powershell script using param.
Example:
param(
[string]$INPUTFILEPATH
)
And then call the script as follows:
updatescript.ps1 -INPUTFILEPATH C:\Documents\myfile.csv
More details about cmd line parameters can be found here

Is it possible to pass the value of a parameter to an UDF constructor?

I've written a UDF which takes a constructor parameter.
I've successfully initialized and used it in grunt as
grunt> register mylib.jar
grunt> define Function com.company.pig.udf.MyFunction('param-value');
But I can't initialize it using a Pig parameter as in
grunt> define Decrypt com.company.pig.udf.MyFunction($secret);
or
grunt> define Decrypt com.company.pig.udf.MyFunction('$secret');
I tried to initialize $secret using both -param and -param_file options.
Are Pig parameters supported as UDF constructor arguments at all?
The function I'm loading is used to decrypt some data and the param-value contains special characters as "%$!" but no quotes. So I'm unsure whether I'm running into an value expansion issue or a parameter there is just not being expanded.
I'm setting the parameter value without quotes in a param file:
secret = My$ecr%t!
It is definitely possible to pass parameter values to UDFs. The problem seems to be the $ sign.
This should the correct syntax:
define Decrypt com.company.pig.udf.MyFunction('$secret');
In your parameter file put:
secret=My\$ecr%t!
The -dryrun option can be helpful for this it will create a file called <nameofpigfile>.pig.subsituted and not run the actual script.
pig -x local -dryrun -f mypigfile.pig -param_file myparameterfile
Also, you probably need pig >= 0.11
https://issues.apache.org/jira/browse/PIG-2931