pentaho PDI passing uservariable in command line - pentaho

I am trying to run a Transformation/Job by passing a user variable in command line.
I have tried by passing variable value as below.
sh pan.sh -file='test.ktr' '-param:input_directory=/path/to/directory' -level=basic
where input_directory is variable in transformation and i mentioned it as ${input_directory}
But when I do this, the pan is unable to find the variable value. It is throwing error as below
Could not list the contents of "file:///home/user1/pdi8.1/data-integration8.1/${input_directory}" because it is not a folder.
can someone help me on this. Thank you

To pass named parameters to your job or transformation, the parameters need to be defined in the properties window, shown here for a transformation. The default value is not needed, but works well for testing. Pay attention to capitalization.
So the pieces of the puzzle are:
From the command line, pass the parameter like -param:yourparam=yourvalue
Define this same parameter in the highest-level job or transformation
Use it as you would use any variable, with ${yourparam}

i think the parameter names to be used in job should be ${PARAM_NAME1}
using command line i follow the below convention
call "{Replace with kitchen.bat File Path}" /file:"{Replace with JOB File Path}" "-param:PARAM_NAME1=PARAM_VALUE1" "-param:PARAM_NAME2=PARAM_VALUE2"

Related

SSIS is doubling up backslashes

I am loading some file names and locations as variables into SSIS, then tried using foreach loop to execute a process task.
after a few unsuccessful attempts I realized SSIS is doubling up all the Backslashes in the fields I am loading into my variables. hence the network addresses not working.
can we stop this behavior?
What I load:
"\\BBBB001\shared\GGGG\PiMSSSRSReportsPath\THM022\HHHH-NextWorkingDay-at1530.pdf"
What I get:
"\\\\BBBB001\\shared\\GGGG\\PiMSSSRSReportsPath\\THM022\\HHHH-NextWorkingDay-at1530.pdf"
SSIS Execute Process task:
as you can see foxit reader doesn't recognize the later filename with double backslashes. if I manually inter the first value it will work.
For future reference, I found a workaround:
Instead of adding variables in Arguments section, I created a single variable including all the parameters for the file to be printed. something like this:
/t "FileLocation\FileName.pdf" PrinterName
And then put this variable in the expression section of the Execute process task, add argument and put that final variable in front it. like this:

TFS 2015 Can build variables access other build variables?

When I define a custom variable in the new TFS 2015 team build as follows:
Name: SomeOutput
Value: $(System.DefaultWorkingDirectory)\Some
...it doesn't seems to expand $(System.DefaultWorkingDirectory).
Is there a way around this?
EDIT:
At least it seems it's not expanded everywhere.
For example, in MSBuild-Arguments, /p:OUTPUT="$(SomeOutput)" is expanded to /p:OUTPUT="C:\TfsData\BuildAgents\_work\3\s\Some" but when i add a cmd line build task with tool set to cmd and parameter set to /k set, it prints
SOMEOUTPUT=$(System.DefaultWorkingDirectory)\Some
EDIT 2:
Here are my variables
This is my workflow step
And this is what the build prints
You can use the VSTS Variable Tasks extension from the Visual Studio Marketplace.
When you define a variable in the Variables screen and use other variables as value, they won't be expanded (as you may have expected). Instead the literal text is passed to the tasks in the workflow. Without this little task the following configuration won't work:
Variable Value
Build.DropLocation \\share\drops\$(Build.DefinitionName)\$(Build.BuildNumber)
By adding the Expand variable(s) task to the top of your workflow, it will take care of the expansion, so any task below it will receive the value you're after.
https://github.com/jessehouwing/vsts-variable-tasks/wiki/Expand-Variable
PS: The new agent (version 2.x) auto-expands variables now.
It can be achieved.
You may need use % % instead of $ to call the variables in cmd to print the result. It is also necessary to add call in the front of the command. Here is a simple example:
Note: System.DefaultWorkingDirectory is not available in cmd (not sure why); you need use System_DefaultWorkingDirectory instead. Details can be viewed in the logs.
I had the same problem - wanted to piece together a path made up of several built-in variables and pass it to a PS script.
Workaround:
I ended up combining the variables in the actual script through the corresponding generated environment variables (for example $env:BUILD_SOURCESDIRECTORY).
Not what I had in mind originally, but it works at least. Drawback - if I need to change the path, I always have to change the PS script instead of a build variable.

Pass parameter to SQL file from within another SQL file

I thought for sure there would be an SO question on this, but I haven't been able to find one.
I have 2 SQL files, myFile1.sql and myFile2.sql. myFile1.sql calls myFile2.sql like so:
-- In myFile1.sql:
#scripts/myFile2
This works with no problem, but now I'd like to pass an argument to the file. I've tried doing the following, with no success (results in a File Not Found exception):
#scripts/myFile2 'ImAnArgument'
Does anyone know what the syntax would be to do this?
I'm guessing your problem is that scripts/myFile2.sql is a relative path from the script it is located in. If that is so, then it is following that path from the directory where SQL*Plus was started (the current working directory). If this is the problem, then it's not the parameter that is the issue, but rather that SQL*Plus can't find the file. In this case, you should use ##, which invokes the path relative to the file it's located in.
The parameter should work just as you proposed (documentation). Parameters provided when invoking a file are placed into substitution variables (rather than bind variables) and can be referenced by using an ampersand followed by the argument number. In your example, 'ImAnArgument' would be &1.
After many attempts, I wasn't able to pass a parameter in (and I still don't understand why not). But here is what I did to get the same affect:
-- In myFile1.sql:
DEFINE my_arg = 'ImAnArgument';
#scripts/myFile2
Then
-- In myFile2.sql
-- Do stuff using the variable my_arg, such as
SELECT my_arg FROM my_table;

WebHCat & Pig - how to pass a parameter file to the job?

I am using HCatalog's WebHCat API to run Pig jobs, such as documented here:
https://cwiki.apache.org/confluence/display/Hive/WebHCat+Reference+Pig
I have no problem running a simple job but I would like to attach a parameters file to the job, such as one can do using pig command line's parameter: --param_file .
I assume this is possible through arg request's parameter, so I tried multiple things, such as passing:
'arg': '-param_file /path/to/param.file'
or:
'arg': {'param_file': '/path/to/param.file'}
None seems to work, and error stacks don't say much.
I would love to know if this is possible, and if so, how to correctly achieve this.
Many thanks
Correct usage:
'arg': ['-param_file', '/path/to/param.file']
Explanation:
By passing the value in arg,
'arg': {'-param_file': '/path/to/param.file'}
webhcat generates "-param_file" for the command prompt.
Pig throws the following error
ERROR org.apache.pig.Main - ERROR 2999: Unexpected internal error. Can not create a Path from a null string
Using a comma instead of the colon operator passes the path to file as a second argument.
webhcat will generate "-param_file" "/path/to/param.file"
P.S: I am using Requests library on python to make the REST calls

Kettle Spoon - variable in file name input

anyone know how to set variable for file name in 'Text File Input'?
I want the file name depends on when I execute the transformation, example:
D:\input_file_<variable>.txt
today = D:\input_file_20131128.txt
tomorrow = D:\input_file_20131129.txt
FYI, I'm using Kettle Spoon - 4.2.0
In set form, you can use variable as ${Variable_Name} in file name.
You should notice the system information:
Please remember that the variables you define with this step can't be used in this transformation. This is simply because all steps in a transformation run in parallel without a certain order of execution.
As alternative correct usage, you can set variables you want to use in the first transformation of a job