Split transformation steps based on Parameter in Pentaho Data Integration - pentaho

I have a transformation with a boolean parameter. If the parameter is 1, I want the data to flow through one transformation path. But if the parameter is 0, I want it to flow through a different transformation path. What step can I possibly use to do this?

Use "Filter Rows" step in PDI. Check the image below:
Give the parameter value as "1" or "0" in the filter rows section. Also properly assign the path for both true and false conditions.
For more you can check this link.
Hope it helps :)

Related

I am facing issue in jmeter

When I record and run the test in login transaction one request is failed.i check that request I found in post data session id keep on chenging.i did correlation for that Id .after I want to replace the variable name insted of that Id.but it not replace. Because that Id is not there any subsequent requests and any parameter value undre the requests.how I need to handle that Id how to replace it... please help me for the above issue
Note: I search that ID in view results tree it showing that Id passing in 3subsequent requests
Make sure that you really "did correlation for that Id", i.e.
Double check that the variable has the anticipated value using Debug Sampler and View Results Tree listener combination
Make sure that the placement of the Post-Processor performing the correlation is correct as according to JMeter Scoping Rules the Post-Processor is applied to:
One Samples only if it's placed as a child of this sampler
All the Samplers which reside at the same level (or lower) with the Post-Processor
so it might be the case you're overwriting the variable with the empty value somewhere else.

No data output when extracting part of filename in U-SQL

When I do an extract from multiple files and include part of the filename in the fields list and in the FROM clause (e.g. FROM "/input/filename-{filedate:*}.nc"), the resulting output file only contains a header row. If I remove "filedate" from the fields list and the FROM clause, I get the correct output.
I noticed in the job graph that when including "filedate", an "Empty Input" and an "Extract Cross" step is added before the "PodAggregate" step, and in the "Extract Cross" no data is written. What is this step?
Also, if I run the original extract including "filedate" locally, I get the correct output, so it's only in ADLA this error occurs.
I use a custom extractor and I don't know if this has anything to do with it. I haven't tested with a built-in extractor.
We released the new "fast file set" option by default. Unfortunately, it introduced a regression for some plans. Until we fix it, please add the following statement to your script:
SET ##InternalDebug = "FileSetV2:off";
Our apologies for any inconvenience this may have caused.

Unable to access the whole payload structure in flowVars

I have a X12 file in which I have Batch Number at BHT03 which I need to put in flowVars. When I am trying to set it in a flow variable I am able to access the payload till 837 tree structure. The structure from Heading onwards doesnt appear when I enter a dot after "837". Even after writing the path manually it fetches null. Is there any constraint that we cannot set value in flowvars with tree structure?
The structure/path is as follows: (want to set the below value in flow vars)
#[payload.TransactionSets.v005010."837".Heading.0100_BHT.BHT03]
Able to set the flowvars value as below:
#[payload.TransactionSets.v005010.837]
Try using:
#[message.payload.'TransactionSets'.'v005010'.'837'.'Heading'.'0100_BHT'.'BHT03']

Pentaho Kettle: how to pass variable from transformation to another transformation inside job

I have two transformations in the job.
In the first trasnformation - I get details about the file.
Now I would like to pass this information to the second transformation, I have set variable in the settings parameters of the trasnformation #2 and use Get Variables inside - but the values are not passed.
See attached sample: https://www.hightail.com/download/bXBiV28wMVhLVlZWeHNUQw
LOG_1 - displays file time, however LOG_2 and LOG_3 are not producing any value.
How can I pass variable across transformations and to parent job.
Try checking the box in the second transformation as below image:
Remove all the logs from the Job file you shared.
You may try reading this blog.
Hope this will resolve your issue :)

Using Input Port in File List Component

I'm trying to pass a parameter into the File List component through input port 0. All of my attempts thus far have been met with an error,
Input edge has no effect. Disconnect edge or use metadata fields as parameters in Target URL, Source path or Target path.
Ideally, I would like the Target URL to be something along the lines of http://${S3_ACCESS_KEY}:${S3_SECRET_KEY}#${MY_BUCKET}.s3.amazonaws.com/reports/${port:0.value}/*_interestingReport.csv where ${port:0.value} is the value passed in from the input port.
What is the correct way to use data coming in on input port 0?
The way how passing parameters from input edge for File List (but other file components as well) works, is that you use the name of the metadata column from an input edge and enclose it between ${ and }.
So if the metadata on the edge have a field called directory, which contains the dir you want to use, this is the way how to do it.
http://${S3_ACCESS_KEY}:${S3_SECRET_KEY}#${MY_BUCKET}.s3.amazonaws.com/reports/${directory}/*_interestingReport.csv
Let me show you an example of a very simple graph which uses a 'Data Generator' that creates the flow and sends it as the input of a 'File List' component'.
http://www.filedropper.com/inputportfilelist_1
As you can see the way the input field is referenced is '${DATA_SOURCE_DIR}/${fileDir}/', being 'fileDir' the only field contained in the metadata of the link that connects both components. It'll basically list the files located in ${PROJECT}\data\source\manifests.
I hope this helps.