In a Pentaho 9.1 Job, using the "Get File(s) from SFTP" step, I download a CSV file. I would like to use that downloaded file name in the subject line of the email message in the "Mail" step.
I have tried calling it as a variable but it is not really a variable but a "results" value. Eample if what I tried below...
Downloaded file name = "somefile.csv"
Syntax in the "Mail" step for "Subject" = "File Processing Complete: ${short_filename}"
When email sends the subject is exactly "File Processing Complete: ${short_filename}" when I need it to be "File Processing Complete: somefile.csv"
We can get file information from result. But unfortunately this step available only in the transformation. Thus, we need to get help a transformation to read the file name. I have prepared a SOLUTION for you. You need to give right information for SFTP & MAIL configuration. Also please run job "getFromSFTP".
[getFromSFTP.kjb] Here, I download the csv file from sftp and send file information to transformation
[getFileName.ktr] Here, read the file information and send filename to another job for mail sending.
[sendMail.kjb] This job only use for send the mail with filename = ${filename1}
Related
I want to test unique link taken from csv file in Jmeter. I have a csv file with unique values - "value 1", "value 2" .....
For each thread, I need to append it as part of url(or path).
Ex: BaseURL: example.com
For thread 1, example.com/value1
For thread2, example.com/value2
How can I do this in JMeter? Any help will be highly appreciated. I have created CSV and all. Just need to know how to take that value and set as part of path.
Put your "base url" into "Server name or IP" tab of the HTTP Request Defaults:
Configure CSV Data Set Config to read the values from the file and store them into a JMeter Variable
Use the variable from the step 2 in "Path" field of the HTTP Request sampler:
That should be it, each thread will read the next value from the CSV file on each iteration:
Demo:
In CSV Data Set Config add path as Variables Names, use Sharing mode All Threads
In HTTP Request Path field add ${path}
Each thread will send a different path taken from CSV line
So I've got about 10 JSON files that I have to stuff into an Elasticsearch setup. I have 3 steps currently, "Get file names", "JSON Input", and "Elasticsearch bulk insert". When I look at the Step metrics, I see that Get File Names is correctly reading the 10 files. But when it comes to the JSON input, only the first file is processed. What could be going on.
Here is an image of my setup, and I've attached the ktr file.
Link to the ktr file as it stands currently
Any help is greatly appreciated.
In the Content tab of the step you have the "Limit" atribute set to 1, you can Edit this by unchecking the "Source is from a previous step" option in the File tab, then you set "Limit" to 0.
I am using SAP DATA Services v. 4.2.
I am trying to acquire an XML file in input.
I created a new XML Schema starting from a .xsd file
When i launch the job i have this error:
2076818752FIL-0522267/25/2017 2:56:35 PM|Data flow DF_FE_XXXX
2076818752FIL-0522267/25/2017 2:56:35 PM<XML file reader->READ MESSAGE XX_INPUT_FILE OUTPUT(XX_INPUT_FILE)> cannot find file location object <%1> in repository.
24736 20092 RUN-050304 7/26/2017 9:18:39 AM Function call <raise_exception ( Error 52226 gestito in Error_handling ) > failed, due to error <50316>
What am i doing wrong?
Thanks
problem in the way how you identify file location in Data File(s) section of your format, BODS thinks that you provide some File Location and it don't find such
for more information about "File Locations"
I am executing SSIS package with Batch file execution. At the end it will generate multiple predefined named log files. I want to attache that log files in send mail task.
Log files are located as e.g. (D:\Folder1\Folder2\Folder3\ABC.log) and (D:\Folder 1\Folder 2\Folder 3\XYZ.log).
I am using following expression in Send Mail Task with File Attachment. "D:\FOLDER 1\FOLDER 2\FOLDER 3\*.log" but it doesn't recognize log file.
(There are "2 slashes" in entire path)
Please help me to attach log files.
Try creating two variables.
vPath = D:\FOLDER 1\FOLDER 2\FOLDER 3\
vLogFileMask = *.log
And add these two variables in expressions vPath + vLogFileMask.
I have a file in my D: drive of my computer and I want to copy this file to an SAP application server so that I am able to see my file with transaction AL11.
I know that I can create a file with AL11 but I want do this in ABAP.
Of course in my search I find this code but I cannot solve my problem with it.
data: unixcom like rlgrap-filename.
data: begin of tabl occurs 500,
line(400),
end of tabl.
dir =
unixcom = 'mkdir mydir'. "command to create dir
"to execute the unix command
call 'SYSTEM' id 'COMMAND' field unixcom
id 'TAB' field tabl[].
To upload the file to the application server, there are three steps to be followed. To open the file use the below statement:
Step1: OPEN DATASET file name FOR INPUT IN TEXT MODE ENCODING DEFAULT.
To write into the application server use.
Step2: TRANSFER name TO file name.
Dont forget to close the file once it is transferred.
Step3: CLOSE DATASET file name.
Plese mark with correct answer, if it helps! :)
If you want to do this using ABAP you could create a small report that uses the function module GUI_UPLOAD to get the file from your local disk into an internal table and then write it to the application server with something like this:
lv_filename = '\\path\to\al11\directory\file.txt'.
OPEN DATASET lv_filename FOR OUTPUT IN TEXT MODE ENCODING UTF-8.
LOOP AT lt_contents INTO lv_line.
TRANSFER lv_line TO lv_filename.
ENDLOOP.
CLOSE DATASET lv_filename.
I used CG3Z transaction and with this transaction I was able to copy a file in the application server directory.