I am trying to create a transformation to validate data obtained on the table input step , which is then written to a text file. I want this transformation to be generic, that is I should be able to use different table name parameters to run the transformation and get the text file output. How do I accomplish the validation part to do this. So far I have been able to validate only one table input., when I give a different table name I get an error, as the validator cannot find he earlier table's fields to validate
Related
Is it possible to write an output parameter to a dataset?
I have a meta data activity that stores the file name of an azure blob dataset and I would like write that value into another azure blob dataset as an additional column via a copy activity.
Thanks
If you are looking to get the output of the previous operation as an input to the next operation, you could probably go ahead in the following manner,
I am hoping that the attribute you are getting is the child Items, the values for this can be obtained in the next step using the following expression.
#activity('Name_of_activity').output.childItems.
This would return an Array of your subfolders.
The following link should help you with the expression in ADF
Will come directly to the question.
Have 2 parameter like filename and table name. The requirement is to upload the data from the excel sheet to the database table enter in the other parameter. This should be in run time. No hardcoding of field names and that program should be flexible enough to suite any table. Please help.
I can think of two possible approaches:
Dynamic code generation -- write a program which writes a program
Use dynamic type tools
For 1. try googling
For 2. see https://wiki.scn.sap.com/wiki/display/Snippets/Example+-+create+a+dynamic+internal+table - this wiki shows a way (not sure if it is overkill as it creates the type from scratch whereas any table in your SAP system is already a defined type in the Data Dictionary).
You can do easily reference a parameterised table in Open SQL e.g. MODIFY (p_tab) ...
Perhaps you could do a generic SPLIT of a line read in from file by the delimiter into a table of fields - you can then use ASSIGN COMPONENT to match the fields you have read in to the fields in your internal type.
If you are doing this I think a white list of allowed tables would be wise - and auth checks. Otherwise someone could upload SAP standard tables with no authorisation.
I am developing a migration tool and using Talend ETL tool (Free edition).
Challenges faced:-
is it possible to create a Talend job that uses dynamic schema every time it runs i.e. no hard-coded mappings in tMap component.
I want user to give a input CSV/Excel file and the job should create mappings on the basis of that input file. Is it possible in talend?
Any other free source ETL tool can also be helpful, or any sample job.
Yes, this can be done in Talend but if you do not wish to use a tMap then your table and file must match exactly. The way we have implemented it is for stage tables which are all datatype of varchar. This works when you are loading raw data into a stage table, and your validation is done after the load, prior to loading the stage data into a data warehouse.
Here is a summary of our method:
the filenames contain the table name so the process starts with a tFileList and parsing out the table name from the file name.
using tMSSQLColumnList obtain each column name, type, and length for the table (one way is to store it as an inline table in tFixedFlowInput)
run this thru a tSetDynamicSchema to produce your dynamic for that table
use a file input reference the dynamic schema.
load that into a MSSQLOutput again referencing the dynamic schema.
One more note on data types. It may work with data types than varchar, but our stage tables only have varchar and datetime. We had issues with datetime, so we filtered out those column types with a tMap.
Keep in mind, this is a summary to point you in the right direction, not a precise tutorial. But with this info in your hands, it can save you many hours of work while building your solution.
I have files abc.xlsx, 1234.xlsx, and xyz.xlsx in some folder. My requirement is to develop a transformation where the Microsoft Excel Input in PDI (Pentaho Data Integration) should only pick the file based on the output of a sql query. If the output query is abc.xlsx. Microsoft Excel Input should pick of abc.xlsx for further processing. How do I achieve this? Would really appreciate your help. Thanks.
Transformations in Kettle run asynchronously, so you're probably looking into needing a job for this.
Files to create
Create a transformation that performs the SQL query you're looking for and populates a variable based on the result
Create a transformation that pulls data from the Excel file, using the variable populated as the filename
Create a job that executes the first transformation, then steps into the second transformation
Jobs run sequentially, so it will execute the first transformation, perform the query, get the result, and set a variable. Variables need to be set and retrieved in different transformations because of their asynchronous nature. This is the reason for the second transformation; the job won't step into the second transformation until the first one is done running (therefore, not until the variable is populated).
This is all assuming you only want to run the transformation once, expecting a single result from the query. If you want to loop it, pulling data from a set, then setup is a little bit different.
The Excel input step has a "accept filenames from previous step" option. You can have a table input build the full path of the file you want to read (or you somehow build it later knowing the base dir and the short filename), pass the filename to the excel input, tick that box and specify the step and the field you want to use for the filename.
I have a transformation which has table input (which fetch data from db) and and csv output (which will save table input data into csv file.)
And a Job which runs this transformation on weekly basis.
What I want now that whenever my report will get generated, a new dynamic password would create.
Please help me on this. Iam using pdi.