SSIS Package Variables - variables

I have a bunch of different variables, most are strings, however I have one which is an integer. I need to grab this int from a table (where it is also of type INT). My problem is with setting the parameter for this variable. All my other variables (which are strings) i can use parameters like {0}, {1}, {2}..etc however Int32 Variables will not let me give it a value which is a parameter, how do I handle this?

We store variables in a configuration table and that way they are set at run-time from the configuration for that server. We also sometimes run parent child pacakgae so the varaible can be differnt for differnt clients running the same process. Then the varaible is passed to the child paackge from the parent package in a parant package varaible.
Alternatively, you can run an execute SQL task at the start of the process to grab the value from the table and set it to the int variable in the result set. This is often the best way if the variable will change over time and thus you don't want it to be part of the configuration.

Related

Can I get pipeline parameters in Synapse Studio Notebook dynamically without predefining parameter names in parameter cell?

In the Pipeline that triggers the Notebook, I'm passing some base parameters in, as below.
Right now, all the materials I read instructed me to declare variables inside the Notebook parameter cell with the same names, as below.
Is there a way that I can get the Pipeline parameters dynamically without pre-defining the variables? (Similar mechanism with sys.argv, in which a list of args are returned without the need of predefined variables)
To be able to pass parameters, you should have them declared in the synapse notebooks so that the values can be received and used as per requirement.
The closest functionality similar to sys.argv you can get is to just declare one parameter in the parameter cell.
Now, concatenate all the values separated by a delimiter like space or ,. So that, you can just use split and use the values same as sys.argv. So, instead of creating new parameters each time, you can just concatenate the value to this parameter itself.
#{pipeline().parameters.a} #{pipeline().parameters.b} #{pipeline().parameters.c}
When I run this, the values would be passed as shown below. You can use split on args variable and then use the values accordingly.

Data Factory expression substring? Is there a function similar like right?

Please help,
How could I extract 2019-04-02 out of the following string with Azure data flow expression?
ABC_DATASET-2019-04-02T02:10:03.5249248Z.parquet
The first part of the string received as a ChildItem from a GetMetaData activity is dynamically. So in this case it is ABC_DATASET that is dynamic.
Kind regards,
D
There are several ways to approach this problem, and they are really dependent on the format of the string value. Each of these approaches uses Derived Column to either create a new column or replace the existing column's value in the Data Flow.
Static format
If the format is always the same, meaning the length of the sections is always the same, then substring is simplest:
This will parse the string like so:
Useful reminder: substring and array indexes in Data Flow are 1-based.
Dynamic format
If the format of the base string is dynamic, things get a tad trickier. For this answer, I will assume that the basic format of {variabledata}-{timestamp}.parquet is consistent, so we can use the hyphen as a base delineator.
Derived Column has support for local variables, which is really useful when solving problems like this one. Let's start by creating a local variable to convert the string into an array based on the hyphen. This will lead to some other problems later since the string includes multiple hyphens thanks to the timestamp data, but we'll deal with that later. Inside the Derived Column Expression Builder, select "Locals":
On the right side, click "New" to create a local variable. We'll name it and define it using a split expression:
Press "OK" to save the local and go back to the Derived Column. Next, create another local variable for the yyyy portion of the date:
The cool part of this is I am now referencing the local variable array that I created in the previous step. I'll follow this pattern to create a local variable for MM too:
I'll do this one more time for the dd portion, but this time I have to do a bit more to get rid of all the extraneous data at the end of the string. Substring again turns out to be a good solution:
Now that I have the components I need isolated as variables, we just reconstruct them using string interpolation in the Derived Column:
Back in our data preview, we can see the results:
Where else to go from here
If these solutions don't address your problem, then you have to get creative. Here are some other functions that may help:
regexSplit
left
right
dropLeft
dropRight

ODI Declare variable

I have ODI ( Oracle Data Integrator)scenario in which I need to give some input parameters when the ODI scenario is executed. For this I have created the declared variable and used the same in one of the interface but when I ran the scenario input parameters are prompted as desired given the necessary value, the scenario is successful but I see no value under variable and interface loaded with zero records. Please share your thoughts.
Declared variable text data type as I am passing string value.

Pentaho Data Integration - Pass dynamic value for 'Add sequence' as Start

Can we pass any dynamic value (which is the max value of another table column) in "Start at Value" in ADD Sequence step.
Please guide me.
Yes, but as the step is written you'll have to be sneaky about it.
Create two transforms and wrap them up in a job. In the first transform, query that database to get the value you want, then store it in a variable. Then in the second transform, which you should execute in the job after the first, in the Add Sequence step use variable substitution on the Start at Value field to sub in the value you previously extracted from the earlier transform.
Note that you can't do this all in one transform because there is no way to ensure that the variable will be set before the Add Sequence step (although it might seem like Wait steps would make this possible, I've tried it in the past and was unsuccessful and so had to go with the methods described above).

SSIS - How do I see/set the field types in a Recordset?

I'm looking at an inherited SSIS package, and a stored procedure is sending records to a recordset called USER:NEW_RECORDS. It's of type Object, and the value is System.Object. It is then used for inputting that data to a SQL table. We're getting an error, because it seems that the numeric results of the stored procedure are being put in a DT_WSTR field, and then failing when it is then put into a decimal field in the database.
Most of the records are working, but one, which happens to have a longer number of decimal digits, is failing.
I want to see exactly what my SSIS recordset field types are, and probably change them, so I can force the data to be truncated properly and copied. Or, perhaps, I'm not even looking at this correctly. The data is put into the recordset using a SQL Task that executes the stored procedure.
Edit: It appears that this particular recordset is used twice, and this is the second use of it. I'm thinking that perhaps it has the data types of the first use. But I can't put a Data Viewer on a SQL Task, can I?
I am having the same trouble, so I directed the flow behind the record set into a flat file.
I did make a new recordset to use, so that the other one was not used. And while I never did figure out how to see the data, I could change the data types of the types in the parameter mapping, which was apparently what was needed. I changed a type from NUMERIC to FLOAT, and it quit complaining about some of the data.
This question may be too specific to my own problem to be of use to others. I may delete it.