I have a SSIS-project which uses xml-configuration file (dtsConfig) where the connection string to source data base is given. Configuration file is stored to environmental variable.
Data needs to pulled from four different data bases, i.e. now I need to run the same set of packages four times by using four different connection strings.
I can make four different configuration files where each of them has a different connection string and update it to the environmental variable after each run. This is how I'm doing it now and it works ok, but I wouldn't like to keep on updating the env variable all the time.
Or then I can use the same configuration file and just update the connection string after each run. But I think it's even worse idea than having four different files.
What I would like to do is dynamically change the connection string after each run.
I have a master-package which runs the set of packages I want. So I was thinking of just adding this master package four times in the control flow, after each run I'd need to update the connection string which then would be used at the next run. But how to actually do this?
Or for each loop container which would contain the master package and would loop the it four times and changing the connection string after each iteration would be cool as well.
To run the packages sequentially, you could simply create a table or file with the connection strings (eg. 4 rows for the 4 data sources). You would then have a for each loop which would loop through the connections (from the table or file) and call the child package passing the connection string down to it as a variable. The child package would access the variable through a Package Configuration. The variable in the child package would be pointed to the connection string of the connection.
Related
I have created a package that does the following:
ExecuteSQLTask: queries db table and sets package variables from data returned
DataFlowTask starts
OleDBSource: uses package variables as parameters to call stored procedure
FlatFileDestination: uses package variables to save a tab delimited file in the correct location and filename
SendEmailTask: uses package variables to email the file as attachment to recipient
I have the following vars:
FileName
sp_Param1
sp_param2
emailRecipient
SMTPServer
At design time, each var has dummy values. When I run the package in VS, it works perfectly. I can update the values in the db table and each execution picks up the new values and works.
The problem begins when I deploy the package to the database and execute it. It appears to not be setting the variables from the db table any longer and it uses the dummy data that I used during design time. What is going on?
This is merely a SSIS question for advanced programmers. I have a sql table that holds clientid, clientname, Filename, Ftplocationfolderpath, filelocationfolderpath
This table holds a unique record for each of my clients. As my client list grows I add a new row in my sql table for that client.
My question is this: Can I use the values in my sql table and somehow reference each of them in my SSIS package variables based on client id?
The reason for the sql table is that sometimes we get request to change the delivery or file name of a file we send externally. We would like to be able to change those things dynamically on the fly within the sql table instead of having to export the package each time and manually change then re-import the package. Each client has it's own SSIS package
let me know if this is feasible..I'd appreciate any insight
Yes, it is possible. There are two ways to approach this and it depends on how the job runs. First is if you are running for a single client for a single job run or if you are running for multiple clients for a single job run.
Either way, you will use the Execute SQL Task to retrieve data from the database and assign it to your variables.
You are running for a single client. This is fairly straightforward. In the Result Set, select the option for Single Row and map the single row's result to the package variables and go about your processing.
You are running for multiple clients. In the Result Set, select Full Result Set and assign the result to a single package variable that is of type Object - give it a meaningful name like ObjectRs. You will then add a ForEachLoop Enumerator:
Type: Foreach ADO Enumerator
ADO object source variable: Select the ObjectRs.
Enumerator Mode: Rows in all the tables (ADO.NET dataset only)
In Variable mappings, map all of the columns in their sequential order to the package variables. This effectively transforms the package into a series of single transactions that are looped.
Yes.
I assume that you run your package once per client or use some loop.
At the beginning of the "per client" code read all required values from the database into SSIS varaibles and the use these variables to define what you need. You should not hardcode client specific information in the package.
I need, at runtime, to change which connection is used by a table input step.
I have 3 connections defined: STG, DWH, DM.
I want to choose at runtime between them.
I can't create a new connection with parameters for server name, database name, etc. I must use the existing connections.
I wish I can write down a variable ${my_connection} in the box below, but the field cannot be edited.
Any suggestion?
Instead of using the variable in the connection selector of the Step, use the Host and Database name in the connection configuration.
EDIT:
You can pass a variable for the KTR to capture and test it using a Switch/Case step that calls a Transformation Executor, in this KTR you'll have your Table input and a copy rows to result step, results which will be captured after the Transformation Executor. You'll need 3 different KTR's, each with the Table input step that's going to execute the row passed by the Switch / Case step.
If i'm not clear or you need further explanation i can perhaps produce an example.
I am designing a SSIS package which imports data from one data base to other database. In reality I need to import data from multiple data source to one destination database. One way to do, that I know is to use package configuration for all data sources (connection strings) and run multiple instances of the same package. But I want something like, I should provide as many connection strings as I need at a point of time in my config file and my package should connect to each database reading data source connection strings from configuration and imports to my destination table.
Is this possible in any way?
If your Data Flow Task is going to be the same for every data source (e.g. using same table from each data source), you could do something like this:
Create an object variable, say ConnStrList. This will hold the list of connection strings.
In a script task, loop through your config file and add each connection string to ConnStrList.
Add a ForEach loop container, set it's data source to ConnStrList. Create a string variable, say ConnStr. This will hold an individual connection string. Set ConnStr as the iteration variable of the foreach loop.
Add your Data Flow Task inside the ForEach loop container.
Create an OLEDB connection manager for your OLEDB source. Go to Properties -> Expressions and for ConnectionString, assign the variable ConnStr.
If the DFT is going to be different for each scenario, you might want to have separate data flows for each source.
Please let me know if this answers your question, or if I am getting the scenario wrong.
Basically, I can't find a way to combine two types. For example, supposing I want a connection to refer to a text file that is held in a path identified by one of the environment variables, and whose filename is a string form of the current day.
I can use SQL to set the filename, and an environment variable to set the path, but I can't seem to find a way to join the two into a full file path which can then be used as a Text File connection. Am I missing something?
Never mind, found a way to get the same effect here :
http://www.sqlservercentral.com/articles/DTS/2851/
with an ActiveX script task.