Dynamic SQL Script path in SSIS Data Flow task - sql

I need to store all the SQL Queries under one folder and refer in the SSIS package to better organize the list of SQL file I am using so we can easily change the SQL file later without having to rebuild the package. This will include all queries that I am using for "Execute SQL task " as well as the queries in "OLE DB Data Source" under Data Flow component.
Under Data Flow task, Instead of putting the SQL query for source data base into the Query Window:
I see four options under Data Access mode for OLE DB Data source-
1. Table or View
2. Table of view name variable
3. SQL Command
4. SQL Command from variable
I understand I could use a variable, store the query in the variable and refer it in "Execute SQL Task" but I am looking for a way to store all the queries in SQL files and it in Data Flow component as well as in "Execute SQL Script Task". I can't seem to find a way to make it dynamic in Data Flow task. Can anyone help with this please?

I don't think storing them as sql files is any good for any type of scenario.
Here is what I would do given what you have described.
You can store the queries as varchar in a table in a database instead of as files. Then you can foreach-loop over the result set and map each row to the variable that you would then use as the query for your oledb data source in the dataflow.
So create a variable and make a for each loop that loops over "select query from dbo.queries" or what ever. Set the output to the variable you created and in the container create your dataflow and set either the source-task with an expression or with "SQL Command from variable".
As for the control flow queries why not just have them be stored procedures that you can change when you need to?

Related

Load data to multiple tables in SQL Server using SSIS

I'm trying to find out how to load data from/to multiple tables in SQL Server.
I have 30+ tables in source and target database (both have the same table names and columns), so I used a foreach loop container to loop through each of the table with 1 DFT. I also have variables (SourceQuery & TargetQuery) that holds the query to use in ole db source/destination.
But when I use the TargetQuery in ole db destination I'm getting this error:
Anyone know a proper way to use a variable in the ole db destination? Or any other solution would be great as well.
To copy 30 tables I recommend using the "Transfer SQL Server Objects Task" component that is much more efficient than using a data flow
here you can see how to pass variables
Transfer SQL Server Objects Task

U-SQL job to query multiple tables with dynamic names

Our challenge is the following one :
in an Azure SQL database, we have multiple tables with the following table names : table_num where num is just an integer. These tables are created dynamically so the number of tables can vary. (from table_1, table_2 to table_N) All tables have the same columns.
As part of a U-SQL script file, we would like to execute the same query on all of these tables and generate an output csv file with the combined results of all these queries.
We tried several things :
U-SQL does not allow looping so we were thinking creating a View in our Azure SQL database that would combine all the tables using a cursor of some sort. Then, the U-SQL file would query this View (using external source). However, a View in Azure SQL database can only be created via a function and a function cannot execute dynamic SQL or even call a stored procedure...
We did not find a way to call a stored procedure of the external data source directly from U-SQL
we dont want to update our U-SQL job each time a new table is added...
Is there a way to do that in U-SQL through a custom extractor for instance? Any other ideas?
One solution I can think of is to use Azure Data Factory (v2) to assist in this.
You could create a pipeline with the following activities:
Lookup activity configured to execute the stored procedure
For Each activity that uses the output of the lookup activity as a source
As a child item use a U-Sql Activity that executes your U-Sql script which writes the output of a single table (the item of the For Each activity) to blob or datalake
Add a Copy Activity that merges the blobs from step 2.1 to one final blob.
If you have little or no experience working with ADF v2 do mind that it takes some time to get to know it but once you do, you won't regret it. Having a GUI to create the pipeline is a nice bonus.
Edit: as #wBob mentions another (far easier) solution is to somehow create a single table with all rows since all dynamically generated table have the same schema. You can create a stored procedure for populating this table for example.

From an SSIS package how should I execute a sql script stored in a table as a varchar(max)?

I'm storing large (varchar(max)) SQL scripts in a table. I'd like to execute the scripts in an SSIS package.
Looking at other posts on this site it's easy enough to get the varchar(max) into an object variable. But then what to do? Is there a way for an Execute Sql Task (SQLSourceType of Variable) to specify an Object variable rather than a String variable?
Is there an approach that will work?
Here's how I might approach it:
Add a Data Flow task to your control flow
Add a Source (ADO.NET) that connects to your database
Create a Package level Object variable (for the next step)
Add a Recordset destination that populates your data into the Object variable created in the previous step
Back on the control flow:
Create a package level String variable for the "current" query (see next step)
Add a For Each ADO.NET enumerator
Connect the previous Data Flow task to the For Each task
Configure the For Each to use the Object variable as a source, and to store the column index with the SQL into the String variable
Add an Execute SQL task inside the For Each task
Configure it to execute a SQL Command from Variable, and pick the string variable containing the current query
Basically it will collect the queries from the table, then for each collected query, assign it to a variable, and then the Execute SQL command can pull the command text from that variable.

Create Job To Convert Data With SSIS

I Want To Convert Data From OldDatabase To NewDatabase Each Day By SSIS.
for this:
1- Create Script From Exists New DataBase
2- Change DatabaseName OF Generated Script and Then Execute Script To Create NewDatabase2 whit no data.
3- Set SSIS Configuration Parameters.
4- Execute SSIS Package to convert Data.
I Want to do this action by Job.
How I want to do?
your requiremets are not very clear but there are two SSIS tasks that can help you.
If you want to transfer all you can use a "Transfer Database Task"
If you need to specify objects, you can use the "Transfer SQL Server Objects Task" where you can specify things like if you are dropping the objects first and how to deal with the data:

Table Variables in SSIS

In one SQL Task can I create a table variable
DELCARE #TableVar TABLE (...)
Then in another SQL Task or DataSource destination and select or insert into the table variable?
The other option I have considered is using a Temp Table.
CREATE TABLE #TempTable (...)
I would prefer to use Table Variable so that it remains in memory. But can use temp table if it is not possible to use table variable. Also I cannot use the record set destination as I need to preform straight SQL tasks on it later on.
The use case that this is trying to solve is essentially performing a transformation in the stead of BizTalk. There is a very large flat file to flat file transformation that BizTalk has to transform unfortunately the data volume would produce unacceptable load on the BizTalk server so the idea is to off load it to SSIS. However, it is not a simple row to row transformation, there are different types of rows which have relations to each other. The first task in SSIS is to load the row into appropriate (temp) tables, then in the second data task a select is preformed with the correct format for output.
You could use some of the techniques in this post: http://consultingblogs.emc.com/jamiethomson/archive/2006/11/19/SSIS_3A00_-Using-temporary-tables.aspx
especially the ones about using RetainSameConnection=TRUE on the connection manager.
I would be interested to see more information about what use case you have that requires you to write out data to a temp table or table variable before further SSIS processing. Couldn't you take care of all of the SQL required steps in your source query before you start processing the dataflow with SSIS?
Table variables are not kept solely in memory and can be written to disk under memory pressure. I tend to use table variables for very small lookups. If you Need to push a table into SQL Server due to necessary and complex transformations, then use a 'permanent' temp table that is truncated within the SSIS package prior to insert. Simple and will get what you need done.
The SSIS package would be run in a job. I assume it runs inside a SQL job. In that case, using a temp table won't harm. SQL Jobs are generally run after office hours so it does not matter.