I'm trying to find out how to load data from/to multiple tables in SQL Server.
I have 30+ tables in source and target database (both have the same table names and columns), so I used a foreach loop container to loop through each of the table with 1 DFT. I also have variables (SourceQuery & TargetQuery) that holds the query to use in ole db source/destination.
But when I use the TargetQuery in ole db destination I'm getting this error:
Anyone know a proper way to use a variable in the ole db destination? Or any other solution would be great as well.
To copy 30 tables I recommend using the "Transfer SQL Server Objects Task" component that is much more efficient than using a data flow
here you can see how to pass variables
Transfer SQL Server Objects Task
Related
I have two tables Table A and B that is present in a Azure SQL Database. I have the same database clone running in my local, but I want to populate the data that is present in Azure by using SSMS Export Data option. While using that option I specify the source and destination and then choose the option of "Write a query to specify the data to transfer"
And then I add the query "Select * from A where Condition1" and select the destination table here:
The issue is if I have 5 tables to export data from, I have to do this whole process 5 times, only difference is the queries and destination tables. Anyone has any idea how can I do this whole thing faster by some other means? I just need to copy data using some select statements with where clauses.
As per the Official Documentation
When you select  Write a query to specify the data to transfer, you can only copy the results of one query to one destination table.
So, you have to repeat the entire process for multiple times if you want to export data like that.
You can use the following ways for importing and exporting data:
Use Transact-SQL statements.
Use BCP (Bulk Copy Program) from the command prompt.
If you want to design a custom data import, you can use SQL Server Integration Services.
Use Azure Data factory.
Use BACPAC file. Refer this material by accu web hosting to know about it. Rather than querying before exporting the data, instead you can delete the unwanted data in destination database after exporting using delete statement.
REFERENCE:
Import & export data from SQL Server & Azure SQL Database - SQL Server | Microsoft Docs
I am in need of suggestion to move data from a particular table in one azure sql database to the other azure sql database which has the same table structure without using elastic query
Using SQL Server Management Studio to connect to SQL azure database, right click the source database and select generate scripts.
During the wizard, after have select the tables that you want to output to a query window, then click advanced. About half way down the properties window there is an option for "type of data to script". Select that and change it to "data only", then finish the wizard.
The heck the script, rearrange the inserts for constraints, and change the using at the top to run it against my target DB.
Then right click on the target database and select new query, copy the script into it, and run it.
This will migrate the data.
Please consider using the "Transfer SQL Server Objects task" in SSIS. You can learn all the advantages it provides on this article.
You can use PowerShell to query each database and move data between them as needed. Here's an example article on how to get this done.
Using PowerShell when working with Azure has a number of other benefits in what you can do and can control as well. It's a good choice to spend time learning.
In the source database I created SPs to select the data from the tables.
In the target database I created table types (which would be available in programmability) for the tables with the same structure as in the source.
I used Azure function to move the data into table type from source.
In the target database I created SPs to insert data into the tables from their respective table types.
After ensuring the transfer of data, I would be deleting those records moved to the target in the source database and for this I created SPs.
I was wondering if someone could help me with creating a while loop to iterate through multiple databases(100 databases) and drop/Truncate the tables(Around 60 tables in each database) within those databases.Thank you.
My task is to create an SSIS package to move the data from source database to target database.Data in the destination needs to be truncated as part of the process whenever I run that package the old data need to be truncated and the new data needs to be inserted.
Kindly help.
Thank you.
I am just explain how i would go about this problem.
Hoping the destination and source database,tables are of similar schema.
1) Try to create a table with all the database (Source and destination) details all 60 in a separate database.
2) You need get the database details using execute SQL task from the table and use For Each container for looping the logic of truncate the destination table and then move the data from source database to destination (The data is moved from source database one after other).
3) You will need to use Dynamic SQL and stored procedures for moving the data from source database to destination database (That you must be knowing). or You can use Data flow task too if you don't want to use Stored procedure or Dynamic Queries.
Hope it helps you! :)
I need to store all the SQL Queries under one folder and refer in the SSIS package to better organize the list of SQL file I am using so we can easily change the SQL file later without having to rebuild the package. This will include all queries that I am using for "Execute SQL task " as well as the queries in "OLE DB Data Source" under Data Flow component.
Under Data Flow task, Instead of putting the SQL query for source data base into the Query Window:
I see four options under Data Access mode for OLE DB Data source-
1. Table or View
2. Table of view name variable
3. SQL Command
4. SQL Command from variable
I understand I could use a variable, store the query in the variable and refer it in "Execute SQL Task" but I am looking for a way to store all the queries in SQL files and it in Data Flow component as well as in "Execute SQL Script Task". I can't seem to find a way to make it dynamic in Data Flow task. Can anyone help with this please?
I don't think storing them as sql files is any good for any type of scenario.
Here is what I would do given what you have described.
You can store the queries as varchar in a table in a database instead of as files. Then you can foreach-loop over the result set and map each row to the variable that you would then use as the query for your oledb data source in the dataflow.
So create a variable and make a for each loop that loops over "select query from dbo.queries" or what ever. Set the output to the variable you created and in the container create your dataflow and set either the source-task with an expression or with "SQL Command from variable".
As for the control flow queries why not just have them be stored procedures that you can change when you need to?
My Source and Destination tables exist on different servers. I am using Execute SQL Task to write Merge Statements to synchronize them.
Could anyone explain how I can reference two different databases that exist on different servers inside my Execute SQL Task?
Possible approaches:
I would suggest the following approaches instead of trying to use MERGE statement within Execute SQL Task between two database servers.
Approach #1:
Create two OLEDB Connection Managers to each of the SQL Server instances. For example, if you have two databases SourceDB and DestinationDB, you could create two connection managers named OLEDB_SourceDB and OLEDB_DestinationDB. You could also use ADO.NET connection manager, if you prefer that. Based on what I have read in SSIS based books, OLEDB performs better than ADO.NET connection manager.
Drag and drop a Data Flow Task on the Control Flow tab.
Within the Data Flow Task, configure an OLE DB Source to read the data from source database table.
Use Lookup Transformation that checks whether if the data already exists in the destination table using the uniquer key between source and destination tables.
If the source table row does not exist in the destination table, then insert the rows into destination table using OLE DB Destination
If the source table row exists in the destination table, then insert the rows into a staging table on the destination database using another OLE DB Destination.
Place an Execute SQL Task after the Data Flow Task on the Control Flow tab. Write a query that would update the data in destination table using the staging table data.
Check the answer to the below SO question for detailed steps.
How do I optimize Upsert (Update and Insert) operation within SSIS package?
Approach #2:
Create two OLEDB Connection Managers to each of the SQL Server instances. For example, if you have two databases SourceDB and DestinationDB, you could create two connection managers named OLEDB_SourceDB and OLEDB_DestinationDB.
Drag and drop a Data Flow Task on the Control Flow tab.
Within the Data Flow Task, configure an OLE DB Source to read the data from source database table and insert into a staging table using OLE DB Destination.
Place an Execute SQL Task after the Data Flow Task on the Control Flow tab. Write a query that would use the MERGE statement between staging table and the destination table.
See this link - http://technet.microsoft.com/en-us/library/cc280522%28v=sql.105%29.aspx
Basically, to do this, you would need to get the data from the different servers into the same place with Data Flow tasks, and then perform an Execute SQL task to do the merge.
The Merge and Merge Join SSIS Data Flow tasks don't look like they do what you want to do.