Deleting or not deleting rows in ssis package destination table - sql-server-2005

i used the import/export wizard in sql server 2005 management studio to export rows from an excel sheet to an sql table and checked the Do not delete rows in destination table option. I saved the export operation as a ssis package, and yes new rows are being appended to existing ones, but now i have a requirement to delete all rows in the destination sql table.
When i go into BIDS to edit my package, i cant find the option to change this behavior any where. does any one know how to change this setting the ssis package designer??

When you enable the delete destination rows option, a "Execute SQL Task" is added to the package's Control Flow. This task is used to execute a TRUNCATE TABLE statement on the destination. In your package, the control flow probably contains a single Data Flow Task. Just add an Execute SQL Task and connect it to the data flow task. Assign the destination connection manager to the Execute SQL Task and set the SQL statement to "TRUNCATE TABLE [yourdestinationtable]"

Related

Copy 10k rows from table to table on other server

I cannot use linked server.
Both databases on both servers have the same structure but different data.
I have 10k rows to transfer from the DB on one server to the same DB on the other. I cannot restore the DB on the other server as it will take a huge amount of space that I don't have on the other server.
So, I have 2 options that I don't know how to execute:
Backup and restoring only one table - the table is linked to many other tables but these other tables exist on the other server too. Can I somehow delete or remove the other tables or make a backup only over one table?
I need to transfer 10k rows. How is it possible to create 10k insert queries based on selected 10k rows?
Can I somehow delete or remove the other tables or make a backup only over one table?
No you can not do this, unfortunately
How is it possible to create 10k insert queries based on selected 10k rows?
Right-click on Database -> Tasks -> Generate scripts -> (Introduction) Next
Chose Select specific database objects -> Tables, chose table you need -> Next
Advanced -> Search for Types of data script change from Schema only (by default) to Data only -> OK
Chose where to save -> Next -> Next. Wait the operation to end.
This will generate the file with 10k inserts.
Another way is to use Import/Export wizard (the most simple way for one-time-import/export) if you have link between databases.
There are many ways to choose from, here is one way using BCP. That's a tool that ships with SQL Server to Import and Export Bulk Data.
The outlines of the process:
Export the data from the source server to a file using BCP - BCP OUT for a whole table, or BCP QUERYOUT with a query to select the 10k rows you want exported
Copy the file to the destination server
Import the data using BCP on the destination database - BCP IN.
My suggestion would be to export these rows to excel( you can do this by copy pasting your query output) and transfer this to other server and import it there.
this is the official method :-
https://learn.microsoft.com/en-us/sql/relational-databases/import-export/import-data-from-excel-to-sql
and this is the the unofficial method :
http://therealdanvega.com/blog/2010/08/04/create-sql-insert-statements-from-a-spreadsheet.
Here I have assumed that you only need to transfer the transactional data and your reference data is same on both server. So you will need to execute only one query for exporting your data
I would definietely go down the SSIS route once you use SSIS to do a task like this you will not use anything else very simple to script up. You can use any version and it will be a simple job and very quick.
Open new SSIS project in available visual studio version/s there are many different but even a 2008 version will do this simple task you may have to install integration services or something similar used to be called bids (business information development studio in 2008) (anything up to 2015 although support is nearly there in 2017)
add a data flow task
double click the data flow task
Bottom of screen add two connection managers (1 to source and 1 to destination database)
add oledb source pointing to source database table
add oledb destination pointing to destination database table
drag line between the source and destination (should auto map all columns if the same name)
hit Start and the data will flow very quickly
you have create DbInstaller. using dbInstaller you have share whole database. Dbinstaller work both ado.Net and Entity Frame Work but I have using Entity Frame Work.
you can do it by sql server query
first select first database like
Use database1 --- this will be your first database
after select first database we will put our table row in temp table by this query
select * into #Temp from select * from table1
now we select second database and insert temp table data into second database table by this code
use secondDatabaseName
INSERT INTO tableNameintoinsert (col1, col2, )
SELECT col1, col2 FROM #temp;

Data transfer from a view from one database to a table in another database

I am just trying to find out whether this is the right way to do this task.
Any other suggestions to improve this is greatly appreciated.
I have the following on my SSIS package.
Data Flow task and established a OLE DB connection to the source database where the view is.
Execute SQL task - I am executing a query with a INSERT INTO Destination Except (all those records that are already there from the source.)
Send mail task is to send out an email.
How to know that the data transfer is successful? So that I can use the send mail to
indicate success or failure.
How to schedule this package so that it runs automatically (Every Tuesday.)
I have tried the suggestion below. Please refer to the new Data Flow task.
OLE DB Source - Points to a view in database server 1
Lookup gets all the rows from OLE DB source. (the rowcount on source and on the lookup)
matches.
On the lookup task, I have configured error output to use 'Redirect row' on all the mapped columns.
On the OLE DB Destination (Destination table where it already has a subset of records from the source. So the Configured Error output to get unmatches rows for insert.
When, I execute the package - I am getting an Primary key constraint error as - Cannot insert duplicate key.
Any suggestions?
You will want to double click the connector from the Execute SQL Task to Send Mail Task Currently it's green which indicates it will only take that path on Success. You will want to update the constraint to be on Completion as you don't care if it's Success or Fail.
It sounds like you have your data flow pulling all of the data from your source and writing to a staging table. In your Execute SQL Task, you then use a query to add data into your target table where it doesn't exist.
This can be consolidated into a single Data Flow. Between your OLE DB Source and OLE DB Destination, add a Lookup task. Since you are on 2005, the Lookup behaves a bit differently than 2008+. You will write a query that pulls back the business keys in your target table and then compares that to what is coming from your OLE DB Source. Map those keys in the interface.
You only want the rows that aren't matched so you will need to get the "unmatched records" from the lookup. In 2005, the option for Unmatched output didn't exist so you will need to route the Error output to your OLE DB Destination.
Andy Leonard has a nice little writeup on how to accomplish this: Configuration an SSIS 2005 Lookup Transformation for a Left Outer Join The only difference for your case, is that you don't care about the matched rows. Instead of Ignore Failure, you want to select Redirect Row. Then when you go to connect the Lookup to the OLE DB Destination, you will be presented with two options. The Green Connector is the Matched, Red Connector is the Unmatched rows. Tie the Red path to your Destination

SSIS - Extract multiple databases based on lookup table

How do I create a package that extracts multiple databases(and all tables in each database) from another server based on a lookup table (also found on the other server) that contains a column where all the databases I need to extract is listed ?
I need to use the lookup table because new databases is created from time to time on the source, so I cannot just create a job that extracts a "static set" of databases to a destination. It needs to be a bit dynamic...
Furthermore I also need to extract the databases incremental where I can use a timestamp that exists in all databases/tables.
Im new to SSIS, so an "easy" guide would be appriciated.
Thanks
As a rough idea, you could work with SSIS Package Configurations and executing packages from within packages, and then use the Transfer SQL Server Objects Task:
Make a "Main package" that iterates over the column in your lookup table.
For each entry, it should UPDATE the Package Config entry of your second SSIS package accordingly. Use the "SQL Server" configuration for that second package.
The Main package should then execute the second package - there is a also a task for this.
The second package looks at its config to find out which server to get databases from and uses the Transfer SQL Server Objects Task to do so.
then the Main package continues with the next entry from your lookup table.
Ideally you would want to have your "second SSIS package" inside SQL Server's MSDB rather than the file system. Its easier to execute.

How can I use MERGE statement across multiple database servers?

My Source and Destination tables exist on different servers. I am using Execute SQL Task to write Merge Statements to synchronize them.
Could anyone explain how I can reference two different databases that exist on different servers inside my Execute SQL Task?
Possible approaches:
I would suggest the following approaches instead of trying to use MERGE statement within Execute SQL Task between two database servers.
Approach #1:
Create two OLEDB Connection Managers to each of the SQL Server instances. For example, if you have two databases SourceDB and DestinationDB, you could create two connection managers named OLEDB_SourceDB and OLEDB_DestinationDB. You could also use ADO.NET connection manager, if you prefer that. Based on what I have read in SSIS based books, OLEDB performs better than ADO.NET connection manager.
Drag and drop a Data Flow Task on the Control Flow tab.
Within the Data Flow Task, configure an OLE DB Source to read the data from source database table.
Use Lookup Transformation that checks whether if the data already exists in the destination table using the uniquer key between source and destination tables.
If the source table row does not exist in the destination table, then insert the rows into destination table using OLE DB Destination
If the source table row exists in the destination table, then insert the rows into a staging table on the destination database using another OLE DB Destination.
Place an Execute SQL Task after the Data Flow Task on the Control Flow tab. Write a query that would update the data in destination table using the staging table data.
Check the answer to the below SO question for detailed steps.
How do I optimize Upsert (Update and Insert) operation within SSIS package?
Approach #2:
Create two OLEDB Connection Managers to each of the SQL Server instances. For example, if you have two databases SourceDB and DestinationDB, you could create two connection managers named OLEDB_SourceDB and OLEDB_DestinationDB.
Drag and drop a Data Flow Task on the Control Flow tab.
Within the Data Flow Task, configure an OLE DB Source to read the data from source database table and insert into a staging table using OLE DB Destination.
Place an Execute SQL Task after the Data Flow Task on the Control Flow tab. Write a query that would use the MERGE statement between staging table and the destination table.
See this link - http://technet.microsoft.com/en-us/library/cc280522%28v=sql.105%29.aspx
Basically, to do this, you would need to get the data from the different servers into the same place with Data Flow tasks, and then perform an Execute SQL task to do the merge.
The Merge and Merge Join SSIS Data Flow tasks don't look like they do what you want to do.

How to backup transaction log after database backup everyday in SQL server 2005

How to backup transaction log after database backup everyday in SQL server 2005
Why not just use SSIS to backup the log, so it can backup, then copy it to where it needs to be.
UPDATE:
You can look at this question, it talks about how to go from SQL Server 2005 query to an Excel file:
http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/DTS/Q_23090779.html
The useful answer is:
Create a stored procedure that will have the output you need to export in
excel.
In the DTS package add a SQL connection and an excel conection. SQL
conn should point you server and db
and excel conn your file. If it
doesn't exist just create one on the
fly.
Create the Transformation task betreen the SQL conn and excel conn.
Double click on the arrow and in the trasformation data task properties
window in Source tab instead of
Table/View pick SQL query. In the
panel below type EXEC sprocname, where
sprocname will be the name of your
procedure from step 1.
5.Click on Destination tab; if file/worksheet if doesn't exist will
open a dialog window for creation.
Edit if you want and click OK.
In Transformations tab define your trasnfromation by matching the
columns. 7 Run.
If you want to run this automatically you need in an ongoing matter what you need is to define a Dynamic properties tasl where you can edit the excel connection to generate a name that will have a timestamp, (you can use an sql statement as well) and then in an Active X task create/copy the file from an existing structure file.
So
Dynamic Property Task ---> ActiveX task (copy from struct file to new generated name file ) ---> SQL conn ------> Excel conn.