Optimal way to Load Data from SQL Server to DB2 - sql

We have 40+ Tables present in SQL SERVER DB and we need to copy the data to an IBM DB2 database. What methods do you recommend to accomplish this?
My ANALYSIS:
BCP and Data Import - The team is trying to avoid any BCP files
Write Stored procedure and use LINKED Server in SQL and insert the data in DB2
SSIS Packages to move data.
Please let us know if you have any better way to approach this issue.

Have you considered Information Integration, that is known in DB2 as federation? you can do a select in SQL Server directly from DB2, and with this feature you can define a cursor and then just use the LOAD command.

Related

Update from linked server (mysql) to local sql database.

I am looking for a way to setup a scheduled update from a linked server I created to a local db, I am not familiar with triggers but from what I've read you have to set them up on the originating server, and I only have read access to the mysql Database. Basically all that I am trying to do is make a local copy of two tables from the mysql db. I can manually do so with select into statements, but I would like to have some automation if possible. Any thoughts on how to achieve this? Also I am using SQL server 2008 R2. Thanks!
You have several options to do:
Copy all data from the source table (do not use this if the source table is big)
If you have a column in the source table which can be used to determine which records should be copied, use that (this is mostly an auto updated timestamp column in MySQL)
Set up a trigger to track modifications
To copy, you can set up a Linked Server or you can use SSIS
To use a linked server you can use OPENQUERY()
You can schedule your task with SQL Server Agent

Sql: export database using TSQL

I have database connection to database DB1. The only thing I could do - execute any t-sql statements including using stored procedures. I want to export the specific table (or even the specific rows of specific table) to my local database. As you can read abve, DBs are on diffrent servers meaning no direct connection is possible. Therefore question: Is it possible to write query that returns the other query to execute on local server and get data? Also note, that table contains BLOBs. Thanks.
If you have SQL Server Management Studio, you can use the data import function on your local database to get the data. It works as long as you have Read/Select access on the tables you are trying to copy.
If you have Visual Studio you can use the database tools in there to move data between two servers as long as you can connect to both from your workstation.
Needs Ultimate or Premium though:
http://msdn.microsoft.com/en-us/library/dd193261.aspx
RedGate has some usefull tools too:
http://www.red-gate.com/products/sql-development/sql-compare/features
Maybe you should ask at https://dba.stackexchange.com/ instead.
If you can login to the remote db (where you can only issue t-sql), you may create linked server on your local server to the remote and use it later directly in queries, like:
select * from [LinkedServerName].[DatabaseName].[SchemaName].[TableName]

Microsoft SQL Server: How to export data from a database and import them to another database?

How can I export all of my rows in a table to sql script in Microsoft SQL Server 2005 and then import them to another database?
Thanks in advance
If you moving it to another sql db you can right click the database you want and choose tasks -> generate scripts. That will launch a a wizard - follow along, choose the option to script all tables and data. Then execute that script in the new db(assuming that you've already created one with the same name)
If you can't find a data import/export tool that will work in your particular circumstances, it's possible to write plain SQL SELECT queries that will generate SQL INSERT statements. In this way it's possible to "export" all your data to a script file that can be run against the destination database. It's kind of an ugly hack, but it's simple and it works if you don't have a lot of data to move. See my answer to this question for details: Export SQL Server 2005 query result to SQL INSERT statement?
Note that this method assumes that the destination table already exists. But it's pretty straightforward to generate table creation scripts, as J Cory's answer has already shown.
There's a command line tool available to dump your data from particular tables into a SQL script that be executed against a different database:
http://blog.sqlauthority.com/2007/11/16/sql-server-2005-generate-script-with-data-from-database-database-publishing-wizard/
I don't believe SQL Management Studio Express supports data scripting (as your screenshot on J Cory's answer shows), but the full version does support that feature. In either case, the command line tool should accomplish what you need.

SSIS and MySQL - Table Name Delimiter Issue

I am trying to insert rows into a MySQL database from an Access database using SQL Server 2008 SSIS.
TITLE: Microsoft SQL Server Management Studio
------------------------------
ERROR [42000] [MySQL][ODBC 5.1 Driver][mysqld-5.0.51a-community-nt]You have
an error in your SQL syntax; check the manual that corresponds to your MySQL
server version for the right syntax to use near '"orders"' at line 1
The problem is with the delimiters. I am using the 5.1 ODBC driver, and I can connect to MySql and select a table from the ADO.Net destination data source.
The MySql tables all show up delimited with double-quotes in the SSIS package editor:
"shipto addresses"
Removing the double quotes from the "Use a table or view" text box on the ADO.NET Destination Editor or replacing them with something else does not work if there is a space in the table name.
When SSIS puts the Insert query together, it retains the double quotes and adds single quotes.
The error above is shown when I click on "Preview" in the editor, and a similar error is thrown when I run the package (albeit then from the actual insert statement).
I don't seem to have control over this behavior. Any suggestions? Other package types where I can hand-code the SQL don't have this problem.
Sorry InnerJoin, I had to take the accepted answer away from you. I found a workaround here:
The solution is to reuse the connection for all tasks, and to turn ANSI quotes on for the connection before you do any inserts, with an Execute Sql task that runs the following:
set sql_mode='STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,
NO_ENGINE_SUBSTITUTION,ANSI_QUOTES'
Try using square brackets around the table names. That may help.
EDIT: If you can, I would create views (with no spaces) based on the Access tables, and use those to export. Even if it means building another Access database with linked tables, I think this is your best bet.
I've always struggled with using SSIS with MYSQL directly. Even after installing the ODBC drivers, they just don't play well in data flows. I've always ended up creating linked ODBC connections between SQL Server and MYSQL. I then rely on linked server queries to bring over data. Instead of using a SSIS data flow task, I use an Execute SQL command, usually in the form of a stored procedure that executes an OPENQUERY.
One solution you could do is load the data into a SQL Server database and use it as a staging environment before you load it into the MYSQL database. I regularly move data between SQL Server 2008 and MYSQL and in the past I use to regularly move data between Access and SQL Server.
Another possible solution is to transform the incoming Access data before it loads into the MYSQL database. That may give you a chance to clean up the column names and the actual data that's going through to MYSQL.
Let me know if either of these work for you.
You can locate the configuration setting file my.ini at <<Drive>>:\ProgramData\MySQL\MySQL Server 5.6\my.ini and add "ANSI_QUOTES" to sql-mode.
e.g: sql-mode="STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION,ANSI_QUOTES". This should solve the issue while previewing in the SSIS editor.

Move Data from Oracle to SQL Server

I would like to copy parts of an Oracle DB to a SQL Server DB. I need to move the data because the Oracle box is being decommissioned. I only need the data for reference purposes so don't need indexes or stored procedures or contstaints, etc. All I need is the data.
I have a link to the Oracle DB in SQL Server. I have tested the following query, which seemed to work just fine:
select
*
into
NewTableName
from
linkedserver.OracleTable
I was wondering if there are any potential issues with using this approach?
Using SSIS (sql integration services) may be a good alternative especially if your table names are the same on both servers. Use the import wizard via and it should create the destination tables for you and let you edit any mappings.
The only issue I see with that is you will need to execute that of course for each and every table you need. Glad you are decommissioning the oracle server :-). Otherwise if you are not concerned with indexes or any of the existing sprocs I don't see any issue in what you are doing.
The "select " approach could be very slow if tables are large. Consider writing pro*C in that case or use Fastreader http://www.wisdomforce.com/products-FastReader.html
A faster and easier approach might be to use the Data Transformation Services, depending on the number of objects you're trying to copy over.