In SQL Server, I have two tables with different schema. The tables are in two separate databases on two servers.
How can I copy the content from one table to the other? I have several million rows to move.
Does this query work for tables in different servers?
INSERT INTO table2 (column_name(s))
SELECT column_name(s)
FROM table1;
Thanks
UPDATE:
I realized I'm inserting data to a non-sql database.
Here's the scenario:
I have a table in SQL Server Database and I want to move the data to an IBM Database.
There are several millions rows in my SQL table.
I need to unload the data and store as flat file. The format can be .csv or txt.
The "Task->Export" function does not work.
Millions of records being transferred across the network will ultimately timeout due to latency and maximum connection time established.
So is there any other ways to do this?
Thanks
Yes, performing this in a query would be the most efficient way of doing this...
You just have to qualify the name:
myDatabase.dbo.myTable
(assuming dbo is the right schema)
Or do you mean cross server/instance transactions? Then you need to set up a Linked Server and again, fully qualify the name:
myLinkedServer.myDatabase.dbo.myTable
Related
I have a remote database that I want to copy on my local SQL Server.
IMPORTANT: I only want a sample data (1k rows for instance) from each table, and there are about 130 different tables.
I have tried to use the export data procedure in SSMS. Put simply, I go to TASKS> EXPORT DATA > CHOSE SOURCE (the remote db)> CHOSE DESTINATION (my local db) > CHOSE THE TABLES TO COPY > COPY
What I have tried:
I've tried to write down in this tool the SQL query like
SELECT TOP 1000 *
FROM TABLE1 GO ...
SELECT TOP 1000 *
FROM TABLE130
But on the mapping step, it puts every result within a single table instead of creating the 130 different output tables.
FYI, the above procedure is taking 2min for one table. Doing it one by one for each table will take 130 *2min = 4hours and half... plus it is so boring
Do you have any idea for resolving this situation?
Thank you
regards
If you only want a subset you are going to have problems with foreign keys, if there are any in your database.
Possible approaches to extract all data or a subset
Use SSIS to extract the data from the source db and load into your local db
Write a custom application that does the job. (You can use SQL Bulk Copy)
If you purely wan't to do do it in SSMS you can create a linked server on your local server to the remote server.
This way you can do something like this if the tables or not yet created on your local server:
SELECT TOP 1000 *
INSERT INTO [dbo].Table1
FROM [yourLinkedServer].[dbo].[Table1]
Changing the INTO table and FROM table for each table you want to copy.
How does CSV generation affect SQL performance on SQL Server?
I want to generate CSV files from an active production database. These tables have 10-20-30 million rows stored in them. These are not particularly wide tables, but they are used actively.
Approach would be like: create copies of tables with "SELECT * INTO [newTable] FROM..." - I think it should copy data fast into another table. Export file from copytable, so production table would not get locked unintentionally. I'm not sure if this step is actually necessary, or if it helps anything.
Thanks!
I have an application that uses a remote database connection as a source (SQL Server), but the database table is case sensitive, and I need it not to be. To solve this I am looking to keep a local copy of that table. Values are regularly added and removed from the source table, so I need the local table to be kept up to date, hourly. What is the simplest way to accomplish this?
How can I sync two databases and do a manual refresh on the entities on either of the database whenever I want?
Let's say I have two databases DB1(prod) and DB2(dev). I want to update/insert only a few tables from prod DB to dev DB. How could I achieve this? Is this possible instead of DBlink since I do not have privileges to create a database link?
If you only want to do a manual refresh set up an import/export/datapump script to copy the data across if there is not too much data involved. If there is a large amount of data you could write some pl/sql as described above to only move the new/changed rows. This will be easier if your data has fields such as created/updated_on
I have to copy the tables with data from one database to another using Query. I know how to copy tables with data within a database. But I was not sure about how to do the same for copying between two databases.
I have to copy huge number of tables, so I need any fast method using query...
Anybody please help out...Thanks in advance...
You can use the same way to copy the tables within one database, the SELECT INTO but use a fully qualified tables names database.schema.object_name instead like so:
USE TheOtherDB;
SELECT *
INTO NewTable
FROM TheFirstDB.Schemaname.OldTable
This will create a new table Newtable in the database TheOtherDB from the table OldTable whih belongs to the databaseTheFirstDB
Right click on the database, select tasks and click on Generate Scripts.
In the resultant pop-up, choose options as required (click advanced), to drop and create table, drop if exists, etc.
Scroll down and choose "Schema and Data" or "Data Only" or "Types of data to script (2008 R2)"as required.
Save to file and execute on the destination DB.
Advantages -
Can be executed against the destination DB, even if it is on another server / instance
Quickly script multiple tables, with data as needed
Warning - Might take quite a while to script, if the tables contain a large amount of data.
Rajan
INSERT INTO DB2.dbo.MyOtherTable (Col0, Col1)
SELECT Col0, Col1 FROM DB1.dbo.MyTable
Both table column's must have same data types..
Below SQL Query will copy SQL Server table schema & data from one database to another database. You can always table name (SampleTable) in your destination database.
SELECT * INTO DestinationDB.dbo.SampleTable FROM SourceDB.dbo.SampleTable