I need to copy all my data from a database on Server "x" to server "z". Both have the same data base name and table names. I need to copy my prod data to my test server without loosing my test servers data.
Try using SSIS to copy your databases. You can also use replication (push or pull or even merge), or you can manually backup a db as a *.bak file then create and restore it onto another server.
https://www.youtube.com/watch?v=pA242aMvz6E
Related
I am transfering 90 million rows from a source server to my staging area on the destination server.
And from the staging area I transfer 20 million further up the ETL process by doing an WHERE ID EXISTS in a table located on the destination server.
Since the table is not present on the source server and only in the destination server. Is it possible to filter when I pull the rows directly from the source server (so I only transfer 20 million rows from the source server to my destination server)?
Besides creating a linked server on the Source Server. there are two pure SSIS approaches.
Create a temp staging table at the Destination server, copy all records from Source to this temp stage table, and use where exists filter.
On the Data Flow, create a Lookup transformation which for lookup set gets IDs from the table of the Destination Server. Then proceed with matched records only. For performance reason, you may use either Lookup in full cache or partial cache mode; only performance testing can tell which mode is better.
Hadi's recommendations with Linked Server are fine and will work. Pure SSIS approach has advantage that it does not bring any changes to the Source Server, all connection configuration is inside SSIS. In some cases it can be beneficial. Its disadvantage - performance can be worse that with Linked Server.
If needed to transfer as little rows from Source as possible, the most simple way is the Linked Server approach. Otherwise, one can create a table at Source Server (it even can be a global temp ## table created at SSIS package task) and copy filter IDs from the Destination server. The temp table should be a global with ##, since it will be filled at one task and used in subsequent tasks. Then filter records with EXISTS clause at the Source server.
You can do that by creating a linked server on the source machine. Linked servers allows to join tables on different instances:
How to create and configure a linked server in SQL Server Management Studio
Create Linked Servers (SQL Server Database Engine)
I have an application that uses a remote database connection as a source (SQL Server), but the database table is case sensitive, and I need it not to be. To solve this I am looking to keep a local copy of that table. Values are regularly added and removed from the source table, so I need the local table to be kept up to date, hourly. What is the simplest way to accomplish this?
I know I can clone DB into a new one with
CREATE DATABASE Database1_copy AS COPY OF Database1;
(https://learn.microsoft.com/en-us/azure/sql-database/sql-database-copy-transact-sql)
and this goes flawesly, except in Azure, where db properties are managed by Azure portal, so I am try to find a way to copy most of the schema/resources/data into an EXISTING DB
would be great for:
CLONE DATABASE Database_test AS COPY OF Database_production
[even first approach has been to "clone" the entire db, indeed few tables on destination db should be kept, so better approach would be to CLONE EVERYTHING EXCEPT ('table1','table2'). Actually plan to achieving this by scripting the few tables needed on destination db and overwriting them after import, but bet solution would be the other]
You can do this in several ways:
Through the Azure Portal
Open your database in the Azure Portal(https://portal.azure.com)
In the overview blade of your database select the "copy" option
Fill in the parameters, in which server would you like the copy
Using a sql server client and connecting to the server
Open your SQL Server blade in Azure
Select the "Firewall" option
Click on "Add client IP"
Connect to your database with your connection string and your favorite client, could be SSMS
Execute your sql query to clone the database in the same server
-- Copy a SQL database to the same server
-- Execute on the master database.
-- Start copying.
CREATE DATABASE Database1_copy AS COPY OF Database1;
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-copy-transact-sql
The above SQL statement works perfectly fine as expected in Azure SQL Database.
Important Notes:
Log on to the master database (System Databases) using the
server-level principal login or the login that created the
database you want to copy.
Logins that are not the server-level principal must be members of
the dbmanager role in order to copy databases.
Use updated version of the SQL Server Management Studio
I have two backup files
1) is named 'backup.sql' with a bunch of SQL defining TABLES
2) is named 'backup' with a bunch of encoded data, which I believe are the ROWS
I need to restore these TABLES + ROWS, but all I am able to figure out is how to restore the tables.
Any tips on dealing with these files? It's the first time I ever deal with SQL Server.
The backup process would not create a file with actual SQL statements, it would create a binary file. So #1 is not a backup file (it's probably a script someone saved to re-create the schema).
I would try to use SQL Server Management Studio to restore the second file and see what happens. I don't think it will allow you to restore an invalid file, but I would take some basic precautions like backing up the system first.
What is the extension for the 'backup' file? Is the filename backup.bak? If you have a backup file created by sql server then it 'should' contain the logic to create both the tables and restore the data, but it could depend on how the backup was created.
---Edit
It is possible for a .SQL file to contain data values as well as the logic to create the tables/columns for a database. I used to run backups of a MySql database in this way a long time ago...it just is not seen very often with SQL server since it has built in backup/restore funcationality.
Seems unlikely they would export all the rows from all tables into CSV file, and given you said it looks encrypted, it's making me think that's your actual backup file.
try this, save a copy of the "backup" file, rename it to backup.bak and run this from SQL Server Management Studio
restore filelistonly from disk='C:\backup.bak'
(assuming your file is saved on the root of the C: drive)
Any results/errors?
I have backed up a database into a file using SQL Server from my old server.
Now i would like to restore that file into a new database on my new server.
I created a DB with the same name , I am getting an error saying :
"The Backup set holds a backup of the database other than the existing '*****' database"
Any thoughts?
Thanks
Add a WITH REPLACE option to your restore:
Specifies that SQL Server should
create the specified database and its
related files even if another database
already exists with the same name
Drop the new database - it's sitting in the way of the one you want to restore.
THen when you try to restore your old database, select the file to restore from, and the name will magically appear in the "to database" destination field in SSMS.
When you restore a database from backup, you are creating a new database on the SQL instance. If a database by that name is already present on that SQL instance, you will get an error--unless you select the option to overwrite any existing database, in which case the old database will be wiped out and replaced.
I was having the same issue, but even when putting WITH REPLACE, the error occurred. I had an empty database with the same name as the back up, but the problem was my .trn file I was using to backup from had two backup sets and I was choosing to restore from the full database AND the transaction log. I chose only the Full Database and it worked.