Our goal is to restore a test environment from our live environment, so basically we
would like to simply backup our current live databases, and just restore them
in our test server.
However... we do not have enough room to move the backups, one of our databases
is 50 GB, and we only have around 20 GB free(the backup is 40 GB uncompressed).
We were thinking of dropping that database to make room for the backup, but I'm assuming that when it restores it, it will run our space.
I am also thinking that we could just Detach/Attach the database file, but I am assuming that
this would mean that we have to take our live database down(which we don't want to do).
Another option is to restore from a network drive, so just set the restore to \severname\X$\RestoreFolder
But are there any things that I should be aware of if we do this?
I would like to thank everyone for your suggestions in advance.
When you say the database is 50 GB, is that just the MDF file? Or is does the LDF file make up a large portion of the 50 GB? Do you need all the log information on a test machine?
To be honest, I'd be tempted to buy a new hard drive and put it in the test server. Even the cheapest one you can find should have more than enough space for several test copies of the database.
Upgrade your hardware. Even a development/test machine should have enough space to backup/restore the catalogs you're working with.
Related
I have two databases for my customers, a LIVE database and a TEST database. Periodically my customers would like the LIVE data copied over to the TEST database so it has current data that they can mess around with.
I am looking for a simple way that I could run maybe a command in my application that would copy all the data from one and move it into the other.
Currently I have to remote in with their IT department or consultant and restore from a backup of the LIVE to the TEST. I would prefer to have a button in my application that says RESTORE TEST and it will do that for me.
Any suggestions on how I could do this? Is it possible in SQL? Is there a tool out there in .NET that could help?
Thanks
If you have a backup plan, which I hope you do, you could simply restore the latest full .bak, if it is accessible to your application. However, this would require some tunneling for your application to access the latest backup file and this is generally a no-no for zones containing database servers.
Perhaps you could set up a scheduled delivery of a backup from machine to machine. Does the LIVE server have access to your TEST server. I wouldn't think that a DBA would be inclined to set up a delivery of backup unless it was to have a remote backup location for disaster recovery and that is usually to dedicated locations, not a testing server. Maybe you could workout a scenario where your TEST server doubles as an extra remote backup location for the production environment.
It would be better to be shipped a backup and then periodically or manually start a job to restore from a local backup. This would take the burden of your application. Then you would only need to simply kick of the sql job from within your app as needed.
I would like to restore a 4TB SQL Server data dump currently on an external hard drive.
Now here's the problem, the hard drive on most laptops is smaller than this size and given that I don't have a server allocated yet, is there a way I can access this data?
I have SQL Server on my laptop. I would also like more than 1 persons to be able to access this data.
Is there anyway that 3 people can access this data from the hard drive at the same time? We do not have a shared network connection, all users are in different homes. We are all university students fairly new to this stuff, so details would be highly appreciated.
If you have around $650, you can by yourself a 8TB NAS box. More than enough space. Just place it on your network.
http://www.amazon.com/Seagate-Business-Storage-Attached-STBP8000100/dp/B00B5Q79FW
You could always use some type of cloud (Azure) to restore it. But might be more hassle since you have to build out a SQL Server on Windows environment, then copy the data, then restore it.
Do not know what the cost would be versus buying a NAS for the project.
I guess the main question is this a one time or continuing task?
If you don't have the space to restore your database you could try using a tool such as Idera SQL Virtual Database:
http://www.idera.com/productssolutions/sqlserver/sqlvirtualdatabase
This will (apparently) mount a backup and make it look just like a genuine MS SQL database. There's a free trial though I don't know what limitations it has.
I have 2 databases with MyISAM tables which are updated once a week. They are quite big in size (one DB is 2GB and the other is 6GB). I currently back them up once a week with mysqldump and keep the last 2 weeks' worth of .sql dumps on the same server where the DBs are running.
I would like, however, to be able to dump the backups to another server, as they are taking up server space unnecessarily. What is the best way to achieve this? If possible, I would like to keep the databases running during the backup. (no inserts or updates take place during the backup process, just selects).
Thanks in advance,
Tim
Were I you, I would create a script that did the backup and then sent the backup elsewhere. I know that is kind of what you are asking how to so, but you left out some things that would be good to know, such as what OS are your two systems running?
Of they are both windows, you could mount a network drive and have the backup dump there (or copy the dump there). If they are linux servers I would recommend copying it across using the scp command. If it is a mix then it gets fun and tricky.
If you are working with linux servers, the following guide should walk you through the process of backup. Click me!
If you are still scratching your head after reading that, let me know what kind of OSes you are rolling with and I can provide more detailed instructions.
Good luck!
I have two MS SQL 2005 servers, one for production and one for test and both have a Recovery Model of Full. I restore a backup of the production database to the test server and then have users make changes.
I want to be able to:
Roll back all the changes made to the test SQL server
Apply all the transactions that have occurred on the production SQL server since the test server was originally restored so that the two servers have the same data
I do not want to do a full database restore from backup file as this takes far too long with our +200GB database especially when all the changed data is less than 1GB.
EDIT
Based on the suggestions below I have tried restoring a database with NoRecovery but you cannot create a snapshot of a database that is in that state.
I have also tried restoring it to Standby Read only mode which works and I can take a snapshot of the database then and still apply transaction logs to the original db but I cannot make the database writable again as long as there are snapshots against it.
Running:
restore database TestDB with recovery
Results in the following error:
Msg 5094, Level 16, State 2, Line 1 The operation cannot be performed on a database with database snapshots or active DBCC replicas
First off, once you've restored the backup and set the database to "recovered", that's it -- you will never be able to apply another transaction log backup to it.
However, there are database snapshots. I've never used them, but I believe you could use them for this purpose. I think you need to restore the database, leave it in "not restored" mode -- definitly not standby -- and then generate snapshots based on that. (Or was that mirroring? I read about this stuff years ago, but never had reason to use it.)
Then when you want to update the database, you drop the snapshot, restore the "next" set of transaction log backups, and create a fresh snapshot.
However, I don't think this would work very well. Above and beyond the management and maintenance overhead of doing this, if the testers/developers do a lot of modifications, your database snapshot could get very big, even bigger than the original database -- and that's hard drive space used in addition to the "original" database. For infrequently modified databases this could work, but for large OLTP systems, I have serious doubts.
So what you really want is a copy of Production to be made in Test. First, you must have a current backup of production somewhere??. Usually on a database this size full backups are made Sunday nights and then differential backups are made each night during the week.
Take the Sunday backup copy and restore it as a different database name on your server, say TestRestore. You should be able to kick this off at 5:00 pm and it should take about 10 hours. If it takes a lot longer see Optimizing Backup and Restore Performance in SQL Server.
When you get in in the morning restore the last differential backup from the previous night, this shouldn't take long at all.
Then kick the users off the Test database and rename Test to TestOld (someone will need something), then rename your TestRestore database to be the Test database. See How to rename a SQL Server Database.
The long range solution is to do log shipping from Production to TestRestore. The at a moments notice you can rename things and have a fresh Test database.
For the rollback, the easiest way is probably using a virtual machine and not saving changes when you close it.
For copying changes across from the production to the test, could you restore the differential backups or transaction log backups from production to the test db?
After having tried all of the suggestions offered here I have not found any means of accomplishing what I outlined in the question through SQL. If someone can find a way and post it or has another suggestion I would be happy to try something else but at this point there appears to be no way to accomplish this.
Storage vendors (as netapp) provide the ability to have writeable snapshots.
It gives you the ability to create a snapshot within seconds on the production, do your tests, and drop/recreate the snapshot.
It's a long term solution, but... It works
On Server1, a job exists that compresses the latest full backup
On Server2, there's a job that performs the following steps:
Copies the compressed file to a local drive
Decompresses the file to make the full backup available
Kills all sessions to the database that is about to be restored
Restores the database
Sets the recovery model to Simple
Grants db_owner privileges to the developers
Ref:http://weblogs.sqlteam.com/tarad/archive/2009/02/25/How-to-refresh-a-SQL-Server-database-automatically.aspx
I would like to save my backups from my SQL 2008 server to another server location.
We have 2 servers:
Deployment server
File Server
The problem is that the deployment server doesn't have much space. And we keep 10 days backups of our databases. Therefore we need to store our backups on an external "file server". The problem is that SQL doesn't permit this.
I've tried to run the SQL 2008 service with an account that has admin rights on both pc's (domain account), but this still doesn't work.
Any thoughts on this one.
Otherwise we'll have to put an external harddisk on a rack server and that's kinda silly no?
EDIT:
I've found a way to make it work.
You have to share the folder on the server. Then grant the Development Server (the PC itself) write permissions. This will make external backups possible with SQL server.
Don't know if it's safe though, I find it kinda strange to give a computer rights on a folder.
You can use 3rd party tools like SqlBackupAndFTP
There are several ways of doing this already described, but this one is based on my open source project, SQL Server Compressed Backup. It is a command line tool for backing up SQL Server databases, and it can write to anywhere the NT User running it can write to. Just schedule it in Scheduled Tasks running with a user with appropriate permissions.
An example of backing up to a share on another server would be:
msbp.exe backup "db(database=model)" "gzip" "local(path=\\server\share\path\model.full.bak.gz)"
All the BACKUP command options that make sense for backing up to files are available: log, differential, copy_only, checksum, and more (tape drive options are not available for instance).
you might use a scheduler to move backups after a certain amount of time after the backup started with a batch file.
If I remember correctly there's a hack to enable the sql server to back up on remote storage, but I don't think a hack is the way to go.
Surely the best possibility may be to use an external backup tool which supports the use of agents. They control when the backup starts and take care of the files to move around.
Sascha
You could create a nice and tidy little SQL Server Integration Services (SSIS) package to both carry out the backup and then move the data to your alternative file store.
Interestingly enough, the maintenance plans within SQL Server actually use SSIS components. These same components are available to use within the Business Intelligence Design Studio (BIDS).
I hope this is clear but let me know if you require further assistance.
Cheers, John
My experiences older versions of MSSQL, so there may be things in SQL2008 which help you better.
We are very tight on disk space on some of our old servers. These are machines at our ISP and their restore-from-tape lead time is not good - 24 hours is not uncommon :( so we need to keep a decent online backup history.
We take full backup on Sunday, differential backup each other night, and TLog backups every 10 minutes.
We force Tlog backups every minute during table/index defrag and statistics update - this is because these are the most TLog-intensive tasks that we run, and they were previously responsibly for determining the size of the standing Tlog file; since this change we've been able to run the TLog standing size about 60% smaller than before.
Worth watching the size of Diff backups though - if it turns out that by the Wednesday backup your DIFF backup is getting close to the size of the Full backup you might as well run a Full backup twice a week and have smaller Diff backups for the next day or two.
When you delete your Full or Diff backup files (to recover the disk space) make sure you delete the TLog backups that are earlier. But consider your recovery path - would you like to be able to restore last Sunday's Full backup and all Tlogs since, or are you happy that for moment-in-time restore you can only go back as far as last night's DIFF backup? (i.e. to go back further you can only restore to FULL / DIFF backup, and not to point-in-time) If the later you can delete all earlier Tlog backups as soon as you have made the DIFF backup.
(Obviously, regardless of that, you need to get the backups on to tape etc. and to your copy-server, you just don't have to be dependant on tape recovery to make your Restore, but you are losing granularity of restore with time)
We may recover the last Full (or the Last Full plus Monday's DIFF) to a "temporary database" to check out something, and then drop that DB, but if we really REALLY had to restore to "last Monday 15-minutes-past-10 AM" we would live with having to get backups back from tape or off the copy-server.
All our backup files are in an NTFS directory with Compress set. (You can probably make compressed backups direct from SQL2008??, the servers we have which are disk-starved are running SQL2000). I have read that this is frowned upon, but never seen good reasoning why, and we've never had a problem with it - touch wood!
Some third-party backup software allows setting specific user permissions for network locations. So it doesn't matter what perrmissions SQL Server service account has. I would recommend you to try EMS SQL Backup, which also supports backup compression 'on fly' to save storage space.