I'm looking for best practices, efficient way. I need to copy once per month my production database to development. So I'm thinking to automate this process if possible.
The size of database around 20GB with log file (full recovery mode).
Please let me know if I need to provide more details.
Hopefully you're making regular backups of your database anyway.
So you basically just need to take the newest backup and restore it on a different server (and maybe with a different database name).
At my workplace, we are using MS SQL Server and we are doing this as well:
Our main database is backed up every evening at 9 pm (full backup).
Every day at 11 pm, a SQL Server Agent job on another server takes the newest backup from the backup folder and restores it as OurMainDatabase_Yesterday.
Here is an example script for MS SQL Server:
ALTER DATABASE OurMainDatabase_Yesterday SET SINGLE_USER WITH ROLLBACK IMMEDIATE
USE master
EXEC sp_detach_db 'OurMainDatabase_Yesterday', 'true'
-- use today's backup from the main server
declare #BackupPath as nvarchar(500)
set #BackupPath = '\\MainServer\backup\OurMainDatabase\OurMainDatabase_backup_'
+ convert(varchar(10),getdate(),112) + '2100.BAK'
RESTORE DATABASE OurMainDatabase_Yesterday
FROM DISK = #BackupPath
WITH MOVE 'OurMainDatabase_Data'
TO 'F:\Data\OurMainDatabase_Yesterday_Data.mdf',
MOVE 'OurMainDatabase_Log'
TO 'G:\Logs\OurMainDatabase_Yesterday_Log.ldf',
REPLACE
ALTER DATABASE OurMainDatabase_Yesterday SET MULTI_USER
Related
I am doing the below using SQL Server / T-SQL :
RESTORE DATABASE UAT
FROM DISK = 'E:\Databases\backup\MY_LIVE_20120720_070001.bak'
WITH REPLACE
But I want to be able to use a file location that ignores the numbers in the file name (which represent the date) in my backup file. There will only ever be one 'MY_LIVE_****.bak' but its number string will change each day.
The goal is to restore my UAT instance from live each week, using the latest backup - of which there will be only file matching that string prefix, but the numbers/date will change each week.
You can use xp_cmdshell to do a dir for your file. Note, however, that xp_cmdshell is normally disabled for good reasons. Given this is UAT, that may not be an issue.
See here for more http://www.sqlusa.com/bestpractices2005/dir/
ALTER DATABASE UAT
SET SINGLE_USER WITH
ROLLBACK IMMEDIATE
declare #fileName varchar(56);
SELECT #filename = physical_device_name
FROM msdb.dbo.backupmediafamily
WHERE media_set_id =(
SELECT TOP 1 media_set_id
FROM msdb.dbo.backupset
WHERE database_name='MY_LIVE'
ORDER BY backup_start_date DESC)
ALTER DATABASE UAT
SET SINGLE_USER WITH
ROLLBACK IMMEDIATE
----Restore Database
RESTORE DATABASE UAT
FROM DISK = #fileName
WITH REPLACE
/*If there is no error in statement before database will be in multiuser
mode.
If error occurs please execute following command it will convert
database in multi user.*/
ALTER DATABASE ASLA_DEV SET MULTI_USER
GO
Is it possible to modify the table creation date of a table? The date which we see on right clicking a table > properties > Created date or in sys.tables.create_date.
Even though the tables were created months ago, I want it to look like they were created today.
No more than you can change your birthday, and why would you want to ?
You could just
select * into #tmp from [tablename]
drop table [tablename]
select * into [tablename] from #tmp
That would rebuild the table and preserve the structure (to a point). You could script a new table , copy data then drop and rename. As above.
In SQL Server 2000, you could do this by hacking into the system tables with the sp_configure option 'allow updates' set to 1.
This is not possible in SQL Server 2005 and up. From Books Online:
This option is still present in the sp_configure stored procedure,
although its functionality is unavailable in Microsoft SQL Server 2005
(the setting has no effect). In SQL Server 2005, direct updates to the
system tables are not supported.
In 2005 I believe you could "game the system" by using a dedicated administrator connection, but I think that was fixed shortly after RTM (or it needs a trace flag or some other undocumented setting to work). Even using DAC and with the sp_configure option set to 1, trying this on both SQL Server 2005 SP4 and SQL Server 2008 R2 SP1 yields:
Msg 259, Level 16, State 1
Ad hoc updates to system catalogs are not allowed.
Who are you trying to fool, and why?
EDIT
Now that we have more information on the why, you could create a virtual machine that is not attached to any domain, set the clock back to whatever date you want, create a database, create your objects, back up the database, copy it to the host, and restore it. The create_date for those objects should still reflect the earlier date, though the database itself might not (I haven't tested this).
You can do this by shifting the clock back on your own system, but I don't know if I'd want to mess with my clock this way after SQL Server has been installed and has been creating objects in what will become "the future" for a short period of time. VM definitely seems safer to me.
We have a SQL Server 2000 instance where the MSDB has grown to a huge size due to the backup history never having been deleted in several years. I would like to purge the backup history completely (I don't see why it's needed) and free up the disk space used by all this data.
I realise you can use the sp_delete_backuphistory command, but this is far too slow (nothing happens in 2+ hours) and while it's executing the transaction log file grows to fill the entire disk (several GB). SQL Server 2000 does not appear to support doing this database by database.
I need to find a way of deleting all the data which doesn't fill the disk up first. So either deleting in stages so the log doesn't grow to big, or perhaps using truncate table somehow, but I'm not sure if there's a safe way to do this, and as I'm not a SQL expert, I wouldn't really know how to do this without destroying my MSDB database!
Any help would be appreciated!
I use something like the following:
declare #oldest_date datetime, #newest_date datetime
select #oldest_date = min(backup_start_date) from backupset
select #newest_date = dateadd(day, -45, getdate())
while(#oldest_date <= #newest_date)
begin
exec sp_delete_backuphistory #oldest_date
set #oldest_date = dateadd(day, 7, #oldest_date)
end
This will delete a week's worth of history at a time until you're caught up. The nice thing is that you can stick this in a job and run it periodically (weekly, for instance) and it'll do the right thing.
Try to reduce the number of rows you delete in one go. The first parameter to sp_delete_backuphistory is the oldest day to keep.
EXEC sp_delete_backuphistory '2000-01-01'
EXEC sp_delete_backuphistory '2001-01-01'
EXEC sp_delete_backuphistory '2002-01-01'
...
It can also help to lower the recovery model to Simple if it's currently at Full.
First take a backup,
Then create database for each year and restore database from backup file for one year data.
then clear all log file after take all process.
I am using SQL Server 2000 including 77 databases, and I want to migrate them to the new SQL Server 2008 R2.
I can do this operation individually with attach or restore commands. Is there any script for migrating 77 databases to the new server installed SQL Server 2008 R2.
Thank you
You could write a script to backup in a loop and restore on another server
As long as the backup files are visible to both servers. This would allow a WITH MOVE option to allow for different drives/folders.
Backups are smaller than MDFs/LDFs too to less copying
You will need to produce your own script as you would really want to do more than backup and restore.
Other things you might like to do is run DBCC UpdateUsage, set the compatibility level, update the stats, run DBCC CheckDB with Data_Purity, change the page verify option to checksum. You may have replication and full text catalogues to deal with as well. All these things would probably need to go into your script.
You would need to setup a script that performs all/some/more of the things mentionned previously on a database and then extend your script to loop through all your databases. This can be done using a combination of batch files or powershell files and utilizing sqlcmd.
For example this is one script I run after restoring the backups onto the new server. This is called from a windows batch file via sqlcmd.
USE [master]
GO
ALTER DATABASE [$(DATABASENAME)] SET COMPATIBILITY_LEVEL = 100
ALTER DATABASE [$(DATABASENAME)] SET PAGE_VERIFY CHECKSUM WITH NO_WAIT
GO
Use [$(DATABASENAME)]
Go
Declare #DBO sysname
--who is the sa user
Select #DBO = name
from sys.server_principals
Where principal_id = 1
--assign sa the DB owner
exec ('sp_changedbowner ''' + #DBO +'''')
go
--fix the counts
dbcc updateusage (0)
go
--check the db include the column value integrity
dbcc checkdb(0) With Data_Purity, ALL_ERRORMSGS, NO_INFOMSGS
go
--make sure the stats are up to date
exec sp_updatestats
Go
You could use a software tool like Sql Compare.
Failing that I would script them individually.
You could run round the internal sysobjects table and build a combined script, but I wouldn't.
We had some problems this morning and need to rollback our database for about one hour. Is this possible and how is it done?
It is a Microsoft SQL 2005 database.
Find the previous full backup of your database (BF1).
Take a backup of the log file (BL1).
Take a full backup of the database (BF2). Good to have, in case the following steps go wrong.
Restore the previous full backup (BF1), with NORECOVERY
Restore the log file backup (BL1) with RECOVERY, and specifying the point in time you want to recover.
Select Your database.
Then select Tasks/Restore/Database.
On the restore database dialog select the Timeline option
Enter the time from when you want to revert your database.
Click ok
Ok Again.
Your database updated successfully.
Have done some investigating and it seems that since our database has RECOVERYMODEL set to SIMPLE it is not possible to do a rollback.
If the database was set up with full recovery model or bulk-logged recovery model it would have been easier.