How to backup and restore table - sql

During my testing, I want to make a copy of a few tables within the same database before running the tests. After tests are complete, I want to restore the original table with the copy.
What is the best way to do this?
I also want to make sure all indexes and constraints are restored.
DECLARE #Tablename NVARCHAR(500)
DECLARE #BuildStr NVARCHAR(500)
DECLARE #SQL NVARCHAR(500)
SELECT #Tablename = 'my_Users'
SELECT #BuildStr = CONVERT(NVARCHAR(16),GETDATE(),120)
SELECT #BuildStr = REPLACE(REPLACE(REPLACE(REPLACE(#BuildStr,'
',''),':',''),'-',''),' ','')
SET #SQL = 'select * into '+#Tablename+'_'+#BuildStr+' from '+#Tablename
SELECT #SQL
EXEC (#SQL) -- Execute SQl statement
How do I restore if I use the above to make a copy.
SQL2005

Something like:
truncate table OriginalTable
insert into OriginalTable select * from CopiedTable
Depending on which database you're using, there are faster alternatives.

I think the script that I recently used can be useful to somebody.
To backup table you can use next query:
DECLARE #tableName nvarchar(max), #tableName_bck nvarchar(max)
SET #tableName = 'SomeTable';
SET #tableName_bck = 'SomeTable_bck';
-- Backup
DECLARE #insertCommand nvarchar(max)
--SELECT INTO SomeTable_bck FROM SomeTable
SET #insertCommand = 'SELECT * INTO ' + #tableName_bck + ' FROM ' + #tableName
PRINT #insertCommand
EXEC sp_executesql #insertCommand
For restore, because tables often can have IDENTITY fields, you need to SET IDENTITY_INSERT ON and also you need to provide the column list when inserting records. That's why script is a bit more complex:
DECLARE #tableName nvarchar(max), #tableName_bck nvarchar(max)
SET #tableName = 'SomeTable';
SET #tableName_bck = 'SomeTable_bck';
-- Restore
DECLARE #columnList nvarchar(max)
DECLARE #insertCommand nvarchar(max)
SELECT
#columnList = SUBSTRING(
(
SELECT ', ' + column_name AS [text()]
From INFORMATION_SCHEMA.COLUMNS
WHERE table_name = #tableName
ORDER BY table_name
For XML PATH ('')
), 2, 1000);
--INSERT INTO SomeTable(Column1, Column2) SELECT Column1, Column2 FROM SomeTable_bck
SELECT #insertCommand = 'INSERT INTO ' + #tableName + '(' + #columnList + ') SELECT ' + #columnList + ' FROM ' + #tableName_bck
IF EXISTS (
SELECT column_name, table_name
FROM INFORMATION_SCHEMA.COLUMNS
WHERE table_schema = 'dbo' AND table_name = #tableName
AND COLUMNPROPERTY(object_id(table_name), column_name, 'IsIdentity') = 1
)
BEGIN
SET #insertCommand =
'SET IDENTITY_INSERT ' + #tableName + ' ON;'
+ 'TRUNCATE TABLE ' + #tableName + ';'
+ #insertCommand + ';'
+ 'SET IDENTITY_INSERT ' + #tableName + ' OFF;'
/*
SET IDENTITY_INSERT SomeTable ON
TRUNCATE TABLE SomeTable
INSERT INTO SomeTable(Column1, Column2) SELECT Column1, Column2 FROM SomeTable_bck
SET IDENTITY_INSERT SomeTable OFF
*/
END
ELSE
BEGIN
SET #insertCommand =
'TRUNCATE TABLE ' + #tableName + ';'
+ #insertCommand
/*
TRUNCATE TABLE SomeTable
INSERT INTO SomeTable(Column1, Column2) SELECT Column1, Column2 FROM SomeTable_bck
*/
END
PRINT #insertCommand
EXEC sp_executesql #insertCommand
It's easy to see, that you can specify #tableName and #tableName_bck however you like it. For example, this can be in a stored procedure, so the script is reusable.

There are MANY methods to do this, but by far, the simplest is to simply take a backup of the database, work with it, then restore from backup when done. (Instructions here)
Backing up the table is certainly viable, but it's not the easiest method, and once you start working with multiple tables, it gets harder. So rather than address your specific example of restoring a single table, I'm offering general advice on better management of test data.
The safest way of doing this is to NOT restore the original, but rather to not even touch the original. Take a backup of it, and then restore it to a new test server. (Instructions here) Best practices dictate that you should never be doing test or development work on a live database anyway. This is also pretty easy, as well as safe.

Have you consdered using a SQL Server unit testing framework such as the open source tSQLt framework?
See http://tsqlt.org/
A tSQLt test runs in a transaction so whatever you do within your test will get rolled back.
It has a concept of a "faketable" which is a copy of the original table minus the constraints, if these get in the way of your test setup.

Related

SQL - Search for table name across all databases on server

I thought this would be pretty straightforward, but I have about 80 databases in the server I am looking at, each database has 5-500 tables.
I am wondering how i can search for a TABLE NAME across everything. I tried a basic
SELECT
*
FROM sys.tables
but I only get 6 results.
This is a bit of a hack, but I think it should work:
sp_msforeachdb 'select ''?'' from ?.information_schema.tables where table_name=''YourTableName''';
It will output the names of the DBs that contain a table with the given name.
Here's a version using print that is a little better IMHO:
sp_msforeachdb '
if exists(select * from ?.information_schema.tables where table_name=''YourTableName'')
print ''?'' ';
The above queries are using ms_foreachdb, a stored procedure that runs a given query on all databases present on the current server.
This version uses FOR XML PATH('') instead of string concatenation, eliminates the default system databases, handles databases with non-standard names and supports a search pattern.
DECLARE #pattern NVARCHAR(128) = '%yourpattern%';
DECLARE #sql NVARCHAR(max) = STUFF((
SELECT 'union all select DatabaseName = name from ' + QUOTENAME(d.name) + '.sys.tables where name like ''' + #pattern + ''' '
FROM sys.databases d
WHERE d.database_id > 4
FOR XML path('')
), 1, 10, '');
EXEC sp_executesql #sql;
You might need to write:
select DatabaseName = name collate Latin1_General_CI_AS
I know I did.
Just because I really dislike loops I wanted to post an alternative to answers already posted that are using cursors.
This leverages dynamic sql and the sys.databases table.
declare #SQL nvarchar(max) = ''
select #SQL = #SQL + 'select DatabaseName = name from [' + name + '].sys.tables where name = ''YourTableName'' union all '
from sys.databases
set #SQL = stuff(#SQL, len(#SQL) - 9, 11, '') --removes the last UNION ALL
exec sp_executesql #SQL
Here's a bit of a simpler option using dynamic sql. This will get you the name of all tables in every database in your environment:
declare #table table (idx int identity, name varchar(max))
insert #table
select name from master.sys.databases
declare #dbname varchar(max)
declare #iterator int=1
while #iterator<=(select max(idx) from #table) begin
select #dbname=name from #table where idx=#iterator
exec('use ['+#dbname+'] select name from sys.tables')
set #iterator=#iterator+1
end
select * from #table
Dim sql As String = ("Select * from " & ComboboxDatabaseName.Text & ".sys.tables")
use this key

Iterate Through and Rename All Tables w/ Object Qualifiers MSSQL

I need to iterate through all of the tables that begin with a specific prefix to rename them. The code I've tried is below, but it ends with one of two results, either it crashes SSMS (sometimes), or I get the error message below for each table. I've tried with and with out dbo.
Can anyone tell me what I'm doing wrong or perhaps suggest a better way to do this?
No item by the name of 'dbo.prefix_TableName' could be found in the current database 'DatabaseName', given that #itemtype was input as '(null)'.
Here's the code I'm running...
SET NOCOUNT ON;
USE [DatabaseName];
DECLARE #oq NVARCHAR(5), #tableName NVARCHAR(128), #newTableName NVARCHAR(128);
SET #oq = N'prefix_';
/*
find and rename all tables
*/
DECLARE [tableCursor] CURSOR FOR
SELECT [TABLE_NAME] FROM INFORMATION_SCHEMA.TABLES
WHERE [TABLE_TYPE] = 'BASE TABLE' AND [TABLE_NAME] LIKE #oq + '%'
ORDER BY [TABLE_NAME];
OPEN [tableCursor]
FETCH NEXT FROM [tableCursor] INTO #tableName
WHILE ##FETCH_STATUS = 0
BEGIN
SET #newTableName = REPLACE(#tableName, #oq, N'');
EXEC('EXEC sp_rename ''dbo.' + #tableName + ''', ''' + #newTableName + '''');
END
CLOSE [tableCursor];
DEALLOCATE [tableCursor];
A simpler solution without cursors
declare #oq nvarchar(max) = N'prefix_'
declare #cmd nvarchar(max)
select #cmd = a from (
select 'EXEC sp_rename ''' + TABLE_NAME + ''', ''' + REPLACE(TABLE_NAME, #oq, N'') + ''' '
from INFORMATION_SCHEMA.TABLES
for xml path('')
) t(a)
exec sp_executesql #cmd
In your example nvarchar(5) causes truncation, you probably need nvarchar(7) or nvarchar(max).

SQL Server : query to insert data into table from another table with different struct

I have two tables in two different databases.
My first table is an older version and has fewer columns than the second table.
I want to copy the contents of my old table to my new table.
In each database table there are several distribution in this case.
How can I do to quickly copy data from old tables to the new without having to write the column names manually for each table?
Thanks!
You can "avoid writing the column names manually" in SSMS by dragging and dropping the "Columns" folder under the table in the Object Explorer over to a query window (just hold the dragged item over whitespace or the character position where you want the names to appear). All the column names will be displayed separated by commas.
You could also try something like this to get just the list of columns that are common between two tables (then writing the INSERT statement is trivial).
SELECT
Substring((
SELECT
', ' + S.COLUMN_NAME
FROM
INFORMATION_SCHEMA.COLUMNS S
INNER JOIN INFORMATION_SCHEMA.COLUMNS D
ON S.COLUMN_NAME = D.COLUMN_NAME
WHERE
S.TABLE_SCHEMA = 'dbo'
AND S.TABLE_NAME = 'Source Table'
AND D.TABLE_SCHEMA = 'dbo'
AND D.TABLE_NAME = 'Destination Table'
FOR XML PATH(''), TYPE
).value('.[1]', 'nvarchar(max)'), 3, 21474783647)
;
You could also create an SSIS package that simply moves all the data from one table to the other. Column names that match would automatically be linked up. Depending on your familiarity with SSIS, this could take you 2 minutes, or it could take you 2 hours.
The following code should do the work.
Basically what it does is:
1. Collects column names from both tables.
2. Intersects the column names in order to filter out columns that exists only in 1 table.
3. Get a string which is the column names delimited by a comma.
4. Using the string from stage #3 creating the insert command.
5. Executing the command from stage #4.
--BEGIN TRAN
DECLARE #oldName NVARCHAR(50) = 'OldTableName', #newName NVARCHAR(50) = 'newTableName'
DECLARE #oldDBName NVARCHAR(50) = '[OldDBName].[dbo].['+#oldName+']', #newDBName NVARCHAR(50) = '[newDBName].[dbo].['+#newName+']'
/*This table variable will have columns that exists in both table*/
DECLARE #tCommonColumns TABLE(
ColumnsName NVARCHAR(max) NOT NULL
);
INSERT INTO #tCommonColumns
SELECT column_name --,*
FROM information_schema.columns
WHERE table_name = #oldName
AND COLUMNPROPERTY(object_id(#oldName), column_name, 'IsIdentity') = 0 --this will make sure you ommit IDentity columns
INTERSECT
SELECT column_name --, *
FROM information_schema.columns
WHERE table_name = #newName
AND COLUMNPROPERTY(object_id(#newName), column_name,'IsIdentity') = 0--this will make sure you ommit IDentity columns
--SELECT * FROM #tCommonColumns
/*Get the columns as a comma seperated string */
DECLARE #columns NVARCHAR(max)
SELECT DISTINCT
#columns = STUFF((SELECT ', ' + cols.ColumnsName
FROM #tCommonColumns cols
FOR XML Path('')),1,1,'')
FROM #tCommonColumns
PRINT #columns
/*Create tyhe insert command*/
DECLARE #InserCmd NVARCHAR(max)
SET #InserCmd =
'INSERT INTO '+#newDBName +' ('+#columns +')
SELECT '+#columns +' FROM '+#oldDBName
PRINT #InserCmd
/*Execute the command*/
EXECUTE sp_executesql #InserCmd
--ROLLBACK
Please note that this script might fail if you have FOREIGN KEY Constraints That are fulfiled in the old table but not in the new table.
Edit:
The query was updated to omit Identity columns.
Edit 2:
query updated for supporting different databases for the tables (make sure you set the #oldName ,#newName, #oldDBName, #newDBName variables to match actual credentials).
Thanks all !
I propose that it's more generic :)
--BEGIN TRAN
DECLARE #Tablename NVARCHAR(50)
SET #Tablename = 'tableName'
DECLARE #Schemaname NVARCHAR(50)
SET #Schemaname = 'schemaName'
DECLARE #Datasource NVARCHAR(50)
SET #Datasource = 'dataSource'
DECLARE #Datadest NVARCHAR(50)
SET #Datadest = 'dataDestination'
/*This table variable will have columns that exists in both table*/
DECLARE #tCommonColumns TABLE(
ColumnsName NVARCHAR(max) NOT NULL
);
--INSERT INTO #tCommonColumns
DECLARE #sql NVARCHAR(max)
SET #sql = 'SELECT column_name
FROM ' + #Datasource + '.information_schema.columns
WHERE table_name = ''' + #Tablename + '''
AND COLUMNPROPERTY(object_id(''' + #Datasource + '.' + #Schemaname + '.' + #Tablename + '''), column_name, ''IsIdentity'') = 0' --this will make sure you ommit IDentity columns
SET #sql = #sql + ' INTERSECT
SELECT column_name
FROM ' + #Datadest + '.information_schema.columns
WHERE table_name = ''' + #Tablename + '''
AND COLUMNPROPERTY(object_id(''' + #Datadest + '.' + #Schemaname + '.' + #Tablename + '''), column_name, ''IsIdentity'') = 0' --this will make sure you ommit IDentity columns'
INSERT INTO #tCommonColumns EXECUTE sp_executesql #sql
-- SELECT * FROM #tCommonColumns
/*Get the columns as a comma seperated string */
DECLARE #columns NVARCHAR(max)
SELECT DISTINCT
#columns = STUFF((SELECT ', ' + cols.ColumnsName
FROM #tCommonColumns cols
FOR XML Path('')),1,1,'')
FROM #tCommonColumns
--PRINT #columns
/*Create tyhe insert command*/
DECLARE #InserCmd NVARCHAR(max)
SET #InserCmd =
'INSERT INTO '+#Datadest+'.'+#Schemaname+'.'+#Tablename +' ('+#columns +')
SELECT '+#columns +' FROM '+#Datasource+'.'+#Schemaname+'.'+#Tablename
PRINT #InserCmd
/*Execute the command*/
--EXECUTE sp_executesql #InserCmd
--ROLLBACK
Something like this:
Insert into dbo.Newtbl
SELECT * FROM dbo.OldTbl

Fastest way to copy sql table

Im looking for the fastest way to copy a table and its contents on my sql server just simple copy of the table with the source and destination on the same server/database.
Currently with a stored procedure select * into sql statement it takes 6.75 minutes to copy over 4.7 million records. This is too slow.
CREATE PROCEDURE [dbo].[CopyTable1]
AS
BEGIN
DECLARE #mainTable VARCHAR(255),
#backupTable VARCHAR(255),
#sql VARCHAR(255),
#qry nvarchar(max);
SET NOCOUNT ON;
Set #mainTable='Table1'
Set #backupTable=#mainTable + '_Previous'
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(#backupTable) AND type in (N'U'))
BEGIN
SET #Sql = 'if exists (select * from sysobjects '
SET #Sql = #Sql + 'where id = object_id(N''[' + #backupTable + ']'') and '
SET #Sql = #Sql + 'OBJECTPROPERTY(id, N''IsUserTable'') = 1) ' + CHAR(13)
SET #Sql = #Sql + 'DROP TABLE [' + #backupTable + ']'
EXEC (#Sql)
END
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(#mainTable) AND type in (N'U'))
SET #Sql = 'SELECT * INTO dbo.[' + #backupTable + '] FROM dbo.[' + #mainTable + ']'
EXEC (#Sql)
END
If you are concerned about speed, it seems you have two alternatives; copying by block or the BCP/Bulk insert method.
Block Transfer
DECLARE
#CurrentRow bigint, #RowCount bigint, #CurrentBlock bigint
SET
#CurrentRow = 1
SELECT
#RowCount = Count(*)
FROM
oldtable
WITH (NOLOCK)
WHILE #CurrentRow < #RowCount
BEGIN
SET
#CurrentBlock = #CurrentRow + 1000000
INSERT INTO
newtable
(FIELDS,GO,HERE)
SELECT
FIELDS,GO,HERE
FROM (
SELECT
FIELDS,GO,HERE, ROW_NUMBER() OVER (ORDER BY SomeColumn) AS RowNum
FROM
oldtable
WITH (NOLOCK)
) AS MyDerivedTable
WHERE
MyDerivedTable.RowNum BETWEEN #startRow AND #endRow
SET
#CurrentRow = #CurrentBlock + 1
end
How to copy a huge table data into another table in SQL Server
BCP/Bulk Insert
SELECT
*
INTO
NewTable
FROM
OldTable
WHERE
1=2
BULK INSERT
NewTable
FROM
'c:\temp\OldTable.txt'
WITH (DATAFILETYPE = 'native')
What is the fastest way to copy data from one table to another
http://www.databasejournal.com/features/mssql/article.php/3507171/Transferring-Data-from-One-Table-to-Another.htm
You seem to want to copy a table that is a heap and has no indexes. That is the easiest case to get right. Just do a
insert into Target with (tablock) select * from Source
Make sure, that minimal logging for bulk operations is enabled (search for that term). Switch to the simple recovery model.
This will take up almost no log space because only allocations are logged with minimal logging.
This just scans the source in allocation order and append new bulk-allocated pages to the target.
Again, you have asked about the easiest case. Things get more complicated when indexes come into play.
Why not insert in batches? It's not necessary. Log space is not an issue. And because the target is not sorted (it is a heap) we don't need sort buffers.

SQL To search the entire MS SQL 2000 database for a value

I would like to search an entire MS SQL 2000 database for one value. This would be to aid development only. Keep that in mind when considering this question.
This will get all the table names and the column of the data type I'm looking for:
SELECT Columns.COLUMN_NAME, tables.TABLE_NAME
FROM INFORMATION_SCHEMA.Columns as Columns
JOIN INFORMATION_SCHEMA.TABLES as tables
On Columns.TABLE_NAME = tables.TABLE_NAME
WHERE Columns.DATA_TYPE = 'INT'
I was thinking something like this:
-- Vars
DECLARE #COUNTER INT
DECLARE #TOTAL INT
DECLARE #TABLE CHAR(128)
DECLARE #COLUMN CHAR(128)
DECLARE #COLUMNTYPE CHAR(128)
DECLARE #COLUMNVALUE INT
-- What we are looking for
SET #COLUMNTYPE = 'INT'
SET #COLUMNVALUE = 3
SET #COUNTER = 0
-- Find out how many possible columns exist
SELECT #TOTAL = COUNT(*)
FROM INFORMATION_SCHEMA.Columns as Columns
JOIN INFORMATION_SCHEMA.TABLES as tables
On Columns.TABLE_NAME = tables.TABLE_NAME
WHERE Columns.DATA_TYPE = #COLUMNTYPE
PRINT CAST(#TOTAL AS CHAR) + 'possible columns'
WHILE #COUNTER < #TOTAL
BEGIN
SET #COUNTER = #COUNTER +1
-- ADD MAGIC HERE
END
Any ideas?
UPDATE I recently found this tool that works quite well.
Since it is dev only (and probably doesn't have to be very elegant), how about using TSQL to generate a pile of TSQL that you then copy back into the query window and execute?
SELECT 'SELECT * FROM [' + tables.TABLE_NAME + '] WHERE ['
+ Columns.Column_Name + '] = ' + CONVERT(varchar(50),#COLUMNVALUE)
FROM INFORMATION_SCHEMA.Columns as Columns
INNER JOIN INFORMATION_SCHEMA.TABLES as tables
On Columns.TABLE_NAME = tables.TABLE_NAME
WHERE Columns.DATA_TYPE = #COLUMNTYPE
It won't be pretty, but it should work... an alternative might be to insert something like the above into a table-variable, then loop over the table-variable using EXEC (#Sql). But for dev purposes it probably isn't worth it...
I've found this script to be helpful... but as Marc noted, it wasn't really worth it. I've only used it a handful of times since I wrote it six months ago.
It only really comes in handy because there are a couple of tables in our dev environment which cause binding errors when you query them, and I always forget which ones.
BEGIN TRAN
declare #search nvarchar(100)
set #search = 'string to search for'
-- search whole database for text
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
IF nullif(object_id('tempdb..#tmpSearch'), 0) IS NOT NULL DROP TABLE #tmpSearch
CREATE TABLE #tmpSearch (
ListIndex int identity(1,1),
CustomSQL nvarchar(2000)
)
Print 'Getting tables...'
INSERT #tmpSearch (CustomSQL)
select 'IF EXISTS (select * FROM [' + TABLE_NAME + '] WHERE [' + COLUMN_NAME + '] LIKE ''%' + #search + '%'') BEGIN PRINT ''Table ' + TABLE_NAME + ', Column ' + COLUMN_NAME + ''';select * FROM [' + TABLE_NAME + '] WHERE [' + COLUMN_NAME + '] LIKE ''%' + #search + '%'' END' FROM information_schema.columns
where DATA_TYPE IN ('ntext', 'nvarchar', 'uniqueidentifier', 'char', 'varchar', 'text')
and TABLE_NAME NOT IN ('table_you_dont_want_to_look_in', 'and_another_one')
Print 'Searching...
'
declare #index int
declare #customsql nvarchar(2000)
WHILE EXISTS (SELECT * FROM #tmpSearch)
BEGIN
SELECT #index = min(ListIndex) FROM #tmpSearch
SELECT #customSQL = CustomSQL FROM #tmpSearch WHERE ListIndex = #index
IF #customSql IS NOT NULL
EXECUTE (#customSql)
SET NOCOUNT ON
DELETE #tmpSearch WHERE ListIndex = #index
SET NOCOUNT OFF
END
print 'the end.'
ROLLBACK