Copy local SQL database onto Azure - sql

I want to do the - should be simple - task of copying a database from local to live...
I have all the data and table structure I want in my local database.
I have a currently running database live database that has all the backup stuff assigned to it etc so I don't just want to create a brand new database (not that I can find how to do this either)...
I just want to remove all tables from that database and then copy all my data and tables from the local into the live azure sql database.
How do I do this???

you can try to achieve this with SQL Server Management Studio.

If you goto SQL Management Studio, right click on the database you want to copy data for...
Goto Tasks Generate Script
Select just your tables not the entire database object
Open up your azure database in Visual Studio that you want to create the copy into
Open a new query for that database and paste the generated script in
Execute and pray to the gods
Jump around because it worked, now run the following command on the azure table to disable foreign key migrations:
DECLARE #sql NVARCHAR(MAX) = N'';
;WITH x AS
(
SELECT DISTINCT obj =
QUOTENAME(OBJECT_SCHEMA_NAME(parent_object_id)) + '.'
+ QUOTENAME(OBJECT_NAME(parent_object_id))
FROM sys.foreign_keys
)
SELECT #sql += N'ALTER TABLE ' + obj + ' NOCHECK CONSTRAINT ALL;
' FROM x;
EXEC sp_executesql #sql;
now go into sql server management studio and right click on your local database and goto tasks and then export data
export to your azure database but make sure to edit the mappings and tick the identity box.
the data is moved, now set the foreign keys back using this on your azure database:
DECLARE #sql NVARCHAR(MAX) = N'';
;WITH x AS
(
SELECT DISTINCT obj =
QUOTENAME(OBJECT_SCHEMA_NAME(parent_object_id)) + '.'
+ QUOTENAME(OBJECT_NAME(parent_object_id))
FROM sys.foreign_keys
)
SELECT #sql += N'ALTER TABLE ' + obj + ' WITH CHECK CHECK CONSTRAINT ALL;
' FROM x;
EXEC sp_executesql #sql;

Related

DENY deletion for schema except one table [duplicate]

We have some 50 tables, and we need to deny write permissions to every table except for one table for particular user.
How can we do this?
Here is a way to do it with dynamic SQL. The Print may not show you the whole command because of output limitations in Management Studio. You'll need to update the username and exception.
DECLARE #sql NVARCHAR(MAX);
SET #sql = N'';
SELECT #sql = #sql + '
DENY UPDATE, DELETE, INSERT ON ' +
QUOTENAME(SCHEMA_NAME([schema_id])) + '.'
+ QUOTENAME(name) + ' TO [username];' -- fix this username
FROM sys.tables
WHERE name <> 'exception'; -- fix this to be the one you want to allow
PRINT #sql;
-- EXEC sp_executesql #sql;
In SQL Server Management Studio:
Go to the properties page for the user, then the User Mapping tab.
Tick public and db_datareader (do not tick db_denydatawriter) for the appropriate database.
This will only grant them read access.
Then you can grant insert and update for the user to the table using a query similar to this:
GRANT INSERT ON OBJECT::dbo.MyTable TO Fred
GRANT UPDATE ON OBJECT::dbo.MyTable TO Fred

MS-SQL: Changing the FileGrowth parameters of a database generically

In our software the user can create databases as well as connect to databases that were not created by our software. The DBMS is Microsoft SQL-Server.
Now I need to update the databases that we use and set the FileGrowth parameter of all the files of all the databases to a certain value.
I know how to get the logical file names of the files of the current database from a query:
SELECT file_id, name as [logical_file_name], physical_name FROM sys.database_files
And I know how to set the desired FileGrowth value, once I know the logical file name:
ALTER DATABASE MyDB MODIFY FILE (Name='<logical file name>', FileGrowth=10%)
But I don't know how to combine these to steps into one script.
Since there are various databases I can't hard code the logical file names into the script.
And for the update process (right now) we only have the possibility to get the connection of a database and execute sql scripts on this connection, so a "pure" script solution would be best, if that's possible.
The following script receives a database name as parameter and uses 2 dynamic SQL: one for a cursor to cycle database files of chosen database and another to apply the proper ALTER TABLE command, since you can't use a variable for the file name on MODIFY FILE.
The EXEC is commented on both occasions and there's a PRINT instead, so you can review before executing. I've just tested it on my sandbox and it's working as expected.
DECLARE #DatabaseName VARCHAR(100) = 'DBName'
DECLARE #DynamicSQLCursor VARCHAR(MAX) = '
USE ' + #DatabaseName + ';
DECLARE #FileName VARCHAR(100)
DECLARE FileCursor CURSOR FOR
SELECT S.name FROM sys.database_files AS S
OPEN FileCursor
FETCH NEXT FROM FileCursor INTO #FileName
WHILE ##FETCH_STATUS = 0
BEGIN
DECLARE #DynamicSQLAlterDatabase VARCHAR(MAX) = ''
ALTER DATABASE ' + #DatabaseName + ' MODIFY FILE (Name = '''''' + #FileName + '''''', FileGrowth = 10%)''
-- EXEC (#DynamicSQLAlterDatabase)
PRINT (#DynamicSQLAlterDatabase)
FETCH NEXT FROM FileCursor INTO #FileName
END
CLOSE FileCursor
DEALLOCATE FileCursor '
-- EXEC (#DynamicSQLCursor)
PRINT (#DynamicSQLCursor)
You might want to check for the usual dynamic SQL caveats like making sure the values being concatenated won't break the SQL and also add error handling.
As for how to apply this to several databases, you can create an SP and execute it several times, or wrap a database name cursor / while loop over this.

Get list of all databases that have a view named 'foo' in them

I have a few servers that have a bunch of databases in them.
Some of the databases have a view called vw_mydata.
What I want to do is create a list of all databases containing a view named vw_mydata and then execute that view and store it's contents in a table that then contains al the data from all the vw_mydata.
I know I can find all the databases containing that view using
sp_msforeachdb 'select "?" AS dbName from [?].sys.views where name like ''vw_mydata'''
But then I have as many recordsets as I have databases. How do I use that to loop through the databases?
What I would preferis a single neat list of the databasenames that I then can store in a resultset. Then it would be pretty straightforward.
I have thought about running above TSQL and storing the results in a table but I would like to keep it all in one SSIS package and not having all kind of tables/procedures lying around. Can I use a #table in a Execute SQL Task in SSIS?
DECLARE #Tsql VARCHAR(MAX)
SET #Tsql = ''
SELECT #Tsql = #Tsql + 'SELECT ''' + d.name + ''' AS dbName FROM [' + d.name + '].sys.views WHERE name LIKE ''vw_mydata'' UNION '
FROM master.sys.databases d
--"trim" the last UNION from the end of the tsql.
SET #Tsql = LEFT(#Tsql, LEN(#Tsql) - 6)
PRINT #Tsql
--Uncomment when ready to proceed
--EXEC (#Tsql)
To use a temp table in SSIS, you'll need to use a global temp table (##TABLE).
On the properties for the connection, I'm pretty sure you'll need to set RetainSameConnection to TRUE.
On the SQL task after you create the temp table, you'll need to set DelayValidation to TRUE.

Drop multiple databases in SQl Azure

I would like to run a script to drop the multiple databases from SQL Azure as soon I finish using it. When I tried as following,
DECLARE #dbname varchar(100);
DECLARE #stmt nvarchar(3000);
SET #dbname = '6A732E0B';
SELECT #stmt = (SELECT 'DROP DATABASE [' + name + ']; ' FROM sys.databases
WHERE name LIKE '%' +#dbname +'%');
EXEC sp_executesql #stmt;
SQL Azure throws error message as “The DROP DATABASE statement must be the only statement in the batch”
Can somebody help me on this?
This is a known limitation in SQL Azure - certain statements need to be in a batch by themselves to be executed. This includes CREATE/ALTER DATABASE, ALTER DATABASE and a few more.
To solve you problem, you can create a loop in you application where you iterate over all the databases and drop them by issuing DROP DATABASE statements in separate batches.
I believe this is a bug of SQL Azure. I've recently reported it to Microsoft:
https://connect.microsoft.com/SQLServer/feedback/details/684160/sp-executesql-the-drop-database-statement-must-be-the-only-statement-in-the-batch

How to move tables from one sql server database to another?

We have a database that has grown to about 50GB and we want to pull out a certain set of tables (about 20 of them) from within that database and move them into a new database. All of this would be on the same SQL Server. The tables that we want to pull out are about 12GB of space (6GB data, 6GB indexes).
How can we move the tables from one database to the second but make sure the tables that are created in the new database are an exact copy of the originals (indexes, keys, etc.)? Ideally I want a copy/paste from within SQL Server Management Studio but I know this does not exist, so what are my options?
To do this really easily with SQL Server 2008 Management Studio:
1.) Right click on the database (not the table) and select Tasks -> Generate Scripts
2.) Click Next on the first page
3.) If you want to copy the whole database, just click next. If you want to copy specific tables, click on "Select Specific Database Objects", select the tables you want, and then click next.
4.) Select "Save to Clipboard" or "Save to File". IMPORTANT: Click the Advanced button next to "Save to File", find "Types of data to script", and change "Schema only" to "Schema and data" (if you want to create the table) or "Data only" (if you're copying data to an existing table). This is also where you'd set other options such as exactly what keys to copy, etc.
5.) Click through the rest and you're done!
If you're moving the tables to a whole new database just because of growth, you might be better off considering using filegroups in your existing database instead. There will be a lot fewer headaches going forward than trying to deal with two separate databases.
EDIT
As I mentioned in my comments below, if you truly need a new database, depending on the total number of tables involved, it might be easier to restore a backup of the database under the new name and drop the tables you don't want.
I did also find this potential solution using SQL Server Management Studio. You can generate the scripts for the specific tables to move and then export the data using the Generate Scripts Wizard and Import/Export Wizard in SQL Server Management Studio. Then on the new database you would run the scripts to create all of the objects and then import the data. We are probably going to go with the backup/restore method as described in #Joe Stefanelli's answer but I did find this method and wanted to post it for others to see.
To generate the sql script for the objects:
SQL Server Management Studio > Databases > Database1 > Tasks > Generate Scripts...
The SQL Server Scripts Wizard will start and you can choose the objects and settings to export into scripts
By default the scripting of Indexes and Triggers are not included so make sure to trun these on (and any others that you are interested in).
To export the data from the tables:
SQL Server Management Studio > Databases > Database1 > Tasks > Export Data...
Choose the source and destination databases
Select the tables to export
Make sure to check the Identity Insert checkbox for each table so that new identities are not created.
Then create the new database, run the scripts to create all of the objects, and then import the data.
If you like/have SSIS you can explore using the Copy SQL Objects Task component to do this.
Try DBSourceTools.
http://dbsourcetools.codeplex.com.
This toolset uses SMO to script tables and data to disk, and also allows you to select which tables / views / Stored procedures to include.
When using a "deployment target", it will also automatically handle dependencies.
I have used it repeatedly for exactly this type of problem, and it's extremely simple and fast.
SELECT *
INTO new_table_name [IN new database]
FROM old_tablename
A lazy, efficient way to do this in T-SQL:
In my case, some of the tables are large, so scripting out the data is impractical.
Also, we needed to migrate just a fraction of an otherwise very large database, so I didn't want to do backup / restore.
So I went with INSERT INTO / SELECT FROM and used information_schema etc to generate the code.
Step 1: create your tables on new DB
For every table you want to migrate to new database, create that table on new database.
Either script out the tables, or use SQL Compare, dynamic sql from information_schema -- many ways to do it. dallin's answer shows one way using SSMS (but be sure to select schema only).
Step 2: create UDF on target DB to produce column list
This is just a helper function used in generation of code.
USE [staging_edw]
GO
CREATE FUNCTION dbo.udf_get_column_list
(
#table_name varchar(8000)
)
RETURNS VARCHAR(8000)
AS
BEGIN
DECLARE #var VARCHAR(8000)
SELECT
#var = COALESCE(#var + ',', '', '') + c.COLUMN_NAME
FROM INFORMATION_SCHEMA.columns c
WHERE c.TABLE_SCHEMA + '.' + c.TABLE_NAME = #table_name
AND c.COLUMN_NAME NOT LIKE '%hash%'
RETURN #var
END
Step 3: create log table
The generated code will log progress into this table so you can monitor. But you have to create this log table first.
USE staging_edw
GO
IF OBJECT_ID('dbo.tmp_sedw_migration_log') IS NULL
CREATE TABLE dbo.tmp_sedw_migration_log
(
step_number INT IDENTITY,
step VARCHAR(100),
start_time DATETIME
)
Step 4: generate migration script
Here you generate the T-SQL that will migrate the data for you. It just generates INSERT INTO / SELECT FROM statements for every table, and logs its progress along the way.
This script does not actually modify anything. It just outputs some code, which you can inspect before executing.
USE staging_edw
GO
-- newline characters for formatting of generated code
DECLARE #n VARCHAR(100) = CHAR(13)+CHAR(10)
DECLARE #t VARCHAR(100) = CHAR(9)
DECLARE #2n VARCHAR(100) = #n + #n
DECLARE #2nt VARCHAR(100) = #n + #n + #t
DECLARE #nt VARCHAR(100) = #n + #t
DECLARE #n2t VARCHAR(100) = #n + #t + #t
DECLARE #2n2t VARCHAR(100) = #n + #n + #t + #t
DECLARE #3n VARCHAR(100) = #n + #n + #n
-- identify tables with identity columns
IF OBJECT_ID('tempdb..#identities') IS NOT NULL
DROP TABLE #identities;
SELECT
table_schema = s.name,
table_name = o.name
INTO #identities
FROM sys.objects o
JOIN sys.columns c on o.object_id = c.object_id
JOIN sys.schemas s ON s.schema_id = o.schema_id
WHERE 1=1
AND c.is_identity = 1
-- generate the code
SELECT
#3n + '-- ' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME,
#n + 'BEGIN TRY',
#2nt + IIF(i.table_schema IS NOT NULL, 'SET IDENTITY_INSERT staging_edw.' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME + ' ON ', ''),
#2nt + 'TRUNCATE TABLE staging_edw.' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME,
#2nt + 'INSERT INTO staging_edw.' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME + ' WITH (TABLOCKX) ( ' + f.f + ' ) ',
#2nt + 'SELECT ' + f.f + + #nt + 'FROM staging.' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME,
#2nt + IIF(i.table_schema IS NOT NULL, 'SET IDENTITY_INSERT staging_edw.' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME + ' OFF ', ''),
#2nt + 'INSERT INTO dbo.tmp_sedw_migration_log ( step, start_time ) VALUES ( ''' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME + ' inserted successfully'', GETDATE() );' ,
#2n + 'END TRY',
#2n + 'BEGIN CATCH',
#2nt + 'INSERT INTO dbo.tmp_sedw_migration_log ( step, start_time ) VALUES ( ''' + t.TABLE_SCHEMA + '.' + t.TABLE_NAME + ' FAILED'', GETDATE() );' ,
#2n + 'END CATCH'
FROM INFORMATION_SCHEMA.tables t
OUTER APPLY (SELECT f = staging_edw.dbo.udf_get_column_list(t.TABLE_SCHEMA + '.' + t.TABLE_NAME)) f
LEFT JOIN #identities i ON i.table_name = t.TABLE_NAME
AND i.table_schema = t.TABLE_SCHEMA
WHERE t.TABLE_TYPE = 'base table'
Step 5: run the code
Now you just copy the output from step 4, paste into new query window, and run.
Notes
In step 1, I exclude hash columns from the column list (in the UDF) because those are computed columns in my situation