EXEC to USE Database - sql

I have written a script that generates a database backup file name, backs up the database, and then restores it as a copy with the new database name. The name is based on some date/time data.
I then need to USE that database, in the script, and then disable all triggers.
However, this doesn't work:
DECLARE #Use VARCHAR(50)
SET #Use = 'USE ' + #NewDatabaseName
EXEC(#Use)
Running it manually - the database doesn't get 'USED'.
How can I execute the USE statement with a variable?
I have tried the sp_executesql proc as well, with the same result. Database didn't change.
DECLARE #sqlCommand nvarchar(1000)
SET #sqlCommand = 'USE ' + #NewDatabaseName
EXECUTE sp_executesql #sqlCommand
Looks like I might need to go into sqlcmd mode? Really hoping not to, though.

Both exec and execute_sql run in their own scope. And the change in database will only affect their own scope. So you could:
set #sql = 'use ' + quotename(#new_db_name) + '; disable trigger t1;'
exec (#sql)
As far as I know, there is no way to change the database context of the current scope to a variable database name.

Related

stored procedure receiving DB name to work with

I am looking to write a stored procedure which received a database name along with other parameters, and the stored procedure needs to work on the Database which it received
any thoughts please
Something like the following should work, as long as correct permissions are setup:
CREATE PROCEDURE dbo.sptest
#DB VARCHAR(50)
AS
BEGIN
DECLARE #sqlstmt VARCHAR(MAX)
SET #sqlstmt='SELECT TOP 10 * FROM ' + #DB + '.dbo.YourTableName'
sp_executesql #sqlstmt
END
GO
As mentioned, be very careful when using dynamic SQL like this- only use with trusted sources because of the ability to wreck havoc on your DB. At a minimum, you should add some checking of the value of #DB passed in to make sure it matches a limited list of database names that it will work with.

Creating schema for all databases in Sql Server

I have problems creating schema for all databases on a sql server in one script.
declare #ssql varchar(2000)
set #ssql= 'use [?]
GO
CREATE SCHEMA [sp_schema]'
exec sp_msforeachdb #ssql
go
But I am always getting these errors:
Incorrect syntax near 'GO'.
'CREATE SCHEMA' must be the first statement in a query batch.
And if I use another statement like CREATE USER => everything works fine.
Any ideas?
Thanks.
Ok
I found it.
It should be like this:
declare #ssql varchar(2000)
set #ssql= 'use [?]
EXEC (''CREATE SCHEMA [sp_schema]'')'
exec sp_msforeachdb #ssql
go
And it works!!
Thanks Dan for your contribution!
GO is not actually a SQL Server command. It is used as a batch separator in applications like SQL Server Management Studio.
I think you can just remove the GO.
Otherwise, this should work:
declare #ssql varchar(2000);
set #ssql= '
use [?]
declare #sql varchar(100)=''create schema [sp_schema]''
exec(#sql)
'
exec sp_msforeachdb #ssql;

sql2008 - Better than multiple replace statements?

Further to my previous post, I would like to copy dependent objects (such as views or procedures) to a 'static' database. However, schema names and other object prefixes are not the same between Production and Static databases...
[I've read Aaron Bertrand's articles on setting up an Audit database, but this is a little much for our needs at this time.]
After extracting the object definitions into a variable using some dynamic sql, I am running multiple replace statements for each change so that the views/procedures still run, pulling data from the Static database.
The reason for the replace statements is that the views/procedures have been created using differing naming conventions. Sometimes I find <dbname>.dbo.<objectname>, other times it's <dbname>..<objectname> or even just dbo.<objectname> !
Instead of using multiple replace statements as below (I feel this may grow quite large!), is there a better method? Would a table-driven approach (using a CURSOR) be wiser/wisest?
[Database/object names have been modified in the code below for simplicity]
declare #sql nvarchar(500), #parmdef nvarchar(500),
#dbname varchar(20), #objname varchar(255), #ObjDef varchar(max);
set #dbname = 'ProdC';
--declare cursor; get object name using cursor on dbo.ObjectsToUpdate
--[code removed for simplicity]
set #sql = N'USE '+quotename(#dbname) +'; ' ;
set #sql = #sql + N'SELECT #def=OBJECT_DEFINITION(OBJECT_ID(''dbo.'+#objname+ '''));'
set #parmdef = N'#def nvarchar(max) OUTPUT' ;
exec sp_executesql #sql, #parmdef, #def=#ObjDef OUTPUT;
--Carry out object definition replacements
set #ObjDef= replace(#ObjDef, 'CREATE VIEW [dbo].[', 'ALTER VIEW ['+#dbname+'].[');
set #ObjDef= replace(#ObjDef, 'Prod1.dbo.', #dbname+'.'); --replace Prod1 with #dbname
set #ObjDef= replace(#ObjDef, ' dbo.', ' '+#dbname+'.'); --replace all 'dbo.'
set #ObjDef= replace(#ObjDef, 'dbo.LookupTable1', #dbname+'.LookupTable1');
--[code removed for simplicity]
exec(#ObjDef);
--get next object name from cursor
--[remaining code removed for simplicity]
Many thanks in advance.
Another problem that you have is using OBJECT_DEFINITION. This returns only the first 4,000 characters of the object name.
The same problem exists using INFORMATION_SCHEMA.ROUTINES.
Check out this post for a discussion of the alternatives...
Dude, where's the rest of my procedure?
"However, schema names and other object prefixes are not the same between Production and Static databases"
That's your problem right there. Make them the same and your problem will go away. With SQL Server supporting multiple instances there should be no barrier to doing that.
#ben: the Static database pull from several Production databases and there is a desire to maintain the original 'source' by using the database name as schema name in the Static database.
Closed. Duplicate with this post

Which user account is running SQLCMD in T-SQL Script without -U and -P option?

I am using sqlcmd in a T-SQl script to write a text file to a network location. However SQLCMD is failing to write to that location due to access permission to the network folder. SP is being run under my user account which has access to the network folder.
Could you please help me under which account sqlcmd will run if I do not specify -U and -P option in TSQL Script?
Use this to find the user name:
PRINT Suser_Sname();
If you don't provide credentials with -u/-p it will try to use windows authentication; i.e the windows account of whomever is running it.
I often just use Process Monitor to look at what account is being used and what the permission error is.
You say you are using SQLCMD in a T-SQL script, don't you mean you are using SQLCMD to run a T-SQL script? How is your script writing a text file? Does it work in SQL Manager? My guess is that the user account SQL Server is running under doesn't have access to that location.
If you call an SQL script via xp_cmdshell without User and Password parameters it will run in the environment of the mssqlserver service, which is very much restricted, and without changing security parameters you will get mostly an 'Access is denied' message instead of the results of the script.
To avoid this security conflict situation I use the following trick in my stored procedure create_sql_proc. I read the text of the script file, and wrap it in a procedure by adding a head and a foot to it. Now I have a script creating a stored procedure from the SQL-file called #procname.
If you let now run this stored procedure by EXEC #procname, it will run in your security environment, and delivers the result you would get by running it from a command prompt:
CREATE PROCEDURE create_sql_proc(#procName sysname, #sqlfile sysname) AS
BEGIN
DECLARE #crlf nvarchar(2) = char(10)+char(13)
DECLARE #scriptText nvarchar(max)
DECLARE #cmd nvarchar(max)
= N'SET #text = (SELECT * FROM openrowset(BULK '''+#sqlFile+''', SINGLE_CLOB) as script)'
EXEC sp_executesql #cmd , N'#text nvarchar(max) output', #text = #scriptText OUTPUT
DECLARE #ProcHead nvarchar(max) = N'CREATE or ALTER PROCEDURE '+#procName+ ' AS '+#crlf+'BEGIN'+#crlf
DECLARE #ProcTail nvarchar(max) = #crlf + N'END '
SET #scriptText = #ProcHead + #scriptText + #ProcTail
-- create TestGen stored procedure --
EXEC sys.sp_executesql #scriptText
END

How do I run SQL queries on different databases dynamically?

I have a sql server stored procedure that I use to backup data from our database before doing an upgrade, and I'd really like it to be able to run the stored procedure on multiple databases by passing in the database name as a parameter. Is there an easy way to do this? The best I can figure is to dynamically build the sql in the stored procedure, but that feels like its the wrong way to do it.
build a procedure to back up the current database, whatever it is. Install this procedure on all databases that you want to backup.
Write another procedure that will launch the backups. This will depend on things that you have not mentioned, like if you have a table containing the names of each database to backup or something like that. Basically all you need to do is loop over the database names and build a string like:
SET #ProcessQueryString=
'EXEC '+DatabaseServer+'.'+DatabaseName+'.dbo.'+'BackupProcedureName param1, param2'
and then just:
EXEC (#ProcessQueryString)
to run it remotely.
There isn't any other way to do this. Dynamic SQL is the only way; if you've got strict controls over DB names and who's running it, then you're okay just truncating everything together, but if there's any doubt use QUOTENAME to escape the parameter safely:
CREATE PROCEDURE doStuff
#dbName NVARCHAR(50)
AS
DECLARE #sql NVARCHAR(1000)
SET #sql = 'SELECT stuff FROM ' + QUOTENAME(#dbName) + '..TableName WHERE stuff = otherstuff'
EXEC sp_ExecuteSQL (#sql)
Obviously, if there's anything more being passed through then you'll want to double-check any other input, and potentially use parameterised dynamic SQL, for example:
CREATE PROCEDURE doStuff
#dbName NVARCHAR(50)
#someValue NVARCHAR(10)
AS
DECLARE #sql NVARCHAR(1000)
SET #sql = 'SELECT stuff FROM ' + QUOTENAME(#dbName) + '..TableName WHERE stuff = #pOtherStuff'
EXEC sp_ExecuteSQL (#sql, '#pOtherStuff NVARCHAR(10)', #someValue)
This then makes sure that parameters for the dynamic SQL are passed through safely and the chances for injection attacks are reduced. It also improves the chances that the execution plan associated with the query will get reused.
personally, i just use a batch file and shell to sqlcmd for things like this. otherwise, building the sql in a stored proc (like you said) would work just fine. not sure why it would be "wrong" to do that.
best regards,
don
MSSQL has an OPENQUERY(dbname,statement) function where if the the server is linked, you specify it as the first parameter and it fires the statement against that server.
you could generate this openquery statement in a dynamic proc. and either it could fire the backup proc on each server, or you could execute the statement directly.
Do you use SSIS? If so you could try creating a couple ssis packages and try scheduling them,or executing them remotely.