How can I create a database and USE statements in one script? - sql

Consider the following script:
DECLARE #path varchar(MAX)
DECLARE #script varchar(MAX)
SET #path = (SELECT physical_name FROM sys.master_files where name = 'master');
SET #path = REPLACE(#path, 'master.mdf', '');
SELECT #path;
SET #script =
'CREATE DATABASE test
ON PRIMARY
(NAME = test_primary,
FILENAME = ''' + #path + 'test_primary.mdf'',
SIZE = 10MB,
FILEGROWTH = 10MB)';
exec(#script);
USE test
When I try to run it all at once I get an error:
Msg 911, Level 16, State 1, Line 31
Database 'test' does not exist. Make sure that the name is entered correctly.
If I first run exec and then separately run USE it all goes fine.
The question is, how can I work-around it, so that it'd be possible to run the whole script at once with no errors?

SQL Server compiles the code for one batch at a time. When your code is compiled the database does not exist.
Add a batch separator before use test.
DECLARE #path varchar(MAX)
DECLARE #script varchar(MAX)
SET #path = (SELECT physical_name FROM sys.master_files where name = 'master');
SET #path = REPLACE(#path, 'master.mdf', '');
SELECT #path;
SET #script =
'CREATE DATABASE test
ON PRIMARY
(NAME = test_primary,
FILENAME = ''' + #path + 'test_primary.mdf'',
SIZE = 10MB,
FILEGROWTH = 10MB)';
exec(#script);
GO
USE test

If you are executing the statements from within one of the SQL Server query tools (e.g. enterprise manager, management studio or sqlcmd), then insert the statement GO prior to the USE test command. This separates the commands into separate batches. If you are executing the script through one of the programmatic clients, then you must execute the batches separately by splitting up the script.

Souldn't USE be at the top of the script?
USE selects the correct database for you. Then run the script on that database

Related

MS-SQL: Changing the FileGrowth parameters of a database generically

In our software the user can create databases as well as connect to databases that were not created by our software. The DBMS is Microsoft SQL-Server.
Now I need to update the databases that we use and set the FileGrowth parameter of all the files of all the databases to a certain value.
I know how to get the logical file names of the files of the current database from a query:
SELECT file_id, name as [logical_file_name], physical_name FROM sys.database_files
And I know how to set the desired FileGrowth value, once I know the logical file name:
ALTER DATABASE MyDB MODIFY FILE (Name='<logical file name>', FileGrowth=10%)
But I don't know how to combine these to steps into one script.
Since there are various databases I can't hard code the logical file names into the script.
And for the update process (right now) we only have the possibility to get the connection of a database and execute sql scripts on this connection, so a "pure" script solution would be best, if that's possible.
The following script receives a database name as parameter and uses 2 dynamic SQL: one for a cursor to cycle database files of chosen database and another to apply the proper ALTER TABLE command, since you can't use a variable for the file name on MODIFY FILE.
The EXEC is commented on both occasions and there's a PRINT instead, so you can review before executing. I've just tested it on my sandbox and it's working as expected.
DECLARE #DatabaseName VARCHAR(100) = 'DBName'
DECLARE #DynamicSQLCursor VARCHAR(MAX) = '
USE ' + #DatabaseName + ';
DECLARE #FileName VARCHAR(100)
DECLARE FileCursor CURSOR FOR
SELECT S.name FROM sys.database_files AS S
OPEN FileCursor
FETCH NEXT FROM FileCursor INTO #FileName
WHILE ##FETCH_STATUS = 0
BEGIN
DECLARE #DynamicSQLAlterDatabase VARCHAR(MAX) = ''
ALTER DATABASE ' + #DatabaseName + ' MODIFY FILE (Name = '''''' + #FileName + '''''', FileGrowth = 10%)''
-- EXEC (#DynamicSQLAlterDatabase)
PRINT (#DynamicSQLAlterDatabase)
FETCH NEXT FROM FileCursor INTO #FileName
END
CLOSE FileCursor
DEALLOCATE FileCursor '
-- EXEC (#DynamicSQLCursor)
PRINT (#DynamicSQLCursor)
You might want to check for the usual dynamic SQL caveats like making sure the values being concatenated won't break the SQL and also add error handling.
As for how to apply this to several databases, you can create an SP and execute it several times, or wrap a database name cursor / while loop over this.

Backup script I've been using with no issues for a while tossing an error on new database

I've been using the code below to drop and create a new backup named (current year database)_daily at midnight to allow my team to test new scripts or updates to our student information system.
It worked all last year, and this year for reasons I can't figure out, the script is tossing an error.
Here is the script:
USE master;
GO
-- the original database (use 'SET #DB = NULL' to disable backup)
DECLARE #SourceDatabaseName varchar(200)
DECLARE #SourceDatabaseLogicalName varchar(200)
DECLARE #SourceDatabaseLogicalNameForLog varchar(200)
DECLARE #query varchar(2000)
DECLARE #DataFile varchar(2000)
DECLARE #LogFile varchar(2000)
DECLARE #BackupFile varchar(2000)
DECLARE #TargetDatabaseName varchar(200)
DECLARE #TargetDatbaseFolder varchar(2000)
-- ****************************************************************
SET #SourceDatabaseName = '[DST18000RD]' -- Name of the source database
SET #SourceDatabaseLogicalName = 'DST18000RD' -- Logical name of the DB ( check DB properties / Files tab )
SET #SourceDatabaseLogicalNameForLog = 'DST18000RD_log' -- Logical name of the DB ( check DB properties / Files tab )
SET #BackupFile = 'F:\Dev_Databases\Temp\backup.dat' -- FileName of the backup file
SET #TargetDatabaseName = 'DST18000RD_Daily' -- Name of the target database
SET #TargetDatbaseFolder = 'F:\Dev_Databases\Temp\'
-- ****************************************************************
SET #DataFile = #TargetDatbaseFolder + #TargetDatabaseName + '.mdf';
SET #LogFile = #TargetDatbaseFolder + #TargetDatabaseName + '.ldf';
-- Disconnect any users using #TargetDatabaseName
USE [master];
DECLARE #kill varchar(8000) = '';
SELECT #kill = #kill + 'kill ' + CONVERT(varchar(5), session_id) + ';'
FROM sys.dm_exec_sessions
WHERE database_id = db_id('DST18000RD_Daily')
EXEC(#kill);
-- Backup the #SourceDatabase to #BackupFile location
IF #SourceDatabaseName IS NOT NULL
BEGIN
SET #query = 'BACKUP DATABASE ' + #SourceDatabaseName + ' TO DISK = ' + QUOTENAME(#BackupFile,'''')
PRINT 'Executing query : ' + #query;
EXEC (#query)
END
PRINT 'OK!';
-- Drop #TargetDatabaseName if exists
IF EXISTS(SELECT * FROM sysdatabases WHERE name = #TargetDatabaseName)
BEGIN
SET #query = 'DROP DATABASE ' + #TargetDatabaseName
PRINT 'Executing query : ' + #query;
EXEC (#query)
END
PRINT 'OK!'
-- Restore database from #BackupFile into #DataFile and #LogFile
SET #query = 'RESTORE DATABASE ' + #TargetDatabaseName + ' FROM DISK = ' + QUOTENAME(#BackupFile,'''')
SET #query = #query + ' WITH MOVE ' + QUOTENAME(#SourceDatabaseLogicalName,'''') + ' TO ' + QUOTENAME(#DataFile ,'''')
SET #query = #query + ' , MOVE ' + QUOTENAME(#SourceDatabaseLogicalNameForLog,'''') + ' TO ' + QUOTENAME(#LogFile,'''')
PRINT 'Executing query : ' + #query
EXEC (#query)
PRINT 'OK!'
The script is not mine, I put together two scripts to get me what I needed. Our old database DST17000RD, this script still works flawlessly. On the new database DST18000RD, I get this error:
Executing query : BACKUP DATABASE [DST18000RD] TO DISK = 'F:\Dev_Databases\Temp\backup.dat'
Processed 1209552 pages for database 'DST18000RD', file 'DST18000RD' on file 23.
Processed 2 pages for database 'DST18000RD', file 'DST18000RD_log' on file 23.
BACKUP DATABASE successfully processed 1209554 pages in 139.942 seconds (67.525 MB/sec).
OK!
OK!
Executing query : RESTORE DATABASE DST18000RD_Daily FROM DISK = 'F:\Dev_Databases\Temp\backup.dat' WITH MOVE 'DST18000RD' TO 'F:\Dev_Databases\Temp\DST18000RD_Daily.mdf' , MOVE 'DST18000RD_log' TO 'F:\Dev_Databases\Temp\DST18000RD_Daily.ldf'
Msg 3234, Level 16, State 2, Line 3
Logical file 'DST18000RD' is not part of database 'DST18000RD_Daily'. Use RESTORE FILELISTONLY to list the logical file names.
Msg 3013, Level 16, State 1, Line 3
RESTORE DATABASE is terminating abnormally.
OK!
Some things to note that may just be me barking up the wrong tree. DST17000RD database is compatibility level SQL Server 2012 (110) and the DST18000RD database is SQL Server 2017 (140). The server was upgraded and migrated a couple months ago before the new database was created.
Any help is appreciated. From what I can tell, I feel like the script is not renaming the MDF and LDF files before it tries to copy them for the *_daily database? Honestly I'm not sure. I'm a pretend DBA, self taught on an as needed basis. Thank you in advance for your help!
The error is telling you got the logical name of the db file wrong:
SET #SourceDatabaseLogicalName = 'DST18000RD' -- Logical name of the DB ( check DB properties / Files tab )
and to run:
RESTORE FILELIST ONLY FROM DISK = 'F:\Dev_Databases\Temp\backup.dat'
To see the correct logical file names.
The issue is you are trying to change the file logical name during the database restore, which is not possible even if you use the MOVE clause.
The MOVE clause allows you to change the location and the names of the physical files but does not do anything for the logical names.
Fix
You will have to use the existing logical names for your database, but once you have restored the database then use ALTER DATABASE command to change the logical names of your files using the following command:
USE [master];
GO
ALTER DATABASE [DST18000RD]
MODIFY FILE ( NAME = DST17000RD , NEWNAME = DST18000RD );
GO

"EXECUTE" function within trigger not functioning in TSQL on SERVER

This is my code I am attempting to run inside a trigger. In a trigger it fails because of the EXEC line. I know this because when I take it out, the code at least finishes execution. If I leave it in it, it doesn't even bother finishing. I know this because if I take out the exec line, it will write the sql line to the error table.
I have to use an EXEC command here to write to the remote server. I can't use a nvarchar variable to define the server. Insert expects an explicit table or a table variable. I need to be able to write to this remote server AND I don't know the name of it until runtime, so I can't be explicit. How do I use an EXEC inside a trigger or is there another way to skin this cat?
DECLARE #acctNum int
DECLARE #tickets nvarchar(MAX)
DECLARE #server nvarchar(64)
DECLARE #sql nvarchar(MAX)
if EXISTS (SELECT * FROM dbo.queueNames WHERE queueName = 'queue1')
BEGIN
SELECT #server = queueServer FROM dbo.queueNames WHERE queueName = 'queue1'
SET #server = #server + '.[AMEETING].dbo.tblMembers'
SELECT TOP 1 #acctNum = AccountNum, #tickets = Tickets FROM dbo.queue1 ORDER BY AccountNum
SET #sql = 'UPDATE ' + #server + ' SET Present = ''1'', Tickets = ''' + #tickets + ''' WHERE AccountNum = ' + convert(nvarchar(64),#acctNum)
INSERT INTO dbo.errors values (#sql)
EXEC (#sql)
DELETE FROM dbo.queue1 WHERE AccountNum = #acctNum
END
I am giving up on this and have gone with using text files on the local server and running processes in the background there to queue the sql statements.

Executing a "sp_executesql #sqlcommand" Syntax Error

I am setting up a SQL script that creates a database from a variable name, and then takes that newly created database and restores it from a .bak file. I am having issues with some syntax in one of my commands that I am setting up, and wanted to ask if anybody could help me spot my syntax error? I am only going to paste my troubled snippet of code and its declarations, and if I am correct the issue lies in the way that I am declaring the file name paths. I have tried setting the paths to variables, but I still received errors due to the apostrophe placement. Thanks!!!
declare #DBname varchar(10), #sqlcommand Nvarchar(max)
set #DBname = 'testdb'
Code to create database, and set new database to single user mode
--restore database
set #sqlcommand = N'Restore DATAbase ' + #DBname + ' from disk = ''C:/loc_smartz_db0_template.bak'' with move '
+ #DBname + ' to ''C:/ProgramFiles/Microsoft SQL Server/MSSQL/Data/TestDatabase1.mdf'', move ' + #DBname + ' to ''C:/ProgramFiles/Microsoft SQL Server/MSSQL/Data/TestDatabase1.ldf'', Replace'
EXECUTE sp_executesql #sqlcommand
Code that sets database back to multiuser, and prints that the database was successfully created
It looks like the previous posters have fixed your problem, but this may have been avoided if you had used dynamic sql in the 'best practice' manner. Concatenating the string together as a mixture of variables and string literals is not ideal as it makes working with apostrophes difficult (as shown here).
A better way is to write your sql as
declare #DBname nvarchar(255) = 'testdb'
,#BakName nvarchar(255) = 'C:\loc_smartz_db0_template.bak'
,#MovemdfName nvarchar(255) = 'C:\Program Files\Microsoft SQL Server\MSSQL\Data\TestDatabase1.mdf'
,#MoveldfName nvarchar(255) = 'C:\Program Files\Microsoft SQL Server\MSSQL\Data\TestDatabase1.ldf'
,#sqlcommand nvarchar(max)
,#paramList nvarchar(max)
set #paramList = '#DBname nvarchar(255), #BakName nvarchar(255), #MovemdfName nvarchar(255), #MoveldfName nvarchar(255)'
set #sqlcommand = N'Restore DATAbase #DBname from disk = #BakName with move #DBname to #MovemdfName, move #DBname to #MoveldfName, Replace'
exec sp_executesql #statement = #sqlcommand
,#params = #paramList
,#DBname = #DBname
,#BakName = #BakName
,#MovemdfName = #MovemdfName
,#MoveldfName = #MoveldfName
This way, your sql command is very easy to read and maintain. Note that you don't have to mess around with escaping the apostrophes in the variable values either if you have spaces in your pathnames.
It also has the advantage (if you have the code in a stored proc) of allowing SQL Server to reuse execution plans which will improve performance.
See here for more information.
Two things.
First, you have to put single quotes around the database file logical name, e.g.
from
...with move testdb to 'C:/ProgramFiles/Microsoft SQL Server/MSSQL/Data/TestDatabase1.mdf'
to
...with move 'testdb' to 'C:/ProgramFiles/Microsoft SQL Server/MSSQL/Data/TestDatabase1.mdf'
making it
set #sqlcommand = N'Restore DATAbase ' + #DBname + ' from disk = ''C:\loc_smartz_db0_template.bak'' with move '''
+ #DBname + ''' to ''C:\ProgramFiles\Microsoft SQL Server\MSSQL\Data\TestDatabase1.mdf'', move '''
+ #DBname + ''' to ''C:\ProgramFiles\Microsoft SQL Server\MSSQL\Data\TestDatabase1.ldf'', Replace'
Second, use backslashes \, not slashes. (Maybe this works, but it didn't in my quick tests.)

Delete multiple files from folder using T-SQL without using cursor

I am writing a cleanup script. This script will run on weekend and clean up the db. Tables are related to Eamils and path of attachments are being stored in table. In cleanup of tables I also have to delete files from folder.
The path of files is like following.
\\xxx.xxx.xxx.xxx\EmailAttachments\Some Confirmation for xyz Children Centre_9FW4ZE1C57324B70EC79WZ15FT9FA19E.pdf
I can delete multiple files like following.
xp_cmdshell 'del c:\xyz.txt, abc.txt'
BUT when I create a CSV from table using FOR XML PATH('') the string cut off at the end. There might be 1000s of rows to delete so I don't want to use cursor to delete files from folder.
How can I delete files from folder
without using cursor
What permissions do I need on
network folder to delete files using t-sql from sql server
EDIT:
I have used cursor and it looks ok, not taking so much time. One problem which I am facing is
The sql server consider file name with space as two files like following statement
xp_cmdshell 'del E:\Standard Invite.doc'
throws error
Could Not Find E:\Standard
Could Not Find C:\Windows\system32\Invite.doc
NULL
Thanks.
Personally, I wouldn't worry too much about using a cursor here. Cursors are only 'mostly evil'; as your task isn't a set-based operation a cursor may be the most effective solution.
Although you have a comment stating that it will take an "awful lot of time" to use a cursor, in this case the biggest overhead is the actual delete of the file (not the cursor).
Note: The file deletion is done by the Operation System, not by the RDBMS.
As the delete is being done by calling xp_cmdshell, and because it it a procedure (not a function, etc), you can't call it and pass in a table's contents.
What you could do is build up a string, and execute that. But note, you are limitted to a maximum of 8000 characters in this string. As you have already said that you may have thousands of files, you will certaily not fit it within 8000 characters.
This means that you are going to need a loop no matter what.
DECLARE
#command VARCHAR(8000),
#next_id INT,
#next_file VARCHAR(8000),
#total_len INT
SELECT
#command = 'DEL ',
#total_len = 4
SELECT TOP 1
#next_id = id,
#next_file = file_name + ', '
FROM
table_of_files_to_delete
ORDER BY
id DESC
WHILE (#next_file IS NOT NULL)
BEGIN
WHILE ((#total_len + LEN(#next_file)) <= 8000) AND (#next_file IS NOT NULL)
BEGIN
SELECT
#command = #command + #next_file,
#total_len = #total_len + LEN(#next_file)
SELECT
#next_file = NULL
SELECT TOP 1
#next_id = id,
#next_file = file_name + ', '
FROM
table_of_files_to_delete
WHERE
id < #next_id
ORDER BY
id DESC
END
SET #command = SUBSTRING(#command, 1, #total_len - 2) -- remove the last ', '
EXEC xp_cmdshell #command
SELECT
#command = 'DEL ',
#total_len = 4
END
Not pretty, huh?
What you may be able do, depending on what needs deleting, is to use wild-cards. For example:
EXEC xp_cmdshell 'DELETE C:\abc\def\*.txt'
To delete files with space in name you need to enclose the filename with "
xp_cmdshell 'del "E:\Standard Invite.doc"'
DECLARE #deleteSql varchar(500)
,#myPath varchar(500) = '\\DestinationFolder\'
SET #deleteSql = 'EXEC master..xp_cmdshell ''del '+#myPath +'*.csv'''
EXEC(#deleteSql)