Unable to shrink Data File in SQL Server (Taking too much time) - sql-server-2016

I am using :
SQL Server 2016 Standard edition
Windows Server 2012 R2 (Standard)
Database Size: 1.4TB
Used space from the Primary Data File: 587GB
Unused space in a Primary Data File : 852GB
Recovery Model: Simple
I am trying to shrink the Data file to 687GB by using a following Command:
USE [TestDB]
GO
DBCC SHRINKFILE (N'TestDB' , 687017)
GO
And there is no blocking and no other activities are happening on this database.
The above shrink operation is been in the process since last 19hrs and still not completed.
So can anyone tell me what needs to be done at this point of time.
How much time should be taken by this shrink operation?

When shrinking a data file is taking forever then what needs to be done at this point of time?
Well you need to do the following things:
Rebuild Indexes of a database before performing the shrink operation.
If the size of the file is too large for your environment, then try to shrink the file in small chunks.
The another option to resolve this issue which I like is emptying a file.
- EMPTYFILE :
Migrates all data from the specified file to other files in the same filegroup. In other words, EmptyFile will migrate the data from the specified file to other files in the same filegroup. EmptyFile assures you that no new data will be added to the file. The file can be removed by using the ALTER DATABASE statement.
Example: Emptying a file
The following example demonstrates the procedure for emptying a file so that it can be removed from the database. For the purposes of this example, a data file is first created and it is assumed that the file contains data.
USE AdventureWorks2012;
GO
-- Create a data file and assume it contains data.
ALTER DATABASE AdventureWorks2012
ADD FILE (
NAME = Test1data,
FILENAME = 'C:\t1data.mdf',
SIZE = 5MB
);
GO
-- Empty the data file.
DBCC SHRINKFILE (Test1data, EMPTYFILE);
GO
-- Remove the data file from the database.
ALTER DATABASE AdventureWorks2012
REMOVE FILE Test1data;
GO
That was taken straight from BOL.
How much time should be taken by this shrink operation?
DBCC SHRINKFILE
is a single threaded operation and a single threaded operation does not take advantage of multiple CPUs and have no effect about how much RAM is available.
However; rebuilding indexes before running DBCC SHRINKFILE operations, shrinking file operations will take relatively less time.
Rebuilding Index operations takes advantage of multiple CPUs.
You can also check the DBCC SHRINKFILE completion progress by using the following command:
SELECT percent_complete, estimated_completion_time
FROM sys.dm_exec_requests
WHERE session_id = <spid running the shrink>;

Assuming all of your tables have been reindexed, then I find reducing the database in smaller chunks works better. I have created a little procedure which helps me shrink databases, which might be helpful:
CREATE PROCEDURE Shrinkdb #dbname VARCHAR(128) , #targetpercentfree float
AS
CREATE TABLE #dbresults
(dbname VARCHAR(128) ,
Filename VARCHAR(128),
type_desc VARCHAR(32),
CurrentSizeMB FLOAT,
FreeSpaceMB FLOAT,
percentagefree FLOAT)
DECLARE #statement NVARCHAR(MAX)
SET #STATEMENT = '
use ' + #dbname + ' ;
INSERT into #dbresults
SELECT DB_NAME() AS DbName,
name AS FileName,
type_desc,
size/128.0 AS CurrentSizeMB,
size/128.0 - CAST(FILEPROPERTY(name, ''SpaceUsed'') AS INT)/128.0 AS FreeSpaceMB,
0 as percentageFree
FROM sys.database_files
WHERE type IN (0,1)
AND name not LIKE ''%log%'' '
EXECUTE sp_executesql #statement
IF ##ERROR > 0
BEGIN
PRINT #statement
END
UPDATE #dbresults
SET percentageFree = FreeSpaceMB /(FreeSpaceMB + CurrentSizeMB) * 100
DECLARE #filename NVARCHAR(128), #currentsize INT, #targetsize FLOAT, #percentagefree FLOAT, #freespaceMB float
SELECT #filename = [filename],
#currentsize = CurrentSizeMB,
#freespaceMB = freespaceMB,
#percentagefree = percentagefree
FROM #dbresults
SELECT * FROM #dbresults
SET #targetsize = (#currentsize - #freespaceMB) * (1 + #targetpercentfree /100)
select #targetsize as TargetSize
-- target percentage should be 10% free if > 12 then shrink
IF #percentagefree > #targetpercentfree
BEGIN
WHILE 1= 1
BEGIN
-- doing this in 1 gb chunks means that it is much likelier to finish
SET #currentsize = #currentsize -1000
BEGIN
IF #currentsize < #targetsize
BEGIN
BREAK
END
SET #STATEMENT = ' USE ' + #DBNAME + '; DBCC SHRINKFILE ( ' + + '''' + #filename + '''' + ',' + convert(varchar,#currentsize) + ' )'
EXEC sp_executesql #statement
IF ##ERROR > 0
BEGIN
print #statement
END
END
END
END
go
EXECUTE Shrinkdb #dbname = 'databasename', #targetpercentfree = 5

Related

MS-SQL: Changing the FileGrowth parameters of a database generically

In our software the user can create databases as well as connect to databases that were not created by our software. The DBMS is Microsoft SQL-Server.
Now I need to update the databases that we use and set the FileGrowth parameter of all the files of all the databases to a certain value.
I know how to get the logical file names of the files of the current database from a query:
SELECT file_id, name as [logical_file_name], physical_name FROM sys.database_files
And I know how to set the desired FileGrowth value, once I know the logical file name:
ALTER DATABASE MyDB MODIFY FILE (Name='<logical file name>', FileGrowth=10%)
But I don't know how to combine these to steps into one script.
Since there are various databases I can't hard code the logical file names into the script.
And for the update process (right now) we only have the possibility to get the connection of a database and execute sql scripts on this connection, so a "pure" script solution would be best, if that's possible.
The following script receives a database name as parameter and uses 2 dynamic SQL: one for a cursor to cycle database files of chosen database and another to apply the proper ALTER TABLE command, since you can't use a variable for the file name on MODIFY FILE.
The EXEC is commented on both occasions and there's a PRINT instead, so you can review before executing. I've just tested it on my sandbox and it's working as expected.
DECLARE #DatabaseName VARCHAR(100) = 'DBName'
DECLARE #DynamicSQLCursor VARCHAR(MAX) = '
USE ' + #DatabaseName + ';
DECLARE #FileName VARCHAR(100)
DECLARE FileCursor CURSOR FOR
SELECT S.name FROM sys.database_files AS S
OPEN FileCursor
FETCH NEXT FROM FileCursor INTO #FileName
WHILE ##FETCH_STATUS = 0
BEGIN
DECLARE #DynamicSQLAlterDatabase VARCHAR(MAX) = ''
ALTER DATABASE ' + #DatabaseName + ' MODIFY FILE (Name = '''''' + #FileName + '''''', FileGrowth = 10%)''
-- EXEC (#DynamicSQLAlterDatabase)
PRINT (#DynamicSQLAlterDatabase)
FETCH NEXT FROM FileCursor INTO #FileName
END
CLOSE FileCursor
DEALLOCATE FileCursor '
-- EXEC (#DynamicSQLCursor)
PRINT (#DynamicSQLCursor)
You might want to check for the usual dynamic SQL caveats like making sure the values being concatenated won't break the SQL and also add error handling.
As for how to apply this to several databases, you can create an SP and execute it several times, or wrap a database name cursor / while loop over this.

Backup script I've been using with no issues for a while tossing an error on new database

I've been using the code below to drop and create a new backup named (current year database)_daily at midnight to allow my team to test new scripts or updates to our student information system.
It worked all last year, and this year for reasons I can't figure out, the script is tossing an error.
Here is the script:
USE master;
GO
-- the original database (use 'SET #DB = NULL' to disable backup)
DECLARE #SourceDatabaseName varchar(200)
DECLARE #SourceDatabaseLogicalName varchar(200)
DECLARE #SourceDatabaseLogicalNameForLog varchar(200)
DECLARE #query varchar(2000)
DECLARE #DataFile varchar(2000)
DECLARE #LogFile varchar(2000)
DECLARE #BackupFile varchar(2000)
DECLARE #TargetDatabaseName varchar(200)
DECLARE #TargetDatbaseFolder varchar(2000)
-- ****************************************************************
SET #SourceDatabaseName = '[DST18000RD]' -- Name of the source database
SET #SourceDatabaseLogicalName = 'DST18000RD' -- Logical name of the DB ( check DB properties / Files tab )
SET #SourceDatabaseLogicalNameForLog = 'DST18000RD_log' -- Logical name of the DB ( check DB properties / Files tab )
SET #BackupFile = 'F:\Dev_Databases\Temp\backup.dat' -- FileName of the backup file
SET #TargetDatabaseName = 'DST18000RD_Daily' -- Name of the target database
SET #TargetDatbaseFolder = 'F:\Dev_Databases\Temp\'
-- ****************************************************************
SET #DataFile = #TargetDatbaseFolder + #TargetDatabaseName + '.mdf';
SET #LogFile = #TargetDatbaseFolder + #TargetDatabaseName + '.ldf';
-- Disconnect any users using #TargetDatabaseName
USE [master];
DECLARE #kill varchar(8000) = '';
SELECT #kill = #kill + 'kill ' + CONVERT(varchar(5), session_id) + ';'
FROM sys.dm_exec_sessions
WHERE database_id = db_id('DST18000RD_Daily')
EXEC(#kill);
-- Backup the #SourceDatabase to #BackupFile location
IF #SourceDatabaseName IS NOT NULL
BEGIN
SET #query = 'BACKUP DATABASE ' + #SourceDatabaseName + ' TO DISK = ' + QUOTENAME(#BackupFile,'''')
PRINT 'Executing query : ' + #query;
EXEC (#query)
END
PRINT 'OK!';
-- Drop #TargetDatabaseName if exists
IF EXISTS(SELECT * FROM sysdatabases WHERE name = #TargetDatabaseName)
BEGIN
SET #query = 'DROP DATABASE ' + #TargetDatabaseName
PRINT 'Executing query : ' + #query;
EXEC (#query)
END
PRINT 'OK!'
-- Restore database from #BackupFile into #DataFile and #LogFile
SET #query = 'RESTORE DATABASE ' + #TargetDatabaseName + ' FROM DISK = ' + QUOTENAME(#BackupFile,'''')
SET #query = #query + ' WITH MOVE ' + QUOTENAME(#SourceDatabaseLogicalName,'''') + ' TO ' + QUOTENAME(#DataFile ,'''')
SET #query = #query + ' , MOVE ' + QUOTENAME(#SourceDatabaseLogicalNameForLog,'''') + ' TO ' + QUOTENAME(#LogFile,'''')
PRINT 'Executing query : ' + #query
EXEC (#query)
PRINT 'OK!'
The script is not mine, I put together two scripts to get me what I needed. Our old database DST17000RD, this script still works flawlessly. On the new database DST18000RD, I get this error:
Executing query : BACKUP DATABASE [DST18000RD] TO DISK = 'F:\Dev_Databases\Temp\backup.dat'
Processed 1209552 pages for database 'DST18000RD', file 'DST18000RD' on file 23.
Processed 2 pages for database 'DST18000RD', file 'DST18000RD_log' on file 23.
BACKUP DATABASE successfully processed 1209554 pages in 139.942 seconds (67.525 MB/sec).
OK!
OK!
Executing query : RESTORE DATABASE DST18000RD_Daily FROM DISK = 'F:\Dev_Databases\Temp\backup.dat' WITH MOVE 'DST18000RD' TO 'F:\Dev_Databases\Temp\DST18000RD_Daily.mdf' , MOVE 'DST18000RD_log' TO 'F:\Dev_Databases\Temp\DST18000RD_Daily.ldf'
Msg 3234, Level 16, State 2, Line 3
Logical file 'DST18000RD' is not part of database 'DST18000RD_Daily'. Use RESTORE FILELISTONLY to list the logical file names.
Msg 3013, Level 16, State 1, Line 3
RESTORE DATABASE is terminating abnormally.
OK!
Some things to note that may just be me barking up the wrong tree. DST17000RD database is compatibility level SQL Server 2012 (110) and the DST18000RD database is SQL Server 2017 (140). The server was upgraded and migrated a couple months ago before the new database was created.
Any help is appreciated. From what I can tell, I feel like the script is not renaming the MDF and LDF files before it tries to copy them for the *_daily database? Honestly I'm not sure. I'm a pretend DBA, self taught on an as needed basis. Thank you in advance for your help!
The error is telling you got the logical name of the db file wrong:
SET #SourceDatabaseLogicalName = 'DST18000RD' -- Logical name of the DB ( check DB properties / Files tab )
and to run:
RESTORE FILELIST ONLY FROM DISK = 'F:\Dev_Databases\Temp\backup.dat'
To see the correct logical file names.
The issue is you are trying to change the file logical name during the database restore, which is not possible even if you use the MOVE clause.
The MOVE clause allows you to change the location and the names of the physical files but does not do anything for the logical names.
Fix
You will have to use the existing logical names for your database, but once you have restored the database then use ALTER DATABASE command to change the logical names of your files using the following command:
USE [master];
GO
ALTER DATABASE [DST18000RD]
MODIFY FILE ( NAME = DST17000RD , NEWNAME = DST18000RD );
GO

SSIS OPENROWSET query flat file

I currently have a variable name called InvoiceFileName that is creating .csv files through a foreach loop. A list of .csv is then outputted to a folder.
I will then need to query off of each .csv file to select the header and the first row of data for each .csv.
I believe I need to use the OPENROWSET to query off of the .csv. I have 2 questions.
What is the syntax to query off of the variable name InvoiceFileName.
Is it possible to select the header field and first row of data OPENROWSET without inserting into a table.
Below is a simple OPENROWSET that only provides the header of the file.
SELECT
top 1 *
FROM OPENROWSET(BULK N'\\myservername\f$\reports\Invoices\CokeFiles\54ASBSd.csv', SINGLE_CLOB) AS Report
What kind of privs do you have on the database? If you have or can get slightly elevated privs, you can use BULK INSERT and xp_cmdShell to accomplish this, but like #scsimon said, you will have to use dynamic sql. Here's a quick example:
-----------------------------------------------------------------------------------------------------
-- Set up your variables
-----------------------------------------------------------------------------------------------------
DECLARE
#folderPath AS VARCHAR(100) = '\\some\folder\path\here\',
#cmd AS VARCHAR(150), -- Will populate this with a command to get a list of files in a directory
#InvoiceFileName AS VARCHAR(100), -- Will be used in cursor loop
#targetTable AS VARCHAR(50) = 'SomeTable',
#fieldTerminator AS CHAR(1) = ',',
#rowTerminator AS CHAR(2) = '\n'
-----------------------------------------------------------------------------------------------------
-- Create a temp table to store the file names
-----------------------------------------------------------------------------------------------------
IF OBJECT_ID('tempdb..#FILE_LIST') IS NOT NULL
DROP TABLE #FILE_LIST
--
CREATE TABLE #FILE_LIST(FILE_NAME VARCHAR(255))
-----------------------------------------------------------------------------------------------------
-- Get a list of the files and store them in the temp table:
-- NOTE: this DOES require elevated permissions
-----------------------------------------------------------------------------------------------------
SET #cmd = 'dir "' + #folderPath + '" /b'
--
INSERT INTO #FILE_LIST(FILE_NAME)
EXEC Master..xp_cmdShell #cmd
--------------------------------------------------------------------------------
-- Here we remove any null values
--------------------------------------------------------------------------------
DELETE #FILE_LIST WHERE FILE_NAME IS NULL
-----------------------------------------------------------------------------------------------------
-- Set up our cursor and loop through the files
-----------------------------------------------------------------------------------------------------
DECLARE c1 CURSOR FOR SELECT FILE_NAME FROM #FILE_LIST
OPEN c1
FETCH NEXT FROM c1 INTO #InvoiceFileName
WHILE ##FETCH_STATUS <> -1
BEGIN -- Begin WHILE loop
BEGIN TRY
-- Bulk insert won't take a variable name, so dynamically generate the
-- SQL statement and execute it instead:
SET #sql = 'BULK INSERT ' + #targetTable + ' FROM ''' + #InvoiceFileName + ''' '
+ ' WITH (
FIELDTERMINATOR = ''' + #fieldTerminator + ''',
ROWTERMINATOR = ''' + #rowTerminator + ''',
FIRSTROW = 1,
LASTROW = 2
) '
EXEC (#sql)
END TRY
BEGIN CATCH
-- Handle errors here
END CATCH
-- Continue your loop
FETCH NEXT FROM c1 INTO #path,#filename
END -- End WHILE loop
-- Do what you need to do here with the data in your target table
A few disclaimers:
I have not tested this code. Only copied from a slightly more complex proc I've used in the past that works for exactly this kind of scenario.
You will need elevated privs for BULK INSERT and xp_cmdShell.
I know people frown on using xp_cmdShell (and for good reason) but this is a quick and dirty solution making a lot of assumptions about what your environment is like.
This is assuming you're not grabbing the data as you get each file in your variable. If you are, you can skip the first part of this code.
This code also assumes you are doing your own error handling in places other than the one try/catch block you see. I've omitted a lot of that for simplicity.
For doing this through SSIS, ideally you'd probably need to use a format file for the bulk operation, but you'd have to have consistently formatted files and remove the SINGLE_CLOB option as well. A really hacky and non-ideal way to do this would be to do something like this:
Let's say your file contains this data:
Col1,Col2,Col3,Col4
Here's,The,First,Line
Here's,The,Second,Line
Here's,The,Third,Line
Here's,The,Fourth,Line
Then you could basically just parse the data doing something like this:
SELECT SUBSTRING(OnlyColumn, 0, CHARINDEX(CHAR(10), OnlyColumn, CHARINDEX(CHAR(10), OnlyColumn, 0)+1) )
FROM OPENROWSET(BULK '\\location\of\myFile.csv', SINGLE_CLOB) AS Report (OnlyColumn)
And your result would be this:
Col1,Col2,Col3,Col4 Here's,The,First,Line
This is obviously dependent on your line endings being consistent, but if you want the results in a single column and single row (as is the behavior of the bulk operation with the SINGLE_CLOB option), that should get you what you need.
You can take a look at the solution on this SO post for info on how to pass the SSIS variable value as a parameter to your query.
Use a Foreach Loop container to query all files in a folder. You can use wildcards for the file name, or user the variables in your DTS to set the properties of the components.
Inside the loop container you place a Data Flow Task with your source file connection, your transformations, and your destination.
You can modify the file names and paths of all these objects by setting their properties to variables in your DTS.
With an Expresion Task inside the loop, you can change the path of the CSV file connection.

Scripted Restore Using xp_DirTree For Transient Logical BAK File Name SQL Server

Hi I am trying to restore a DB from one server to another where the logical name of the .bak file changes daily with a new timestamp, I have so far found success in determining this name using the following SQL script provided by Jeff Moden here: http://www.sqlservercentral.com/Forums/Topic1200360-391-1.aspx
--===== Create a holding table for the file names
CREATE TABLE #File
(
FileName SYSNAME,
Depth TINYINT,
IsFile TINYINT
)
;
--===== Capture the names in the desired directory
-- (Change "C:\Temp" to the directory of your choice)
INSERT INTO #File
(FileName, Depth, IsFile)
EXEC xp_DirTree '\\filepath\',1,1
;
--===== Find the latest file using the "constant" characters
-- in the file name and the ISO style date.
SELECT TOP 1
FileName
FROM #File
WHERE IsFile = 1
AND FileName LIKE '%.bak' ESCAPE '_'
ORDER BY FileName DESC
;
DROP TABLE #File
My question is now how do I use this as the basis of a scripted restore operation? any help would be very much appreciated!
I have found success by extending the above to cache the directory path and ordered bak files chronologically to determine which to use, then combined the restore operation, with move for logs.
--==CHECK IF DB EXISTS IF IT DOES DROP IT
USE [master]
IF EXISTS(SELECT * FROM sys.databases where name='insert db name')
DROP DATABASE [insert db name]
--==START THE RESTORE PROCESS
DECLARE #FileName varchar(255), #PathToBackup varchar(255), #RestoreFilePath varchar(1000)
DECLARE #Files TABLE (subdirectory varchar(255), depth int, [file] int)
SET NOCOUNT ON
--==SET THE FILEPATH
SET #PathToBackup = '\\insert path to back up'
--insert into memory table using dirtree at a single file level
INSERT INTO #Files
EXEC master.dbo.xp_DirTree #PathToBackup,1,1
SELECT TOP 1
#FileName = [subdirectory]
FROM
#Files
WHERE
-- get where it is a file
[file] = 1
AND
--==FIND THE LOGICAL NAME OF THE BAK FILE FROM THE CHRONILOGICALLY ORDERED LIST
subdirectory LIKE '%.bak'
ORDER BY
-- order descending so newest file will be first by naming convention
subdirectory DESC
IF LEFT(REVERSE(#PathToBackup), 1) != '\'
BEGIN
SET #PathToBackup = #PathToBackup + '\'
END
SET #RestoreFilePath = #PathToBackup + #FileName
--Grab the file path to restore from
SELECT #RestoreFilePath
--BEGIN THE RESTORE TO THE DESIGNATED SERVER
RESTORE DATABASE [insert name of database to restore]
FROM DISK = #RestoreFilePath
WITH
FILE = 1,
--Create transactional log files on target
MOVE 'mdf_file_name' TO 'file_path\file.mdf',
MOVE 'log_file_name' TO 'file_path\file.ldf', REPLACE;
Here is a script that I have partly written and partly collected.
Features include:
Imports exactly one backup file that has any file name with .bak
ending. If there are more files, then imports only one without any
errors.
If no files exist, gives an error.
Kicks users out from the database.
Deletes the backup file
DECLARE #DBName nvarchar(255), #FileName nvarchar(255), #PathToBackup nvarchar(255), #RestoreFilePath nvarchar(1000)
DECLARE #Files TABLE (subdirectory nvarchar(255), depth int, [file] int)
SET XACT_ABORT, NOCOUNT ON
SET #PathToBackup = N'I:\Folder'
-- insert into our memory table using dirtree and a single file level
INSERT INTO #Files
EXEC master.dbo.xp_DirTree #PathToBackup,1,1
SELECT
#FileName = [subdirectory]
FROM
#Files
WHERE
-- get where it is a file
[file] = 1
AND
subdirectory LIKE N'%.bak'
ORDER BY
-- order descending so newest file will be first by naming convention
subdirectory DESC
IF LEFT(REVERSE(#PathToBackup), 1) != N'\'
BEGIN
SET #PathToBackup = #PathToBackup + N'\'
END
SET #RestoreFilePath = #PathToBackup + #FileName
SET #DBName = LEFT(#FileName, LEN(#FileName)-4)
-- SELECT 'Replace AdventureWorks2016CTP3 in this script with #DBName'
SELECT #RestoreFilePath
BEGIN TRY
-- You can try to check if this command works "already":
-- ALTER DATABASE [#DBName] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
ALTER DATABASE [AdventureWorks2016CTP3] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
-- You can try to check if this command works "already":
-- RESTORE DATABASE [#DBName]
RESTORE DATABASE [AdventureWorks2016CTP3]
FROM DISK = #RestoreFilePath
WITH FILE = 1, NOUNLOAD, REPLACE, STATS = 10
END TRY
BEGIN CATCH
-- You can try to check if this command works "already":
-- ALTER DATABASE [#DBName] SET MULTI_USER;
ALTER DATABASE [AdventureWorks2016CTP3] SET MULTI_USER;
; THROW
END CATCH
-- You can try to check if this command works "already":
-- ALTER DATABASE [#DBName] SET MULTI_USER;
ALTER DATABASE [AdventureWorks2016CTP3] SET MULTI_USER;
-- This script is especially for the case where you replication from one location to another using backup and restore.
-- Typically you don't need transaction log backups as all changes will be wiped out on next transfer.
-- You can try to check if this command works "already":
-- ALTER DATABASE [#DBName] SET RECOVERY SIMPLE;
ALTER DATABASE [AdventureWorks2016CTP3] SET RECOVERY SIMPLE;
-- Delete file(s)
-- NOTE: This works only if you give deletion permissions as defined in https://learn.microsoft.com/en-us/sql/database-engine/configure-windows/xp-cmdshell-server-configuration-option?view=sql-server-2017
-- EXAMPLE: exec xp_cmdshell 'del "I:\Directory\AdventureWorks2016CTP3___.bak"'
-- exec xp_cmdshell 'del "' + '#PathToBackup + ''\'' + #FileName + ''"''
DECLARE #cmd NVARCHAR(MAX) = 'xp_cmdshell ''del "' + #PathToBackup + #FileName + '"''';
-- SELECT #cmd
EXEC (#cmd)

Delete multiple files from folder using T-SQL without using cursor

I am writing a cleanup script. This script will run on weekend and clean up the db. Tables are related to Eamils and path of attachments are being stored in table. In cleanup of tables I also have to delete files from folder.
The path of files is like following.
\\xxx.xxx.xxx.xxx\EmailAttachments\Some Confirmation for xyz Children Centre_9FW4ZE1C57324B70EC79WZ15FT9FA19E.pdf
I can delete multiple files like following.
xp_cmdshell 'del c:\xyz.txt, abc.txt'
BUT when I create a CSV from table using FOR XML PATH('') the string cut off at the end. There might be 1000s of rows to delete so I don't want to use cursor to delete files from folder.
How can I delete files from folder
without using cursor
What permissions do I need on
network folder to delete files using t-sql from sql server
EDIT:
I have used cursor and it looks ok, not taking so much time. One problem which I am facing is
The sql server consider file name with space as two files like following statement
xp_cmdshell 'del E:\Standard Invite.doc'
throws error
Could Not Find E:\Standard
Could Not Find C:\Windows\system32\Invite.doc
NULL
Thanks.
Personally, I wouldn't worry too much about using a cursor here. Cursors are only 'mostly evil'; as your task isn't a set-based operation a cursor may be the most effective solution.
Although you have a comment stating that it will take an "awful lot of time" to use a cursor, in this case the biggest overhead is the actual delete of the file (not the cursor).
Note: The file deletion is done by the Operation System, not by the RDBMS.
As the delete is being done by calling xp_cmdshell, and because it it a procedure (not a function, etc), you can't call it and pass in a table's contents.
What you could do is build up a string, and execute that. But note, you are limitted to a maximum of 8000 characters in this string. As you have already said that you may have thousands of files, you will certaily not fit it within 8000 characters.
This means that you are going to need a loop no matter what.
DECLARE
#command VARCHAR(8000),
#next_id INT,
#next_file VARCHAR(8000),
#total_len INT
SELECT
#command = 'DEL ',
#total_len = 4
SELECT TOP 1
#next_id = id,
#next_file = file_name + ', '
FROM
table_of_files_to_delete
ORDER BY
id DESC
WHILE (#next_file IS NOT NULL)
BEGIN
WHILE ((#total_len + LEN(#next_file)) <= 8000) AND (#next_file IS NOT NULL)
BEGIN
SELECT
#command = #command + #next_file,
#total_len = #total_len + LEN(#next_file)
SELECT
#next_file = NULL
SELECT TOP 1
#next_id = id,
#next_file = file_name + ', '
FROM
table_of_files_to_delete
WHERE
id < #next_id
ORDER BY
id DESC
END
SET #command = SUBSTRING(#command, 1, #total_len - 2) -- remove the last ', '
EXEC xp_cmdshell #command
SELECT
#command = 'DEL ',
#total_len = 4
END
Not pretty, huh?
What you may be able do, depending on what needs deleting, is to use wild-cards. For example:
EXEC xp_cmdshell 'DELETE C:\abc\def\*.txt'
To delete files with space in name you need to enclose the filename with "
xp_cmdshell 'del "E:\Standard Invite.doc"'
DECLARE #deleteSql varchar(500)
,#myPath varchar(500) = '\\DestinationFolder\'
SET #deleteSql = 'EXEC master..xp_cmdshell ''del '+#myPath +'*.csv'''
EXEC(#deleteSql)