Escaping command parameters passed to xp_cmdshell to dtexec - sql

I am calling an SSIS package remotely using a stored procedure and a call to xp_cmdshell:
declare #cmd varchar(5000)
set #cmd = '"C:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn\dtexec.exe" /Rep E /Sql Package /SET \Package.Variables[User::ImportFileName].Value;c:\foo.xlsx'
print #cmd
exec xp_cmdshell #cmd
This works fine, however I can not guarantee the variable value (c:\foo.xslx) is not going to contain spaces so I would like to escape that with quotes like below:
set #cmd = '"C:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn\dtexec.exe" /Rep E /Sql Package /SET \Package.Variables[User::ImportFileName].Value;"c:\foo.xlsx"'
But by doing this I get the error
'C:\Program' is not recognized as an internal or external command, operable program or batch file.
Both of the above commands work fine if executed within cmd.exe so I am guessing that SQL Server is interpreting my double quotes and changing something, but I can't figure out what.

In a nutshell, put CMD /S /C " at the beginning, and " at the end. In between you can have as many quotes as you like.
Here is how you do it:
declare #cmd varchar(8000)
-- Note you can use CMD builtins and output redirection etc with this technique,
-- as we are going to pass the whole thing to CMD to execute
set #cmd = 'echo "Test" > "c:\my log directory\logfile.txt" 2> "c:\my other directory\err.log" '
declare #retVal int
declare #output table(
ix int identity primary key,
txt varchar(max)
)
-- Magic goes here:
set #cmd = 'CMD /S /C " ' + #cmd + ' " '
insert into #output(txt)
exec #retVal = xp_cmdshell #cmd
insert #output(txt) select '(Exit Code: ' + cast(#retVal as varchar(10))+')'
select * from #output

After looking into this, it appears you have to use the DOS 8.3 notation with xp_cmdshell e.g. c:\progra~1... and you can only have one set of quotes on the arguments.
To get around this limitation, either use the older DOS notation, or put your code in a batch file instead which will run fine.
Source: http://social.msdn.microsoft.com/forums/en-US/sqlintegrationservices/thread/4e7024bb-9362-49ca-99d7-1b74f7178a65

Related

bcp outfile not found

I'm trying to run the script below, but it returns null. When I run the DOS command, it generates the file normally.
DECLARE #str VARCHAR(1000)
SET #str = 'bcp "Select * FROM WDG.dbo.Facilidade" queryout "w:\xyzTable.txt" -S "WDG-NOTE24\MSSQLWDG" -T -c -t ; '
EXEC xp_cmdshell #str
GO
I need to return a separate txt file for ';' with query data
Tanks
Given your error in your comments i will follow this link and as i described earlier its an access problem to your folder.
Remember it should be given to your Sql service account user and not your self
BCP unable to open BCP host access
I managed to make it work, I found the command below that tests if it has access to the directory.
For some reason did not accept the old path, so I created another one on another disk and it worked.
EXEC master..xp_cmdshell 'DIR C:\sql'
Very thank's for help.
I have a problem, at the end of my import file, it comes with the text '--END--', and when the bulk insert is going to render, it displays an unexpected end message.
I can put some parameter so that when it finds the text it finishes the import.
declare #sql varchar(max)
set #sql = 'BULK INSERT Temp_Facilite FROM ''' + ##FullPath + '''WITH (FIRSTROW = 2,CODEPAGE = ''RAW'',FIELDTERMINATOR = '';'',ROWTERMINATOR = ''0x0A'',MAXERRORS = 3, KEEPNULLS );'
exec (#sql)

SQL Server Calling a stored procedure from another stored procedure at the command line

I have been playing around with database backup automation scripts and in particular the one at this link:
http://support.microsoft.com/kb/2019698
I got everything working fine and even added automated compression using 7zip, logging, and with the help of vbscript an email scheduled notification. However, even without all that, you can see this is a bit heavy. Its now easily reaching 400 lines of code.
I am really not comfortable having all my stuff in one block like this and I want to separate it out. So I can have say a compression file called BackupCompress.sql, and an log file called BackupLogReport.sql all of which would be called from inside the main Backup.sql script.
The Backup.sql script is in turn run from a Backup.bat file which is set to run in the scheduler.
All of this works like a charm. But I am at a loss as to how to call BackupCompress.sql from within BackupLogReport.sql and pass in parameters and get a return value.
In the Backup.bat file I use this command to spin everything up and pass parameters to it:
SQLCMD -S %SQLDATABASE% -d master -i %BACKUP_FOLDER%\Backup.sql -v Pram1="%Pram1%"
In the Backup.sql file I get those parameters simply by:
DECLARE #Param1 NVARCHAR(256) = '$(Param)'
from then on as my script runs it uses whatever I want to pass in.
I tried using standard sql stored procedure logic to call another procedure like this:
EXEC BackupCompress.sql
#AnotherParam = #Param1
I also tried:
EXECUTE sp_executesql BackupCompress.sql #Param1
Finally I tried:
SET #cmd = 'SQLCMD -S ' + ##ServerName + ' -d master -i $(BACKUP_FOLDER)\BackupCompress.sql -v Param1 = ' + #Param1
EXEC xp_cmdshell #cmd, no_output
but it doesn't work and my files which were being compressed simply don't get compressed. I get no error message. everything else continues to work fine.
EDIT: I was getting an error message on the last one but I fixed it - however, I still don't get my little zip file. I even put print's into the file to see if it was actually be executed but it does not seem to be.
EDIT2: Another option I have tried, almost works, but cant figure out how to pass parameters from within the sql file to the other file... As a result it generates an error saying it cant find the file as it's treating the path as a literal string instead of the variable value I want to pass.
:!!SQLCMD -S ##ServerName -d master -i #CFG_BACKUP_PATH\BackupCompress.sql -v Param1 = #Param1
xp_cmdshell can return values. These values can be captured into a table variable that you could use to "see" the results, and perhaps determine where the problem lies:
DECLARE #cmd VARCHAR(255);
DECLARE #Param1 NVARCHAR(256) = '$(Param)';
DECLARE #Results TABLE
(
ResultsText NVARCHAR(MAX)
);
SET #cmd = 'SQLCMD -S ' + ##ServerName + '-d master -i $(BACKUP_FOLDER)\$(BackupCompress.sql) -v Param1 = ' + #Param1;
SET #cmd = 'DIR \';
INSERT INTO #Results (ResultsText)
EXEC xp_cmdshell #cmd;
SELECT *
FROM #Results;
You need to ensure xp_cmdshell is enabled for the instance, by executing:
EXEC sp_configure 'xp_cmdshell',1;

Problem with BCP writing to .txt file from SQL

Im using Sql2008 trying to run this BCP command but it never creates the file.
-- Export query
DECLARE #qry2 VARCHAR(1000)
SET #qry2 = 'SELECT * FROM #SkippedProductsTable'
-- Folder we will be putting the file in
DECLARE #incomingfolder VARCHAR(1000)
SET #incomingfolder = 'c:\Logs'
DECLARE #bcpCommand VARCHAR(2000)
SET #bcpCommand = 'bcp "'+#qry2+'" queryout "'+#incomingfolder+'\SkippedProducts-'+CAST(#StoreMatchCode AS VARCHAR)+'-'+'.txt" -c -T'
PRINT #bcpCommand
EXEC MASTER..xp_cmdshell #bcpCommand, no_output
The created command looks like:
bcp "SELECT * FROM #SkippedProductsTable" queryout "c:\Logs\SkippedProducts-1330-.txt" -c -T
Can anyone suggest what could be going wrong? I've never used BCP before and not really sure where to start looking.
As a start I know that the folder deffinately exists at that location
I think the problem is the SELECT.
You are SELECTing from a table variable that is not declared in the query, so there's nothing for BCP to do.
Table variables only persist for the context they are called in, so even if you have one in a query, and you have dynamic sql or a subproc within that first query, they won't be able to see the table variable.
See this for more info.

TransactSQL to run another TransactSQL script

I have 10 transact SQL scripts that each create a table and fill it with data.
I am attempting to create 1 master sql script that will run each of the 10 other scripts.
Is there a way with TSQL / TRANSACTSQL for Microsoft SQL Server 2008 to execute another tsql script from within the current tsql script?
This is intended to be run through the SQL Server Management Studio (SSMS).
Thanks!
Try this if you are trying to execute a .sql file in SSMS:
:r C:\Scripts\Script1.sql
:r C:\Scripts\Script2.sql
:r C:\Scripts\Script3.sql
...
note: for this to run turn on sql command mode (Query > SQLCMD Mode)
If these are scripts you run fairly often you might consider dropping them in a stored proc and running them that way...
You can also do it through sqlcmd (which I believe is more common):
sqlcmd -S serverName\instanceName -i C:\Scripts\Script1.sql
Or just use openrowset to read your script into a variable and execute it:
DECLARE #SQL varchar(MAX)
SELECT #SQL = BulkColumn
FROM OPENROWSET
( BULK 'MeinPfad\MeinSkript.sql'
, SINGLE_BLOB ) AS MYTABLE
--PRINT #sql
EXEC (#sql)
I find it useful to define a variable with the path, if I want to execute a set of scripts, say to run a test, something like:
:setvar path "C:\code\branch-qa"
:r $(path)\tables\client.sql
:r $(path)\tables\item.sql
:r $(path)\proc\clientreport.sql
exec clientreport
You can use osql or better yet the newer sqlcmd almost interchangeably. I am using osql in this example only because I happened to have a code sample sitting around but in production I am using sqlcmd. Here is a snipped of code out of a larger procedure I use to run update scripts against databases. They are ordered by major, minor, release, build as I name my scripts using that convention to track releases. You are obviously missing all of my error handing, the parts where I pull available scripts from the database, setup variables, etc but you may still find this snippet useful.
The main part I like about using osql or sqlcmd is that you can run this code in ssms, or in a stored procedure (called on a scheduled basis maybe) or from a batch file. Very flexible.
--Use cursor to run upgrade scripts
DECLARE OSQL_cursor CURSOR
READ_ONLY
FOR SELECT FileName
FROM #Scripts
ORDER BY Major, Minor, Release, Build
OPEN OSQL_cursor
FETCH NEXT FROM OSQL_cursor INTO #name
WHILE (##fetch_status <> -1)
BEGIN
IF ((##fetch_status <> -2) AND (#result = 0))
BEGIN
SET #CommandString = 'osql -S ' + ##ServerName + ' -E -n -b -d ' + #DbName + ' -i "' + #Dir + #name + '"'
EXEC #result = master.dbo.xp_cmdshell #CommandString, NO_OUTPUT
IF (#result = 0)
BEGIN
SET #Seconds = DATEDIFF(s, #LastTime, GETDATE())
SET #Minutes = #Seconds / 60
SET #Seconds = #Seconds - (#Minutes * 60)
PRINT 'Successfully applied ' + #name + ' in ' + cast(#Minutes as varchar)
+ ' minutes ' + cast(#Seconds as varchar) + ' seconds.'
SET #LastTime = GETDATE()
END
ELSE
BEGIN
SET #errMessage = 'Error applying ' + #name + '! The database is in an unknown state and the schema may not match the version.'
SET #errMessage = #errMessage + char(13) + 'To find the error restore the database to version ' + #StartingVersion
SET #errMessage = #errMessage + ', set #UpToVersion = the last version successfully applied, then run ' + #name
SET #errMessage = #errMessage + ' manually in Query Analyzer.'
END
IF #name = (#UpToVersion + '.sql')
GOTO CleanUpCursor --Quit if the final script specified has been run.
END
FETCH ENDT FROM OSQL_cursor INTO #name
END
The simplest way would be to make your scripts stored procedures, and to call (via the EXECUTE command) each procedure in turn from a central procedure. This is ideal if you're going to run the exact same script(s) over and over again (or the same script with different parameters passed in).
If your scripts are .sql (or any kind of text) file, as #Abe Miesller says (upvoted) you can run them from within SSMS via the :r command, when SQLCMD mode is enabled. You would have to know and script the exact file path and name. This cannot be done from within a stored procedure.
A last alternative, usable with "known" file names and necessary for arbitrary file names (say, all files currently loaded in a subfolder) is to leverage the power of extended procedure XP_CMDSHELL. Such solutions can get compelx pretty fast (use it to retrieve list of files, build and execute via xp_cmdshell a string calling SQLCMD for each file in turn, manage results and errors via output files, it goes on and on) so I'd only do this as a last resort.
Assuming you want to keep the 10 scripts in their own individual files, I would say the easiest way to do what you want would be to create a batch file that executes osql.exe to execute the 10 scripts in the order you want.

Transfering the content of a variable to file in SQL Server 2k5

How do we transfer the content of a string variable to a file on a SP? I need to do these:
Create an empty text file from an SP
Push the content of a variable (whose len is 25487) to the newly created file
The variable size is Varchar(Max), and this is the code I am trying to make work (but it ain't [Sad] ) ---
Declare #cmd sysname
Declare #ReqContent Varchar(max)
SET #cmd = 'echo ' + #ReqContent + ' > C:\DD4TRD.txt'
EXEC master..xp_cmdshell #cmd ,NO_OUTPUT
Thanks,
Sayan
Check out this sproc from Reading and Writing Files in SQL Server using T-SQL.