I am running SQL Server 2005 on Windows Server 2003 machine.
I have a requirement to accumulate small text files into a bigger one.
So I use
exec xp_cmdshell #sql
where #sql=
'copy /b'+#sourcePath+#sourceFile+' '+#destinationPath+#NewFileName
Both the source and destination paths are on a separate server.
Seldom this process fails and I don't find anything else in the event or SQL Server logs.
The Surface Area Config for xp_cmdshell is also enabled.
Please help.....
I just tested this on my sql server 2005 and EXEC dbo.xp_cmdshell always returns output (even in the case of a bogus command) in the form of a table. For C#, if you call this code with ExecuteNonQuery, then call it with ExecuteReader and read the output. Alternatively, you could dump the output in a table so that you can look at it later at your leisure. Create a table like this :
CREATE TABLE [dbo].[xp_cmdShellOutput](
[errorMsg] [nvarchar](max) NULL
)
and then use this code :
DECLARE #sql AS VARCHAR(600)
SELECT #sql = '<your command>'
INSERT dbo.xp_cmdShellOutput(errorMsg)
EXEC dbo.xp_cmdshell #sql
Related
I am trying to build an SSIS package that dynamically rebuilds the indexes for all the tables in my database. The general idea is that the package will make sure that the table is not being update and then execute a stored procedure that drops the old index, if it exists, and then recreates it. The logic behind the package seems to be sound. The problem that I am having is when I execute the package I keep getting the error:
Cannot find object...because it does not exist or you do not have permission...
The index existing should be irrelevant due to the IF EXISTS part.
The procedure looks like this:
REFERENCE_NAME AS VARCHAR(50),
COLUMN_NAME AS VARCHAR(50),
INDEX_NAME AS VARCHAR(50)
AS
BEGIN
SET NOCOUNT ON;
DECLARE #sql NVARCHAR(MAX)
SET #sql = 'IF EXISTS (SELECT name FROM sysindexes WHERE name = '+CHAR(39)+#INDEX_NAME+CHAR(39)+') '+
'DROP INDEX '+#INDEX_NAME+' ON '+#REFERENCE_NAME+' '+
'CREATE INDEX '+#INDEX_NAME+' ON '+#REFERENCE_NAME+'('+#COLUMN_NAME+') ON [INDEX]'
EXEC sp_executesql #sql
END
GO
I am able to execute the procedure through SSMS just fine, no error and it builds the index. When I execute the package in SSIS it errors out the minute it gets to the task that executes the stored procedure. I have made sure that SSIS is passing the variables to the execute SQL task and I have verified that I have db_ddladmin rights. Outside of that I am at a loss and have been beating my head against the wall for a day and a half on this.
Is there something I am missing, some permissions I need to request, or some work around for the issue?
Any information would be much appreciated.
Bartover, its definitely not looking at the wrong database. I have checked that the proc is there and the only connection on the package is to that specific database. Yes, I am executing the package manually with Visual Studios 2010 Shell Data Tools.
Sorrel, I tried your idea of a sanity check on the #sql statement on the drop, on both the drop and create, and on whole #sql statement, no joy.
Gnackenson, I had that same thought, but the connection authentication method is set to Windows Authentication, same as ssms. Do you have any ideas as to why it might use different permissions?
It looks like IF EXISTS is being ignored by SSIS SQL Task. To fix my problem, I altered my SQL tasks from DROP - CREATE to DISABLE - ENABLE.
I'm trying to create a temp table from stored procedures, from this link
In the string he defines the sql server version. Our clients have different types of sql servers, from 2005 until 2012.
String: 'SQLNCLI', 'Server=(local)\SQL2008;Trusted_Connection=yes;','EXEC getBusinessLineHistory'
How can I use that command independently from sql server plataform
The OPENROWSET creates a dynamic link to a remote server.
http://technet.microsoft.com/en-us/library/ms190312.aspx
You can create a dynamic TSQL call to a dynamic link with changing parameters. Below is sample code. This can be converted into a store procedure with a #my_Server passed as a parameter.
Please note, this does not support multiple calls at the same time since only one table exists.
You can not use a local temp table since there might be a scoping issue with EXEC calling sp_executesql inside a stored procedure.
These are things you will need to research.
-- Set the server info
DECLARE #my_Server SYSNAME;
SET #my_Server = 'Server=(local)\SQL2008';
-- Clear the staging table
truncate table STAGE.dbo.MYTABLE;
-- Allow for dynamic server location
DECLARE #my_TSQL NVARCHAR(2048);
SET #my_TSQL =
'INSERT INTO STAGE.dbo.MYTABLE SELECT * FROM OPENROWSET(''SQLNCLI'',' + #my_TSQL +
';Trusted_Connection=yes;'', ''EXEC usp_My_Stored_Procedure'')';
-- Run the dynamic remote TSQL
exec sp_executesql #my_TSQL;
I have a .sql script with a lot of action queries that work on some staging tables. This script needs to be run twice with some other commands in-between i.e.:
Load the staging table from source A
Use do_stuff.sql to process it
Move the results somewhere.
Repeat Steps 1-3 for source B.
The brute force approach would be to just copy & paste dostuff.sql as needed. While this would technically work, is there a better way?
I'm hoping there's a command like RunThisSQL 'C:\do_stuff.sql' that I haven't discovered yet.
Update
Well, it's been about 5 years and I just re-discovered this old question. I did this recently and made a cursor to loop thru a master table. For each record in that master table, the script runs through an inner script using variables set by the master table.
https://www.mssqltips.com/sqlservertip/1599/sql-server-cursor-example/
If you use visual studio you can create "Sql Server Database" project. Withing the project you can create script that let you execute your *.sql files in a manner
/*
Post-Deployment Script Template
--------------------------------------------------------------------------------------
This file contains SQL statements that will be appended to the build script.
Use SQLCMD syntax to include a file in the post-deployment script.
Example: :r .\myfile.sql
Use SQLCMD syntax to reference a variable in the post-deployment script.
Example: :setvar TableName MyTable
SELECT * FROM [$(TableName)]
--------------------------------------------------------------------------------------
*/
see also. http://candordeveloper.com/2013/01/08/creating-a-sql-server-database-project-in-visual-studio-2012/
Try using xp_cmdshell.
EXEC xp_cmdshell 'sqlcmd -S ' + #ServerName + ' -d ' + #DBName + ' -i ' +#FileName
xp_cmdshell and concatenation do not play together nicely, often resulting in an "Incorrect syntax near '+'" error. So further to Jeotics solution above you will need to make a variable of the entire string you pass to xp_cmdshell (including quotes around anything that may contain a space (eg filepath\filename). This is mentioned in the Microsoft documentation for xp_cmdshell here. Other issues you will have to contend with are the default set up for SQL Server which has xp_cmdshell disabled as outlined here and granting permission to non-system administrators to use xp_cmdshell outlined here. The documentation generally advises against giving xp_cmdshell rights to too many people owing to it being a vehicle for those with malintent but if, like me, you have minimal and trustworthy database users then it seems like a reasonable solution. One last issue that requires correct configuration is the SQL Server Agent as outlined here. Documentation outlines that SQL Agent is responsible for background scheduling (such as back ups) and performance of command line statements, etc..
DECLARE
#Server nvarchar (50)
,#Database nvarchar(50)
,#File nvarchar(100)
,#cmd nvarchar(300);
SET #Server = server_name;
SET #Database = database_name;
SET #File = 'C:\your file path with spaces';
SET #cmd = 'sqlcmd -S ' + #Server + ' -d ' + #Database + ' i "' + #File + '"';
EXEC xp_cmdshell #cmd;
There are some security issues with enabling xp_cmdshell in SQL Server. You can create a CLR Stored procedure, which executes the passed file content. This CLR stored procedure is especially for this purpose, not like xp_cmdshell, which can do anything over the command prompt.
issues with enabling xp_cmdshell
Create CLR stored procedure
Is it possible, in a script executed in MS SQL Server 2005, to copy a trigger from one database to another?
I've been asked to write a test script for a trigger my project is using. Our test structure is to create an empty database containing only the object under test, then execute a script on that database that creates all the other objects needed for the test, fills them, runs whatever tests are needed, compares the results against expected results, and then drops everything except the object under test.
I can't just create a database that is empty except for the trigger, because the trigger depends on several tables. My test script currently runs the CREATE TRIGGER after all the required tables are created, but this won't do because the test script isn't allowed to contain the object under test.
What's been suggested is that, instead of running a CREATE TRIGGER, I somehow copy the trigger at that point in the script from the live database to the test database. I've had a quick Google and haven't found a way to do this. Thus my question - is this even possible, and if so, how can I do it?
You could read the text of the trigger with sp_helptext (triggername)
Or you can select the text into a variable and execute that:
declare #sql varchar(8000)
select #sql = object_definition(object_id)
from sys.triggers
where name = 'testtrigger'
EXEC #sql
I have a stored procedure that copies a bunch of tables to a test database. To make it less prone to mistakes that could potentially change the wrong database, I want to avoid using USE and instead explicitly specify per statement which database the trigger is copied from and to.
With the help of this answer, I came up with this solution:
DECLARE #sql NVARCHAR(MAX);
EXEC SourceDB.sys.sp_executesql
N'SELECT #output = (SELECT OBJECT_DEFINITION(OBJECT_ID(''TriggerName'')))',
N'#output VARCHAR(MAX) OUTPUT',
#output = #sql OUTPUT;
EXEC DestDB.sys.sp_executesql #sql;
I have a sql server stored procedure that I use to backup data from our database before doing an upgrade, and I'd really like it to be able to run the stored procedure on multiple databases by passing in the database name as a parameter. Is there an easy way to do this? The best I can figure is to dynamically build the sql in the stored procedure, but that feels like its the wrong way to do it.
build a procedure to back up the current database, whatever it is. Install this procedure on all databases that you want to backup.
Write another procedure that will launch the backups. This will depend on things that you have not mentioned, like if you have a table containing the names of each database to backup or something like that. Basically all you need to do is loop over the database names and build a string like:
SET #ProcessQueryString=
'EXEC '+DatabaseServer+'.'+DatabaseName+'.dbo.'+'BackupProcedureName param1, param2'
and then just:
EXEC (#ProcessQueryString)
to run it remotely.
There isn't any other way to do this. Dynamic SQL is the only way; if you've got strict controls over DB names and who's running it, then you're okay just truncating everything together, but if there's any doubt use QUOTENAME to escape the parameter safely:
CREATE PROCEDURE doStuff
#dbName NVARCHAR(50)
AS
DECLARE #sql NVARCHAR(1000)
SET #sql = 'SELECT stuff FROM ' + QUOTENAME(#dbName) + '..TableName WHERE stuff = otherstuff'
EXEC sp_ExecuteSQL (#sql)
Obviously, if there's anything more being passed through then you'll want to double-check any other input, and potentially use parameterised dynamic SQL, for example:
CREATE PROCEDURE doStuff
#dbName NVARCHAR(50)
#someValue NVARCHAR(10)
AS
DECLARE #sql NVARCHAR(1000)
SET #sql = 'SELECT stuff FROM ' + QUOTENAME(#dbName) + '..TableName WHERE stuff = #pOtherStuff'
EXEC sp_ExecuteSQL (#sql, '#pOtherStuff NVARCHAR(10)', #someValue)
This then makes sure that parameters for the dynamic SQL are passed through safely and the chances for injection attacks are reduced. It also improves the chances that the execution plan associated with the query will get reused.
personally, i just use a batch file and shell to sqlcmd for things like this. otherwise, building the sql in a stored proc (like you said) would work just fine. not sure why it would be "wrong" to do that.
best regards,
don
MSSQL has an OPENQUERY(dbname,statement) function where if the the server is linked, you specify it as the first parameter and it fires the statement against that server.
you could generate this openquery statement in a dynamic proc. and either it could fire the backup proc on each server, or you could execute the statement directly.
Do you use SSIS? If so you could try creating a couple ssis packages and try scheduling them,or executing them remotely.