Stored procedure with bcp hangs, but works when run as a script - freeze

I've been working on exporting a table to a file, and had problems with the bcp (bulk copy program) part of the procedure locking up. The code worked fine when I ran it as a script, but would generate locked processes when I wrapped it in a stored procedure.

I seem to have found the solution; COMMIT. Namely, I had to wrap the code which truncated and inserted into the table which bcp would be picking up the data from within a BEGIN TRANSACTION...COMMIT. Now the procedure works
I think it is to do with the command
exec master.dbo.xp_cmdshell #bcp
going outside of the SQL session to the OS. Am I correct, or is there a better explanation?

Related

I've generated an sql file full of inserts but can't find any documentation of executing this script from a stored procedure

I'm creating a stored procedure that will delete all the data in my database and then insert the data from my sql file. The reason I am using the delete and insert instead of a restore is because a restore requires that no one is connected to the database where as deleting and inserting allows people to still be connected.
Stored Procedure:
CREATE PROCEDURE DropAndRestore
-- Add the parameters for the stored procedure here
#filepath nvarchar(200)
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
Exec sp_MSFOREACHTABLE 'delete from ?
RESTORE DATABASE [landofbeds] -- These lines are what needs to be replaced
FROM DISK = #FilePath --
END
GO
The reason I am using the delete and insert instead of a restore is
because a restore requires that no one is connected to the database
where as deleting and inserting allows people to still be connected
If all you need is minimum downtime you can restore your database in db_copy. Then drop your db and rename db_copy to db.
Yes you should disconnect all the users to be able to drop your db, but it will take minimum time, while if you delete your data the table will still be unavailable for the whole duration of the delete, and as delete is always fully logged your users will wait.
To launch your script you can use xp_cmdshell that calls sqlcmd with -i but it's not a good idea. You have no control on your script execution and if something goes wrong you will have even more downtime for your users.
Does your tables have FK defined?
Exec sp_MSFOREACHTABLE 'delete from ?
will try to delete everything in order it decides and you may end up with errors when you try to delete rows that are referenced in other tables.
To execute your sql file from Stored procedure .. you can use xp_cmdshell. See steps below
First Create a Batch File (C:\testApps\test.bat) and execute your sql file from there..
e.g.
osql -S TestSQlServer -E -I C:\testApps\test.sql > C:\testApps\tlog.txt
Then add this line to your Calling Stored procedure
exec xp_cmdshell 'C:\testApps\test.bat'
Execute your procedure
**Please note you will need to enable xp_cmdshell
You can use bulk insert like this:
BULK INSERT landofbeds.dbo.SalesOrderDetail
FROM '\\computer\share\folder\neworders.txt'

Using MSDB stored procedures in application's database stored procedure

This seems like it would be trivial, but I have not been able to come up with a solution to this small problem.
I am attempting to create a stored procedure in my application's database. This stored procedure just executes a job that has been set up in the SSMS on the same server (seemed to be the only way to programmatically execute these jobs).
The simple code is shown below:
USE ApplicationsDatabase
GO
CREATE PROCEDURE [dbo].[procedure]
AS
BEGIN
EXEC dbo.sp_start_job N'Nightly Download'
END
When ran as is, the procedure technically gets created but cannot be executed due to it not being able to find the 'sp_start_job' since it is using the ApplicationsDatabase. If I try to create the procedure again (after deleting previously created) but updating the USE to MSDB, it tries to add it to that system database for which I do not have permissions to do. Finally, I attempted to keep the original create statement but added the USE MSDB within the procedure (just to use the 'sp_start_job' procedure), but it would error saying USE statements cannot be placed within procedures.
After pondering on the issue for a little (I'm obviously no SQL database expert), I could not come up with a solution and decided to solicit the advice of my peers. Any help would be greatly appreciated, thanks!
You will have to fully qualify the path to the procedure. Of course, you can only execute this is the application has permissions.
Try this:
USE ApplicationsDatabase
GO
CREATE PROCEDURE [dbo].[procedure]
AS
BEGIN
EXEC msdb.dbo.sp_start_job N'Nightly Download'
END

using BCP to export stored procedure result in SQL Server 2008

Heyy,
I'm trying to use BCP to export a SP result to a text file using this query:
EXEC xp_cmdshell 'bcp "exec asmary..usp_Contract_SelectByEmpId -1,1" queryout "C:\test.txt" -w -C OEM -t$ -T -r ~ -S heba\HEBADREAMNET '
The output of this query is telling this error:
Error = [Microsoft][SQL Server Native Client 10.0][SQL Server]Incorrect syntax near the keyword 'where'.
even thought I'm sure that the stored procedure "usp_Contract_SelectByEmpId" is working correctly.
Anyone faced that kind of error before?
As Lynn suggested, check your stored procedure. It looks like the issue is within that.
Ensure any plain SELECT works (e.g., C: drive is database server's local drive, not necessarily your own local drive).
If the first two items work fine, then add SET FMTONLY OFF as follows:
EXEC xp_cmdshell 'bcp "set fmtonly off exec asmary..usp_Contract_SelectByEmpId -1,1" queryout "C:\test.txt" -w -C OEM -t$ -T -r ~ -S heba\HEBADREAMNET'
I have to admit that when I tried similar on my computer it failed with 'Function sequence error', and I found that it is related to a SQL Server 2008 bug fixed in 2011.
Please note also that even without SET FMTONLY OFF everything works with BCP library (odbcbcp.dll/odbcbcp.lib). So, you can have much more generic ODBC-wide bcp solution if you write your own wrapper executable (for instance, in C or C++).
I also found the following at http://msdn.microsoft.com/en-us/library/ms162802.aspx
The query can reference a stored procedure as long as all tables referenced inside the stored procedure exist prior to executing the bcp statement. For example, if the stored procedure generates a temp table, the bcp statement fails because the temp table is available only at run time and not at statement execution time. In this case, consider inserting the results of the stored procedure into a table and then use bcp to copy the data from the table into a data file.
Please see also my later separate reply - I think the whole concept of using stored procedure for BCP/queryout is wrong.
Try this.
DECLARE #strbcpcmd NVARCHAR(max)
SET #strbcpcmd = 'bcp "EXEC asmary..usp_Contract_SelectByEmpId -1,1" queryout "C:\test.txt" -w -C OEM -t"$" -T -S'+##servername
EXEC master..xp_cmdshell #strbcpcmd
Sorry for flooding your question with multiple answers, but I wanted to find out how much heavier (performance-wise) the use of stored procedure is compared to plain SELECT. And I got a very important information from
http://social.msdn.microsoft.com/Forums/en-US/transactsql/thread/b8340289-7d7e-4a8d-b570-bec7a0d73ead/
This forced me to create another (separate) answer. The post I refer to invalidates the whole concept.
In a few words: stored procedure might be called several (3) times in order to figure out the structure of the resultset, then the actual data.
Therefore (and especially if calling from SQL Server connection rather than client), I think it makes a lot more sense to have a stored procedure or function, which will return SELECT statement. Then you can have another generic stored procedure or function to create and execute full BCP command with that statement embedded. I am pretty sure that in this case BCP might use a lot better execution plan. Unfortunately, I cannot verify that in practice, because of BCP bug in SQL Server 2008 R2 I mentioned in my previous post.
N.B. Please be careful creating dynamic queries and escape all explicit literal strings (i.e. repeat all single quotes twice) in order to avoid notorious SQL injection. Unfortunately, there is another pitfall: you should ensure you are not escaping your queries twice or more times.

Can we delete the physical file from server when I delete the corresponding entry from database?

Can we delete the physical file from server when I delete the corresponding entry from database?
i.e., C:\Test\Test.txt -> when deleting this record from database, i need to delete the corresponding Test.txt file from mentioned location.
Is there a way, I m using SQL 2008.
Any help would be highly appreciable..
Thanks
The ways are:
use of xp_cmdshell proc (exec master..xp_cmdshell 'del C:\Test\Test.txt')
use the .NET CLR unsafe proc (need to write in any .NET language and deploy to sql server. Its a long story)
Both ways are ugly
And once again - it is the worst practice. Server should not delete user files, or any files, is they are not integral part of its database.
You could use CREATE TRIGGER FOR DELETE to create a trigger that runs when rows are deleted. The SQL statement that is run upon deletion can walk through the table deleted to get the deleted rows. For each row it can exec xp_cmdshell. xp_cmdshell is disabled by default, so you must enable it first using exec sp_configure.
I didnt tested this but i think it should work.
Try writing a stored procedure which has the filename as a parameter and delete it using the:
exec master.dbo.xp_cmdshell 'del <Filename>'
then create a trigger for after delete on the table containing the Filenames which calls the stored procedure and provides the Filename from table deleted, or maby you can run directly the command exec master.dbo.xp_cmdshell 'del ' from the trigger.
The better way would be to save the files as an Object in the Database instead of the file path, and when deleting you just delete the file Object.

How do I run several .sql scripts from one query?

In SQL Server Management Studio, I want to execute a number of SQL scripts (saved queries) one after the other. This is simply to make it easier to run each. I could take each script and combine them all into one massive script and simply execute the lot, however I want it to all be separate so I can easily and simply run each bit by bit.
For example, something like this:
EXEC ('CreateTable1.sql')
EXEC ('CreateTable2.sql')
EXEC ('CreateSP1.sql')
EXEC ('CreateSP2.sql')
EXEC ('SetupTestData.sql')
And that way I can run each line individually and keep everything separate.
if you like you can run then from the command line using SQLCMD -i, and put the commands in to a batch script. then you can call this from SQL management studio using exec xp_cmdshell. Actually you can be brave and run a FOR command under an xp_cmdshell and do the lot in one line. Or perhaps just run xp_cmdshell on them one by one
Or you can take the approach redgate approach and read the files into variables and then call exec on them. This last approach is serious overkill if all you want to do is exec a few scripts in my opinion.