I am trying to run an SSIS package using a stored procedure without using a job so that I can call multiple instances of the package without getting the job still in use error should multiple people want to use it at once. I've used the create and start execution procedures in the SSIS database which I have working but I also need the package to inherit the permissions of the person calling the procedure in order for it move some files during it's execution.
So far I'm using SELECT suser_name() to get the profile name of the person calling the procedure but I'm not sure where to use that next, will execute as follow through into the SSIS package if I run the start_execute procedure using it?
I have a huge amount of data in my tables and to extract results I used a stored procedure from it takes nearly 60 minutes.
Until it's done, my application is frozen or in wait.
My question is: I need to call the stored procedure from C# code and I do not need any output from it.
I handled all code in backend once done email got send.
I need only stored procedure should start execute and give alert to user, that once procedure is done mail will get to you.
Please describe how to run that procedure in the background without blocking my application in wait for user.
One possible solution which I have used successfully is to create a SQL Agent job that runs your stored procedure.
From your application you then just start the Agent job using sp_start_job, which returns immediately; SQL Agent then runs your procedure asynchronously with no additional involvement from your application.
My application actually calls a stored procedure which inturn calls a package and execute it.Initially we are doing it using xp_cmdshell and dtexec but client disallows to use xm_cmdshell.Is there is any other way to do this?
Please help.
This scenario is discussed in the SSIS documentation. Another option would be a CLR procedure that runs the package, so instead of TSQL/xp_cmdshell you use CLR/Dts object model.
You could set up a job that runs the package and then use sp_start_job to execut the job
http://msdn.microsoft.com/en-us/library/ms403355.aspx
You can use these commands to create the job programatically
http://msdn.microsoft.com/en-us/library/ms181153.aspx
I have a an application that needs to run at the end of a series of database jobs in SQL Server 2005. The application will do processing on the data that was created by these jobs.
What would be the best way to trigger the execution of this application?
Depending on the type of application, if it's non-interactive then you can write CLR stored procedure that can be executed like any other stored procedure call.
If you don't mind having the application run on the database server, just add a new job step and execute it as part of the job. The command type is "operating system command (CmdExec).
I want to do something like:
exec sproc1 and sproc2 at the same time
when they are both finished exec sproc3
I can do this in dts.
Is there a way to do it in transact sql?
Or is there a way to do it with a batch script (eg vbs or powershell)?
You could create a CLR Stored Procedure that (using C#) would call the first two on their own threads, and then block until both are complete... then run the third one.
Are you able to use CLR sprocs in your situation? If so, I'll edit this answer to have more detail.
sp _ start _ job
I'm doing a similar thing at the moment, and the only way I've found to avoid using SSIS or some external shell is to split my load routine into 'threads' manually, and then fire a single master sqlagent job which in turn executes as many sp _ start _ job's as I have threads. From that point, they all run autonomously.
It's not exactly what we're looking for, but the result is the same. If you test the job status for the sub jobs, you can implement your conditional start of sproc 3 as well.
What's the point in 8 cores if we can't use them all at once?
Do you absolutely need both SPs to be executed in parallel?
With simple CRUD statements within a single SP, I've found SQL S. does a very good job of determining which of them can be run in parallel and do so. I've never seen SQL S. run 2 SPs in parallel if both are called sequentially from a T-SQL statement, don't even know if it's even possible.
Now then, do the DTS really execute them in parallel? It could be it simply executes them sequentially, then calls the 3rd SP after the last finishes successfully.
If it really runs them in parallel, probably you should stick with DTS, but then I'd like to know what it does if I have a DTS package call, say, 10 heavy duty SPs in parallel... I may have to do some testings to learn that myself :D
You can use SSIS. The benefits of this are that the package can be stored in the SQL Server and easily scheduled there.
From PowerShell or just about any external scripting language, you can use the SQL command line osql or sqlcmd. This technique can also be used to schedule it on the SQL Server by shelling out using xp_cmdshell also.