Query remains active while calling a batch file from stored procedure in SQL Server2008 - sql

I created the following stored procedure:
CREATE PROCEDURE [dbo].[_StartWebcamStream]
AS
BEGIN
declare #command varchar(200)
set #command = 'C:\startStream.bat'
exec master..xp_cmdshell #command
END
for executing a the batch file startStream.bat. This batch contain the following code:
"C:\Program Files (x86)\VideoLAN\VLC\vlc.exe" -I dummy -vvv rtsp://mycamaddress/live_mpeg4.sdp --network-caching=4096 --sout=#transcode{vcodec=mp4v,fps=15,vb=512,scale=1,height=240,width=320,acodec=mp4a,ab=128,channels=2}:duplicate{dst=http{mux=asf,dst=:11345/},dst=display} :sout-keep}
The batch file is launched correctly, but the query continues to run until the vlc is stopped.
What can I do for stopping the query letting the vlc running?

DISCLAIMER: Don't run these examples in production!
SQL Server watches all processes spawned by xp_cmdshell and waits for them to finish. to see that you can just run this statement:
EXEC xp_cmdshell 'cmd /C "start cmd /K dir"'
This starts a command shell that in-turn starts another command shell to execute the dir command. The first shell terminates right away after spawning the second (/C switch) while the second executes the "dir" and then does not terminate (/K switch). Because of that second lingering process, even though the process that SQL Server started directly is gone, the query won't return. You cannot even cancel it. Even if you close the window in SSMS, the request continues to run. You can check that with
SELECT r.session_id, t.text
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) t
The NO_OUTPUT parameter does not help either:
EXEC xp_cmdshell 'cmd /C "start cmd /K dir"', NO_OUTPUT;
You will see the same "sticky" behavior.
The only way to get rid of these is to restart the computer or manually kill the processes (requires taskmanager to be executed as administrator). Restarting the SQL Server service does not stop the spawned processes.
As a solution you can use the SQL Agent. It has a "Operating System (CmdExec)" step type that you can use to run your program. You can create the job and then start it using sp_start_job.
Either way, your process actually needs to finish at some point. Otherwise you will create a pile of process-"bodies" that will cause performance problems in the long run.

Related

How to catch ctrl-c behavior in DB2 stored procedure

I am using DB2 11.5.
I have a stored procedure that will run some complex tasks.
Before running the tasks, it will first check from a log table if the job is already running, if yes, it signal for SQLSTATE 75002 with error meesage.
If it is not already running, it will insert a record of the job with status RUNNING, then run the tasks.
When it finishes, it update the status to FINISHED.
CREATE OR REPLACE PROCEDURE WORK.TEST_SP()
P1: BEGIN
if exists(select 1 from db2inst1.job_log where job='abc' and status='RUNNING' and date=current date) then
SIGNAL SQLSTATE '75002' SET MESSAGE_TEXT = 'Job abc is already running, please wait for it to finish';
end if;
insert into db2inst1.job_log values ('abc', 'RUNNING', current date);
commit;
-- Some complex tasks here
call dbms_lock.sleep(120);
update db2inst1.job_log set job_status='FINISHED' where job_name='abc' and job_date=current date
commit;
END P1
My question is how do I handle sigint when user press ctrl-c that aborted the stored procedure when the complex tasks are running?
I want it to update the job_status to ABORTED when ctrl-c occurs so that the job will not be "running" forever.
#Edit 1
Users run the stored procedure with a windows .bat file on local machine with db2 client installed.
#echo off
#if ""%DB2CLP%""=="""" db2cmd /c /i /w ""%0"" && goto :EOF
db2 connect to mydb user db2inst1 using abc123
db2 "call WORK.TEST_SP()"
IF ERRORLEVEL 1 (echo Job failed) else (echo Job done)
db2 connect reset > nul
pause
If your MS-Windows batch file gets interrupted by a Control-C or other signal, then any already started/running stored-procedures invoked by that app will continue running by default. The stored procedure will be unaware that the client application has terminated. So your batch file (cmd/bat) will terminate but any currently running stored procedure will continue to execute on the Db2-server.
You cannot send operating-system signals directly to a Db2-LUW stored procedure, as they run on the Db2-server in the background and are usually owned by a different account than the userid performing the call.
Your stored-procedure should have its own condition handlers or exit handlers or undo handlers. Usually you want to issue a rollback if a hard error happens from which your procedure itself cannot recover. But Db2 itself will issue a rollback for specific sqlcodes (e.g. -911 ).
Db2-LUW also has a sysproc.cancel_work procedure which an application might use in specific situations. Refer to the Knowledge Centre for details. If WLM (workload management) or equivalent is enabled then stored procedures are subject to its configuration as regards resource consumption, and WLM also offers a wlm_cancel_activity routine.
There is no way to do this in SP.
Control is not passed to an exception handler defined in SP upon forcing a caller off the database, canceling activity and some other conditions (log full, for example).
So, don't put any flag / status management logic into SP exception handlers.
How is the stored procedure run? From the command line (db2)? If so, on what operating systems?
If, for instance, the command is run from bash on Linux, you can use trap myfunc SIGINT in Bash to run a custom Bash function myfunc if the user presses Ctrl-C. myfunc could then change the job status.
On Windows, you will have more control if you switch from plain .bat files to Powershell . Some related Stack Overflow questions:
batch script if user press Ctrl+C do a command before exiting
Gracefully stopping in Powershell

Calling batch file from SQL CLP Script

Hi I need to run a batch file from a sql-clp script:
The script is
CONNECT TO MYTAB1 USER xxxx using yyyyyyy;
QUIESCE DATABASE IMMEDIATE FORCE CONNECTIONS;
CONNECT RESET;
BACKUP DATABASE MYTAB1 TO "C:\temp\bcks" WITHOUT PROMPTING;
CONNECT TO MYTAB1 USER xxxx using yyyyyyy;
UNQUIESCE DATABASE;
CONNECT RESET;
cmd.exe /c "C:\Users\xxxx\Desktop\backup_neu.bat C:\temp\bcks C:\temp\bcks\zips 7z");
It runs great until it reaches the last line.
I tried
cmd.exe /c
exec(' xp_cmdshell ''script_here');
EXEC master..xp_CMDShell '"script here "'
but nothing worked.
OI have DB2 v10 running.
Any ideas on how I can get the batch file running?
Thanks for all your help.
TheVagabond
Ok I found the solution....
really simple somehow, just needed
!C:\Users\xxxx\Desktop\backup_neu.bat C:\temp\bcks C:\temp\bcks\zips 7z
so only a ! that was it.

Using CMD to fire SQL rename stored procedure returns warning, stops script

I have a .BAT file that executes a few commands. Everything works fine until the SP_Rename, I get the following message returned:
Caution: Changing any part of an object name could break scripts and stored procedures.
And my next line on the CMD prompt window shows: 1> with a cursor. It should be executing the next line of code, but it does not.
Any ideas on how to get around this?
The message raised is Select * from sys.messages where message_id=15477.
You could get the procedure text via EXEC sp_helptext sp_rename,
remove the line raiserror(15477,-1,-1) and create a new procedure e.g. sp_rename_no_alert and use this for your batch operations.
How is your .BAT file structured? Is it a bunch of separate SQLCMD invocations one after the other, and it stops after the first one by leaving you at the "1>" prompt?
If that's the case, I don't think the caution is what is causing SQLCMD to leave you at the prompt. -Q (uppercase) instead of the -q parameter for running your query, as the lowercase -q will leave you at the prompt and wait for more input.
Example:
SQLCMD -Q "exec sp_rename(blah)"

Runing pentaho command from a scheduler (tidal)

I am trying to execute the pentaho job over the windows through TIDAL, but the TIDAL does not execute the job at all. But when i run seperately on CMD PROMPT is executes.
The below is command used, IT does not the read the parameters assigned to it.
Kindly suggest on what has to be done.
E:\apps\Pentaho\data-integration\kitchen.bat /rep:Merlin_Repository /user:admin /pass:admin /dir=wwclaims /job=J-CLAIMS /level:Basic
You forgot a slash in /dir: option and you must use : not = symbols in your command.
For example in a windows batch script command
#echo off
SET LOG_PATHFILE=C:\logs\KITCHEN_name_of_job_%DATETIME%.log
call Kitchen.bat /rep:"name_repository" /job:"name_of_job" /dir:/foo/sub_foo1 /user:dark /pass:vador /level:Detailed >> %LOG_PATHFILE%`

Executing a stored proc inside a script file using the sqlcmd -i

Need some help with script files.
I have an SQL script file in the following format:
Begin tran
insert..
select..
update..
Commit
exec linked_server.db1.dbo.storedproc1
I am calling the above script file from within a .js file in the following manner:
var sCommand = "sqlcmd -i C:\\scriptfile1"
var WshShell = new ActiveXObject("WScript.Shell");
var oExec = WshShell.Exec(sCommand);
When I run the .js file, the code between tran-commit gets executed but the storeproc1 is never called. I know for sure that the storedproc1 is not called because it has a list of insert statements that never shows up in the table.
Have you tried running the exec storedproc1 alone? Maybe it throws an error.
Also you can try adding go like this:
commit
go
exec storedproc1
You can try this in the management studio first. After you are sure it works in the management studio, you can go on running it through sqlcmd.
Edit: next you can check the permission of the user running the script, whether it is allowed to run stored procedure.