Hi I need to run a batch file from a sql-clp script:
The script is
CONNECT TO MYTAB1 USER xxxx using yyyyyyy;
QUIESCE DATABASE IMMEDIATE FORCE CONNECTIONS;
CONNECT RESET;
BACKUP DATABASE MYTAB1 TO "C:\temp\bcks" WITHOUT PROMPTING;
CONNECT TO MYTAB1 USER xxxx using yyyyyyy;
UNQUIESCE DATABASE;
CONNECT RESET;
cmd.exe /c "C:\Users\xxxx\Desktop\backup_neu.bat C:\temp\bcks C:\temp\bcks\zips 7z");
It runs great until it reaches the last line.
I tried
cmd.exe /c
exec(' xp_cmdshell ''script_here');
EXEC master..xp_CMDShell '"script here "'
but nothing worked.
OI have DB2 v10 running.
Any ideas on how I can get the batch file running?
Thanks for all your help.
TheVagabond
Ok I found the solution....
really simple somehow, just needed
!C:\Users\xxxx\Desktop\backup_neu.bat C:\temp\bcks C:\temp\bcks\zips 7z
so only a ! that was it.
Related
I tried searching for the solution here but didn't find one that can solve my problem. I have following batch script:
for /f "tokens=1-3 delims=, " %%a in ('\path\batch_output.txt') do (
echo %%a, %%b, %%c
sqlcmd -S server -E -i path\spu_update_src_trg_ref.sql -v SourceName= %%a Instancname= %%b exitcode= %%c
ping 1.1.1.1 -n 1 -w 5000 > nul
)
Inside spu_update_src_trg_ref.sql I have below code:
use dbname
go
EXEC dbo.spu_update_src_trg_ref $(SourceName), $(Instancname), $(exitcode)
I am running the below batch script via a job scheduler so unable to see the direct error in the cmd. But my job is getting failed and the stored proc is also not getting executed. If need, stored proc is as below:
CREATE PROCEDURE dbo.spu_update_src_trg_ref
#SourceName VARCHAR(100),
#Instancname VARCHAR(100),
#exitcode INT
AS
BEGIN
IF #exitcode=0
BEGIN
UPDATE dbo.t_ctrm_ref_src_trg SET LoadStatus='Completed' WHERE SourceTableName=#SourceName;
UPDATE dbo.t_ctrm_instance_status SET InstanceStatus='Completed' WHERE InstanceName=#Instancname;
END
END
Its a simple sp that updates two tables, but I am unable to pass the input parameters from batch script. Please advice.
Update:
Thanks everyone for the help. I just removed some spaces and quotes('') from '\path\batch_output.txt' and it worked just fine. Appreciate all your help
There are syntax errors in your sqlcmd command. Remove the spaces between the var name, the equal sign, and the value in the "-v" portion.
I am new to unix shell scripting and need a suggestion.
I have shell script which connects to the oracle db fetch the value and assign it outpt variable as mentioned below.
outpt=$(sqlplus Username/Pass#(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(Host=10.255.244.15)(Port=1674))(CONNECT_DATA=(SID=abc01))) <<EOF
set pages 0 echo off feed off
select max(date_sent) from test;
exit;
EOF
)
In windows we close the connection explicitly after executing the query. Is the code above implicitly do that for me or i have to explictly close connection.
If yes . Please help.
I want to make a daily dump of all the databases in MySQL using
Event Scheduler
, by now I have this query to create the event:
DELIMITER $$
CREATE EVENT `DailyBackup`
ON SCHEDULE EVERY 1 DAY STARTS '2015-11-09 00:00:01'
ON COMPLETION NOT PRESERVE ENABLE
DO
BEGIN
mysqldump -user=MYUSER -password=MYPASS all-databases > CONCAT('C:\Users\User\Documents\dumps\Dump',DATE_FORMAT(NOW(),%Y %m %d)).sql
END $$
DELIMITER ;
The problem is that MySQL seems to not recognize the command 'mysqldump' and shows me an error like this: Syntax error: missing 'colon'.
I am not an expert in SQL and I've tried to find the solution, but I couldn't, hope someone can help me with this.
Edit:
Help to make this statement a cron task
For Windows, create a .bat file with the needed command, and then create a scheduled task that runs that .bat file according to a schedule.
Create a .bat file in this fashion, replacing your username, password, and database name as appropriate:
mysqldump --opt --host=localhost --user=root --password=yourpassword dbname > C:\some_folder\some_file.sql
Then go to the start menu, control panel, administrative tools, task scheduler. Hit action > create task. Go to the actions tab, hit new, browse to the .bat file and add it to the task. Then go to the triggers tab, hit new, and define your daily schedule. Refer to http://windows.microsoft.com/en-US/windows/schedule-task
You might want to use a tool like 7zip to compress your backups all in the same command (7zip can be invoked from the command line). An example with 7zip installed would look like:
mysqldump --opt --host=localhost --user=root --password=yourpassword dbname | 7z a -si C:\some_folder\some_file.7z
I use this to include the date and time in the filename:
set _my_datetime=%date:~-4%_%date:~4,2%_%date:~7,2%_%time:~0,2%_%time:~3,2%_%time:~6,2%_%time:~9,2%_
set _my_datetime=%_my_datetime: =_%
set _my_datetime=%_my_datetime::=%
set _my_datetime=%_my_datetime:/=_%
set _my_datetime=%_my_datetime:.=_%
echo %_my_datetime%
mysqldump --opt --host=localhost --user=root --password=yourpassword dbname | 7z a -si C:\some_folder\backup_with_datetime_%_my_datetime%_dbname.7z
#Drew means to use a cronjob. to add a cronjon just start the crontab using this command:
crontab -e
the add a new entry at the end like this:
0 0 * * * mysqldump -u username -ppassword databasename > /path/to/file.sql
this will perform a database dump every day at 00:00
yes program the scheduler to run something like this:
C:/path/to/mysqldump.exe -u username -ppassword databasename > /path/to/file.sql
I am trying to execute the pentaho job over the windows through TIDAL, but the TIDAL does not execute the job at all. But when i run seperately on CMD PROMPT is executes.
The below is command used, IT does not the read the parameters assigned to it.
Kindly suggest on what has to be done.
E:\apps\Pentaho\data-integration\kitchen.bat /rep:Merlin_Repository /user:admin /pass:admin /dir=wwclaims /job=J-CLAIMS /level:Basic
You forgot a slash in /dir: option and you must use : not = symbols in your command.
For example in a windows batch script command
#echo off
SET LOG_PATHFILE=C:\logs\KITCHEN_name_of_job_%DATETIME%.log
call Kitchen.bat /rep:"name_repository" /job:"name_of_job" /dir:/foo/sub_foo1 /user:dark /pass:vador /level:Detailed >> %LOG_PATHFILE%`
I created the following stored procedure:
CREATE PROCEDURE [dbo].[_StartWebcamStream]
AS
BEGIN
declare #command varchar(200)
set #command = 'C:\startStream.bat'
exec master..xp_cmdshell #command
END
for executing a the batch file startStream.bat. This batch contain the following code:
"C:\Program Files (x86)\VideoLAN\VLC\vlc.exe" -I dummy -vvv rtsp://mycamaddress/live_mpeg4.sdp --network-caching=4096 --sout=#transcode{vcodec=mp4v,fps=15,vb=512,scale=1,height=240,width=320,acodec=mp4a,ab=128,channels=2}:duplicate{dst=http{mux=asf,dst=:11345/},dst=display} :sout-keep}
The batch file is launched correctly, but the query continues to run until the vlc is stopped.
What can I do for stopping the query letting the vlc running?
DISCLAIMER: Don't run these examples in production!
SQL Server watches all processes spawned by xp_cmdshell and waits for them to finish. to see that you can just run this statement:
EXEC xp_cmdshell 'cmd /C "start cmd /K dir"'
This starts a command shell that in-turn starts another command shell to execute the dir command. The first shell terminates right away after spawning the second (/C switch) while the second executes the "dir" and then does not terminate (/K switch). Because of that second lingering process, even though the process that SQL Server started directly is gone, the query won't return. You cannot even cancel it. Even if you close the window in SSMS, the request continues to run. You can check that with
SELECT r.session_id, t.text
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) t
The NO_OUTPUT parameter does not help either:
EXEC xp_cmdshell 'cmd /C "start cmd /K dir"', NO_OUTPUT;
You will see the same "sticky" behavior.
The only way to get rid of these is to restart the computer or manually kill the processes (requires taskmanager to be executed as administrator). Restarting the SQL Server service does not stop the spawned processes.
As a solution you can use the SQL Agent. It has a "Operating System (CmdExec)" step type that you can use to run your program. You can create the job and then start it using sp_start_job.
Either way, your process actually needs to finish at some point. Otherwise you will create a pile of process-"bodies" that will cause performance problems in the long run.