Output of sub-processes in DOS batch script not visible in Apache - apache

I'm running Apache 2.2 (launched via console) on Vista. I have simple batch script in cgi-bin. Unfortunately, Apache does not seem to serve any content generated by sub-processes.
For example, given the following script:
#echo off
echo Content-Type: text/html
echo.
echo Visible in browser
cmd /c echo Hidden from browser
echo End of script
All three lines of text will appear in the console if executed directly from a command prompt. However the middle line ("Hidden from browser") will not appear if the script is launched from Apache.
This script is just illustrative -- I'm actually using the batch file to launch a number of separate console based applications (not cmd.exe)
What am I doing wrong?

I've been looking at this, over at :
Pipe Java output to calling script
FWIW, all of this C:\wamp\bin\apache\apache2.2.22\cgi-bin\testbat.bat's echo output appears in both the command window and the served webpage :
#echo off
rem This works in Wampserver's Apache cgi-bin...
rem http://localhost/cgi-bin/testbat.bat
echo Content-Type: text/html
echo.
echo ^<html^>^<head^>^</head^>^<body^>
echo ^<H1^>Hello world!!!^</H1^>
echo ^<PRE^>
FOR /F "usebackq delims==" %%i IN (`dir`) do echo %%i
echo ^</PRE^>
FOR /F "usebackq delims==" %%i IN (`cmd /c echo NOT hidden!`) do echo %%i
echo ^</body^>^</html^>

Many more details aside, this behavior happens when cmd is invoked via CreateProcess() with DETACHED_PROCESS, which is what Apache does in ap_cgi_build_command() through apr_proc_create() (reference to Apache 2.2.25 source code).
For some reason, the child processes from cmd are also spawned detached. This also happens in other situations (e.g., invoking WScript.CreateObject("WScript.Shell").Run() in the same context) which might or might not involve cmd in the background.
Unrelated lesson learned: if CreateObject(DETACHED_PROCESS) and I/O redirection (STARTF_USESTDHANDLES) are mixed together, results might be surprising.
As far as I know, there is no solution other than avoiding batch, WSH and others. The httpd team might look into this for a workaround in the future.

Maybe you need to redirect the output to your STDOUT. I haven't tried it on Windows machine, but you could try
cmd /c echo Hidden from browser >&1
or redirect it to a temp file and call type on the file.

It would work from the command line as expected but what are the applications that you are trying to run in apache's cgi-bin? I have not heard of Apache's cgi-bin being a batch file...and that could be a potential exploit...maybe the permissions are not set for the batch file...or that there is no plugin available for Apache to actually execute a batch file, think of the mod handlers used for ssl (secure sockets layer)...like this as an example found in Apache's config file...httpd.conf
<LoadModule ssl_module modules/mod_ssl.so>
....
<if mod_ssl>
....
</if>
This would explain why you cannot run a batch file as a cgi-bin script...

Related

Finish batch/vbs script on logoff or shutdown

Situation:
To keep it simple: I got multiple shortcuts on a network share that starts an VMwareclient application that connects to a console of a virtual machine. The shortcut looks something like this:
\\share\vmware-vmrc.exe -h HOST -u USER -p PASS "[LocalStorage] NAME/NAME.vmx"
But i like to know if someone is already using this console connection, so i can avoid starting this shortcut and choose another machine. So i made an script to do this and changed the shortcut to:
C:\Windows\System32\wscript.exe \\share\start.vbs NAME "[LocalStorage] NAME/NAME.vmx"
This opens a vbscript that runs an batch file silent, so it is not visible for the user.
arg = " " & WScript.Arguments.UnNamed(0) & " """ & WScript.Arguments.UnNamed(1) & """"
CreateObject("Wscript.Shell").Run "client.cmd" & arg ,0,False
This opens client.cmd that operates the launch of the VMwareclient and connects to the given virtual machine and writes a logfile to log which user has the Virtual Machine in use.
#echo off
pushd "%~dp0"
set TIMESTAMP=%TIME:~0,2%:%TIME:~3,2%
set CLIENT=%1
:start
IF EXIST logs\%client% goto msg
echo %username% since %TIMESTAMP% on %DATE% >logs\%client%
start /wait c:\vmwareclient\vmware-vmrc.exe -h HOST -u USER -p PASS %2
del logs\%client%
goto exit
:msg
for /f "delims=" %%x in ('type logs\%client%') do set "type=%%x"
echo wscript.quit MsgBox ("%client% is in use by %type%. Do you like to continue?", 4, "%client% is in use") > yesno.vbs
wscript //nologo yesno.vbs
set value=%errorlevel%
del yesno.vbs
if "%value%"=="6" start /wait c:\vmwareclient\vmware-vmrc.exe -h HOST -u USER -p PASS %2
if "%value%"=="7" goto exit
goto exit
:exit
popd
The variable %client% contains the first parameter given in the shortcut, it represents NAME.
This all works fine. But there is one problem, at "start /wait :.." the scripts waits till the client application is closed. But when the user logs off or shuts down Windows it force closes the script and it doesn't continue the script to delete the logfile. That causes the logs to be irrelevant because some users who logged off or force closed the script are still present in the logfile as if the virtual machine is still in use.
Problem:
"start /wait" waits till the application is closed, but when a user shuts down windows or logs off the script is force closed and doesn't continue.
Question:
Is there a solution to detect such force close in batch? (probably not)
Do i need to switch to VBscript or an other program/script language to accomplish this?
Best option is not to check file existence, but hold a lock on file. If file is locked, it is in use. If it is not locked, you are in the scenario described with a lock file from old connection. Start the following code from two consoles to see how it works.
What it does is redirect output of stream 3 (0=stdin 1=stdout 2=stderr 3-9= for your use) to a lock/log file. This fill will have a lock on it (is is been written) while the code in the parenthesis block is running. Other instance of the batch file trying to run will not be allowed to open the lock file for writting, so the block will not be executed. This case is checked via a environment variable that is asigned a value inside the block. If it has value, the block was executed, if it has no value (not defined) the block has not been executed (the log file is locked by another process).
Stream 3 is used instead of usual output redirection to allow code inside block to echo information to console, but it is not needed.
#echo off
rem Prepare environment
setlocal enableextensions
rem The file used as flag
set "file=client.log"
rem The variable used to test if we got the lock
set "started="
((
rem Mark the process as started
set "started=yes"
rem echo data to screen
echo User %username% will hold the lock
rem save data to log file. It will be on output stream 3
echo %username% >&3
rem Simulate the process to work
pause >nul
rem Redirect output stream 3 to our lock/log file
rem Send stderr to nul to not show errors (ex. file lock fail)
) 3> "%file%" ) 2>nul
rem Check if the process got the lock and has ended
rem OR it couldn't get the lock
if not defined started (
<"%file%" set /p "lockedBy="
setlocal enabledelayedexpansion
echo(Process is locked by !lockedBy!
endlocal
)
endlocal

Batch File... Close openfiles

I searched for 2 days. Now I am here and hope to get help.
I am trying to "kill (close)" a file automatically.
For example:
#echo off
taskkill /IM Acrord32.exe
Thats works fine. The only problem is, that it closes every PDF file. But I only want to close a specific file.
Maybe through path.
So, i hope anyone can help me.
(It do not have to be a Batch Script that closes this PDF file, it can also be a Tool or something else) But it has to close the file AUTOMATICALLY.
I hope you understand me. I am not English.
Thanks so far. M.L.
For such, you would need to "tell" Acrobat Reader to close file. But AFAIK Acrobat Reader doesn't expose any API for remote control.
Only choice I see is send a Ctrl+F4 to Reader, but from command line is not possible (at least with standard commands).
I do not know exactly what you mean, but You ask for a script that closes Acrord32 process and it automatically. My script does, but only for the processes that have been "suspended" and too much demand on the processor.
There is then no need for any changes to the registry, parameters or settings Acrobat Reader. Batch script prepared by default to automaticaly kill process Acrord32.exe and to this end has been designed, but the script can be used to close as any other process too burdensome system, which was suspended when the script calls to the appropriate parameter. If this parameter contains a long name with a space, you must enclose the parameter in quotation marks. Inside the script, to the beginning of the several parameters can be determined, for example. Time of suspension, the time of re-checking, or where the report is to be created (LOG). The script closes all processes meet the criteria for any user, of course, not the system that can not be closed. Can be useful on the server, the work of many users. The script is optimized in order to minimize itself burdened processor.
Thank You
#echo off
REM Automatic closing Acrobat Reader or other process parameter specified in the call, which too much high the CPU
REM Preparing: Artur Zgadzaj
REM ---------------------------------------------------------------------------------------------
SET REPEAT_TIME_VERIFICATION_[seconds]=7
SET IDLE_TIME_[seconds]=5
SET LOG_FOLDER=C:\UTIL\LOG
REM # # # # CHECKING OR IS STARTED AS ADMINISTRATOR # # # # #
FSUTIL | findstr /I "volume" > nul&if not errorlevel 1 goto Administrator_OK
cls
echo ************************************
echo *** RUN AS ADMINISTRATOR ***
echo ************************************
echo.
echo.
echo Call up just as the Administrator. Abbreviation can be done to the script and set:
echo.
echo Shortcut ^> Advanced ^> Run as Administrator
echo.
echo.
echo Alternatively, a single run "Run as Administrator"
echo or in the Schedule tasks with highest privileges
pause > nul
goto:eof
:Administrator_OK
SET WD=day
if "%~1"=="" (SET Close_Process=AcroRd32.exe) else (SET "Close_Process=%~1")
MD %LOG_FOLDER% 2>NUL
Setlocal EnableDelayedExpansion
:again
cls
echo Automatic closing %Close_Process%, which are charged to the processor too ...&echo.&echo.
FOR /F "tokens=2,7,8 delims=," %%A IN ('%SystemRoot%\System32\tasklist.exe /v /FO CSV^|find /I ^"%Close_Process%^"') DO (
SET PROC=%%C
SET PROC=!PROC:"=!
FOR /F "tokens=2,3 delims=:" %%s IN ("!PROC!") DO (SET PR=%%t
if "!PR:~0,1!"=="0" (SET /A PROC_TIME=%%s*60+!PR:~1,1!) else (SET /A PROC_TIME=%%s*60+!PR:~0,2!))
if !PROC_TIME! GTR %IDLE_TIME_[seconds]% (
SET PID=%%A
SET PID=!PID:"=!
%SystemRoot%\system32\taskkill.exe /PID !PID! /F
SET B=%%B
SET B=!B:%USERDOMAIN%\=!
SET B=!B:%COMPUTERNAME%\=!
SET Process_User=!B:"=!
if not "!DATE_WD!"=="%DATE%" ((FOR /F "tokens=1" %%W IN ('POWERSHELL GET-DATE -format dddd') DO SET WD=%%W)&&SET DATE_WD=%DATE%)
echo %TIME:~0,8% ^(Hanging: !PROC:~-5!^) !Process_User! >>"%LOG_FOLDER%\%DATE:-=.% ^(!WD:~0,3!^) Close_%Close_Process%.TXT"
)
)
TIMEOUT /T %REPEAT_TIME_VERIFICATION_[seconds]% > nul
goto again

How to stop a process automatically via a batch script

How can I check if a process is running from a batch/cmd file? And If process is running, how do I stop the process automatically?
Like a cmd script thingy, can someone give me some hints or help me make that kind of script.
Example (pseudo code):
If calc.exe is running
Stop calc.exe
I want something like this:
#echo off
:x
PATH=C:\Windows\System32\calc.exe
If calc.exe is ON goto Process_Stop
:Process_Stop
net stop calc.exe
goto x
First off, you have the wrong command to stop a process like calc.exe. NET STOP will stop a service, but not a normal program process. You want TASKKILL instead.
Second - If you know you want to kill the process if it exists, then why do you think you have to check if it is running first before you kill it? You can simply attempt to kill the process regardless. If it doesn't exist, then an error is generated, and no harm done. If it does exist, then it is killed with a success message.
taskkill /im calc.exe
If you don't want to see the error message, then redirect stderr to nul using 2>nul. If you don't want to see the success message either, then redirect stdout to nul using >nul.
taskkill /im calc.exe >nul 2>nul
If you want to take action depending on the outcome, then you can use the conditional && and || operators for success and failure.
taskkill /im calc.exe >nul 2>nul && (
echo calc.exe was killed
) || (
echo no process was killed
)
Or you could use the ERRORLEVEL to determine what to do depending on the outcome.
A more readable and simpler command is something like this:
taskkill /F /FI "IMAGENAME eq calc.exe"
This methodology never returns an error code if the process isn't running where the /IM switch WILL return an error if the process is not running. You can also use wild cards like:
taskkill /F /FI "IMAGENAME eq calc*.exe" to kill any process that starts with 'calc' and ends with '.exe' in the name.
Check out tasklist command - This command gives you a list of tasks and services that are running. Check if the task that you are interested is running using regular expression
Then use taskkill command to kill the service or task.

XCOPY seems to drain redirected input

I am trying to write tests for some of my scripts. I am redirecting input from a file that contains input for a specific test case. A few of my scripts use xcopy. What I have noticed is that xcopy drains the redirected input even though I use the /Y option that suppresses prompting for confirmation.
Here's a script to produce this:
#ECHO OFF
SETLOCAL
SET some_info=
SET /p some_info=Please provide info:
ECHO.
ECHO Your input was:%some_info%
xcopy /Y some_existing_file.txt some_other_existing_file.txt
SET some_info=
SET /p some_info=Please provide info:
ECHO.
ECHO Your input was:%some_info%
SET some_info=
SET /p some_info=Please provide info:
ECHO.
ECHO Your input was:%some_info%
ENDLOCAL
and here's the input:
info 1
info 2
info 3
Note that files some_existing_file.txt and some_other_existing_file.txt both exist as their name suggests.
I noticed that copy does not drain redirected input but unfortunately I am using the /EXCLUDE option of xcopy a lot and replacing it with copy will cause many changes to my scripts.
Have you noticed this behaviour of xcopy? Is there a way to avoid it without making significant changes to my scripts? If you have an alternative suggestion on how to automatically test batch scripts please let me know.
This works here:
<nul xcopy /Y some_existing_file.txt some_other_existing_file.txt

Cron Job - Could not open input file:

I have set up a php file to run that just echos hello.
<?php
echo hello;
?>
My cron job looks like this:
/usr/local/bin/php -f “/home/username/public_html/mls/test.php”
when my script runs i get a confirmation email that says:
Could not open input file: /home/username/public_html/mls/test.php
I don't know what is causing this. I am using godaddy's virtual private server with cpanel x installed. I have used the ssh to set permissions 777 on folder and file and still can not get it to run.
Any advice would be helpful. Thanks.
For some reason PHP cannot open the file. Try replacing /usr/local/bin/php -f with "ls -la" to try to crib some more information. Remember to NOT quote the file name in the crontab: php -f filename.php, not php -f "filename.php", unless it contains spaces -- and then it's better to use single quotes.
Possibly, try "ls -la /home", "ls -la /home/username", "ls -la ~/public_html" and so on.
Also try appending
2>&1
to the command line, in case only stdout is mailed to you (I don't really think so, but being sure costs little).
One other possibility
The crontab as it is refers /home/username/public_html/mls/test.php - that is, a public HTML directory inside username's commonest value for a home directory.
It is possible that the cron job is either not running with the appropriate user and privileges, or that the user it "sees" is actually a virtual user - there is no "/home/username" at all - and the "home directory" is elsewhere, possibly even existing just as long as the cron job runs. In this case the solution might be to refer to
~/public_html/mls/test.php
or, as described above, to first run a command such as pwd or ls -la to determine exactly where the cron job's current working directory is.
If this, too, fails, then another workaround could be to invoke the PHP HTTP handler via curl or lynx:
/usr/bin/curl http://www.thishostname.com/mls/test.php
Possibly using either some environment variable or curl header or _GET option to authenticate to the script as the cron job, and avoid it being accessible from the outside.