Perl Apache script runs from browser-perfroms as expected closes a running perl instance but when trying to launch a new perl instance it does nothing - apache

I have a server running Perl and an Apache web server.
I wrote a script which closes running instances of perl.exe and then launches them again, with some system() commands.
When I try and run it from a browser it works as expected, closes all running perl.exe, but then it doesn't restart them with my system("start my_script.pl").
This is my script running from the browser.
#!/Perl/bin/perl
use lib "/Perl/ta/mods" ;
# http://my_domain.com/cgi-bin/myscript.pl?
use CGI::Carp qw(fatalsToBrowser);
use IPC::System::Simple qw(system capture);
use Win32::Process::List;
my $script_to_end = "start \path_to_script\myscript.pl" ;
system($script_to_end);
print "done" ;
exit;
This launching myscript.pl which does the following:
#!/Perl/bin/perl
use strict;
use warnings;
use lib "/Perl/ta/mods" ;
use Win32::Process::List;
my $script = 'io.socket' ;
my #port = (4005,5004) ;
my $scriptpath_4005 = "Perl C:\\path_to_script\\$script.pl $port[0]";
my $scriptpath_5004 = "Perl C:\\path_to_script\\$script.pl $port[1]";
our $nmealogger = "C:\\nmealogger\\nmealogger.exe";
system('TASKKILL /F /IM nmealogger* /T 2>nul');
print "current running perl instance: $$\n" ;
my $P = Win32::Process::List->new(); #constructor
my %list = $P->GetProcesses(); #returns the hashes with PID and process name
foreach my $key ( keys %list ) {
unless ($list{$key} eq 'perl.exe') { next ; }
# $active_perl_pid{$key} = 1 ;
print sprintf("%30s has PID %15s", $list{$key}, $key) . "\n\n";
if ($$ == $key)
{
print "current running perl instance: $key\n";
next;
} else {
print "kill: $key\n";
system("TASKKILL /F /PID $key");
# system('TASKKILL /F /IM powershell* /T 2>nul');
}
}
system "start $nmealogger" ;
system "start $scriptpath_4005";
system "start $scriptpath_5004";
use common_cms;
exit;
This works fine if I run it from the machine, kills all perl.exe and re-launches perl.exe, but running from the browser it only kills them but never re-launches them.
I thought it could be to do with the httpd.conf settings but I'm not sure.
Any help would be greatly appreciated.
Thanks

Update: I couldn't get around this issue so took a different approach.
Ended up changing the script running from the browser to write a log file on the server and created a scheduled task that runs every minute to check if that file exists, which then kicks of my script on the server.
Quite a long way around but hey it works.
Thanks for the suggestions, much appreciated.

"start" runs the script in a new command window, correct? Presumably Apache is a service, please see related https://stackoverflow.com/a/36843039/2812012.

Related

Perl CGI Output Buffering Using XAMPP Apache Server

I recently picked up a CGI Programming with Perl book and while trying to test one of the example problems I ran into an issue. According to the book newer versions of Apache (since v1.3) do not buffer the output of CGI scripts by default, but when I run the script below, it waits until the entire loop completes before it prints anything:
# count.cgi
#!"C:\xampp\perl\bin\perl.exe" -wT
use strict;
#print "$ENV{SERVER_PROTOCOL} 200 OK\n";
#print "Server: $ENV{SERVER_SOFTWARE}\n";
print "Content-type: text/plain\n\n";
print "OK, starting time consuming process ... \n";
# Tell Perl not to buffer our output
$| = 1;
for ( my $loop = 1; $loop <= 30; $loop++ ) {
print "Iteration: $loop\n";
## Perform some time consuming task here ##
sleep 1;
}
print "All Done!\n";
The book said that using an older version of Apache you may need to run the script as an "nph" script so the output would not be buffered, but I even tried that with no luck.
# nph-count.cgi
#!"C:\xampp\perl\bin\perl.exe" -wT
use strict;
print "$ENV{SERVER_PROTOCOL} 200 OK\n";
print "Server: $ENV{SERVER_SOFTWARE}\n";
print "Content-type: text/plain\n\n";
print "OK, starting time consuming process ... \n";
# Tell Perl not to buffer our output
$| = 1;
for ( my $loop = 1; $loop <= 30; $loop++ ) {
print "Iteration: $loop\n";
## Perform some time consuming task here ##
sleep 1;
}
print "All Done!\n";
I am running: Apache/2.4.10 (Win32) OpenSSL/1.0.1i PHP/5.5.15
Clearly this version of Apache is beyond v1.3 so what is going on here? I did a little research and found that if you have "mod_deflate" or "mod_gzip" enabled it can cause output to be buffered, but I checked my configuration files and "mod_deflate" and "mod_gzip" are already both disabled. All of the other posts I have seen about buffering refer to PHP and say to modify "php.ini", but I am using Perl, not PHP, so that doesn't seem to be the solution.
Also I don't know if this helps at all but I am using Chrome as my web browser.
How can I stop Apache from buffering my output? Thanks!
Try disabling 'mod_deflate'.
Simply move/delete it from your mods-enabled directory.
Don't forget to restart apache after doing so.

Interrupt (SIGINT) perl script from perl script running under apache

I have a perl script running as root that monitors a serial device and sends commands to it. Under apache, I have another perl script that displays a gui for the controlling 'root' script.
I'm trying to interrupt via sigint and sigusr1 the root perl script from the gui perl script but get operation not permitted, probably as one is root the other is not.
I basically want the gui to be able to tell the controlling root script to pass some command to the serial device.
If I run the gui script from the cmd line as root it can successfully signal the root script
I'm not sure where to go from here, any suggestions on methods to interrupt a root script when not running as root? calling seperate "signal" script as shown:
#cmds = ("perl /var/www/signal.cgi", "$perlSerialPID", "test");
if (0 == system(#cmds)) {
# code if the command runs ok
print "system call ok: $?";
} else {
# code if the command fails
print "not ok $?";
}
# try backticks
$output = `perl /var/www/signal.cgi $perlSerialPID test`;
print "<br>:::$output";
signal.cgi:
#!/usr/bin/perl
$pid = $ARGV[0];
$type = $ARGV[1];
if($type eq "test"){
if(kill 0, $pid) {
print "perl-serial $pid running!";
}else{
print "perl-serial $pid gone $!";
}
}elsif($type eq "USR1"){
if(kill USR1, $pid) {
print "perl-serial interrupted";
}else{
print "perl-serial interrupt failed $!";
}
}else{
print "FAILED";
}
Use named pipes for IPC.
The downside is that you'll have to retrofit your main script to read from the named pipe as event driven process.
Similar in concept to above (with the same downside), but use sockets for communication.
Create a separate "send a signal" script that's called via a system call from your UI, and make that new script SUID and owned by root - in which case it will execute with root's permissions.
Downside: this is ripe for security abuse so harden this very carefully.

Running a process in background in Perl

Update: Edited to make question more understandable.
I am creating a script which automatically parses an HTTP upload file and stores the information of the uploaded file like name, time of upload to another data file. These information are found in mod_security log file. mod_security has a rule in which we can redirect the uploaded file to a Perl script. In my case the Perl script is upload.pl. In this perl script I will scan the uploaded file using ClamAV antivirus. But the mod_sec only logs the uploaded file information like name, time of upload after the Perl script upload.pl is executed. But I am initiating another perl script execute.pl from upload.pl with a sleep(10) in execute.pl. The intention is that the execute.pl starts its function only after the completion of upload.pl. I need execute.pl to be executed as background process and upload.pl should complete without waiting the output of execute.pl.
But my issue is even I have made the execute.pl to run in background the HTTP upload waits for the completion of execute.pl even I have made the process to execute in background. I need the upload.pl to get complete without waiting the output of execute.pl. The script runs fine in console. For example I execute perl upload.pl from console the upload.pl completely executed without waiting the output of execute.pl. But when I try the same through apache, that means when I upload a sample file, the upload stucks for both the upload.pl and execute.pl to complete. Since execute.pl has been called from upload.pl as background process , the upload process should complete without waiting the output of execute.pl.
The methods I have tried so far are
system("cmd &")
my $pid = fork();
if (defined($pid) && $pid==0) {
# background process
my $exit_code = system( $command );
exit $exit_code >> 8;
}
my $pid = fork();
if (defined($pid) && $pid==0) {
exec( $command );
}
Rephrase of the question:
How do I start a perl deamon process with a perl webscript?
Answer:
The key is to close the streams of the background job, since they are shared:
Webscript:
#!/usr/bin/perl
#print html header
print <<HTML;
Content-Type: text/html
<!doctype html public "-//W3C//DTD HTML 4.01 Transitional//EN">
<html><head><title>Webscript</title>
<meta http-equiv="refresh" content="2; url=http://<your host>/webscript.pl" />
</head><body><pre>
HTML
#get the ps headers
print `ps -ef | grep STIME | grep -v grep`;
#get the running backgroundjobs
print `ps -ef | grep background.pl | grep -v grep`;
#find any running process...
$output = `cat /tmp/background.txt`;
#start the background job if there is no output yet
`<path of the background job>/background.pl &` unless $output;
#print the output file
print "\n\n---\n\n$output</pre></body></html>";
Background job:
#!/usr/bin/perl
#close the shared streams!
close STDIN;
close STDOUT;
close STDERR;
#do something usefull!
for ( $i = 0; $i < 20; $i++ ) {
open FILE, '>>/tmp/background.txt';
printf FILE "Still running! (%2d seconds)\n", $i;
close FILE;
#getting tired of all the hard work...
sleep 1;
}
#done
open FILE, '>>/tmp/background.txt';
printf FILE "Done! (%2d seconds)\n", $i;
close FILE;

Install after download within a script?

I would like to download an msi and install it, all silently within a script. I wanted to try something like start iexplore.exe http://domain.com/file.msi /qnbut unfortunetaley, it will just download the MSI and not begin the install.
Anybody know how to fix this?
Here you go. I haven't tested it, but I at least had the JScript functions already made from another project. Save this with a .bat extension, modify the set url= line as appropriate, and run it.
#if (#a==#b) #end /*
:: batch portion
#echo off
setlocal
set url=http://domain.com/file.msi
set saveloc=%temp%\file.msi
cscript /nologo /e:jscript "%~f0" "%url%" "%saveloc%"
msiexec /i "%saveloc%" /qn /norestart
:installwait
ping -n 2 0.0.0.0 >NUL
wmic process where name="msiexec.exe" get name 2>NUL | find /i "msiexec" >NUL && goto installwait
del "%saveloc%"
goto :EOF
:: JScript portion */
function fetch(url) {
var xObj = new ActiveXObject("Microsoft.XMLHTTP");
xObj.open("GET",url,true);
xObj.setRequestHeader('User-Agent','XMLHTTP/1.0');
xObj.send('');
while (xObj.readyState != 4) WSH.Sleep(50);
return(xObj);
}
function save(xObj, file) {
var stream = new ActiveXObject("ADODB.Stream");
with (stream) {
type = 1;
open();
write(xObj.responseBody);
saveToFile(file, 2);
close();
}
}
save(fetch(WSH.Arguments(0)), WSH.Arguments(1));
You must also 'start' the downloaded file. It will download to the default download location (C:\Users[username]\Downloads in Windows 7, unless it has been changed). However, you must have the file wait until the download completes, or else it won't be able to run the msi. As far as I know, there's not a way built in to have it check for you, so you just have to account for the longest expected download time, in seconds. The waiting can be done a few different ways, depending on which OS you're using. One example is
timeout /t [seconds] /nobreak > NUL
This accepts a wait time in seconds, and the /nobreak means that it ignores keypresses (which would normally indicate to proceed before the timer is done.). Another method is to have the batch file ping an invalid IP address (1.1.1.1) for a certain number of milliseconds. For example
PING 1.1.1.1 -n 1 -w [milliseconds] >NUL
Hope this helps.

How do I run a script on VxWorks Tornado Shell?

I am trying to run a script on VxWorks Shell, which will load a module.
I use a Perl script to telnet into the system, login and get access to the shell.
I am able to run the basic commands like 'i', 'time', 'ls' 'pwd' and 'h' and so on.
But I would like to run a script, say 'test.o'.
If I do : <C:\Path\subfolder\test.o the script file WILL run from, the TORNADO Shell.
But I have connected to using Telnet using Perl.
So I connect this way:
use Net::Telnet;
my $username = "username";
my $password = "password";
my $t = new Net::Telnet(Timeout=>10, Errmode=>'die');
$t->open('10.42.177.123');
$t->login($username,$password); # Logins as expected.
my #lines = $t->cmd('i'); # To test
print #lines # This works
#lines = $t->cmd('<C:\\Path\\Subfolder\\test.o'); # This is not working for me. HELP!
print #lines; # Prints the Error below
I get an error saying :
Unknown directory: /C:\Path\Subfolder
can't open input 'C:\Path\Subfolder\test.o
errno = 0x1f5
-
How do I run my script file if it is residing at a particular folder of the host PC?
I am able to run the script manually from the TORNADO SHELL window where the prompt looks like ->. and hence it is a working script. And as I have said, I am able to run and print the basic VxWorks Shell commands ("build-in functions").
Any help? [ My OS is Win7 ]
Thanks!
This is issue is now resolved. Two issues was there, and one was because TORNADO, another VxWorks Client was also logged into the system at the same time, while I am trying run my perl script which sends commands and do instructions using Telnet, and having two clients (Tornado, and my scripts Telnet session) running at the same time (despite the VxWorks OS running on the Embedded system having TelnetDeamon running) it didn't like it.
As for the Error above, why it didn't work and gave an error was a syntax error. I should have used
$t->cmd('<\\Path\\subfolder\\test.o');
No need to give C: