Can't run CGI script - cgi

I have a problem with my CGI script.
Here is what I do with my script :
create test.cgi in /usr/lib/cgi-bin > chmod -x test.cgi> ./test.cgi
It return an Internal Server error,
like
I detetect that my system doesn't see where I created this script, so I :
change ScriptAlias to Alias of serve-cgi-bin.conf
Now, It can see the script, but I can't execute, I return the content of script, like
I do it on my raspberry. Does anyone see and can fix it?
Many thanks

Related

How to create htaccess file correctly?

I am trying to run a script via apache on a shared linux server, like
#!/usr/bin/perl
print "Content-type: text/html\n\n";
foreach $i (keys %ENV) {
print "$i $ENV{$i}\n";
}
But I want to run it via. a symlink created like this
ln -s printenv.pl linkedprintenv.pl
It runs fine directly but I got a 500 server error when executing via the symlink from a web browser. I understand the solution may be to create a .htaccess file containing
Options +FollowSymLinks
but I tried and that didn't work. Is there extra configuration needed for that single line htaccess file to take effect?

Getting "permission denied" for executing python script from PHP

I have apache server installed on redhat linux.
I have a PHP script that needs to call a python wrapper script that calls another python script which has a function to execute some database queries.
So, my php script is in location /var/www/html: If I execute the python script from this location command line the script works fine. However, If I run the exact same command through exec inside PHP, I get "permission denied".
The whoami at /var/www/html gives my linux user id.
Inside php script
exec("whoami")
returns apache.
and
echo get_current_user();
returns my linux user id. Which is understandable since get_current_user returns the owner of the php script. How should I fix this problem? Is the problem because user "apache" does not have access to the script? User "smeeta" does have access to execute the script.
Any guidance here is highly appreciated.

"End of script output before headers" error in Apache

Apache on Windows gives me the following error when I try to access my Perl script:
Server error!
The server encountered an internal error and was unable to complete your request.
Error message:
End of script output before headers: sample.pl
If you think this is a server error, please contact the webmaster.
Error 500
localhost
Apache/2.4.4 (Win32) OpenSSL/1.0.1e PHP/5.5.3
this is my sample script
#!"C:\xampp\perl\bin\perl.exe"
print "Hello World";
but not working on browser
Check file permissions.
I had exactly the same error on a Linux machine with the wrong permissions set.
chmod 755 myfile.pl
solved the problem.
If this is a CGI script for the web, then you must output your header:
#!"C:\xampp\perl\bin\perl.exe"
print "Content-Type: text/html\n\n";
print "Hello World";
The following error message tells you this End of script output before headers: sample.pl
Or even better, use the CGI module to output the header:
#!"C:\xampp\perl\bin\perl.exe"
use strict;
use warnings;
use CGI;
print CGI::header();
print "Hello World";
For future reference:
This is typically an error that occurs when you are unable to view or execute the file, the reason for which is generally a permissions error. I would start by following #Renning 's suggestion and running chmod 755 test.cgi (obviously replace test.cgi with the name of your cgi script here).
If that doesn't work there are a couple other things you can try. I once got this error when I created test.cgi as root in another user's home. The fix there was to run chmod user:user test.cgi where user is the name of the user who's home you're in.
The last thing I can think of is making sure that your cgi script is returning the proper headers. In my ruby script I did it by putting puts "Content-type: text/html" before I actually outputted anything to the page.
Happy coding!
Probably this is an SELinux block. Try this:
# setsebool -P httpd_enable_cgi 1
# chcon -R -t httpd_sys_script_exec_t cgi-bin/your_script.cgi
Had the same error on raspberry-pi. I fixed it by adding -w to the shebang
#!/usr/bin/perl -w
You may be getting this error if you are executing CGI files out of a home directory using Apache's mod_userdir and the user's public_html directory is not group-owned by that user's primary GID.
I have been unable to find any documentation on this, but this was the solution I stumbled upon to some failing CGI scripts. I know it sounds really bizarre (it doesn't make any sense to me either), but it did work for me, so hopefully this will be useful to someone else as well.
Since no answer is accepted, I would like to provide one possible solution. If your script is written on Windows and uploaded to a Linux server(through FTP), then the problem will raise usually. The reason is that Windows uses CRLF to end each line while Linux uses LF. So you should convert it from CRLF to LF with the help of an editor, such Atom, as following
If using Suexec, ensure that the script and its directory are owned by the same user you specified in suexec.
In addition, ensure that the user running the cgi script has permissions execute permissions to the file AND the program specified in the shebang.
For example if my cgi script starts with
#! /usr/bin/cgirunner
Then the user needs permissions to execute /usr/bin/cgirunner.
Internal error is due to a HIDDEN character at end of shebang line !!
ie line #!/usr/bin/perl
By adding - or -w at end moves the character away from "perl" allowing the path to the perl processor to be found and script to execute.
HIDDEN character is created by the editor used to create the script
So for everyone starting out with XAMPP cgi
change the extension from pl to cgi
change the permissions to 755
mv test.pl test.cgi
chmod 755 test.cgi
It fixed mine as well.
In my case I had a similar problem but with c ++ this in windows 10, the problem was solved by adding the environment variables (path) windows, the folder of the c ++ libraries, in my case I used the codeblock libraries:
C:\codeblocks\MinGW\bin
This is my case.
Only two line in the script:
#!/usr/bin/sh
echo "Content-type: text/plain"
give the error 500.
adding this line, after the first echo:
echo ""
don't give the error.
Basing above suggestions from all, I was using xampp for running cgi scripts.
Windows 8 it worked with out any changes, but Cent7.0 it was throwing errors like this as said above
AH01215: (2)No such file or directory: exec of '/opt/lampp/cgi-bin/pbsa_config.cgi' failed: /opt/lampp/cgi-bin/pbsa_config.cgi, referer: http://<>/MCB_HTML/TestBed.html
[Wed Aug 30 09:11:03.796584 2017] [cgi:error] [pid 32051] [client XX:60624] End of script output before headers: pbsa_config.cgi, referer: http://xx/MCB_HTML/TestBed.html
Try:
Disabled selinux
Given full permissions for script, but 755 will be ok
I finaly added like -w like below
#!/usr/bin/perl -w*
use CGI ':standard';
{
print header(),
...
end_html();
}
-w indictes enable all warnings.It started working, No idea why -w here.

ssh scripting and copying files

I am writing a BASH deployment script on RH 5. Script runs great and send out an email at the end of the script run. However, what I need to do is, at the end of the script, if I detect any failure, I need to copy log files back local server to attach to the email.
Script can detect failure fine, how to copy log files back. I don't want to just cat the log files as they can be huge.
Any suggestions?
Thanks
S
If I understand correctly your problem, you should use scp
http://linux.die.net/man/1/scp
and here you can find how to automate the login so you can use it in a script
http://linuxproblem.org/art_9.html
I can't see any easy way of avoiding a second login with scp/sftp. If you're sure that it's only the log file that will be returned you could do something like the following:
ssh -e none REMOTE SCRIPT | gzip -dc > LOGFILE
Inside SCRIPT you have something like gzip -c LOGFILE when if fails.

Postfix piping email to php, permissions error

I'm attempting to pipe an email to PHP with my Postfix mail server, using the technique mentioned here and have encountered the following error...
Mar 16 22:52:52 s15438530 postfix/pipe[9259]: AD1632E84C63: to=<php#[myserver].com>, relay=plesk_virtual, delay=0.61, delays=0.59/0/0/0.02, dsn=4.3.0, status=deferred (temporary failure. Command output: /bin/sh: /var/www/vhosts/[myserver].com/httpdocs/clients/emailpipe/email2php.php: Permission denied 4.2.1 Message can not be delivered at this time )
I'd really appreciate if anyone could shed some light on this issue for me. I've tried 777'ing the emailpipe directory, to no avail. Where am I going wrong?
Many thanks.
From the postfix docs...
For security reasons, deliveries to command and file destinations are performed with the rights of the alias database owner. A default userid, default_privs, is used for deliveries to commands/files in root-owned aliases.
So you have two options, either set the default_privs in main.cf to match the ownership of the email2php file.
Alternatively, there should be a way to create an alias database that is owned by the user instead of postfix/nobody. I haven't tried this before though so can't advise.
I have fixed this issue by disabling the SELINUX.
Make sure that you have
#!/usr/bin/php
<?php
(or whatever your path to php is - do "which php" on the server)
at the top of each of your php scripts and that each of the php script files is executable
chmod +x /var/.../email2php.php
Also, make sure that you can test the script from the command line:
cat some_rfc822_email.txt | /var/.../email2php.php
and get the result that you want
To fix this issue, you'll want to chown or chmod /var/www/vhosts/[myserver].com/httpdocs/clients/emailpipe/email2php.php to executable by your postfix user. Alternately, you'll want to redefine this user to execute the file successfully.
Simply changing the permissions of your directory (unless you used -R) won't be sufficient.
To illustrate why this works, consider the following toy example:
<me>#harley:~$ touch test
<me>#harley:~$ ls -al test
-rw-r--r-- 1 <me> <me> 0 2012-03-26 23:44 test
<me>#harley:~$ sh test
<me>#harley:~$
<me>#harley:~$ ./test
bash: ./test: Permission denied
<me>#harley:~$ chmod 755 test
<me>#harley:~$ ./test
<me>#harley:~$
In order to execute a file directly through the running shell, it needs to be set as executable. Other invocations (for example, sh email2php.php or php email2php.php) only require read access, because they're chaining execution off a different file entirely.
For what's likely to be causing the issue in the first place, see here.