Postfix piping email to php, permissions error - permissions

I'm attempting to pipe an email to PHP with my Postfix mail server, using the technique mentioned here and have encountered the following error...
Mar 16 22:52:52 s15438530 postfix/pipe[9259]: AD1632E84C63: to=<php#[myserver].com>, relay=plesk_virtual, delay=0.61, delays=0.59/0/0/0.02, dsn=4.3.0, status=deferred (temporary failure. Command output: /bin/sh: /var/www/vhosts/[myserver].com/httpdocs/clients/emailpipe/email2php.php: Permission denied 4.2.1 Message can not be delivered at this time )
I'd really appreciate if anyone could shed some light on this issue for me. I've tried 777'ing the emailpipe directory, to no avail. Where am I going wrong?
Many thanks.

From the postfix docs...
For security reasons, deliveries to command and file destinations are performed with the rights of the alias database owner. A default userid, default_privs, is used for deliveries to commands/files in root-owned aliases.
So you have two options, either set the default_privs in main.cf to match the ownership of the email2php file.
Alternatively, there should be a way to create an alias database that is owned by the user instead of postfix/nobody. I haven't tried this before though so can't advise.

I have fixed this issue by disabling the SELINUX.

Make sure that you have
#!/usr/bin/php
<?php
(or whatever your path to php is - do "which php" on the server)
at the top of each of your php scripts and that each of the php script files is executable
chmod +x /var/.../email2php.php
Also, make sure that you can test the script from the command line:
cat some_rfc822_email.txt | /var/.../email2php.php
and get the result that you want

To fix this issue, you'll want to chown or chmod /var/www/vhosts/[myserver].com/httpdocs/clients/emailpipe/email2php.php to executable by your postfix user. Alternately, you'll want to redefine this user to execute the file successfully.
Simply changing the permissions of your directory (unless you used -R) won't be sufficient.
To illustrate why this works, consider the following toy example:
<me>#harley:~$ touch test
<me>#harley:~$ ls -al test
-rw-r--r-- 1 <me> <me> 0 2012-03-26 23:44 test
<me>#harley:~$ sh test
<me>#harley:~$
<me>#harley:~$ ./test
bash: ./test: Permission denied
<me>#harley:~$ chmod 755 test
<me>#harley:~$ ./test
<me>#harley:~$
In order to execute a file directly through the running shell, it needs to be set as executable. Other invocations (for example, sh email2php.php or php email2php.php) only require read access, because they're chaining execution off a different file entirely.
For what's likely to be causing the issue in the first place, see here.

Related

please tell me how you set permission 777 on serverfree.com without SSH

anyone can please tell me how you set permission 777 on serverfree.com because i am seen there and there is not any option to set permission and unable to set permission via web based ssh.
please tell me how you set permission.
actually everythis is fine on serverfree.com but i am unable to set cron, someone tell me it's permission issue , but i don't know how to set permission on serverfree.com without SSH ?
Usually, you are able to set the permissions via your FTP-Client.
e.g. in FileZilla there is an option "File permissions..." where you can set the permission values for each file.
You're on a *nix System, right?
If you only want set permissions without calling chmod directly(as your question suggests), you can try following on the console, if you have Perl installed:
perl -e 'chmod 0777, "Filename"'
Another approach is to use the install utility which is a glorified copying program which can set permissions in one step. (See the -m argument.)
install -m 777 "File" "/Copy/Location"
You can find it in the GNU coreutils(if you have it installed there), and isn't directly included to *nix systems(but BSD for example). Also simple move the file out of directory and call install to move it back.
But for both methods you need SSH, and i don't think there is a solution to set permissions without(because you never can do the chmod() system call that you need to set them).

"End of script output before headers" error in Apache

Apache on Windows gives me the following error when I try to access my Perl script:
Server error!
The server encountered an internal error and was unable to complete your request.
Error message:
End of script output before headers: sample.pl
If you think this is a server error, please contact the webmaster.
Error 500
localhost
Apache/2.4.4 (Win32) OpenSSL/1.0.1e PHP/5.5.3
this is my sample script
#!"C:\xampp\perl\bin\perl.exe"
print "Hello World";
but not working on browser
Check file permissions.
I had exactly the same error on a Linux machine with the wrong permissions set.
chmod 755 myfile.pl
solved the problem.
If this is a CGI script for the web, then you must output your header:
#!"C:\xampp\perl\bin\perl.exe"
print "Content-Type: text/html\n\n";
print "Hello World";
The following error message tells you this End of script output before headers: sample.pl
Or even better, use the CGI module to output the header:
#!"C:\xampp\perl\bin\perl.exe"
use strict;
use warnings;
use CGI;
print CGI::header();
print "Hello World";
For future reference:
This is typically an error that occurs when you are unable to view or execute the file, the reason for which is generally a permissions error. I would start by following #Renning 's suggestion and running chmod 755 test.cgi (obviously replace test.cgi with the name of your cgi script here).
If that doesn't work there are a couple other things you can try. I once got this error when I created test.cgi as root in another user's home. The fix there was to run chmod user:user test.cgi where user is the name of the user who's home you're in.
The last thing I can think of is making sure that your cgi script is returning the proper headers. In my ruby script I did it by putting puts "Content-type: text/html" before I actually outputted anything to the page.
Happy coding!
Probably this is an SELinux block. Try this:
# setsebool -P httpd_enable_cgi 1
# chcon -R -t httpd_sys_script_exec_t cgi-bin/your_script.cgi
Had the same error on raspberry-pi. I fixed it by adding -w to the shebang
#!/usr/bin/perl -w
You may be getting this error if you are executing CGI files out of a home directory using Apache's mod_userdir and the user's public_html directory is not group-owned by that user's primary GID.
I have been unable to find any documentation on this, but this was the solution I stumbled upon to some failing CGI scripts. I know it sounds really bizarre (it doesn't make any sense to me either), but it did work for me, so hopefully this will be useful to someone else as well.
Since no answer is accepted, I would like to provide one possible solution. If your script is written on Windows and uploaded to a Linux server(through FTP), then the problem will raise usually. The reason is that Windows uses CRLF to end each line while Linux uses LF. So you should convert it from CRLF to LF with the help of an editor, such Atom, as following
If using Suexec, ensure that the script and its directory are owned by the same user you specified in suexec.
In addition, ensure that the user running the cgi script has permissions execute permissions to the file AND the program specified in the shebang.
For example if my cgi script starts with
#! /usr/bin/cgirunner
Then the user needs permissions to execute /usr/bin/cgirunner.
Internal error is due to a HIDDEN character at end of shebang line !!
ie line #!/usr/bin/perl
By adding - or -w at end moves the character away from "perl" allowing the path to the perl processor to be found and script to execute.
HIDDEN character is created by the editor used to create the script
So for everyone starting out with XAMPP cgi
change the extension from pl to cgi
change the permissions to 755
mv test.pl test.cgi
chmod 755 test.cgi
It fixed mine as well.
In my case I had a similar problem but with c ++ this in windows 10, the problem was solved by adding the environment variables (path) windows, the folder of the c ++ libraries, in my case I used the codeblock libraries:
C:\codeblocks\MinGW\bin
This is my case.
Only two line in the script:
#!/usr/bin/sh
echo "Content-type: text/plain"
give the error 500.
adding this line, after the first echo:
echo ""
don't give the error.
Basing above suggestions from all, I was using xampp for running cgi scripts.
Windows 8 it worked with out any changes, but Cent7.0 it was throwing errors like this as said above
AH01215: (2)No such file or directory: exec of '/opt/lampp/cgi-bin/pbsa_config.cgi' failed: /opt/lampp/cgi-bin/pbsa_config.cgi, referer: http://<>/MCB_HTML/TestBed.html
[Wed Aug 30 09:11:03.796584 2017] [cgi:error] [pid 32051] [client XX:60624] End of script output before headers: pbsa_config.cgi, referer: http://xx/MCB_HTML/TestBed.html
Try:
Disabled selinux
Given full permissions for script, but 755 will be ok
I finaly added like -w like below
#!/usr/bin/perl -w*
use CGI ':standard';
{
print header(),
...
end_html();
}
-w indictes enable all warnings.It started working, No idea why -w here.

How I can change owner of files written by php

How can I change the owner of the files written by php from terminal (command line)? The files are created by upload form in apposite folder.
Many thaks
Use the command "chown":
chown owner filenames
To solve this problem in the future uploads, you can use the chown command:
chown($path, $user_name);
http://php.net/manual/en/function.chown.php
There is also a chmod command, if you prefer to change permissions:
http://www.php.net/manual/en/function.chmod.php
Maybe you can change the owner within the PHP program itself (of course if you have the permission to do that) with the exec command of PHP.
For example:
exec('whoami');
See the documentation of this php instruction here: http://php.net/manual/en/function.exec.php
This could work, but only if the servers configuration allows so.
If this doesn't work, you can try also this, but again you need some priviledges to do it:
// File name and username to use
$file_name= "foo.php";
$path = "/home/sites/php.net/public_html/sandbox/" . $file_name ;
$user_name = "root";
// Set the user
chown($path, $user_name);
See the documentation of this php instruction here: http://php.net/manual/en/function.chown.php
There are some instructions in php to modify file attributes like, chmod, chown, among others.

Cron Job - Could not open input file:

I have set up a php file to run that just echos hello.
<?php
echo hello;
?>
My cron job looks like this:
/usr/local/bin/php -f “/home/username/public_html/mls/test.php”
when my script runs i get a confirmation email that says:
Could not open input file: /home/username/public_html/mls/test.php
I don't know what is causing this. I am using godaddy's virtual private server with cpanel x installed. I have used the ssh to set permissions 777 on folder and file and still can not get it to run.
Any advice would be helpful. Thanks.
For some reason PHP cannot open the file. Try replacing /usr/local/bin/php -f with "ls -la" to try to crib some more information. Remember to NOT quote the file name in the crontab: php -f filename.php, not php -f "filename.php", unless it contains spaces -- and then it's better to use single quotes.
Possibly, try "ls -la /home", "ls -la /home/username", "ls -la ~/public_html" and so on.
Also try appending
2>&1
to the command line, in case only stdout is mailed to you (I don't really think so, but being sure costs little).
One other possibility
The crontab as it is refers /home/username/public_html/mls/test.php - that is, a public HTML directory inside username's commonest value for a home directory.
It is possible that the cron job is either not running with the appropriate user and privileges, or that the user it "sees" is actually a virtual user - there is no "/home/username" at all - and the "home directory" is elsewhere, possibly even existing just as long as the cron job runs. In this case the solution might be to refer to
~/public_html/mls/test.php
or, as described above, to first run a command such as pwd or ls -la to determine exactly where the cron job's current working directory is.
If this, too, fails, then another workaround could be to invoke the PHP HTTP handler via curl or lynx:
/usr/bin/curl http://www.thishostname.com/mls/test.php
Possibly using either some environment variable or curl header or _GET option to authenticate to the script as the cron job, and avoid it being accessible from the outside.

New Rails sass user having problems with permissions

Have never used sass before.
On my production server having to switch chmod 644 public/* then run /etc/init.d/apache2 restart to restart the server.
Once server has rendered the css, I then have to come back and run chmod 755 public/* to actually load the css, js and images.
If 755 has higher creds, why does it fail?
actual error is:
Errno::EACCES (Permission denied - /srv/www/mysite.com/myapp/public/stylesheets/custom.css):
Use this Command : chmod 766 -Rf public/. it might be work for you
I know you already got your answer, but if you want to know more of the theory behind it, check this out:
http://www.thinkplexx.com/learn/article/unix/command/chmod-permissions-flags-explained-600-0600-700-777-100-etc
It's pretty straight-forward... I think it's definitely worth glancing at.
Also, in case you don't already know, the -f option only told chmod to not display a diagnostic message if it couldn't modify the file, so it wasn't totally necessary (though it could be helpful). The -R switch changed the modes of the file hierarchies rooted in the files rather than the files themselves.