Plesk Cron jobs and FTP - who is the owner for file access? - apache

Trying to setup a Cron task that gets a file via FTP however seems to fail due to file permissions.
Code runs perfect in the browser, ie when apache is the owner, however fails when Cron runs the same page.
I'm assuming this is a directory/file permission error, if so who should I set the directory owner too for Cron jobs?

Most likely Dan's thought is going to be your problem. However if it works from a browser you can also call the page like this:
wget -q "http://www.domain.com/path/to/script/script.whatever" >/dev/null 2>&1
if you still get errors you can remove the >/dev/null 2>&1 part & [if your email address is in the domain administrator account correctly] output, including errors should get emailed to you.
As for the correct permissions, don't change the default plesk ones or you will get issues with normal ftp.
Defaults are:
everything under httpdocs = ftpuser.psacln
anything written by php/apache = apache.apache ~ unless you are running php as a cgi on that domain,, then they will belong to the ftp user as well.
-sean

cron jobs will run as the user that created them. More likely than a permissions error is a path error. If you're not specifying full absolute paths to the program/script to run, and to any files you reference, you'll likely have problems as cron won't have the same PATH in its environment as Apache does or you do at your shell prompt.

Related

AUTH (crontab command not allowed) - Bitnami LAMP Stack centos

I'm trying to setup a crontab to execute at set intervals. The crontab job is setup as part of my PHP-Slim application running on Apache. For some reason, it just doesn't add the job to the crontab, so when I run the command:
crontab -u daemon -l
It says 'no crontab for daemon' (daemon is the default Apache account). I did manage to get the cronjob manually added using another account (and it executes with no further issues) so it's most likely a permissions issue. What is the best way to troubleshoot this, without resorting to things like chmod 777 (it will be a production server so I need to careful with setting permissions and documenting them)?
Managed to find the answer just after posting.
I looked in the log file for cron:
cat /var/log/cron
Lots of (daemon) AUTH (crontab command not allowed) error messages. Some further googling lead me to look at /etc/cron/allow which doesn't exist, but /etc/cron.deny does, and the daemon account was listed there. Problem solved.
By default we do not allow the user daemon to run crontab jobs. If you want that user to run crontab jobs, you would need to modify /etc/cron.deny and remove the daemon user from there.
Hope it helps.

'Invalid argument' error when using 'chown apache' on web server folder

On a mac in terminal when executing:
chown apache uploads/
I get the error:
chown: apache: Invalid argument
The foder is on a shared web server. I need to change the owner of the folder because otherwise my PHP script for creating simple text files will return a permission denied error. Please don't suggest chmodding the folder to 777 (which does work), since almost all advice against it.
Is it possible that the server doesn't run scripts as the user 'apache'? How can I find this out?
"Invalid argument" makes me think this directory is on an HFS+ volume with owners disabled; you won't be able to change it in that case. You may be able to switch owners on, although it's possible that requires reformatting.
(The advice to check /etc/passwd is wrong, or at least inaccurate, on OS X; you need dscl . list /Users.)
There are two things you might want to check:
1) Is there a user called apache? Maybe it's httpd. You can search /etc/passwd. (Or whatever your platform uses to store user names, you didn't mention your operating system.)
2) What user do scripts run as? You can check this by running a test script. For example:
#/bin/bash
echo Content-Type: text/plain
echo
id -a
If you save this as test.cgi and put it in a CGI directory, you should be able to run it and get it to tell you what user it's running as.

Allowing a PHP script to ssh, using sudo

I need to allow a PHP script on my local web server, to SSH to another machine to perform a specified task on some files. My httpd runs as _www with low permissions, so setting up direct passwordless SSH is difficult, not to say ill-advised.
The way I do it now is to have a minimal PHP script that sudo-exec's (as me) a shell script which is outside of the document root. The shell script in turn calls (as me) the PHP code that does the actual SSH work, and prints its output. Here's the code.
read_remote_files.php (The script I call from my browser):
exec('sudo -u me -n /home/me/run_php.sh /path/to/my_prog.php', $results);
print $results;
/home/me/run_php.sh (Runs as me, calls whatever it's given):
php $1 2>&1
sudoers:
_www ALL = (me) NOPASSWD: /home/me/run_php.sh
This all works, as my_prog.php is called as me and can SSH as me. It seems it's not too insecure since run_php.sh can't be called directly from a browser (outside document root). The issue I'm having is that my_prog.php isn't called as an HTTP program so doesn't have access to the HTTP environment variables (DOCUMENT_ROOT etc).
Two questions:
Am I making this too complicated?
Is there an easy way for my final script to get the HTTP variables?
Thanks!
Andy
Many systems do stuff like this using a (privileged) cron job that frequently checks for the existence of a file, a database record or some other resource, and then performs actions if there are any.
The huge advantage of this is that there is no direct interaction between the PHP script and the privileged script at all. The PHP script leaves the instructions in a resource, the privileged script fetches it. As long as the instructions can't lead to the system getting compromised or damaged, it's definitely more secure than sudoing.
The disadvantage is that you can't push changes whenever you like; you have to wait until the cron job runs again. But maybe it's an option anyway?
"I need to allow a PHP script on my local web server, to SSH to another machine to perform a specified task on some files."
I think that you are phrasing this in terms of a solution that you have difficulty in getting to work rather than a requirement. Surely what you should be saying is "I want to invoke a task on machine B from a PHP script running under Apache on Machine A." And then research solutions to this -- to which there are many from a simple 'roll-your-own' RPC tunnelled over HTTP(S) to using an XMLRPC or SOA framework.
Two caveats:
Do a phpinfo(); on both machines to check what extensions are available and
Also check your php.ini setting to make sure that your service provider hasn't disabled any functions that you expect to use (or do a Q&D script to echo 'disable_functions = ' . ini_get('disable_functions') . "\n"; ...)
If you browse here and the wider internet you'll find many examples. Here is one that I use for a similar purpose.

Cgi-bin scripts get run without a user?

I'm running a binary that requires a license key to reside in the user's home directory. I'm making a cgi script that calls upon this binary and everything is happy when I execute the script from the command line using sudo -u www-data binary. However, when I run the cgi script from the web, the binary can't find the license key.
The apache error log states:
License key "(null)/.key" not found., referer:
Does this mean that cgi scripts are executed without any user attached for security reasons? And how can I make cgi scripts be run as www-data so the binary knows to look in the appropriate home directory? Unfortunately, There is no command line flag to specify the key location.
Take a look at suexec for apache2, with that, you'll be able to run cgi as a specified user.

Python script runs via apache when permissions are 755 but gives Error 500 when 777?

I uploaded a basic python script to my shared hosting at Dreamhost, and changed the permissions to 777. It ran fine from the shell (via SSH) but would display a 'Server Error' when called from the browser.
In the error.log, the error was 'Premature end of script headers'.
I wrote to DreamHost, who (surprisingly quickly) replied by changing the permissions to 755, and the script started working properly in apache (I could see the output in the browser).
But this doesn't seem right - how can adding extra lenient permissions break anything from functioning?
Allowing anyone to edit a CGI script means that it would be easy to insert a backdoor into the system. httpd is correctly disallowing a suspect program to be run.