Securing Apache and PHP-FPM - apache

Given
ProxyPassMatch ^/(.*\.php(/.*)?)$ fcgi://127.0.0.1:9000/var/www/$1
how can one prevent malicious code execution when a fake image is uploaded in a folder which is then called via
http://www.foo.bar/uploads/malicious.jpg/fake.php
If I understand correctly, the request above will let Apache pass it to PHP-FPM which will execute /uploads/malicious.jpg.
I know I could add an .htaccess file in the uploads folder that removes the ProxyPassMatch, but this is something my customers don't know and they could end up being compromised.

There's a new setting in php-fpm since php 5.3.9, 'security.limit_extensions', that limits which files php-fpm will execute. The default is '.php', so the 'malicious.jpg' would not be executed.

Related

Cannot remove Apache noindex page

CentOS 7.3.1611
Apache httpd-2.4.6-45.el7.centos.x86_64
I need to replace the default Apache noindex page ("Testing 123..") with a config page for a dev environment.
I tried deleting it but it seems to have permanently cached itself somewhere on the server and I can't get rid of it. I've deleted /etc/httpd/conf.d/welcome.conf as well as the entire /usr/share/httpd/noindex/ directory.
I've rebooted the server and verified it's not the client (same result on different client computers and browsers).
Is there some caching mechanism responsible for this? How do I clear it?
Attempting to change Apache's noindex processing is not a good idea. A better way to do it might be redirecting requests for "/" with a LocationMatch directive in httpd.conf
<LocationMatch "^/$">
Redirect "/" "/config.php"
</LocationMatch>

mod_wsgi and static pages (no django)

On page: http://code.google.com/p/modwsgi/wiki/FileWrapperExtension , Graham Dumpleton recommends the following:
"Do note however that for the best performance, static files should
always be served by a web server. In the case of mod_wsgi this means
by Apache itself rather than mod_wsgi or the WSGI application."
I'd like to pre-build a large number of static pages, then have a python program (running under apache/mod_wsgi 3.3/python3.1, daemon mode, no django involved) decide which of them to serve to each user. I'd like the python program to decide, for example, that this guy needs "12345.html" and have it tell Apache, "please serve static file '12345.html' to this guy", rather than having to use python to open the file, read the contents, turn it into a python string, and return it to mod_wsgi as "[output]".
Is this possible? If so, how?
If not, what's the best way to do this?
There are numerous ways one could do it.
X-Sendfile implemented by mod_xsendfile and Apache.
Location/mod_rewrite tricks using mod_wsgi daemon mode.
X-Accel-Redirect if also using nginx as front end to Apache.
Read up on (1) and (3) as more widely used options.
Update with instructions for (2).
Have the WSGI application return a 200 response with empty body and 'Location' response header with URL path to local resource hosted on same Apache server and mod_wsgi when daemon mode is being used will trigger an internal redirect to that URL.
Thus if your Apache has:
Alias /generated-files/ /some/path/
<Directory /some/path>
Order allow, deny
Allow from all
</Directory>
then generate your file as /some/path/foo.txt in file system and then have the 'Location' response header have value '/generated-files/foo.txt' and it will be served up.
Note that anything under '/generated-files' is publicly accessible. If you didn't want this and wanted it to be private and so only returnable via the specific request which generated the 'Location' response header, you need to add mod_rewrite magic that blocks access to that URL except for an internally generated sub request. That from memory needs to be something like:
RewriteCond %{IS_SUBREQ} false
RewriteRule ^/generated-files/ - [F]

How do I get Apache mod_cache to cache?

I've gotten Apache mod_cache to work as intended on a Windows server running Apache 2.2, but I'm having trouble getting it running on a Linux cpanel server to which I have root access.
Here's what I know:
1) mod_cache and mod_disk_cache are both compiled into Apache (confirmed with "httpd -l")
2) My httpd.conf is configured like this
CacheRoot /home/accountname/apache-cacheroot
CacheEnable disk /
3) I've restarted Apache after all configuration changes
4) I know that section of the httpd.conf is being processed (I put some unrelated commands in there to debug.)
5) The file I request displays the current time via php, and it does not change on subsequent requests.
...it does not change on subsequent requests
It sounds like your caching is working. If it did change on every request, then the request is being served by PHP instead of Apache's cache.
Did you try enabling the modules with a2en cache and a2en disk_cache ?
Do not forget to restart the server after doing this.
Depending on what your have your CacheRoot set to, you may need to change the permissions to make the permissions 777.

PHP - a different open_basedir per each virtual host

I've came across on this problem, I have a sever running apache and php. We have many virtual hosts but we've noticed that a potentially malicious user could use his web space to browse other user's files(via a simple php script) and even system files, this could happens due to the php permissions.
A way to avoid it is to set the open_basedir var in php.ini, yhis is very simple in a single host system, but in case of virtual hosts there would be a basebir per each host.
Ho can I set dis basedir per each user/host? is there a way to let apache hereditate php privileges of the php file that has been requested
E.G.
/home/X_USER/index.php has as owner X_USER, when apache read the file index.php it checks its path and owner, simply I'm looking for a system set php basedir variable to that path.
Thank in advance
Lopoc
It is possible to set open_basedir on a per-directory basis using the php_admin_value Apache directive.
Example from the manual:
<Directory /docroot>
php_admin_value open_basedir /docroot
</Directory>
Re your comment: yes, external commands are not affected by open_basedir - when calling ls / this is done with the rights the user account PHP runs under (often named www or similar). As far as I know, it is not possible to extend open_basedir to external commands.
In that case, I don't think the kind of protection that you're looking for is possible in a normal Apache/PHP setup. The only thing that maybe comes close is running Apache in a chroot jail. I haven't done this myself so I can't say anything about it - you'd have to dig in and maybe ask a question specifically about that.
You can set many php.ini settings using the Apache configuration file.
See these related pages from the PHP manual:
- http://php.net/manual/en/configuration.changes.php
- http://www.php.net/manual/en/ini.core.php#ini.sect.path-directory
- http://www.php.net/manual/en/configuration.changes.modes.php
chroot is a good idea. And now docker is more effective.
and open_basedir with "/docroot" is not security ,you should end with a "/" or PHP can access /docroot1

How can I redirect requests to specific files above the site root?

I'm starting up a new web-site, and I'm having difficulties enforcing my desired file/folder organization:
For argument's sake, let's say that my website will be hosted at:
http://mywebsite.com/
I'd like (have set up) Apache's Virtual Host to map http://mywebsite.com/ to the /fileserver/mywebsite_com/www folder.
The problem arises when I've decided that I'd like to put a few files (favicon.ico and robots.txt) into a folder that is ABOVE the /www that Apache is mounting the http://mywebsite.com/ into
robots.txt+favicon.ico go into => /fileserver/files/mywebsite_com/stuff
So, when people go to http://mywebsite.com/robots.txt, Apache would be serving them the file from /fileserver/mywebsite_com/stuff/robots.txt
I've tried to setup a redirection via mod_rewrite, but alas:
RewriteRule ^(robots\.txt|favicon\.ico)$ ../stuff/$1 [L]
did me no good, because basically I was telling apache to serve something that is above it's mounted root.
Is it somehow possible to achieve the desired functionality by setting up Apache's (2.2.9) Virtual Hosts differently, or defining a RewriteMap of some kind that would rewrite the URLs in question not into other URLs, but into system file paths instead?
If not, what would be the preffered course of action for the desired organization (if any)?
I know that I can access the before mentioned files via PHP and then stream them - say with readfile(..), but I'd like to have Apache do as much work as necessary - it's bound to be faster than doing I/O through PHP.
Thanks a lot, this has deprived me of hours of constructive work already. Not to mention poor Apache getting restarted every few minutes. Think of the poor Apache :)
It seems you are set to using a RewriteRule. However, I suggest you use an Alias:
Alias /robots.txt /fileserver/files/mywebsite_com/stuff/robots.txt
Additionally, you will have to tell Apache about the restrictions on that file. If you have more than one file treated this way, do it for the complete directory:
<Directory /fileserver/files/mywebsite_com/stuff>
Order allow,deny
Allow from all
</Directory>
Can you use symlinks?
ln -s /fileserver/files/mywebsite_com/stuff/robots.txt /fileserver/files/mywebsite_com/stuff/favicon.ico /fileserver/mywebsite_com/www/
(ln is like cp, but creates symlinks instead of copies with -s.)