I've gotten Apache mod_cache to work as intended on a Windows server running Apache 2.2, but I'm having trouble getting it running on a Linux cpanel server to which I have root access.
Here's what I know:
1) mod_cache and mod_disk_cache are both compiled into Apache (confirmed with "httpd -l")
2) My httpd.conf is configured like this
CacheRoot /home/accountname/apache-cacheroot
CacheEnable disk /
3) I've restarted Apache after all configuration changes
4) I know that section of the httpd.conf is being processed (I put some unrelated commands in there to debug.)
5) The file I request displays the current time via php, and it does not change on subsequent requests.
...it does not change on subsequent requests
It sounds like your caching is working. If it did change on every request, then the request is being served by PHP instead of Apache's cache.
Did you try enabling the modules with a2en cache and a2en disk_cache ?
Do not forget to restart the server after doing this.
Depending on what your have your CacheRoot set to, you may need to change the permissions to make the permissions 777.
Related
Let's assume I have multiple domains pointing to one server.
I know that I can have multiple virtual hosts in Apache for example, but every time I want to add a new website I have to change the configuration and restart the server.
I am looking for hosting multiple domain names without having to create a config file each time.
Why ? because after creating a config file, I have then to restart the HTTP server which means that each domain I add will block all the other domains for a period of time.
Basically I want a config or program that points dynamically each domain to a sub-folder of my main source code without having to create a config file or restarting the HTTP server.
Please let me know if this is doable with the current HTTP servers or if not point me to some resources that will help me do this programatically.
With most standard web servers you simply cannot do this without a restart or at least reload of the service (so it picks up the new configs).
Of course you could build your own solution, based on your requirements.
You can reload the server without restart it, a reload keep the active connections up, so you can load the changes in your configuration without restarting the server.
Command for nginx: nginx -s reload
I suggest to use nginx -t && nginx -s reload in order to check the configuration before reloading
Command for Apache: apachectl -k graceful, systemctl reload httpd.service, service apache2 reload, service httpd reload (it depends on your environment)
CentOS 7.3.1611
Apache httpd-2.4.6-45.el7.centos.x86_64
I need to replace the default Apache noindex page ("Testing 123..") with a config page for a dev environment.
I tried deleting it but it seems to have permanently cached itself somewhere on the server and I can't get rid of it. I've deleted /etc/httpd/conf.d/welcome.conf as well as the entire /usr/share/httpd/noindex/ directory.
I've rebooted the server and verified it's not the client (same result on different client computers and browsers).
Is there some caching mechanism responsible for this? How do I clear it?
Attempting to change Apache's noindex processing is not a good idea. A better way to do it might be redirecting requests for "/" with a LocationMatch directive in httpd.conf
<LocationMatch "^/$">
Redirect "/" "/config.php"
</LocationMatch>
Given
ProxyPassMatch ^/(.*\.php(/.*)?)$ fcgi://127.0.0.1:9000/var/www/$1
how can one prevent malicious code execution when a fake image is uploaded in a folder which is then called via
http://www.foo.bar/uploads/malicious.jpg/fake.php
If I understand correctly, the request above will let Apache pass it to PHP-FPM which will execute /uploads/malicious.jpg.
I know I could add an .htaccess file in the uploads folder that removes the ProxyPassMatch, but this is something my customers don't know and they could end up being compromised.
There's a new setting in php-fpm since php 5.3.9, 'security.limit_extensions', that limits which files php-fpm will execute. The default is '.php', so the 'malicious.jpg' would not be executed.
I can download 1+ Gb files with no problem from my server. Today uploaded 2.3 GB file and discovered that I can't download it, downloading gets interrupted after 1GB and it happens for every user. I tried Firefox 15, latest Chrome, and IE9.
Server:
Apache 2.2.22
fcgi
eAccelerator
CPnginx
How can I resolve this?
I think I just faced the same problem, hopefully this might help others.
My server is set up to proxy requests to Apache through Nginx using the ngx_http_proxy_module. The proxy_max_temp_file_size directive of this module defaults to 1024m and is what was causing my problem.
Try adding the line proxy_max_temp_file_size 5120m; within a http, server or location directive in your nginx.conf file and reload nginx's config file with /etc/init.d/nginx reload.
Looks like the default max download size apache sets is around 2 Gb and you can override it by tweaking LimitRequestBody in your httpd.conf. Not sure why it would stop at 1 Gb though, that would make me think the problem is something else.. Hope this helps.
I want to use Apache and Nginx in the same directory:
nginx root /home/admin/tv;
Apache DocumentRoot domain root /home/admin/tv;
I set the same directory but when I go to the Nginx addreess
ip:777 i got 403 Forbidden nginx/0.8.54
I finally change the permission and it works.
There's no reason you can't run one of each. They just have to bind to different sockets. The 403 error is because you configured permissions incorrectly.
I was playing with this earlier, I have apache and nginx and testing both on my server. you should be able to use the same directory with existing sites as long as you make the changes in your virutal host of both to reflect you chosen directory. I don't think you can run both at the same time though.