Apache, FastCGI - Error 503 - apache

It was pointed out to me that my server is insecure, because anyone could read any file through PHP, even if it was out of his DocumentRoot.
I decided to reconfigure whole apache and all configuration files for virtual hosts and installed Mod FastCGI and I managed to get it running quite nicely, I don't have problems with rights anymore.
But there seems to be another problem. When I try to load more than 3 sites (on different VirtualHosts => different users on different threads) page loads for a while and than crashes on "Error 503: Service Temporarily Unavailable".
I tried increasing PHP_FCGI_CHILDREN var from my default setting for all VirtualHosts which is 0 to higher numbers, but without luck. I also tried to set KeepAlive var in my apache2.conf to Off as I read in some tutorial, but it seems like nothing helps.
Does anyone know how to resolve this issue? [My Apache error.log and suexec.log is empty.]

Found the answer to my own question. There are few more settings I overlooked.
FcgidMaxProcesses 15
FcgidFixPathinfo 1
FcgidProcessLifeTime 0
FcgidTimeScore 3
FcgidZombieScanInterval 10
FcgidMaxRequestsPerProcess 0
FcgidMaxRequestLen 33554432
FcgidIOTimeout 60
Those are my settings in apache.conf and everythings works as expected.

Related

Apache2 will only serve small files

I just know this is one of those questions where the solution should have been obvious and I'm going to feel like an idiot but I'm stuck.
I've set up a basic apache2 web sever under openSUSE Leap 15.1 on my LAN with a single virtual host (for the moment).
If I create a simple html file of 255 bytes or less a browser on another workstation picks it up without problem. If the file gets to anything larger than 255 bytes apache doesn't serve it. The GET request shows up on the server but nothing shows in Firefox, Konqueror at least gives me a 1b Object not found error.
I should say that running a browser on the server itself shows all these files perfectly well should I use 127.0.0.1, localhost or the server name in the URL.
Is this an Apache2 directive I've missed or something else entirely?
After setting the LogLevel to trace1 and then hunting around on the web I came across what appears to be the solution. Set the Apache directive ...
EnableSendfile Off
Apparently Apache is compiled with this set to On and the kernel sendfile doesn't do the job. Not a detailed explanation I know but I haven't followed this all the way through - I just needed to get Apache working!

mod_evasive not working on Apache 2.4.6

I am trying to configure mod_evasive for Apache 2.4.6 on CentOS, release 7.5.1804. I got clean install of CentOS, with clean install of Apache without serving any pages or anything (just example index.html saying hello world), and I installed mod_evasive using this tutorial: https://www.digitalocean.com/community/tutorials/how-to-protect-against-dos-and-ddos-with-mod_evasive-for-apache-on-centos-7
Everything works fine, unless i have to run the testing script which should send requests to server and get 403 error because of mod_evasive. Instead, I am getting 400 Bad Request.
I switched firewalld for IPtables, and I have port 80 open, in fact, the example page works ok from browser. Also, SELinux mode is set to permissive.
Any suggestions?
You will most likely have to change the perl script (usually saved in /usr/share/doc/libapache2-mod-evasive/examples/test.pl) to make this work, e.g.
Original line:
print $SOCKET "GET /?$_ HTTP/1.0\n\n";
Re-worked line:
print $SOCKET "GET /?$_ HTTP/1.0\r\nHost: 127.0.0.1\r\n\r\n";
From https://centosfaq.org/centos/apache-mod_evasive-problem-with-testpl/
The issue was not with mod_evasive or its configuration. In my case I had to tweak the configuration of mpm_prefork_module like below to get mod_evasive configuration to work:
StartServers 10
MinSpareServers 10
MaxSpareServers 10
MaxRequestWorkers 80
MaxConnectionsPerChild 0
Basically fix the number of servers to constant by setting StartServers = MinSpareServers = MaxSpareServers = {your_magic_number} and set MaxConnectionsPerChild=0, so that no new server processes are spawned and no re-cycling of connections happen, allowing Child to hold infinitely many concurrent connections.
Hope this saves your day!
After a few days, I found that there was an error in testing script provided with mod_evasive...
I corrected it and found out that installation was ok.

Apache returns 403 forbidden after 1 mins

I have a Java EE application running on Wildfly and I use Apache as Proxy Server.
One of my requests takes more than one minute to respond. When I use directly Wildfly it returns, but if the request goes over Apache, I take 403 forbidden after 1 minute. I have done some research but I couldn't find a proper solution.
Probably there is a configuration file for this. As I have seen, it is not in httpd.conf :).
Please help me to solve this problem.
I have found the solution. Default ProxyTimeout is 60 second in Apache. With
ProxyTimeout 600
on virtual-host configuration I have solved the problem.

Apache - Domain not resolving : 404 errors

My Centos 5.1 VPS (WHM) had to be hard rebooted earlier and all the sites apart from one have come back online. The non working domain pings with 0% packet loss so it looks as though Apache is serving, however, I get the error:
Not Found
The requested URL / was not found on this server.
Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
I checked the DNS settings and as expected, they point to the host. I checked the permissions and CHMOD values for the public_html folder and they are all correct. I restarted Apache and BIND but I still can't get the domain to work. Incidentally, adding erroneous characters to .htaccess (in an attempt to produce an error), does not make any difference. The host did a system file check which showed no errors.
What have I missed?

How to configure server to allow large file downloads?

I can download 1+ Gb files with no problem from my server. Today uploaded 2.3 GB file and discovered that I can't download it, downloading gets interrupted after 1GB and it happens for every user. I tried Firefox 15, latest Chrome, and IE9.
Server:
Apache 2.2.22
fcgi
eAccelerator
CPnginx
How can I resolve this?
I think I just faced the same problem, hopefully this might help others.
My server is set up to proxy requests to Apache through Nginx using the ngx_http_proxy_module. The proxy_max_temp_file_size directive of this module defaults to 1024m and is what was causing my problem.
Try adding the line proxy_max_temp_file_size 5120m; within a http, server or location directive in your nginx.conf file and reload nginx's config file with /etc/init.d/nginx reload.
Looks like the default max download size apache sets is around 2 Gb and you can override it by tweaking LimitRequestBody in your httpd.conf. Not sure why it would stop at 1 Gb though, that would make me think the problem is something else.. Hope this helps.