Apache 2.4.x ip blacklist - apache

I'm looking for an easy way to blacklist IP addresses in Apache 2.4.x. My web site logs ip addresses that tried illegal operations into a text file. I would like to use this text file within Apache to deny all access to all vhosts to this ip list. What would be the best way (easiest and least resource consuming way) ?
Found this but this is only for 2.2.. Not sure how this applies to 2.4..
Cheers.
edit: this is a windows x64 box running apache x64

#vastlysuperiorman called it right, csf/lfd is the best at this. Unfortunately, they only run on linux.
This free utility promises to provide the same functionality: dynamically monitor access attempts and auto-block IP addresses. You can unblock with a command, in case of false positives. Certainly worth a short.
An alternative could be to create a VM (if your platform supports virtualization) deploy a very small spec linux box, and use that as a proxy. This should be easy to implement. BTW, why not just use linux? .. :-)
(this should have been a comment on #vastlysuperiorman's post, but I don't have enough SO reps to comment on the post of others)
Edited to suggest a possible apache 2.4 based solution:
To translate ACL directives between the 2.2 and 2.4 in apache
2.2 Syntax
order Deny,Allow
include conf/IPList.conf
Allow from all
2.4 Syntax
DocumentRoot /some/local/dir
<Directory /some/local/dir/>
<RequireAll>
Require all granted
Include conf/IPList.conf
</RequireAll>
</Directory>
#this will also work
<Location />
<RequireAll>
Require all granted
Include conf/IPList.conf
</RequireAll>
</Directory>
# conf/IPLIst.com is actually in /etc/apache2/conf/IPList.conf
# (ie, paths are relative to where apache is installed.
# I guess you can also use the full path to the list.
And inside conf/IPList.conf, you will have individual lines with entries like the following
Require not ip 10.10.1.23
Require not ip 192.168.22.199
Require not ip 10.20.70.100
Using mod-rewrite and a list of IPs for banning
For a redirect-to-another-page to work, you need to keep the RewriteRule outside the base URL you are guarding.
For instance, the redirect would not work under a Directory directive on DocumentRoot or a Location directive on '/', because the ban affects the status page we want to display.
So, best to keep this outside a Directory or Location directive, or link to a status page on another unprotected web server.
#Required set of rewrite rules
RewriteEngine on
RewriteMap hosts-deny txt:/etc/apache/banned-hosts
RewriteCond ${hosts-deny:%{REMOTE_ADDR}|NOT-FOUND} !=NOT-FOUND [OR]
RewriteCond ${hosts-deny:%{REMOTE_HOST}|NOT-FOUND} !=NOT-FOUND
RewriteRule ^ /why-am-i-banned.html
## inside our banned hosts file, we have:
## /etc/apache2/banned-hosts (maintain the format .. its not just a plain text file)
##
193.102.180.41 -
192.168.111.45 -
www.example.com -
www.sumwia.net -
# inside our status page, could be html as below or a plain text file with '.txt' extension
#/var/www/html/why-am-i-banned.html
#
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>Why is my IP banned?</title>
</head>
<body>
<h2>Why is my IP address banned?</h2>
<p>
To manage spammers and for other security needs, our server automatically blocks
suspicious IP address. If however you reckon your IP address has been blocked
wrongfully, please contact us.
</p>
</body>
</html>
And of course, you can parse your log files and populate conf/IPList.conf or /etc/apache2/banned-hosts as appropriate ..
As a short term solution
An alternative that will allow you to use the 2.2 syntax, is to install mod_access_compat module and continue using your deprecated 2.2 style 'Deny,Allow' directives. But this is only advisable as a short-term solution since that module is just there to aid transition, and would probably go away in future versions of apache 2.4

I too have not seen a good alternative for blocking access dynamically from within Apache itself. There are "hacky" ways: you could set an environment variable to contain a list of IPs and then use the module with ${REMOTE_ADDR} and the env function, but that's a stretch. Details on the Expression Parser
However, I have used several light weight modules that are helpful in protecting your Apache server.
ConfigServer Firewall (CSF/LFD) is a great solution for linux systems. It provides a simple method for managing iptables, and can be set up to do brute force detection and blocking. Info here
EDIT:
Add the following line to /etc/csf/csf.deny to include your custom IP block list:
Include /var/www/example.deny
Alternately, update your script to append IP addresses to csf.deny either directly:
echo $badIP >> /etc/csf/csf.deny
or using the CSF command line option (preferred):
csf -d 10.20.30.40
CSF readme here
mod_security is one of my favorite Apache/nginx modules. It detects dangerous GET and POST requests and blocks access accordingly. When set up properly, it will trigger CSF to block the IP addresses that frequently violate rules. Details here

Related

how to block cross frame scripting in Apache for svn

I have SVN configured thru Apache 2.4.18 on Linux 6.6. Next i have to disable cross frame scripting for my svn url. SVN url is like https://servername/svn/projectA. I have compiled mod_security2.so and copied to /modules directory and loaded then in virtualHost have the lines below.
LoadFile /usr/lib64/libxml2.so
LoadFile /usr/lib64/liblua-5.1.so
LoadModule security2_module modules/mod_security2.so
httpd-vhosts.conf
<VirtualHost *:80>
ServerAdmin email#domain.com
DocumentRoot "/var/local/apache/httpd2.4.18/htdocs"
ServerName servername.fqdn.com
# For http to https redirect
Redirect / https://servername
TraceEnable off
RewriteEngine on
RewriteCond %{REQUEST_METHOD} ^(TRACE|TRACK)
RewriteRule .* - [F]
SecRuleEngine On
#SecFilterEngine On
#SecFilterForceByteRange 32 126
#SecFilterScanPOST On
#SecFilter "<( |\n)*script"
SecRequestBodyAccess On
SecResponseBodyAccess On
ErrorLog "logs/error_log"
CustomLog "logs/access_log" common
</VirtualHost>
The rules that Apache not supported are
SecFilterEngine
SecFilterForceByteRange
SecFilterScanPOST
SecFilter
Blockquote
Instead of SecFilterEngine, its taking SecRuleEngine. But I do not know alternative rule for other rules. I am using modsecurity-2.9.0 source compiled. The error i see is below. [root#server extra]# /var/local/apache/httpd2.4.18/bin/apachectl configtest
AH00526: Syntax error on line 45 of /var/local/apache/httpd2.4.18/conf/extra/httpd-vhosts.conf:
Invalid command 'SecFilterForceByteRange', perhaps misspelled or defined by a module not included in the server configuration. Any one know the mod_security2 supported modules for SecFilterForceByteRange, SecFilterScanPOST and SecFilter. I also read documentation about mod_security but could not figure out and solve the issue. I followed the url below.
http://www.unixpearls.com/how-to-block-xss-vulnerability-cross-site-scripting-in-apache-2-2-x/
[EDIT]
Its solved by adding the header response.
All those unsupported commands are ModSecurity v1 commands and have been completely rewritten for ModSecurity2.
The rule you would want would be something like this:
SecRule ARGS "<( |\n)*script" "phase:2,id:1234,deny"
This basically scans any of your arguments (as parameters or the body) for items like this:
<script
or
< script
or
<
script
That's not a bad start to trying to protect for XSS but is a bit basic.
OWASP has a Core Rule Set of ModSecurity rules and their XSS rules are much more complex and can be seen here: https://github.com/SpiderLabs/owasp-modsecurity-crs/blob/master/base_rules/modsecurity_crs_41_xss_attacks.conf
XSS can be exploited in a number of ways, some of which will make it to your server (and this sort of thing might catch) and some which might not even make it to your server at all (and so which this can't protect against).
The best way to protect against XSS is to look at Content Security Policy, which allows you to explicitly say what javascript you want to allow on your site, and what not, and to explicitly deny in-line scripts if you want. This may require some clean up of your site to remove inline scripts and is not always the easiest to set up, particularly if loading third party assets and widgets on your site, but is the most robust protection.
The X-Frame-Options header is useful to stop your site being framed, and someone overlaying content to make you think you are clicking on the real site buttons and fields, but actually clicking their buttons. It's not really a form of XSS, since you are more putting scripting on an invisible window on top of your site rather than directly on your site, but can have similar effects. It's a good header to use.

How to change the limitrequestfieldsize in Apache 2.4.2

I'm working with Apache 2.4.2 and I need to change the LimitRequestFieldSize. supposedly (according to some Google researchs) that can be done in the httpd.conf file but I cant find that LimitRequestFieldSize neither the httpd.conf or any file within the Apache. Have any idea how I can do it?
In the end I solved simply adding LimitRequestFieldSize 500000 to the file httpd-default.conf
What you just did is open the door to a DoS attack.
Take a look at the LimitRequestFieldSize directive in the Apache documentation:
Quoting from that source:
This directive specifies the number of bytes that will be allowed in
an HTTP request header.
The LimitRequestFieldSize directive allows the server administrator to
set the limit on the allowed size of an HTTP request header field. A
server needs this value to be large enough to hold any one header
field from a normal client request. The size of a normal request
header field will vary greatly among different client implementations,
often depending upon the extent to which a user has configured their
browser to support detailed content negotiation. SPNEGO authentication
headers can be up to 12392 bytes.
This directive gives the server administrator greater control over
abnormal client request behavior, which may be useful for avoiding
some forms of denial-of-service attacks.
The documentation also specifies that the context of that directive is server config (which means server-wide) and virtual host (you can apply this directive on a per-vhost basis).
In addition, you do not mention what your OS is. In case it's Linux (which I'm more familiar with):
The configuration file, httpd.conf, is found in /etc/httpd/conf/httpd.conf (RHEL, CentOS, Fedora, Scientific Linux).
In Debian, and derivatives like Ubuntu (I don't think that is the case here, but I am mentioning it anyway just for the record), the configuration file is apache2.conf and can be found in /etc/apache2/apache2.conf.
Hope it helps.
And last but not least, you may want to check out the Unix and Linux Q&A here in StackExchange for questions like this (assuming Linux or other *Nix OS). You may have better luck at getting an answer.
This issue can be solved by updating the directive LimitRequestFieldSize either in the apache httpd.conf or in the virtual hosts.
How to add the prop in the virtual host
<VirtualHost 10.10.50.50:80>
ServerName www.mysite.com
LimitRequestFieldSize 16384
RewriteEngine On
...
...
</VirtualHost>
How to add in the httpd.conf which is inside , apache2/conf/httpd.conf
LimitRequestFieldSize 16384
But even after doing this i am still getting bad request error.

Apache always get 403 permisson after changing DocumentRoot

I'm just a newbie for Apache. I just installed apache 2.2 on the FreeBSD box at my home office. The instruction on FreeBSD documentation is that I can change the DocumentRoot directive in order to use the customized directory data. Therefore, I replaced...
/usr/local/www/apache22/data
with
/usr/home/some_user/public_html
but something is not right. There's index.html file inside the directory, but it seems that apache could not read the directory/file.
Forbidden
You don't have permission to access / on this server.
The permission of
public_html
is
drwxr-xr-x
I wonder what could be wrong here. Also, in my case, I am not going to host more than one website for this FreeBSD box, so I didn't look at using VirtualHost at all. Is this a good practice just to change the DirectoryRoot directive?
Somewhere in the apache config is a line like:
# This should be changed to whatever you set DocumentRoot to.
#
<Directory "/usr/local/www/apache22/data">
You must change this path too, to make it work. This directive contains for example:
Order allow,deny
Allow from all
Which give initial user access to the directory.
one possibility that comes to mind is SELinux blocking web process from accessing that folder. If this is the case, you would see it in selinux log. You would have to check the context for your original web root with:
ls -Zl
and then apply it to your new web folder:
chcon whatevercontextyousaw public_html
Or, instead, if its not a production server that requires security (like a development machine behind a firewall), you might want to just turn selinux off.
Just one idea. Could be a number of other things.

Can you disable apache logs for a single site using htaccess or in the Virtual Host settings?

I'm working on a web site where the client doesn't want ANY logging on the site for privacy reasons. The site will be hosted on the same Apache Web Server as a number of other websites which is why I can just turn logging off in Apache. Is there some way to disable logging for an individual site using htaccess rules or by adding something to the VirtualHost settings?
The options seem to be
Sending to /dev/null on *nix or C:/nul on Windows (see here)
Removing the base logging directives and duplicating them in each vhost (so there is no logging on for vhosts by default)
Seems like there should be some better way to do this, but that's what I've found.
Yes, just comment out (using a '#') the ErrorLog and CustomLog entries in the httpd conf for your virtual host.
http://www.mydigitallife.info/how-to-disable-and-turn-off-apache-httpd-access-and-error-log/
I achieve this by making the logging dependent on a non-existing environment variable. So in the VirtualHost you can have:
CustomLog /var/log/httpd/my_access_log combined env=DISABLED
and so long as there is no environment variable called DISABLED then you'll get no logs.
I actually arrived here looking for a neater solution but this works without having to change the global httpd.conf.
Edit: removed reference to .htaccess because CustomLog only applies in the global config or in the virtual host config as pointed out by #Basj

How can I redirect requests to specific files above the site root?

I'm starting up a new web-site, and I'm having difficulties enforcing my desired file/folder organization:
For argument's sake, let's say that my website will be hosted at:
http://mywebsite.com/
I'd like (have set up) Apache's Virtual Host to map http://mywebsite.com/ to the /fileserver/mywebsite_com/www folder.
The problem arises when I've decided that I'd like to put a few files (favicon.ico and robots.txt) into a folder that is ABOVE the /www that Apache is mounting the http://mywebsite.com/ into
robots.txt+favicon.ico go into => /fileserver/files/mywebsite_com/stuff
So, when people go to http://mywebsite.com/robots.txt, Apache would be serving them the file from /fileserver/mywebsite_com/stuff/robots.txt
I've tried to setup a redirection via mod_rewrite, but alas:
RewriteRule ^(robots\.txt|favicon\.ico)$ ../stuff/$1 [L]
did me no good, because basically I was telling apache to serve something that is above it's mounted root.
Is it somehow possible to achieve the desired functionality by setting up Apache's (2.2.9) Virtual Hosts differently, or defining a RewriteMap of some kind that would rewrite the URLs in question not into other URLs, but into system file paths instead?
If not, what would be the preffered course of action for the desired organization (if any)?
I know that I can access the before mentioned files via PHP and then stream them - say with readfile(..), but I'd like to have Apache do as much work as necessary - it's bound to be faster than doing I/O through PHP.
Thanks a lot, this has deprived me of hours of constructive work already. Not to mention poor Apache getting restarted every few minutes. Think of the poor Apache :)
It seems you are set to using a RewriteRule. However, I suggest you use an Alias:
Alias /robots.txt /fileserver/files/mywebsite_com/stuff/robots.txt
Additionally, you will have to tell Apache about the restrictions on that file. If you have more than one file treated this way, do it for the complete directory:
<Directory /fileserver/files/mywebsite_com/stuff>
Order allow,deny
Allow from all
</Directory>
Can you use symlinks?
ln -s /fileserver/files/mywebsite_com/stuff/robots.txt /fileserver/files/mywebsite_com/stuff/favicon.ico /fileserver/mywebsite_com/www/
(ln is like cp, but creates symlinks instead of copies with -s.)