My Apache server has been overloading and I'm trying to trouble shoot why.
In looking into the apache access log I see tons of entries like this:
POST /?CtrlFunc_999999AAAAAAAAAAAAAAAAAAAAAAAA HTTP/1.0
POST /?CtrlFunc_ppppqqqqqqqrrrrrrrsssssssstttt HTTP/1.0
POST /?CtrlFunc_KOUZdilsx27BGKOSXbfkpv05AGKPTX HTTP/1.0
POST /?CtrlFunc_rrsssssstttttuuuuuvvvvvwwwwwwx HTTP/1.0
All from different IP addresses. It seems strange that all these different IP's would all be sending sequential alpha-numeric requests. Is this some type of encoding that I'm not familiar with? I couldn't find out anything about the:
?CtrlFunc
either. There are hundreds of entries like this coming from IP addresses in China, Taiwan, India, Equador, and Spain to name a few.
Is this normal behavior? I'm just trying to track down why my apache server gets overloaded every time I turn it on. Maybe there's a more efficient way to look at the server processes, but I haven't found it.
I've seen the same attack.
The people over at the Internet Storm Center do not have a method for stopping the flow, but have some suggestions on how to drop the incoming requests:
https://isc.sans.edu/forums/diary/Defending+Against+Web+Server+Denial+of+Service+Attacks/16240
Particularly Comment #2, from the Mod_Security team member, which seems to work well for me.
Related
I was in the middle of tailing my error log for apache when this massive chunk came through. Never seen anything like it before. The IP maps out to RIPE Network Coordination Centere, with a PO box
link here
Is this anything i should dig into further? I couldn't find much in regards to this when googling, other than RIPE is what appears to be an ISP.
[Tue Mar 15 21:34:44.775251 2016] [core:error] [pid 22280] (36)File name too long: [client 93.113.125.12:44444] AH00036: access to /we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages failed (filesystem path '/var/www/html/we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages_we_are_looking_for_not_found_pages')
I just received this request as well from the same IP. I don't see many others talking about it, so I guess it's new.
I assume it is a bot looking for insecure common management pages, and the actual request text is sort of a joke to the webmaster viewing the logs.
The script making these requests is doing exactly what the request URL says. It is sending requests to huge lists of IP addresses, IP address ranges, or even in some cases every IP address on the internet. It records whether or not the address returns with a status code - indicating that there is a web server running at that address.
It is unclear what's going to be done with this information. Maybe if they get a response they will put you on a list to be further probed. This is actually quite common.
These sorts of things are harmless unless you are using default passwords on some admin utility (wordpress, drupal, phpmyadmin, etc). If you start getting bombarded with these requests, you may want to employ the use of a more advanced hardware or software firewall, audit your publicly open ports, and maybe start restricting IP ranges.
http://who.is/whois-ip/ip-address/93.113.125.12
If this persists, you could always report the incident to the source's ISP.
After Apache upgrade on my shared server, I have been having nightmare issues with the form input on all of my reseller hosting accounts. The clients get 403 (or in case of Wordpress install, 404, which has really confused me) after the simplest, most innocent looking form input. For example "he is having a lot of trouble" in a text field results in 403!
It took almost two weeks to figure out what's going on, as the error seemed random and hard to replicate, but after I asked for exact text from the clients that they were not able to enter, we got to the modsec issue. The answer from tech help was "While checking the issue in detail, we found that a mod_security rule was getting triggered on the server while trying to submit the content as "he is having a lot of trouble". We have whitelisted the rule for the website which resolved the issues.".
My question is - how can I deal with this proactively? Is there a list of rules for mod_security that I can check, test some input, ask for additional whitelisting etc? With about a 100 accounts all having problems, it's enough to want to get out of the hosting business all together.
I don't understand your scenario or your question. Are you managing the host or not?
It sounds like you are hosting sites on a shared server so do not have access to the full server but are setting hosts up for clients - is that right?
Running a WAF like ModSecurity requires monitoring log files to identify false positives like this. If you do not have access to the log files then you need to ask your hosting provider what there options are for managing this sort of thing? Or will they do nothing until you raise it?
You can also ask to turn off ModSecurity completely. Most sites get on fine without a WAF - though personally I think they do add value and security.
Finally as to what rules are running on your instance only your tech help can answer that. ModSecurity itself is only an engine and comes with no rules. People can write their own, but some, or use free sets of rules like the OWASP Core Rule Set. So depending what you have would depend how you can test this. Most rules are fairly generic in nature so do result in false positives unless tweaked.
Background: we're running Drupal 6 on an Apache server. I've scoured the internet but can't seem to find anything on exactly what I'm looking for, maybe someone here has an idea.
As a website with a decent amount of traffic, we tend to get a lot of low-level attack attempts. Any time I look through logs, there's at least a handful of "page not found" errors from script kiddies and bots trying to access pages like wp-login.php or admin.php. Obviously these attacks never get very far, but it's still a sometimes significant load on our server to serve up all the 404s.
These attempts are often quite amateurish. Generally, they all come from one IP address over a period of a few minutes. So I'm wondering if there's some way to implement temporary blocks by IP address for anyone who tries something that's obviously an attack. For example, maybe there's a way to configure .htaccess to say:
If (bot_IP tries to access wp_login.php, admin.php, administer/index.php, phpmyadmin.php and so on)
Deny from bot_IP for next four hours
Has anyone ever tried anything like this? It would be wonderful if we could reduce the amount of time we spend sending 404s to attackers, and it seems to me like a lot of people could find it useful.
Thanks!
If you do this in a Drupal module, you'll spend more resources checking logs and filtering requests than you do sending 404's. If you have root access to the server, http://www.fail2ban.org/wiki/index.php/Main_Page will work very well. It scans the apache error logs, uses regex based rules to match log entries and updates the OS firewall rules to handle the blocking.
I am looking for a solution which would redirect the externally facing http://mycompany.com/external/* to be redirected/proxied to http://internal-host:1234/internal/*
(the asterisk is used as a wildcard)
OK, I guess the sentence above is not enough, so here are the details:
In my intranet I have several servers, (names, addresses, ports, and context paths are obviously made-up for the sake of simplicity):
HRServer running at address 10.10.10.10:1010/hr
MailServer running at address 20.20.20.20:2020/mail
My system is accessible from internet only from ip 78.78.78.78, and the constraint here is that I can use only one port (e.g. 8080). In other words - whatever the solution of my problem is - the external address should start with 78.78.78.78:8080
What I need to do is to expose both HR and Mail services though this port.
The first thing which came to my mind was to write two simple portlets (or an HTML with two frames) and to embed them in a simple web page at 78.78.78.78:8080/
But obviously this will not work, as the portlets will redirect the browser to e.g 10.10.10.10:1010/hr which is not visible from the internet.
So my next thought was - OK, lets find a reverse proxy which has dispatching capabilities. Then I can make
78.78.78.78:8080/hr to "redirect" to the internal 10.10.10.10:1010/hr
78.78.78.78:8080/mail to "redirect" to the internal 20.20.20.20:2020/mail
I'd also expect that if let's say the mail server unread messages are seen on 20.20.20.20:2020/mail/unread the unread messages to be also accessible from internet.
Roughly speaking - I'd expect
78.78.78.78:8080/mail/* to redirect to the internal 20.20.20.20:2020/mail/* (the asterisk is used as a wildcard)
I really feel I am missing the obvious here, but honestly - I've spent quite a while in researching several proxies and I did not find the answer. I might be looking for the wrong words or something, but I could not find reverse proxy which can be configured to dispatch external path to different internal paths.
So please - if the answer is e.g. the Apache mod_proxy - please give me a hint about the parameter names that I should be looking for.
Lastly - I am going to run thin in a FreeBSD OS, but this is not a strong requirement (other *nix OSes are also fine)
Thanks!
It took quite a while, but here is the answer:
A good solution is nginx (pronounced "Engine X").
To reroute all traffic which comes to
https://mycompany.com/external/* to
http://internal-host:1234/internal/* (the asterisk is used as a wildcard) you need to have the following configuration:
location ~ ^/internal/ {
rewrite ^/internal/(.*)$ /$1 break;
proxy_pass http://internal-host:1234;
}
And this approach can be used for all the other addresses - e.g. HR portal, mail, etc.
Finally, to give you a heads up - the following configuration does not work:
location ~ ^/internal/(.*)$ {
proxy_pass http://internal-host:1234/internal/$1;
}
It turns out nginx will always proxypass the whole URI when regex is used, so the rule has to be the one above (which does url-rewrite).
Excuse the potential noobishness of the question, but I'm, well, a bit of a noob when it comes to this domain architecture lark. If "domain architecture" is even the technical term. Anyway, I digress...
So, I've googled this question , but I can't see the answer I'm looking for (maybe it doesn't exist, who knows!?) The situation is that I host a .com top level domain which does a 301 forward to another site on the net not hosted by me. Can I set up a subdomain that then points somewhere else whether that be on my host itself or just some other site elsewhere on the net?
Essentially, if I set up a subdomain, will it too inherit the web forwarding, and if so, can I directly affect where that subdomain points?
Any answers gratefully appreciated!
Before I try to answer your question, let me be a little fussy :)
First thing first you are confusing and mixing together two different protocols ([DNS] and [HTTP]), actually there is even a dedicated page to the Wikipedia for HTTP 301 responses: http://en.wikipedia.org/wiki/HTTP_301 (but you should read the whole shebang: ([Wikipedia, search for HTTP] is always a good start, and the [RFC 2616] is absolutely a must, IETF RFCs are not easy reading but the Internet is built on them).
DNS is used to translate a name, like www.example.com into an IP address, like 192.168.0.1, in order to locate a machine on the Internet. So DNS is involved as the one of the very first steps a browser takes in order to resolve an URL: but once the "machine name" is translated by the separate DNS Service, and it has become an IP address, DNS job is over and it is used/involved no more.
Then when a browser, using HTTP, contacts the Web Server located on that machine (in this example the machine www.example.com, which the DNS Service has kindly translated to an IP address, in our example 192.168.0.1, because the operating system can only use an IP address as the argument for an [internet socket]) and only at that moment the web server, instead of serving a page, answers whith an "error" code (which, actually is a "response header" with a numeric code that does not start with "2").
Only that this error code is actually used to tell something else: that the browser should try again an "HTTP request", this time connecting to another machine (and, as long as this redirection is "permanent" instead of "temporary" ([HTTP_307]), the new address should be remembered by the browser, its cache and history).
So, if you can setup [redirection response header] on the first machine, it means that there is a Web Server on that first machine that is programmed (given a certain URL pattern) to spit out a Redirection Header, and as long as you can control these redirections, you can as well send the browser wherever you want, not merely sending them to another machine on the Internet but to another URL as well, even on the same website (actually this is the original intended use of code 301, as a measure against [link rot]).
Basically you are free to do whatever you want, or better, to send them wherever you want.
The pros are obvious... the cons are that you must have control over the first web server, and that the visiting browsers will have to perform two "GET request" in order to land at the intended page (this is not grim as it looks, since the [RFC 2616] suggests that the browser (they call it User Agent) caches and remembers the redirection (because it is
permanent)).
Disclaimer: I am being prevented to post hyperlinks, but they where basically all from the Wikipedia so, if you will, you can look the words in brackets "[...]" on the Wikipedia...