I have a web shop on the very much traffic is, but this traffic is apparently generated by bots, which load pages, but never the pictures, javascript files or css files load.
I want to lock these connections with mod-security, but I find nowhere a rule with which I can do that.
First I had tried with the firewall the IP lock, but the always come with other addresses. Undertaking is pointless.
The user agent is different ... it can not be seen that the connection of a bot comes. The only thing that is noticeable is that they never load an image or a CSS file from the page.
The accesses are however not so much and fast, so the mod_evasive does not strike.
Obviously, it is a good idea to make the page so slow, the normal user quickly give up :-( I think a competitor has a ddos attack run ... who knows that already ...
Does anyone have the same problem or for this problem a mod-security rule, with which I could work?
Does somebody has any idea?
Best regards,
Holger
Related
This site or app is sending too much traffic to rawgit.com. Please contact its owner and ask them to use cdn.rawgit.com instead, which has no traffic limit.
I am getting this above notification on my website. Can anyone help me to resolve out this issue.How I can remove it.
Rawgit has two Domains for displaying your Github files:
cdn.rawgit.com is for when you request that site a lot (Updates take a while to show).
rawgit.com is for testing (Updates are visible immediately).
You need to use the cdn.rawgit.com link as you are sending to much traffic.
For example:
https://rawgit.com/Oisins/Mod/master/.project
becomes
https://cdn.rawgit.com/Oisins/Mod/master/.project
See http://rawgit.com/ for more info.
Many of our users are reporting that they are getting a blank page when using IE11 to access our website. Sometimes they don't even get a blank page, the browser just stays on the last page visited. These users can access other domains (such as google.com) without a problem.
For the browsers that are failing, if those users disable Protected Mode in IE, then they can again access our site, but this isn't a good solution because we have thousands of users and we can't go telling each of them to reconfigure their browsers, not to mention that we're completely losing the business of those potential customers who just surf by, see a blank page, and then keep on going without filing an error report.
Firefox and Chrome work fine. Also, even when using IE11 in protected mode, some users have zero problem, their computers just seem to work.
We are running the site off of IIS7. Other sites on the same server run fine, it's just the one particular site that is having the problem, and it's intermittent, affecting some computers and not others.
There must be something I can do on my side in the server or network settings to keep this problem from happening, but I'm baffled as to what it might be. When I look at the network traffic on a browser that's failing, the GET request is just being aborted with no explanation. When looking at the traffic with Wireshark, I see no errors, and no errors are showing up in the IIS logs. Of the browsers that fail, they are not even opening a connection to our web server to make a GET request, the request is just aborting immediately.
Any advice appreciated!
(followup): Another test we did: We can reproduce the problem on our development server, with the same pattern of which computers "work" and which don't. We tried turning off the webserver, and the problem stayed consistent, we'd still get a blank page error. So it's obviously not to do with anything that's in our page content? I've run tests on 9 computers on our office network: 5 worked, 4 failed. We're all baffled. :/
(January 24 followup): We figured out what's causing the problem, but not how to fix yet. Deep in the Windows registry, a key is being set in a Zonemap folder, adding a "play.net" domain with a key value of 2. IE sees that key when someone types play.net, and quivers and dies without an error message (Firefox and Chrome handle it fine).
So next question, what is setting that key in the first place? Probably an ActiveX control somewhere, but we haven't had any luck finding it in this 15 year old site, as many of the coders who may have created ActiveX controls in the past are long gone.
Does anyone know of a way to scan an entire domain and list anything that might be twiddling a registry key?
Followup mail threads from Elonka indicated that some users had the play.net domain mapped to a specific / non-default Internet Explorer security zone, and those users were the ones having problems.
Try add
<meta http-equiv="X-UA-Compatible" content="IE=10" />
into the header. It fixed IE11 for my website. It just forces it into IE10 mode (A complete joke though)
I accessed your domain in Internet Explorer 11, with Protected Mode enabled. I was unable to get a fully broken experience, but I did see something perhaps related to what your customers are experiencing. Some directories on your domain would hang for me, resulting in a white screen if they've not yet been loaded, or staying on the present page while the forthcoming page is loaded.
When I navigated to /dr, I found that my browser hung for over 7 seconds while your files loaded. Note, there were roughly 60 items, totaling about .78mb. This isn't much, but I'm presently on a relatively slow connection, and this resulted in a hang for me.
I would inquire what connection speeds your clients are on when experiencing this issue - it could very well be nothing more than some directories being served up more slowly than others. Regardless, you could optimize the experience considerably by compressing and concatenating your CSS files, as well as all of your JavaScript files. This results in fewer parallel requests, and quicker downloads.
If you have a reproducible example, please feel free to share and I'm sure myself and many others here will be more than happy to assist you.
For me box-shadow on an element caused this behavior. I removed the box-shadow(but left -webkit-box-shadow and -moz-box-shadow) and the problem disappeared. The interesting thing is that I still have box-shadow on another element...
If you are using 'offline.js' in your project, and you have version less than 0.7.19, then IE11 latest security update will see this code as potential security threat and block pages which are using 'offline.js' code.
Solution: Update to latest 'offline.js' version.
I am trying to ban users that spam my service by logging their IP and blocking it.
Of course this isn't safe at all, because of dynamic IP addresses.
Is there a way to identify a user that's 100% safe?
I've heard about something called evercookie, but I was easily able to delete that, and I guess that anyone capable of changing their IP can also keep their PC clean..
Are there any other options? Or is it just not possible?
A cookie will prevent the same browser from visiting your site as long as the user doesn't delete it, or turn off cookies, or use a different browser, or reinstall their browser, or use another machine, etc.
There is no such thing as 100% safe. Spam is an ongoing problem that most websites just have to learn to deal with.
There are numerous highly secure options, mostly relying on multi-factor authentication and physical key generators like the ones RSA markets. But the real question is an economic one. The more draconian the authentication mechanism, the more quickly you kill your website as you scare off all your visitors.
More practical solutions involve CAPTCHA, forum moderation, spam-reporting affordances, etc. One particularly effective technique is to block offending content from every IP address except the one that originated it. That way, the original spammer thinks their content is still there, oblivious to the fact that no one else can see it.
Alright I get that it's impossible to 100% identify a unique visitor.
What are the things that I could do to:
- find whether someone (anonymous) is using lots of different proxies to see my content (problem here is that cookies would land on the machine of the proxy? and not the actual visitors PC?)
- identify unique (anonymous) visitors with a dynamic IP
Recently a user told me to avoid subdomains when i can. I remember reading google consider subdomains as a unique site (is this true?). What else happens when i use a subdomain and when should i use or should not use a subdomain?
I heard cookies are not shared between subdomains? i know 2 images can be DL simultaneously from a site. Would i be able to DL 4 if i use sub1.mysite.com and sub2.mysite.com?
What else should i know?
You can share cookies between subdomains, provided you set the right parameters in the cookie. By default, they won't be shared, though.
Yes, you can get more simultaneous downloads if you store images in different subdomains. However, the other side of the scale is that the user spends more time resolving DNSs, so it's not practical to have, say, 25 subdomains to get 50 simultaneous downloads.
Another thing that happens with subdomains is that AJAX requests won't work without some effort (you CAN make them work using document.domain tricks, but it's far from straightforward).
Can't help with the SEO part, however, although some people discourage having both yoursite.com and www.yoursite.com working and returning the same content, because it "dilutes your pagerank". Not sure how true that is.
You complicate quite a few things. Collecting stats, controlling spiders, html5 storage, XSS, inter-frame communication, virtual-host setup, third-party ad serving, interaction with remote APIs like google maps.
That's not to say these things can't be solved, just that the rise in complexity adds more work and may not provide suitable benefits to compensate.
I should add that I went down this path once myself for a classifieds site, adding domains like porshe.site.com, ferrari.site.com hoping to boost rank for those keywords. In the end I did not see noticeable improvement and even worse google was walking the entire site via each subdomain, meaning that a search for ferraris might return porsche.site.com/ferraris instead of ferrari.site.com/ferraris. In short google considered each site to be duplicates but it still crawled each site every time it visited.
Again, workarounds existed but I chose simplicity and I don't regret it.
If you use sub domains to store your web sites images, javascript, stylesheets, etc. then your pages may load quicker. Browsers limit the number of simultaneous connections to each domain name. The more sub domains you use, the more connection can be made at the same time to collect the web pages content.
Recently a user told me to avoid subdomains when i can. I remember reading google consider subdomains as a unique site (is this true?). What else happens when i use a subdomain and when should i use or should not use a subdomain?
The last thing I heard about Google optimization, is that domains count for more pagerank than subdomains. I also believe that page rank calculations are per page, not per site (according to algorithm etc.). Though the only person who can really tell you is a Google employee.
I heard cookies are not shared between subdomains?
You should be able to use a cookie for all subdomains. www.mysite.com sub1.mysite.com sub2.mysite.com can all share the same cookies, but a cookie specified for mysite.com cannot be shared with them.
i know 2 images can be DL simultaneously from a site. Would i be able to DL 4 if i use sub1.mysite.com and sub2.mysite.com?
I'm not sure what you mean by DL simultaneously. Often times, a browser with a single thread will download images one at a time, even from different domains. Browsers with multiple thread configurations can download multiple items from different domains at the same time.
In order to allow for multiple policies regarding content... security, cookies, sessions, etc, I'm considering moving some content from my sites to their own domains and was wondering what kinds of dividends it will pay off (If any).
I understand cookies are domain specific and are sent on every request (even for images) so if they grow too large they could start affecting performance, so moving static content in this way makes sense (to me at least).
Since I expect that someone out there has already done something similar, I was wondering if you could provide some feedback of the pros and the cons.
I don't know of any situation that fits your reasons that can't be controlled in the settings for the HTTP server, whether it be Apache, IIS or whatever else you might be using.
I assume you mean you want to split them up into separate hosts, ie www1.domain.com www2.domain.com. And you are correct that the cookies are host/domain specific. However, there aren't really any gains if www1 and www2 are still the same computer. If you are experiencing load issues and split it between two different servers, there could be some gains there.
If you actually mean different domains (www.domain1.com & www.domain2.com) I'm not sure what kind of benefits you would be looking for...