Can a website be cached anywhere other than a browser's cache? - browser-cache

My client is seeing a different version of the website on his computers then what I am seeing on mine. He claims to be deleting the cache. I'm using Safari with the cache disabled via the Develop menu and I see the correct version of the site.
Is it possible that the website is somehow cached by my client's ISP or something along those lines?
Update:
I think I need to describe the problem better:
My client has a web hosting package where he has his domains and email accounts. somedomain.com has it's A record changed to point to Behance's ProSite hosted service.
The problem is that when he goes to somedomain.com he gets the index.html that's sitting in his web server's public_html directory, and not his ProSite. Using the same domain I see the ProSite. He has cleared his cache and tried on a computer at home with the same result. This is what lead me to believe that there is some sort of caching issue somewhere along the line with his ISP(s).
Is there anything I can do about this?

Proxy servers at the ISP or even the client's site might do this. Or even network-compressors in some (mal)configurations.
Depending on the site you might also be seeing actually a different site. e.g. Google redirects to different servers using DNS load balancing.

Yes, you're right. To improve performance and the speed in loading page from the same request modern browser seem to great at caching. I myself have the same problem as well. To resolve this problem You should tag version of your projects whenever you deploy them to production.

Based on the update, the problem was with DNS cache.
DNS can be cached at the following levels:
browser
operation system
router
DNS provider
And each of them has its own way to flush DNS cache. Except DNS provider where the only thing you can is to wait for cache invalidation. Though you can replace your current DNS provider with another one who won't have your domain in his cache. You have all the chances to find such if your domain isn't popular.

Related

What would cause apache to redirect from a local IP to a remote IP address

Here's a scenario that I can't figure out; I simply can't understand why an slightly oldish webserver (totally inactive/powered-off for 2y) is behaving this way. I MUST be overlooking something quite simple.
Specifically, when i try to access an Apache instance on Centos 7 residing on my local network (192.168.2.XXX), the apache page responds just fine (Testing 1,2,3; Yay). Watching the access logs on this simple request shows up fine. On this same machine, I have four additional paths set up. One for example is a locked down phpMyAdmin that is accessible only from an internal IP. This route works fine, and the databases can be browsed, etc. Yet, for the other route, such as a wordpress installation or a route to a Magento instance, the the request comes up on the access log (no error log entry), and then just sits there. When the request finally times out, the URL in the browser changes to a new ip address (ABC.XXX.YYY.ZZZ), and then terminates any efforts.
Admittedly, the machine WAS originally configured to be outward facing, and my suspicion is that the IP to which the pages revert may have been the public IP last time the machine was alive. the IP is no longer associated with the site, and the domain which was likely setup with that IP address, is also no longer active.
Does anyone have any suggestions as to what I may look at? I have combed the httpd configurations and there is nothing resembling any such redirection address. Could there be some DNS data that needs to be flushed? A network configuration in sysconfig/ that I am overlooking?
It was nothing to do with my apache configuration. Everything was related to the site urls that were embedded inside the wordpress and magento installations. Upon finding and replacing all instances of the site IP address in some configuration tables, I was able to get both applications to respond properly.

Is the malicious botting, how to prevent?

I recently set up a subdomain on my website with the intention to soon clone my website for testing purposes. Subdomain was "beta", so beta.example.com
It was set up and password protected via htaccess and is directed through Cloudflare, it's about three days old and was never announced publicly (only I know of it).
Today I notice this on my Apache Server Stats page:
Also, CPU load was increasing and very, very high. Upon refreshing, this continued and is actually still continuing right now. Is this some sort of botting/brute force attack? I can't imagine how/why else so many IPs would be accessing this unlinked/private subdomain. I've since taken it down from Cloudflare DNS and the IPs are still connecting somehow, I assume it will take time for it to propagate.
Is this malicious? And how can it be prevented? I assume it was/is attempting to brute force the htaccess password? Is it because it's a common subdomain name? ("beta") - would it matter? Again, it's only been about three days so damn they work fast.
It can be search engine robots, It can be script kiddies, It can be brute force, you can have more information in your log file or by analyzing IP address.
I'm not sure to really understand your problem and what you want.
If you website is online, so yes some people/bots/robots will try to access to it, like any other website.
If you don't want than anybody access to your website, you can add an IP restriction.

Domain URL masking

I am currently hosting the contents of a site with ProviderA. I have a domain registered with ProviderB. I want users to access the contents (www.providerA.com/sub/content) by visiting www.providerB.com. A domain forward is easy enough and works as intended, however, unless I embed the site in a frame (which is a big no-no), the actual URL reads www.providerA.com/sub/content despite the user inputting www.providerB.com.
I really need a solution for this. A domain masking without the use of a frame. I'm sure this has been done before. An .htaccess domain rewrite?
Your help would be hugely appreciated! I'm going nuts trying to find a solution.
For Apache
Usual way: setup mod_proxy. The apache on providerB becomes a client to providerA's apache. It gets the content and sends it back to the client.
But looks like you only have .htaccess. So no proxy, you need full configuration access for that.
So you cannot, see: How to set up proxy in .htaccess
If you have PHP on providerB
Setup a proxy written in PHP. All requests to providerB are intercepted by that PHP proxy. It gets the content from providerA and sends it back. So it does the same thing as the Apache module. However, depending on the quality of the implementation, it might fail on some requests, types, sizes, timeouts, ...
Search for "php proxy" on the web, you will see a couple available on GitHub and others. YMMV as to how difficult it is to setup, and the reliability.
No PHP but some other server side language
Obviously that could be done in another language, I checked PHP because that is what I use the most.
The best solution would be to transfer the content to providerB :-)

How to create a friendly url in Tomcat?

I want to modify my application URL from //localhost:8080/monitor/index.html to just monitor , so that on putting monitor on browser, my application should open. Is there a way to achieve this, can someone suggest the configuration changes which will be required for this.
Can I map my short URL to the existing one may be somewhere in web.xml. I am not sure about the approach any suggestions will be great.
Thanks and regards
Deb
You're mixing up several different protocol layers in your question.
If you just enter nothing but "monitor" in the browser URL bar the browser is going to first lookup "monitor" in DNS and finding nothing it will then probably send a query to Google or your configured search engine. In the past browsers have taken other steps, such as appending ".com" and prepending "www." but I don't think modern browsers do that any more.
So far, your server is not even remotely involved.
If you're a large ISP user (TimeWarner, Comcast) and use their DNS it's also possible the ISP will intercept your failed DNS lookup and route the request to a "helpful" search page (i.e. SPAM) of their own.
At this point the request is still nowhere near your server.
I suppose you could mess with the /etc/hosts file on your local system to resolve "monitor" to the proper hostname, but that's an extremely brittle solution that has to be hard coded on each machine you want to have this "shortcut" link (and which breaks when the hostname changes).
You're much better off just setting up a web shortcut in your browser that points to the right place.

Strange domains in mod_pagespeed cache folder

About a year ago I have installed mod_pagespeed on my VPS server, set it up and left it running. Recently I was exploring files on my server, went to pagespeed cache folder and discovered some strange folders.
All folders usually named this way ,2Fwww.mydomain.com or ,2F111.111.111.111 for IP addresses. I was surprised to see some domains that does not belong to me, like:
24x7-allrequestsallowed.com
allrequestsallowed.com
m.odnoklassniki.ru
www.fbi.gov
www.securitylab.ru
It looks like something dodgy is going on, was my server compromised, is there any reasonable explanation?
That does look peculiar. Everything in the cache folder should be files that mod_pagespeed tried to rewrite. There are two ways that I know of that this can happen:
1) You reference some third-party resource (say an image from another domain, or google analytics script) and you have explicitly enabled rewriting of that domain with ModPagespeedDomain www.example.com or ModPagespeedDomain *.
2) If your server accepts HTTP requests with invalid Host headers. Try (for example) wget --header="Host: www.fbi.gov" www.yourdomain.com/foo/bar.html. If your server accepts requests like that it may be providing mod_pagespeed with an incorrect base domain, and then subresources would be fetched from the same domain (so if www.yourdomain.com/foo/bar.html references some.jpeg, and your server accepts invalid host headers, we could fetch www.fbi.gov/foo/some.jpeg as the resource). There was a recent security release that makes sure all of these subrequests are done against localhost (not arbitrary third-party websites). Please see: https://developers.google.com/speed/docs/mod_pagespeed/CVE-2012-4001
You might want to look through these folders and see what specific resources are in there. I think that the biggest concern you should have is that someone might be trying to perform an XSS attack on your users or maybe a DDoS attack against another website (like www.fbi.gov), using your server as one vector. I do not think that these folders are indicative that your server itself is compromised.
If you would like to discuss this more, https://groups.google.com/forum/?fromgroups#!forum/mod-pagespeed-discuss is a good list to join and email.