Difference in website display depending on domain - apache

I'm getting slightly different display of a website depending on which URL I use to access it (two different servers, both serving the same files). One looks "thinner" than the other in Firefox 3.0 (no discernible difference in IE)
For example, compare:
http://www.ece.ualberta.ca/support/
and
http://www1.ece.ualberta.ca/support/
This is not a major issue, but I just noticed this and am extremely curious as to what could cause it. Is it some kind of Firefox bug? I haven't yet tried the newest version.
EDIT: My bad for assuming those URL's were actually serving the same content (it's not my server, but I do attend that school). Comparing:
http://www.ece.ualberta.ca/~ecegsa/links.html (it seems this server is down atm) and http://www1.ece.ualberta.ca/~ecegsa/links.html
shows the same issue, but the HTML is identical according to diff run on saved files. I don't see the problem on anything other than FF 3.0 at my work, so I'm guessing it's some idiosyncrasy with that browser. Still curious though.

Looking briefly at those two URLs, they're running different HTML!
For example, http://www.ece.ualberta.ca/support/ has this text:
Windows Vista/7 (volume license)
Activation
While http://www1.ece.ualberta.ca/support/ has this text:
Windows Vista (volume license)
Activation
I suspect that different HTML accounts for the difference you're seeing.
If these are actually the same servers hosting the same content, this kind of disparity could be caused by intermediate caches (e.g. proxies, CDN's, etc.) refreshing at different rates. For example, if www points to a load-balancing, caching proxy and www1 points directly to the host, this may cause the difference. You might also be seeing a bug or lag in how content is updated to different servers in a load-balanced cluster.

Related

How to maintain good PageSpeed Rank with external services

So I'm grappling with brutal ranking degradation due to external services used in client sites. I've pretty much done everything I feel I have control over, including resolving render-blocking problems.
But one thing that runs like a red thread through all sites is that YSlow and PageSpeed get stuck at rankings in the D range at best because of browser caching and redirect advisories for external resources, including googles own analytics.js.
Now I know that some of these resources might be able to be moved locally, but often, especially in the case of redirect-chains - it seems like a daunting task.
Here's an example for an insane redirect chain:
https://d.adroll.com/cm/b/out
https://x.bidswitch.net/sync?dsp_id=44&user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM
https://x.bidswitch.net/ul_cb/sync?dsp_id=44&user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM
https://t.brand-server.com/match_back?bidder_id=4&external_user_id=da20ac56-bf05-4acc-8df2-2e92ceb9f4da
https://t.brand-server.com/ul_cb/match_back?bidder_id=4&external_user_id=da20ac56-bf05-4acc-8df2-2e92ceb9f4da
https://match.adsrvr.org/track/cmf/generic?ttd_pid=centro&ttd_tpi=1
https://match.adsrvr.org/track/cmb/generic?ttd_pid=centro&ttd_tpi=1
https://t.brand-server.com/match_back?bidder_id=1&external_user_id=09d385a1-bd5e-4dc0-84fb-1afdf83f1892
https://secure.adnxs.com/getuid?https://t.brand-server.com/match_back?bidder_id=3&external_user_id=$UID
https://t.brand-server.com/match_back?bidder_id=3&external_user_id=8261031581142479988
https://pixel-a.sitescout.com/dmp/pixelSync?nid=35
https://bcp.crwdcntrl.net/map/c=1389/tp=STSC/tpid=8157edd8-d80d-432e-bf0b-47234df4942c?https%3A%2F%2Fsu.addthis.com%2Fred%2Fusync%3Fpid%3D11185%26puid%3D8157edd8-d80d-432e-bf0b-47234df4942c%26url%3Dhttps%253A%252F%252Ft.brand-server.com%252Fmatch_back%253Fbidder_id%253D5%2526external_user_id%253D8157edd8-d80d-432e-bf0b-47234df4942c
https://bcp.crwdcntrl.net/map/ct=y/c=1389/tp=STSC/tpid=8157edd8-d80d-432e-bf0b-47234df4942c?https%3A%2F%2Fsu.addthis.com%2Fred%2Fusync%3Fpid%3D11185%26puid%3D8157edd8-d80d-432e-bf0b-47234df4942c%26url%3Dhttps%253A%252F%252Ft.brand-server.com%252Fmatch_back%253Fbidder_id%253D5%2526external_user_id%253D8157edd8-d80d-432e-bf0b-47234df4942c
https://su.addthis.com/red/usync?pid=11185&puid=8157edd8-d80d-432e-bf0b-47234df4942c&url=https%3A%2F%2Ft.brand-server.com%2Fmatch_back%3Fbidder_id%3D5%26external_user_id%3D8157edd8-d80d-432e-bf0b-47234df4942c
https://t.brand-server.com/match_back?bidder_id=5&external_user_id=8157edd8-d80d-432e-bf0b-47234df4942c
Here's some caching/expires headers warnings:
https://fonts.googleapis.com/css?family=Oswald:400,700
http://cdn.searchspring.net/ajax_search/sites/742gv8/js/742gv8.js
http://cdn.searchspring.net/ajax_search/js/searchspring-catalog.min.js
http://cdn.searchspring.net/autocomplete/searchspring-autocomplete.min.js
http://www.googleadservices.com/pagead/conversion.js
https://seal-stlouis.bbb.org/seals/blue-seal-200-42-miniaturemarketllc-310240951.png
https://fonts.googleapis.com/css?family=Open+Sans:600,700,400,300
https://trustlogo.com/trustlogo/javascript/trustlogo.js
https://connect.facebook.net/en_US/fbevents.js
http://www.googletagmanager.com/gtm.js?id=GTM-5RMBM2
http://tag.perfectaudience.com/serve/51dc7c34a84900f9d3000002.js
http://a.adroll.com/j/roundtrip.js
https://www.facebook.com/tr/?id=1860590510836052&ev=PageView&dl=http%3A%2F%2Fwww.miniaturemarket.com%2F&rl=&if=false&ts=1480458368216&v=2.5.0
https://www.facebook.com/tr/?id=1610476729247227&ev=PageView&dl=http%3A%2F%2Fwww.miniaturemarket.com%2F&rl=&if=false&ts=1480458368220&v=2.5.0
https://trustlogo.com/trustlogo/images/popup/seal_bg.gif
https://trustlogo.com/trustlogo/images/popup/warranty_level.gif
https://www.google-analytics.com/analytics.js
https://pixel-geo.prfct.co/tagjs?check_cookie=1&a_id=3045&source=js_tag
https://www.google-analytics.com/plugins/ua/ec.js
https://s.adroll.com/pixel/P3MVZ4FMVNG67LVRKHALEV/CSFUSWFLCFBNTBB2REH2EP/V42TOE4T75HOHDQUCEXVPV.js
http://pixel-geo.prfct.co/seg/?add=842026,3277058&source=js_tag&a_id=3045
https://www.facebook.com/tr?id=1610476729247227&ev=ViewContent&cd[rtb_id]=3277058&noscript=1
https://www.facebook.com/tr?id=1610476729247227&ev=ViewContent&cd[rtb_id]=842026&noscript=1
https://www.facebook.com/tr/?id=1638890983076166&ev=PageView&dl=http%3A%2F%2Fwww.miniaturemarket.com%2F&rl=&if=false&ts=1480458369206&cd[segment_eid]=%5B%22V42TOE4T75HOHDQUCEXVPV%22%5D&v=2.5.0
https://analytics.twitter.com/i/adsct?p_id=48571&p_user_id=pa_HjjM3Ntt5wLVRxjwi
https://image2.pubmatic.com/AdServer/Pug?vcode=bz0yJnR5cGU9MSZjb2RlPTMyNDMmdGw9MTI5NjAw&piggybackCookie=uid:pa_HjjM3Ntt5wLVRxjwi
https://www.facebook.com/fr/u.php?p=292157157590619&m=pa_HjjM3Ntt5wLVRxjwi
https://www.facebook.com/fr/u.php?t=2592000&p=443937282305007&m=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM
https://analytics.twitter.com/i/adsct?p_user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM&p_id=823423
https://pixel-geo.prfct.co/cb?partnerId=goo
https://d.adroll.com/cm/g/in?google_ula=1535926,0
https://pixel.rubiconproject.com/tap.php?cookie_redirect=1&v=194538&nid=3644&put=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM&expires=365
https://us-u.openx.net/w/1.0/sd?cc=1&id=537114372&val=pa_HjjM3Ntt5wLVRxjwi
https://dsum-sec.casalemedia.com/rum?cm_dsp_id=105&external_user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM&expiration=1511994369&C=1
https://us-u.openx.net/w/1.0/sd?cc=1&id=537103138&val=5f606fcc54e1deae156ff683b26f9f0c
https://pixel.rubiconproject.com/tap.php?cookie_redirect=1&v=189868&nid=4106&expires=30&put=pa_HjjM3Ntt5wLVRxjwi
https://idsync.rlcdn.com/377928.gif?partner_uid=5f606fcc54e1deae156ff683b26f9f0c&redirect=1
https://pixel.prfct.co/seg/?add=695885
https://pixel.prfct.co/cb?partnerId=mrin
http://ib.adnxs.com/mapuid?member=364&user=11465672070136222257
https://t.brand-server.com/match_back?bidder_id=5&external_user_id=8157edd8-d80d-432e-bf0b-47234df4942c
It would seem that being able to do something about this would drastically improve the score as it's about the only thing left to fix. So my question is - what can be done about it?
Has anyone tried solutions like rewriting the urls after source generate through a proxy that covers the redirect chain or fetches the resources and passes them through with modified headers?
Is it at all worth it or are these page scores just to be ignored?
What are plausible alterntives?
Apologies for a loaded question...
I've tried dealing with the same issue in the past, and although I couldn't find a good solution for the redirect chains and caching issue, there are few guidelines I follow:
Think carefully on every external resource - do you really need it? Do you really have to load it before any user interactions? If so - can it be fetched from your server instead? In your example, I would copy the Google fonts to my server.
Never #import stuff in CSS. It's slow by definition as it requires at least two sequential HTTP requests. Try importing as many resources as possible in the or just before , or precompile using LESS or SCSS.
Ignore widely used resources such as Google Analytics. It's improbable that Google will lower your site's ranking just because you use them. In any case, always look for the "async" implementation of the services you are using.
Hope this helps.

Force a page to be re-downloaded, rather than fetched from browser cache - Apache Server

Ive made a minor text change to our website, the change is minor in that its a couple of words, but the meaning is quite significant.
I want all users (both new and returning) to the site to see the new text rather than any cached versions, is there a way i can force a user (new or returning) to re download the page, rather than fetch it from their browser cache ?
The site is a static html site hosted on a LAMP server.
This depends totally on how your webserver has caching set up but, in short, if it's already cached then you cannot force a download again until the cache expires. So you'll need to look at your cache headers in your browsers developer tools to see how long it's been set for.
Caching gives huge performance benefits and, in my opinion, really should be used. However that does mean you've a difficulty in forcing a refresh as you've discovered.
In case you're interested in how to handle this in the future, there are various cache busting methods, all of which basically involve changing the URL to fool the browser into thinking its a different resource and forcing the download.
For example you can add a version number to a resource so you ask for so instead of requesting index.html the browser asks for index2.html, but that could mean renaming the file and all references to it each time.
You can also set up rewrites in Apache using regular expressions so that index[0-9]*.html actually loads index.html so you don't need multiple copies of the file but can refer to it as index2.html or index3.html or even index5274.html and Apache will always serve the contents of index.html.
These methods, though a little complicated to maintain unless you have an automated build process, work very well for resources that users don't see. For example css style sheets or JavaScript.
Cache busting techniques work less well for HTML pages themselves for a number of reasons: 1) they create unfriendly urls, 2) they cannot be used for default files where the file name itself is not specified (e.g. the home page) and 3) without changing the source page, your browser can't pick up the new URLs. For this reason some sites turn off caching for the HTML pages, so they are always reloaded.
Personally I think not caching HTML pages is a lost opportunity. For example visitors often visit a site's home page, and then try a few pages, going back to the home page in between. If you have no caching then the pages will be reloaded each time despite the fact it's likely not to have changed in between. So I prefer to have a short expiry and just live with the fact I can't force a refresh during that time.

IE11 - Getting a webpage to STOP showing in Enterprisemode

This is an odd question, in that I'm not trying to display a page in EnterpriseMode - I'm trying to prevent it from displaying in EnterpriseMode. I'm assisting the Webserver team, so my access is limited to only changes in the page itself.
The twist is that the rest of the domain has to be displayed in EnterpriseMode, save for this one page.
I've tried utilizing an XML document and changing HKLM\software\microsoft\internet explorer\main\enterprisemode -- setting SiteList to my file location on the local machine, and Enabled to blank. The page ignores this and loads itself into EnterpriseMode anyways.
Example of my Site.XML. Note: I've changed the server name to protect the innocent. Also I'm having to use the escape characters so the note quits trying to interact with my example. I could've sworn code block should've stopped that.
<rules version="1">
<emie>
<domain exclude="false">internalportal.ExampleServer.com<path exclude="true">/OperationsRecap/</path></domain>
</emie>
</rules>
I've tried the same thing in the HKCU key, and even checked gredit for anything that might be pushing it to default. No such luck. This should be a fairly simple procedure, but it's stumping me. I'm starting to wonder if the Webserver team has a customHeader stuck in web.config, but I don't have access and I've been waiting for an answer from them for a few days now. And by 'waiting' I mean 'continually hounding'.
Compatibility mode doesn't seem to make a difference, whether its on or off. I've several sites with different settings that get the same problem - and then several sites with different settings that do not get that problem. There does not appear to be a rhyme or reason in terms of configuration on the local machines. So while it's tempting to call it an issue with IIS7 web.config and dust my hands of the whole thing, I have to be absolutely certain.
I've dug at the source code, and literally the only difference is in the META tag. Those that load correctly load X-UA-Compatible as IE=Edge, like they're supposed too. Those that do not load as IE=8, despite all my attempts to force them to stop that. In fact, when it fails to load I can go to tools on the IE11, de-select EnterpriseMode, and it reloads just fine. The META tag changes as well in the source. Again, whether compatibility mode is on or off, whether there's a list in play, utterly ignoring any changes I make to EnterpriseMode key.
Thoughts?
Found the answer. I was looking in HKLM\software\microsoft\internet explorer\main\EnterpriseMode
I should have been looking in hklm\software\policies\microsoft\internet explorer\main\EnterpriseMode
Lesson learned, stupid mistake.

What's different between HTML and WML/WAP?

I checked the source of a few WAP sites,
but doesn't find anything different from a normal HTML page.
Can you name a few detailed points?
Well, WAP/WML is very strict when it comes to markup because the page needs to be compiled before delivery to the client device.
As for specifics,
WAP "pages" can have more than one "card". (Confusing? I know...)
Although not markup related, accepted image formats are more limited
Do not forget a DOCTYPE!
Content must be served with the text/vnd.wap.wml MIME type
WAP 1 has almost nothing in common with the HTML/CSS/JS/server-side-scripting stack. The only connection it has with the larger web is that telco gateways use HTTP to request WML content from a normal web server. WML is an old-fashioned and ugly ‘card’-based hypertext system which everyone hated, largely failed in the market and is long gone (thank goodness).
The misleadingly-named “WAP 2”, on the other hand is just XHTML Mobile Profile (a somewhat limited subset of HTML); everything else about it is the same as the normal web stack. This makes it much easier to work with: it's possible to generate content for desktop and phones from the same templates. You may also see ‘i-XHTML’, which is a similar HTML-subset used by Docomo phones.
Either way, modern smartphones are happy rendering normal desktop-style [X]HTML, so you're not going to have to worry about any of this in the future. (Sure, there are compatibility issues, but that's nothing new, right?)
Here is a table with some information about the differences : http://csc.colstate.edu/summers/Research/Wireless/WAPvsWeb.html
Another difference being WAP is almost if not totally dead, and HTML is kicking ##S :-)

Web site migration and differences in firebug time profiles

I have a php web site under apache (at enginehosting.com). I rewrote it in asp.net MVC and installed it at discountasp.net. I am comparing response times with firebug.
Here is the old time profile:
Here is the new one:
Basically, I get longer response times with the new site (not obvious on the pictures I posted here but in average yes with sometimes a big difference like 2s for the old site and 9s for the new one) and images appear more progressively (as opposed to almost instantly with the old site). Moreover, the time profile is completely different. As you can see on the second picture, there is a long time passed in DNS search and this happens for images only (the raw html is even faster on the new site). I thought that once a url has been resolved, then it would be applied for all subsequent requests...
Also note that since I still want to keep my domain pointed on the old location while I'm testing, my new site is under a weird URL like myname.web436.discountasp.net. Could it be the reason? Otherwise, what else?
If this is more a serverfault question, feel free to move it.
Thanks
Unfortunately you're comparing apples and oranges here. The test results shown are of little use because you're trying to compare the performance of an application written using a different technology AND on a different hosting company's shared platform.
We could speculate any number of reasons why there may be a difference:
ASP.NET MVC first hit and lag due to warmup and compilation
The server that you're hosting on at DiscountASP may be under heavy load
The server at EngineHosting may be under utilised
The bandwidth available at DiscountASP may be under contention
You perhaps need to profile and optimise your code
...and so on.
But until you benchmark both applications on the same machine you're not making a proper scientific comparison and are going to be pulling a straws.
Finally, ignore the myname.web436.discountasp.net url, that's just a host name/header DiscountASP and many other hosters add so you can test your site if you're waiting for a domain to be transferred/registered, or for a DNS propagation of the real domain name to complete. You usually can't use the IP addresse of your site because most shared hosters share a single IP address across multiple sites on the same server and use HTTP Host Headers.