Web site migration and differences in firebug time profiles - apache

I have a php web site under apache (at enginehosting.com). I rewrote it in asp.net MVC and installed it at discountasp.net. I am comparing response times with firebug.
Here is the old time profile:
Here is the new one:
Basically, I get longer response times with the new site (not obvious on the pictures I posted here but in average yes with sometimes a big difference like 2s for the old site and 9s for the new one) and images appear more progressively (as opposed to almost instantly with the old site). Moreover, the time profile is completely different. As you can see on the second picture, there is a long time passed in DNS search and this happens for images only (the raw html is even faster on the new site). I thought that once a url has been resolved, then it would be applied for all subsequent requests...
Also note that since I still want to keep my domain pointed on the old location while I'm testing, my new site is under a weird URL like myname.web436.discountasp.net. Could it be the reason? Otherwise, what else?
If this is more a serverfault question, feel free to move it.
Thanks

Unfortunately you're comparing apples and oranges here. The test results shown are of little use because you're trying to compare the performance of an application written using a different technology AND on a different hosting company's shared platform.
We could speculate any number of reasons why there may be a difference:
ASP.NET MVC first hit and lag due to warmup and compilation
The server that you're hosting on at DiscountASP may be under heavy load
The server at EngineHosting may be under utilised
The bandwidth available at DiscountASP may be under contention
You perhaps need to profile and optimise your code
...and so on.
But until you benchmark both applications on the same machine you're not making a proper scientific comparison and are going to be pulling a straws.
Finally, ignore the myname.web436.discountasp.net url, that's just a host name/header DiscountASP and many other hosters add so you can test your site if you're waiting for a domain to be transferred/registered, or for a DNS propagation of the real domain name to complete. You usually can't use the IP addresse of your site because most shared hosters share a single IP address across multiple sites on the same server and use HTTP Host Headers.

Related

How to maintain good PageSpeed Rank with external services

So I'm grappling with brutal ranking degradation due to external services used in client sites. I've pretty much done everything I feel I have control over, including resolving render-blocking problems.
But one thing that runs like a red thread through all sites is that YSlow and PageSpeed get stuck at rankings in the D range at best because of browser caching and redirect advisories for external resources, including googles own analytics.js.
Now I know that some of these resources might be able to be moved locally, but often, especially in the case of redirect-chains - it seems like a daunting task.
Here's an example for an insane redirect chain:
https://d.adroll.com/cm/b/out
https://x.bidswitch.net/sync?dsp_id=44&user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM
https://x.bidswitch.net/ul_cb/sync?dsp_id=44&user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM
https://t.brand-server.com/match_back?bidder_id=4&external_user_id=da20ac56-bf05-4acc-8df2-2e92ceb9f4da
https://t.brand-server.com/ul_cb/match_back?bidder_id=4&external_user_id=da20ac56-bf05-4acc-8df2-2e92ceb9f4da
https://match.adsrvr.org/track/cmf/generic?ttd_pid=centro&ttd_tpi=1
https://match.adsrvr.org/track/cmb/generic?ttd_pid=centro&ttd_tpi=1
https://t.brand-server.com/match_back?bidder_id=1&external_user_id=09d385a1-bd5e-4dc0-84fb-1afdf83f1892
https://secure.adnxs.com/getuid?https://t.brand-server.com/match_back?bidder_id=3&external_user_id=$UID
https://t.brand-server.com/match_back?bidder_id=3&external_user_id=8261031581142479988
https://pixel-a.sitescout.com/dmp/pixelSync?nid=35
https://bcp.crwdcntrl.net/map/c=1389/tp=STSC/tpid=8157edd8-d80d-432e-bf0b-47234df4942c?https%3A%2F%2Fsu.addthis.com%2Fred%2Fusync%3Fpid%3D11185%26puid%3D8157edd8-d80d-432e-bf0b-47234df4942c%26url%3Dhttps%253A%252F%252Ft.brand-server.com%252Fmatch_back%253Fbidder_id%253D5%2526external_user_id%253D8157edd8-d80d-432e-bf0b-47234df4942c
https://bcp.crwdcntrl.net/map/ct=y/c=1389/tp=STSC/tpid=8157edd8-d80d-432e-bf0b-47234df4942c?https%3A%2F%2Fsu.addthis.com%2Fred%2Fusync%3Fpid%3D11185%26puid%3D8157edd8-d80d-432e-bf0b-47234df4942c%26url%3Dhttps%253A%252F%252Ft.brand-server.com%252Fmatch_back%253Fbidder_id%253D5%2526external_user_id%253D8157edd8-d80d-432e-bf0b-47234df4942c
https://su.addthis.com/red/usync?pid=11185&puid=8157edd8-d80d-432e-bf0b-47234df4942c&url=https%3A%2F%2Ft.brand-server.com%2Fmatch_back%3Fbidder_id%3D5%26external_user_id%3D8157edd8-d80d-432e-bf0b-47234df4942c
https://t.brand-server.com/match_back?bidder_id=5&external_user_id=8157edd8-d80d-432e-bf0b-47234df4942c
Here's some caching/expires headers warnings:
https://fonts.googleapis.com/css?family=Oswald:400,700
http://cdn.searchspring.net/ajax_search/sites/742gv8/js/742gv8.js
http://cdn.searchspring.net/ajax_search/js/searchspring-catalog.min.js
http://cdn.searchspring.net/autocomplete/searchspring-autocomplete.min.js
http://www.googleadservices.com/pagead/conversion.js
https://seal-stlouis.bbb.org/seals/blue-seal-200-42-miniaturemarketllc-310240951.png
https://fonts.googleapis.com/css?family=Open+Sans:600,700,400,300
https://trustlogo.com/trustlogo/javascript/trustlogo.js
https://connect.facebook.net/en_US/fbevents.js
http://www.googletagmanager.com/gtm.js?id=GTM-5RMBM2
http://tag.perfectaudience.com/serve/51dc7c34a84900f9d3000002.js
http://a.adroll.com/j/roundtrip.js
https://www.facebook.com/tr/?id=1860590510836052&ev=PageView&dl=http%3A%2F%2Fwww.miniaturemarket.com%2F&rl=&if=false&ts=1480458368216&v=2.5.0
https://www.facebook.com/tr/?id=1610476729247227&ev=PageView&dl=http%3A%2F%2Fwww.miniaturemarket.com%2F&rl=&if=false&ts=1480458368220&v=2.5.0
https://trustlogo.com/trustlogo/images/popup/seal_bg.gif
https://trustlogo.com/trustlogo/images/popup/warranty_level.gif
https://www.google-analytics.com/analytics.js
https://pixel-geo.prfct.co/tagjs?check_cookie=1&a_id=3045&source=js_tag
https://www.google-analytics.com/plugins/ua/ec.js
https://s.adroll.com/pixel/P3MVZ4FMVNG67LVRKHALEV/CSFUSWFLCFBNTBB2REH2EP/V42TOE4T75HOHDQUCEXVPV.js
http://pixel-geo.prfct.co/seg/?add=842026,3277058&source=js_tag&a_id=3045
https://www.facebook.com/tr?id=1610476729247227&ev=ViewContent&cd[rtb_id]=3277058&noscript=1
https://www.facebook.com/tr?id=1610476729247227&ev=ViewContent&cd[rtb_id]=842026&noscript=1
https://www.facebook.com/tr/?id=1638890983076166&ev=PageView&dl=http%3A%2F%2Fwww.miniaturemarket.com%2F&rl=&if=false&ts=1480458369206&cd[segment_eid]=%5B%22V42TOE4T75HOHDQUCEXVPV%22%5D&v=2.5.0
https://analytics.twitter.com/i/adsct?p_id=48571&p_user_id=pa_HjjM3Ntt5wLVRxjwi
https://image2.pubmatic.com/AdServer/Pug?vcode=bz0yJnR5cGU9MSZjb2RlPTMyNDMmdGw9MTI5NjAw&piggybackCookie=uid:pa_HjjM3Ntt5wLVRxjwi
https://www.facebook.com/fr/u.php?p=292157157590619&m=pa_HjjM3Ntt5wLVRxjwi
https://www.facebook.com/fr/u.php?t=2592000&p=443937282305007&m=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM
https://analytics.twitter.com/i/adsct?p_user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM&p_id=823423
https://pixel-geo.prfct.co/cb?partnerId=goo
https://d.adroll.com/cm/g/in?google_ula=1535926,0
https://pixel.rubiconproject.com/tap.php?cookie_redirect=1&v=194538&nid=3644&put=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM&expires=365
https://us-u.openx.net/w/1.0/sd?cc=1&id=537114372&val=pa_HjjM3Ntt5wLVRxjwi
https://dsum-sec.casalemedia.com/rum?cm_dsp_id=105&external_user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM&expiration=1511994369&C=1
https://us-u.openx.net/w/1.0/sd?cc=1&id=537103138&val=5f606fcc54e1deae156ff683b26f9f0c
https://pixel.rubiconproject.com/tap.php?cookie_redirect=1&v=189868&nid=4106&expires=30&put=pa_HjjM3Ntt5wLVRxjwi
https://idsync.rlcdn.com/377928.gif?partner_uid=5f606fcc54e1deae156ff683b26f9f0c&redirect=1
https://pixel.prfct.co/seg/?add=695885
https://pixel.prfct.co/cb?partnerId=mrin
http://ib.adnxs.com/mapuid?member=364&user=11465672070136222257
https://t.brand-server.com/match_back?bidder_id=5&external_user_id=8157edd8-d80d-432e-bf0b-47234df4942c
It would seem that being able to do something about this would drastically improve the score as it's about the only thing left to fix. So my question is - what can be done about it?
Has anyone tried solutions like rewriting the urls after source generate through a proxy that covers the redirect chain or fetches the resources and passes them through with modified headers?
Is it at all worth it or are these page scores just to be ignored?
What are plausible alterntives?
Apologies for a loaded question...
I've tried dealing with the same issue in the past, and although I couldn't find a good solution for the redirect chains and caching issue, there are few guidelines I follow:
Think carefully on every external resource - do you really need it? Do you really have to load it before any user interactions? If so - can it be fetched from your server instead? In your example, I would copy the Google fonts to my server.
Never #import stuff in CSS. It's slow by definition as it requires at least two sequential HTTP requests. Try importing as many resources as possible in the or just before , or precompile using LESS or SCSS.
Ignore widely used resources such as Google Analytics. It's improbable that Google will lower your site's ranking just because you use them. In any case, always look for the "async" implementation of the services you are using.
Hope this helps.

How RESTful is using subdomains as resource identifiers?

We have a single-page app (AngularJs) which interacts with the backend using REST API. The app allows each user to see information about the company the user works at, but not any other company's data. Our current REST API looks like this:
domain.com/companies/123
domain.com/companies/123/employees
domain.com/employees/987
NOTE: All ids are GUIDs, hence the last end-point doesn't have company id, just the employee id.
We recently started working on enforcing the requirement of each user having access to information related exclusively the company where the user works. This means that on the backend we need to track who the logged in user is (which is simple auth problem) as well as determining the company whose information is being accessed. The latter is not easy to determine from our REST API calls, because some of them do not include company id, such as the last one shown above.
We decided that instead of tracking company ID in the UI and sending it with each request, we would put it in the subdomain. So, assuming that ACME company has id=123 our API would change as follows:
acme.domain.com
acme.domain.com/employees
acme.domain.com/employees/987
This makes identifying the company very easy on the backend and requires minor changes to REST calls from our single-page app. However, my concern is that it breaks the RESTfulness of our API. This may also introduce some CORS problems, but I don't have a use case for it now.
I would like to hear your thoughts on this and how you dealt with this problem in the past.
Thanks!
In a similar application, we did put the 'company id' into the path (every company-specific path), not as a subdomain.
I wouldn't care a jot about whether some terminology enthusiast thought my design was "RESTful
" or not, but I can see several disadvantages to using domains, mostly stemming from the fact that the world tends to assume that the domain identifies "the server", and the path is how you find an item on that server. There will be a certain amount of extra stuff you'll have to deal with with multiple domains which you wouldn't with paths:
HTTPS - you'd need a wildcard certificate instead of a simple one
DNS - you're either going to have wildcard DNS entries, or your application management is now going to involve DNS management
All the CORS stuff which you mention - may or may not be a headache in your specific application - anything which is making 'same domain' assumptions about security policy is going to be affected.
Of course, if you want lots of isolation between companies, and effectively you would be as happy running a separate server for each company, then it's not a bad design. I can't see it's more or less RESTful, as that's just a matter of viewpoint.
There is nothing "unrestful" in using subdomains. URIs in REST are opaque, meaning that you don't really care about what the URI is, but only about the fact that every single resource in the system can be identified and referenced independently.
Also, in a RESTful application, you never compose URLs manually, but you traverse the hypermedia links you find at the API endpoint and in all the returned responses. Since you don't need to manually compose URIs, from the REST point of view it's indifferent how they look. Having a URI such as
//domain.com/ABGHTYT12345H
would be as RESTful as
//domain.com/companies/acme/employees/123
or
//domain.com/acme/employees/smith-charles
or
//acme.domain.com/employees/123
All of those are equally RESTful.
But... I like to think of usable APIs, and when it comes to usability having readable meaningful URLs is a must for me. Also following conventions is a good idea. In your particular case, there is nothing unrestful with the route, but it is unusual to find that kind of behaviour in an API, so it might not be the best practice. Also, as someone pointed out, it might complicate your development (Not specifically on the CORS part though, that one is easily solved by sending a few HTTP headers)
So, even if I can't see anything non REST on your proposal, the conventions elsewhere would be against subdomains on an API.

mod_expires in apache htaccess

I am learning about apache and its various modules, currently i am confused about mod_expires. What i read so far is that using this module we can set future expiry header for static files so that browser need not to request them each time.
I am confused about the fact that if some one change css/js or any image file in between, how will browser come to know about it since we have already told the browser that this is not going to change say for next 1 year.
Thanks in advance
It may not be possible for all provided content on your HTTP server, but you can simply change the name of the file to update a file on the client side from the server. At that point, the browser will download the new content.
Sometimes, for websites with less traffic it is far more functional to set the cache to a much lower value.
An expiration of 365 days should always be used with caution, and the fact that you can set an expiration of 1 year does not mean you always have to do it. In other words, do not fall prey to premature optimization.
A good example of setting cache expiration to 1 year are countries' flags, which are not likely to change. Also, be aware that with a simple browser refresh of a page, the client can discard the local cache and download the content again from the origin.
A good and easy way of testing all this is to use Firefox with Firebug. With this extension, you can analyze requests and responses.
Here you can find the RFC specifications.

Difference in website display depending on domain

I'm getting slightly different display of a website depending on which URL I use to access it (two different servers, both serving the same files). One looks "thinner" than the other in Firefox 3.0 (no discernible difference in IE)
For example, compare:
http://www.ece.ualberta.ca/support/
and
http://www1.ece.ualberta.ca/support/
This is not a major issue, but I just noticed this and am extremely curious as to what could cause it. Is it some kind of Firefox bug? I haven't yet tried the newest version.
EDIT: My bad for assuming those URL's were actually serving the same content (it's not my server, but I do attend that school). Comparing:
http://www.ece.ualberta.ca/~ecegsa/links.html (it seems this server is down atm) and http://www1.ece.ualberta.ca/~ecegsa/links.html
shows the same issue, but the HTML is identical according to diff run on saved files. I don't see the problem on anything other than FF 3.0 at my work, so I'm guessing it's some idiosyncrasy with that browser. Still curious though.
Looking briefly at those two URLs, they're running different HTML!
For example, http://www.ece.ualberta.ca/support/ has this text:
Windows Vista/7 (volume license)
Activation
While http://www1.ece.ualberta.ca/support/ has this text:
Windows Vista (volume license)
Activation
I suspect that different HTML accounts for the difference you're seeing.
If these are actually the same servers hosting the same content, this kind of disparity could be caused by intermediate caches (e.g. proxies, CDN's, etc.) refreshing at different rates. For example, if www points to a load-balancing, caching proxy and www1 points directly to the host, this may cause the difference. You might also be seeing a bug or lag in how content is updated to different servers in a load-balanced cluster.

What's the best way to test a site which displays differently depending on the client location?

I am using an IP location lookup to display localised prices to customers depending on whether they are visiting from the UK, US or general EU and defaulting to the US price if the location can't be determined.
I could easily force the system to believe I'm from a specific country for testing but still there is no way of knowing for sure that it's displaying correctly when a visitor from abroad accesses my site. Is the use of some proxy the only viable way of testing a site like this? If so how would I go about tracking down one that I can use to test my site from various countries of origin?
You should be able to achieve that by using proxies. http://www.proxy4free.com/page1.html has a bunch. That site just came from a Google search; I've never used proxies like this before though, so there may be better sites out there.
This is not about how to test, but rather how you identify your visitors.
Instead of using IP-lookup to determine their geographical location, you should instead grab the information about the locale they use from the useragent string.
F.instance, I'm a norwegian, and when I go to useragent.org I see that my browser sends "nb-NO" as the language my machine uses.
You can easily use that to customize currency, dates etc on your site.
If the website is indexed in Google's cache, you can visit the google with the proper URL address. ex. http://www.google.co.uk/
And see if it's displaying properly in the cache.
#Frode:
Checking system locale in iseragent string might be misleading.
I go to Canada, and set my system locale as French. So it might show the user EU prices as opposed to showing US price. Many such cases are possible where locale wont give accurate info about the end users desired "price class" in this particular application mentioned.
=AD
If you want to use geo-ip location to detect a user's language, using a proxy probably is the best way to do so.
There are a lot of lists of open proxies on the web, mostly listed with the countries. Google has quite a lot of search results on this topic. Of the top results, I have used SamAir to test some stuff before.
Searching for a working open proxy with an acceptable speed in the correct country can be a tedious task. Also keep in mind that you should not use any these proxy servers to submit any sensitive data, because you never know who runs them. This could be a kinda trustworthy ISP (ie. not from GB ;D), a honeypot to collect data, or an illegal open proxy hosted by some trojan.