How to maintain good PageSpeed Rank with external services - seo

So I'm grappling with brutal ranking degradation due to external services used in client sites. I've pretty much done everything I feel I have control over, including resolving render-blocking problems.
But one thing that runs like a red thread through all sites is that YSlow and PageSpeed get stuck at rankings in the D range at best because of browser caching and redirect advisories for external resources, including googles own analytics.js.
Now I know that some of these resources might be able to be moved locally, but often, especially in the case of redirect-chains - it seems like a daunting task.
Here's an example for an insane redirect chain:
https://d.adroll.com/cm/b/out
https://x.bidswitch.net/sync?dsp_id=44&user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM
https://x.bidswitch.net/ul_cb/sync?dsp_id=44&user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM
https://t.brand-server.com/match_back?bidder_id=4&external_user_id=da20ac56-bf05-4acc-8df2-2e92ceb9f4da
https://t.brand-server.com/ul_cb/match_back?bidder_id=4&external_user_id=da20ac56-bf05-4acc-8df2-2e92ceb9f4da
https://match.adsrvr.org/track/cmf/generic?ttd_pid=centro&ttd_tpi=1
https://match.adsrvr.org/track/cmb/generic?ttd_pid=centro&ttd_tpi=1
https://t.brand-server.com/match_back?bidder_id=1&external_user_id=09d385a1-bd5e-4dc0-84fb-1afdf83f1892
https://secure.adnxs.com/getuid?https://t.brand-server.com/match_back?bidder_id=3&external_user_id=$UID
https://t.brand-server.com/match_back?bidder_id=3&external_user_id=8261031581142479988
https://pixel-a.sitescout.com/dmp/pixelSync?nid=35
https://bcp.crwdcntrl.net/map/c=1389/tp=STSC/tpid=8157edd8-d80d-432e-bf0b-47234df4942c?https%3A%2F%2Fsu.addthis.com%2Fred%2Fusync%3Fpid%3D11185%26puid%3D8157edd8-d80d-432e-bf0b-47234df4942c%26url%3Dhttps%253A%252F%252Ft.brand-server.com%252Fmatch_back%253Fbidder_id%253D5%2526external_user_id%253D8157edd8-d80d-432e-bf0b-47234df4942c
https://bcp.crwdcntrl.net/map/ct=y/c=1389/tp=STSC/tpid=8157edd8-d80d-432e-bf0b-47234df4942c?https%3A%2F%2Fsu.addthis.com%2Fred%2Fusync%3Fpid%3D11185%26puid%3D8157edd8-d80d-432e-bf0b-47234df4942c%26url%3Dhttps%253A%252F%252Ft.brand-server.com%252Fmatch_back%253Fbidder_id%253D5%2526external_user_id%253D8157edd8-d80d-432e-bf0b-47234df4942c
https://su.addthis.com/red/usync?pid=11185&puid=8157edd8-d80d-432e-bf0b-47234df4942c&url=https%3A%2F%2Ft.brand-server.com%2Fmatch_back%3Fbidder_id%3D5%26external_user_id%3D8157edd8-d80d-432e-bf0b-47234df4942c
https://t.brand-server.com/match_back?bidder_id=5&external_user_id=8157edd8-d80d-432e-bf0b-47234df4942c
Here's some caching/expires headers warnings:
https://fonts.googleapis.com/css?family=Oswald:400,700
http://cdn.searchspring.net/ajax_search/sites/742gv8/js/742gv8.js
http://cdn.searchspring.net/ajax_search/js/searchspring-catalog.min.js
http://cdn.searchspring.net/autocomplete/searchspring-autocomplete.min.js
http://www.googleadservices.com/pagead/conversion.js
https://seal-stlouis.bbb.org/seals/blue-seal-200-42-miniaturemarketllc-310240951.png
https://fonts.googleapis.com/css?family=Open+Sans:600,700,400,300
https://trustlogo.com/trustlogo/javascript/trustlogo.js
https://connect.facebook.net/en_US/fbevents.js
http://www.googletagmanager.com/gtm.js?id=GTM-5RMBM2
http://tag.perfectaudience.com/serve/51dc7c34a84900f9d3000002.js
http://a.adroll.com/j/roundtrip.js
https://www.facebook.com/tr/?id=1860590510836052&ev=PageView&dl=http%3A%2F%2Fwww.miniaturemarket.com%2F&rl=&if=false&ts=1480458368216&v=2.5.0
https://www.facebook.com/tr/?id=1610476729247227&ev=PageView&dl=http%3A%2F%2Fwww.miniaturemarket.com%2F&rl=&if=false&ts=1480458368220&v=2.5.0
https://trustlogo.com/trustlogo/images/popup/seal_bg.gif
https://trustlogo.com/trustlogo/images/popup/warranty_level.gif
https://www.google-analytics.com/analytics.js
https://pixel-geo.prfct.co/tagjs?check_cookie=1&a_id=3045&source=js_tag
https://www.google-analytics.com/plugins/ua/ec.js
https://s.adroll.com/pixel/P3MVZ4FMVNG67LVRKHALEV/CSFUSWFLCFBNTBB2REH2EP/V42TOE4T75HOHDQUCEXVPV.js
http://pixel-geo.prfct.co/seg/?add=842026,3277058&source=js_tag&a_id=3045
https://www.facebook.com/tr?id=1610476729247227&ev=ViewContent&cd[rtb_id]=3277058&noscript=1
https://www.facebook.com/tr?id=1610476729247227&ev=ViewContent&cd[rtb_id]=842026&noscript=1
https://www.facebook.com/tr/?id=1638890983076166&ev=PageView&dl=http%3A%2F%2Fwww.miniaturemarket.com%2F&rl=&if=false&ts=1480458369206&cd[segment_eid]=%5B%22V42TOE4T75HOHDQUCEXVPV%22%5D&v=2.5.0
https://analytics.twitter.com/i/adsct?p_id=48571&p_user_id=pa_HjjM3Ntt5wLVRxjwi
https://image2.pubmatic.com/AdServer/Pug?vcode=bz0yJnR5cGU9MSZjb2RlPTMyNDMmdGw9MTI5NjAw&piggybackCookie=uid:pa_HjjM3Ntt5wLVRxjwi
https://www.facebook.com/fr/u.php?p=292157157590619&m=pa_HjjM3Ntt5wLVRxjwi
https://www.facebook.com/fr/u.php?t=2592000&p=443937282305007&m=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM
https://analytics.twitter.com/i/adsct?p_user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM&p_id=823423
https://pixel-geo.prfct.co/cb?partnerId=goo
https://d.adroll.com/cm/g/in?google_ula=1535926,0
https://pixel.rubiconproject.com/tap.php?cookie_redirect=1&v=194538&nid=3644&put=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM&expires=365
https://us-u.openx.net/w/1.0/sd?cc=1&id=537114372&val=pa_HjjM3Ntt5wLVRxjwi
https://dsum-sec.casalemedia.com/rum?cm_dsp_id=105&external_user_id=NWY2MDZmY2M1NGUxZGVhZTE1NmZmNjgzYjI2ZjlmMGM&expiration=1511994369&C=1
https://us-u.openx.net/w/1.0/sd?cc=1&id=537103138&val=5f606fcc54e1deae156ff683b26f9f0c
https://pixel.rubiconproject.com/tap.php?cookie_redirect=1&v=189868&nid=4106&expires=30&put=pa_HjjM3Ntt5wLVRxjwi
https://idsync.rlcdn.com/377928.gif?partner_uid=5f606fcc54e1deae156ff683b26f9f0c&redirect=1
https://pixel.prfct.co/seg/?add=695885
https://pixel.prfct.co/cb?partnerId=mrin
http://ib.adnxs.com/mapuid?member=364&user=11465672070136222257
https://t.brand-server.com/match_back?bidder_id=5&external_user_id=8157edd8-d80d-432e-bf0b-47234df4942c
It would seem that being able to do something about this would drastically improve the score as it's about the only thing left to fix. So my question is - what can be done about it?
Has anyone tried solutions like rewriting the urls after source generate through a proxy that covers the redirect chain or fetches the resources and passes them through with modified headers?
Is it at all worth it or are these page scores just to be ignored?
What are plausible alterntives?
Apologies for a loaded question...

I've tried dealing with the same issue in the past, and although I couldn't find a good solution for the redirect chains and caching issue, there are few guidelines I follow:
Think carefully on every external resource - do you really need it? Do you really have to load it before any user interactions? If so - can it be fetched from your server instead? In your example, I would copy the Google fonts to my server.
Never #import stuff in CSS. It's slow by definition as it requires at least two sequential HTTP requests. Try importing as many resources as possible in the or just before , or precompile using LESS or SCSS.
Ignore widely used resources such as Google Analytics. It's improbable that Google will lower your site's ranking just because you use them. In any case, always look for the "async" implementation of the services you are using.
Hope this helps.

Related

How to properly use a CDN?

Good evening everyone! Thank you for opening this post.
I currently bought myself the ProCDN from MediaTemple (basically EdgeCast) and have setup a CDN where now I go to cdn-small.DOMAIN.com (or cdn-large.DOMAIN.com) it loads the normal website just fine...
However, I'm not sure which one to use.. Would I use this for the whole complete site to optimize, or use the links to add one by one for each script/stylesheet based on file size? (e.g. All JS/CSS will have the cdn-small while anything larger such as 300kb will have the cdn-large link)
And to say, if the correct way is to load the whole site as one link (e.g. everything is linked normally like js/jquery.js instead of a full link like https://cdn-small.domain.com/js/jquery.js).. Would I set a redirect from DOMAIN.com to cdn-small.DOMAIN.com for the best loading and that they only need to type in the domain not the full sub-CDN-domain?
Apologize if this isn't making sense or anything, but trying to do my best. To put it much more simple terms again is that I'm trying to find the best way to use my cdn-small/cdn-large for my website by having the user enter in the domain (https:// or http://) normally to serve my content as fast as possible near the user.
Kindly appreciate your time for reading this and wish you all a positive weekend.
Here is my live site if it even matters or want to experiement; http://bit.ly/1eGCShX

How RESTful is using subdomains as resource identifiers?

We have a single-page app (AngularJs) which interacts with the backend using REST API. The app allows each user to see information about the company the user works at, but not any other company's data. Our current REST API looks like this:
domain.com/companies/123
domain.com/companies/123/employees
domain.com/employees/987
NOTE: All ids are GUIDs, hence the last end-point doesn't have company id, just the employee id.
We recently started working on enforcing the requirement of each user having access to information related exclusively the company where the user works. This means that on the backend we need to track who the logged in user is (which is simple auth problem) as well as determining the company whose information is being accessed. The latter is not easy to determine from our REST API calls, because some of them do not include company id, such as the last one shown above.
We decided that instead of tracking company ID in the UI and sending it with each request, we would put it in the subdomain. So, assuming that ACME company has id=123 our API would change as follows:
acme.domain.com
acme.domain.com/employees
acme.domain.com/employees/987
This makes identifying the company very easy on the backend and requires minor changes to REST calls from our single-page app. However, my concern is that it breaks the RESTfulness of our API. This may also introduce some CORS problems, but I don't have a use case for it now.
I would like to hear your thoughts on this and how you dealt with this problem in the past.
Thanks!
In a similar application, we did put the 'company id' into the path (every company-specific path), not as a subdomain.
I wouldn't care a jot about whether some terminology enthusiast thought my design was "RESTful
" or not, but I can see several disadvantages to using domains, mostly stemming from the fact that the world tends to assume that the domain identifies "the server", and the path is how you find an item on that server. There will be a certain amount of extra stuff you'll have to deal with with multiple domains which you wouldn't with paths:
HTTPS - you'd need a wildcard certificate instead of a simple one
DNS - you're either going to have wildcard DNS entries, or your application management is now going to involve DNS management
All the CORS stuff which you mention - may or may not be a headache in your specific application - anything which is making 'same domain' assumptions about security policy is going to be affected.
Of course, if you want lots of isolation between companies, and effectively you would be as happy running a separate server for each company, then it's not a bad design. I can't see it's more or less RESTful, as that's just a matter of viewpoint.
There is nothing "unrestful" in using subdomains. URIs in REST are opaque, meaning that you don't really care about what the URI is, but only about the fact that every single resource in the system can be identified and referenced independently.
Also, in a RESTful application, you never compose URLs manually, but you traverse the hypermedia links you find at the API endpoint and in all the returned responses. Since you don't need to manually compose URIs, from the REST point of view it's indifferent how they look. Having a URI such as
//domain.com/ABGHTYT12345H
would be as RESTful as
//domain.com/companies/acme/employees/123
or
//domain.com/acme/employees/smith-charles
or
//acme.domain.com/employees/123
All of those are equally RESTful.
But... I like to think of usable APIs, and when it comes to usability having readable meaningful URLs is a must for me. Also following conventions is a good idea. In your particular case, there is nothing unrestful with the route, but it is unusual to find that kind of behaviour in an API, so it might not be the best practice. Also, as someone pointed out, it might complicate your development (Not specifically on the CORS part though, that one is easily solved by sending a few HTTP headers)
So, even if I can't see anything non REST on your proposal, the conventions elsewhere would be against subdomains on an API.

SEO Question, and about Server.Transfer (Asp.net)

So, we're trying to up our application in the rankings in the search engines, and one way our SEO guy told us to do that was to register similar domains...for example we have something like
http://www.myapplication.com/parks.html
so..we acquired the domain parks.com (again just an example).
Now when people go to http://www.parks.com ...we want it to display the content of http://www.myapplication.com/parks.html.
I could just put a forwarding page there, but from what i've been told that makes us look bad because it's technically a permanent redirect..and we're trying to get higher in the search engine rankings, not lower.
Is this a situation where we would use the Server.Transfer method of ASP.net?
How are situations like this handled, because I've defiantly seen this done by many websites.
We also don't want to cheat the system, we are showing relevant content and not spam or tricking customers in anyway, so the proper way to do achieve what i'm looking for would be great.
Thanks
Use your "similar" domain names to host individual and targetted landing pages that will point to your master content.
It's easier to manage and you will get a higher conversion rate.
Having to create individual page will force you to write relevent content and will increase the popularity of the page.
I also suggest you to not only build landing pages, but mini sites (of few pages).
SEO is sa very high demanding task.
Regarding technical aspects: Server.Transfer is what you should use. Never use Response.Redirect, Google and other search engines will drop your ranking.
I used permanent URL rewrite in the past. I changed my website and since lots of traffic was coming from others website linking mine, I wanted to have a permanent solution.
Read more about URL rewriting : http://msdn.microsoft.com/en-us/library/ms972974.aspx

What's the best practice , using subdomains, archive SEO , keep the system scalable , and isolate the applications?

We are developing a website quite similar with ebay.com and in order to upgrade/maintain it without much effort we decided to split/isolate different parts of the website like ebay does too (e.g the item page/application will be served from cgi.domain.com , signin application from signin.domain.com, shopping cart application from offer.domain.com, search features from search.domain.com etc ). Each major application/function of the site will be deployed on a different server. Another reason for isolation the applications is the security.
I also need to mention that one application is deployed on google app engine .
However we received some "warnings" that this will affect the SEO dramatically so I have 2 questions :)
Is it true ? Do the subdomains decrease the pagerank of the website ?
If it's true how can we sort this out ? Should we use a different server which should act as a routing/proxy and make a kind of rewrite (e.g search.domain.com => domain.com/search etc) ?
What's the best practice to archive the simplicity/isolation of the applications + SEO + security + scalability in a website ?
Thank you in advance !
Search engines no longer see sub-domains as separate sites. This was new as around Sept 2011. Whether your link-juice carries over is another thing, and it's not really explained (as of yet). Here is a reference: http://searchengineland.com/new-google-classifies-subdomains-as-internal-links-within-webmaster-tools-91401
No, multiple subdomains will not decrease page rank of the main website. However, they don't contribute to page rank either (because the search engines see them as separate sites).
However, for the sort of site that you're working on, that looks like it would be OK. For example, the only thing you really want indexed is product listings anyway - you don't need it to index login, search results and stuff like that. Also, since external websites aren't going to link to your login pages or search results either (I assume they'll only link to product pages as well) then you don't really care about those other sites contributing to your page rank.
Personally, I think people put too much focus on making sites "SEO" friendly. As long as the site is user-friendly then SEO-friendly will follow as well.

Web site migration and differences in firebug time profiles

I have a php web site under apache (at enginehosting.com). I rewrote it in asp.net MVC and installed it at discountasp.net. I am comparing response times with firebug.
Here is the old time profile:
Here is the new one:
Basically, I get longer response times with the new site (not obvious on the pictures I posted here but in average yes with sometimes a big difference like 2s for the old site and 9s for the new one) and images appear more progressively (as opposed to almost instantly with the old site). Moreover, the time profile is completely different. As you can see on the second picture, there is a long time passed in DNS search and this happens for images only (the raw html is even faster on the new site). I thought that once a url has been resolved, then it would be applied for all subsequent requests...
Also note that since I still want to keep my domain pointed on the old location while I'm testing, my new site is under a weird URL like myname.web436.discountasp.net. Could it be the reason? Otherwise, what else?
If this is more a serverfault question, feel free to move it.
Thanks
Unfortunately you're comparing apples and oranges here. The test results shown are of little use because you're trying to compare the performance of an application written using a different technology AND on a different hosting company's shared platform.
We could speculate any number of reasons why there may be a difference:
ASP.NET MVC first hit and lag due to warmup and compilation
The server that you're hosting on at DiscountASP may be under heavy load
The server at EngineHosting may be under utilised
The bandwidth available at DiscountASP may be under contention
You perhaps need to profile and optimise your code
...and so on.
But until you benchmark both applications on the same machine you're not making a proper scientific comparison and are going to be pulling a straws.
Finally, ignore the myname.web436.discountasp.net url, that's just a host name/header DiscountASP and many other hosters add so you can test your site if you're waiting for a domain to be transferred/registered, or for a DNS propagation of the real domain name to complete. You usually can't use the IP addresse of your site because most shared hosters share a single IP address across multiple sites on the same server and use HTTP Host Headers.