New to CloduFlare and confused with a few concept such as Page Rules.
Simply, my main aim is to make my WP blog loading faster, which is quite slow currently. I already signed up a free account at CloduFlare. My question is that creating page rule is mandatory after adding website?
What kind of additional settings I should make?
No, it's not mandatory to create a page rule. But CloudFlare doesn't cache pages (the html) by default.
Your best option is to install a wp-based cacher like Comet Cache. It's intelligent enough to know when to flush cached pages. You'll get a 10x speed boost. From there CloudFlare will speed up static resources, etc.
You can have a similar and faster html caching mechanism at CloudFlare on the free plan, but not intelligent WP caching - that's only on paid plans.
Related
I am exploring the support and usage of CDN in a Hybris solution.
I am a Hybris newbie and am working through the wiki to understand the product better.
I am unable to find the answers based on my search of CDN in conjuction with Hybris.
What are the typical CDN Providers that are used in an Hybris solution? Any references would be helpful.
Appreciate any pointers.
PS: This is not a programming question. If this question is considered inappropriate let me know and I will delete this.
Why would there be any preferred CDN providers? You can choose any provider you want but take these questions into consideration:
How is the TTL defined? By cache headers (set by hybris) or manually set on the CDN’s side? The possible cache headers are Cache-Control, Surrogate-Control and Edge-Control. Akamai for example uses Edge-Control but I’m not aware that any other CDN uses this header.
The choice of your CDN will also depend on where the content will be required: do you need to serve it worldwide or do you want to improve your quality of service in certains areas only by adding POPs?
Does content sometimes need to be invalidated from the cache? If yes, check if there is an API to do so (will require work to make hybris communicate with the API).
The easiest solution would be a basic cache where the TTLs would be defined in the CDN’s configuration.
If you choose to go with cache headers, this is a quite simple solution to setup in hybris, you only have to define a request handler that will take care of them.
Hybris is usually used for some sort of ecommerce site. I have seen great results by using ImageEngine.io for an image specific CDN for Hybris. ImageEngine optimizes images on the fly which makes your site load faster. Worth a look: http://imageengine.io
Recently a user told me to avoid subdomains when i can. I remember reading google consider subdomains as a unique site (is this true?). What else happens when i use a subdomain and when should i use or should not use a subdomain?
I heard cookies are not shared between subdomains? i know 2 images can be DL simultaneously from a site. Would i be able to DL 4 if i use sub1.mysite.com and sub2.mysite.com?
What else should i know?
You can share cookies between subdomains, provided you set the right parameters in the cookie. By default, they won't be shared, though.
Yes, you can get more simultaneous downloads if you store images in different subdomains. However, the other side of the scale is that the user spends more time resolving DNSs, so it's not practical to have, say, 25 subdomains to get 50 simultaneous downloads.
Another thing that happens with subdomains is that AJAX requests won't work without some effort (you CAN make them work using document.domain tricks, but it's far from straightforward).
Can't help with the SEO part, however, although some people discourage having both yoursite.com and www.yoursite.com working and returning the same content, because it "dilutes your pagerank". Not sure how true that is.
You complicate quite a few things. Collecting stats, controlling spiders, html5 storage, XSS, inter-frame communication, virtual-host setup, third-party ad serving, interaction with remote APIs like google maps.
That's not to say these things can't be solved, just that the rise in complexity adds more work and may not provide suitable benefits to compensate.
I should add that I went down this path once myself for a classifieds site, adding domains like porshe.site.com, ferrari.site.com hoping to boost rank for those keywords. In the end I did not see noticeable improvement and even worse google was walking the entire site via each subdomain, meaning that a search for ferraris might return porsche.site.com/ferraris instead of ferrari.site.com/ferraris. In short google considered each site to be duplicates but it still crawled each site every time it visited.
Again, workarounds existed but I chose simplicity and I don't regret it.
If you use sub domains to store your web sites images, javascript, stylesheets, etc. then your pages may load quicker. Browsers limit the number of simultaneous connections to each domain name. The more sub domains you use, the more connection can be made at the same time to collect the web pages content.
Recently a user told me to avoid subdomains when i can. I remember reading google consider subdomains as a unique site (is this true?). What else happens when i use a subdomain and when should i use or should not use a subdomain?
The last thing I heard about Google optimization, is that domains count for more pagerank than subdomains. I also believe that page rank calculations are per page, not per site (according to algorithm etc.). Though the only person who can really tell you is a Google employee.
I heard cookies are not shared between subdomains?
You should be able to use a cookie for all subdomains. www.mysite.com sub1.mysite.com sub2.mysite.com can all share the same cookies, but a cookie specified for mysite.com cannot be shared with them.
i know 2 images can be DL simultaneously from a site. Would i be able to DL 4 if i use sub1.mysite.com and sub2.mysite.com?
I'm not sure what you mean by DL simultaneously. Often times, a browser with a single thread will download images one at a time, even from different domains. Browsers with multiple thread configurations can download multiple items from different domains at the same time.
I have a site which has been developed completely in flash. Now the site owners do not want to shift to a more text/html based site. So am planning to create an alternative html/text based site which the googlebot will get redirected to. (By checking the useragent). My question is that is this allowed officially by google?
If not then how come there are many subscription based sites which display a different set of data to google compared to the users? Is that allowed?
Thank you very much.
I've dealt with this exact scenario for a large ecommerce site and Google essentially ignored the site. Google considers it cloaking and addresses it directly here and says:
Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.
Instead, create an ADA compliant version of the website so that users with screen readers and vision aids can use your web site. As long as there as link from your home page to your ADA compliant pages, Google will index them.
The official advice seems to be: offer a visible link to a non-flash version of the site. Fooling the googlebot is a surefire way to get in trouble. And remember, Google results will link to the matching page! Do not make useless results.
Google already indexes flash content so my suggestion would be to check how your site is being indexed. Maybe you don't have to do anything.
I don't think showing an alternate version of the site is good from a Google perspective.
If you serve up your page with the exact same address, then you're probably fine. For example, if you show 'http://www.somesite.com/' but direct googlebot to 'http://www.somesite.com/alt.htm', then Google might direct search users to alt.htm. You don't want that, right?
This is called cloaking. I'm not sure what the effects of it are but it is certainly not whitehat. I am pretty sure Google is working on a way to crawl flash now so it might not even be a concern.
I'm assuming you're not really doing a redirect but instead a PHP import or something similar so it shows up as the same page. If you're actually redirecting then it's just going to index the other page like normal.
Some sites offer a different level of content -- they LIMIT the content, they don't offer alternative and additional content. This is done so it doesn't index unrelated things generally.
In order to allow for multiple policies regarding content... security, cookies, sessions, etc, I'm considering moving some content from my sites to their own domains and was wondering what kinds of dividends it will pay off (If any).
I understand cookies are domain specific and are sent on every request (even for images) so if they grow too large they could start affecting performance, so moving static content in this way makes sense (to me at least).
Since I expect that someone out there has already done something similar, I was wondering if you could provide some feedback of the pros and the cons.
I don't know of any situation that fits your reasons that can't be controlled in the settings for the HTTP server, whether it be Apache, IIS or whatever else you might be using.
I assume you mean you want to split them up into separate hosts, ie www1.domain.com www2.domain.com. And you are correct that the cookies are host/domain specific. However, there aren't really any gains if www1 and www2 are still the same computer. If you are experiencing load issues and split it between two different servers, there could be some gains there.
If you actually mean different domains (www.domain1.com & www.domain2.com) I'm not sure what kind of benefits you would be looking for...
I am developing a small intranet based web application. I have YSlow installed and it suggests I do several things but they don't seem relevant for me.
e.g I do not need a CDN.
My application is slow so I want to reduce the bandwidth of requests.
What rules of YSlow should I adhere to?
Are there alternative tools for smaller sites?
What is the check list I should apply before rolling out my application?
I am using ASP.net.
Bandwidth on intranet sites shouldn't be an issue at all (unless you have VPN users, that is). If you don't and it's still crawling, it's probably something to do with the backend than the front-facing structure.
If you are trying to optimise for remote users, some of the same things apply to try and optimise the whole thing:
Don't use 30 stylesheets - cat them into one
Don't use 30 JS files, cat them into one
Consider compressing both JS and CSS using minifiers or the YUI compressor.
Consider using sprites (images with multiple versions in - eg button-up and button-down, one above the other)
Obviously, massive images are a no-no
Make sure you send expires headers to make sure stylesheets/js/images/etc are all cached for a sensible amount of time.
Make sure your pages aren't ridiculously large. If you're in a controlled environment and you can guarantee JS availability, you might want to page data with AJAX.
To begin,
limit the number of HTTP requests
made for images, scripts and other
resources by combining where
possible. Consider minifying them
too. I would recommend Fiddler for debugging HTTP
Be mindful of the size of Viewstate,
set EnableViewState = false where
possible e.g. For dropdown list controls
that never have their list of items changed,
disable Viewstate and populate in
Page_Init or override OnLoad. TRULY
understanding Viewstate is a
must read article on the subject
Oli has posted an answer while writing this and have to agree that bandwidth considerations should be secondary or tertiary for an intranet application.
I've discovered Page speed since asking this question. Its not really for smaller sites but is another great fire-bug plug-in.
Update: As of June 2015 Page Speed plugins for Firefox and Chrome is no longer maintained and available, instead, Google suggests the web version.
Pingdom tools provides a quick test for any publicly accessible web page.