I've noticed that many pages with heavy javascript load certain files at the end so pages load faster, can sprockets help me do this in Rails 3? If not, what methods can I use to help performance when using scripts?
There are a few things to consider here - first, the browser. Most modern browsers can asynchronously load resources, so the old trick of putting javascript at the end of the html isn't as necessary as it once was. Have a read through this article on asynce at css-tricks: http://css-tricks.com/thinking-async/
Next, the point of the asset pipeline is to concatenate, minify and compress all the javascript and css into two files, one 'application.js' and one 'application.css'.
This serves to reduce HTTP requests and means that only the first page load will take
The logic is, that with proper caching, the first load will take longer to get the monster files, but once they've been downloaded, the user has the js and stylesheets cached on their machine.
Have a read through the asset pipeline guide for more information about its use and benefits: http://guides.rubyonrails.org/asset_pipeline.html
Related
Our clients website load really slow on the first load (the TTFB on the page document can be 10-20s). If I reload the page, the site loads a lot faster.
This may be because of a lot of the files are cached?
Website is here: https://www.mortels.com.au/
This happens for a lot of the pages.
I have tried merging some of the .css files, and will try to attempt the .js files if I cannot find anything else (I never built the original theme, so finding it hard to figure out what is done where and do not have much experience with developing in Shopify.
I also tried adding a lazyloader however it doesn't look like it is working.
Would anyone have any solutions to make the website load quicker? Could it be just the apps we have running on the website causing the initial response to be so slow?
One of the things that can hinder your site's load speed is having too much logic happening through Liquid tags. Shopify has to parse all of the page's Liquid code before it can serve the page, and that has a direct effect on the TTFB
For the files that have unacceptable TTFB ratings, some things you can try to do to help make Shopify's servers serve your content faster include:
Reducing the number of lookups (eg: through all_products[handle] on the page
Avoiding nested for whenever possible
Replacing loops with map whenever you need to make an array of values
Rewriting logic-heavy sections to run in Javascript instead of Liquid (and using the | json filter to drop your liquid variables in a Javascript-friendly version)
Hope this helps!
Ive made a minor text change to our website, the change is minor in that its a couple of words, but the meaning is quite significant.
I want all users (both new and returning) to the site to see the new text rather than any cached versions, is there a way i can force a user (new or returning) to re download the page, rather than fetch it from their browser cache ?
The site is a static html site hosted on a LAMP server.
This depends totally on how your webserver has caching set up but, in short, if it's already cached then you cannot force a download again until the cache expires. So you'll need to look at your cache headers in your browsers developer tools to see how long it's been set for.
Caching gives huge performance benefits and, in my opinion, really should be used. However that does mean you've a difficulty in forcing a refresh as you've discovered.
In case you're interested in how to handle this in the future, there are various cache busting methods, all of which basically involve changing the URL to fool the browser into thinking its a different resource and forcing the download.
For example you can add a version number to a resource so you ask for so instead of requesting index.html the browser asks for index2.html, but that could mean renaming the file and all references to it each time.
You can also set up rewrites in Apache using regular expressions so that index[0-9]*.html actually loads index.html so you don't need multiple copies of the file but can refer to it as index2.html or index3.html or even index5274.html and Apache will always serve the contents of index.html.
These methods, though a little complicated to maintain unless you have an automated build process, work very well for resources that users don't see. For example css style sheets or JavaScript.
Cache busting techniques work less well for HTML pages themselves for a number of reasons: 1) they create unfriendly urls, 2) they cannot be used for default files where the file name itself is not specified (e.g. the home page) and 3) without changing the source page, your browser can't pick up the new URLs. For this reason some sites turn off caching for the HTML pages, so they are always reloaded.
Personally I think not caching HTML pages is a lost opportunity. For example visitors often visit a site's home page, and then try a few pages, going back to the home page in between. If you have no caching then the pages will be reloaded each time despite the fact it's likely not to have changed in between. So I prefer to have a short expiry and just live with the fact I can't force a refresh during that time.
I want to exclude a number of items on my website when viewing it on a screen with a small screen resolution so I'm using CSS media queries to hide the items. But a Facebook Like box, for example, is still loaded in the background - a number of javascripts and css files - how do I prevent them from being loaded in order to decrease the page load time?
Right now I'm using PHP to check user agent and then simply exclude the code, but I wonder if that is the best way really. Over time I will have to change the PHP when new devices are introduced for instance. Is a script like Modernizr an option here?
Using a back-end and front-end mix would be best. You can use something like WURFL or even Apache Mobile Filter to make it easier to work on the back-end, and loading certain content only when is_tablet() or is_mobile() are true.
On the other hand, I'd recommend checking jQuery image lazy-loading and share-button on demand loading (like techcrunch.com does). Those two fron-end techniques can really improve page load times (they cut almost in half the initial pageload time on my site).
Depending on the situation, you could also use Modernizr (a front-end javascript solution) to detect something truthy (such as [if width > 320]) and then conditionally load some scripts etc based on that.
We want to set expires headers for used images, css and javascript to improve pagespeed, but we are aware of the cachingproblem when modifying a css or js script.
Is it possible to add a meta or other tag in the file which loads the xhtml which tells the browser to refresh every element, no matter what caching is set on existing images, css or js?
As far as I know, there's no such shortcut.
And even if it was - what would be the point? Sending such a header would defeat the purpose of the future expiration header in the first place.
When setting expiration headers, you need to add some sort of asset versioning to to your elements, such as <link rel="stylesheet" href="css/style.css?v=2">
Changing the path to the asset would achieve the same goal.
Yes this is a hassle. But cache invalidation is a hard problem, there's really no easy way around it.
What you probably want to do is version your static content. so say you have a main.css file then you can version this by renaming it to main_0.css (just an example) and then you would set the cache expire to a year. if you ever need to update main.css just increment the version number and update your references. Then all clients will get the latest version.
There are several solutions that can do this versioning for you, but this is the basic principle..
I have always found theese resources very usefull when having cache related questions:
http://code.google.com/intl/no/speed/page-speed/docs/caching.html
http://code.google.com/intl/no/speed/page-speed/docs/filters.html
Hope this helps.
EDIT
Here is a versioning solution (mod_pagespeed) that does what i explained above:
http://code.google.com/intl/no/speed/page-speed/docs/filter-cache-extend.html
Is it possible to add a meta or other tag in the file which loads the xhtml which tells the browser to refresh every element, no matter what caching is set on existing images, css or js?
Not as far as I'm aware.
We want to set expires headers for used images, css and javascript to improve pagespeed
This is a good thing in my opinion.
but we are aware of the cachingproblem when modifying a css or js script.
If you're making drastic changes to CSS or JS then you ought to be staging the changes anyway. Manage these changes by having the new CSS and JS on a different path and change the references in your HTML when making the change. This allows you to:
instantly roll out / back
make selective rollouts by dynamically generating those references
web logs now provide clear audit trail for following up bug reports
avoid any issues with short term caching of CSS and JS
OK, so I'm starting a new project using Rails 3.1 and I'm new to CoffeeScript.
Anyway, I like the idea of having asset files representing controllers but what if I only want the JS to render when the controller is called?
For example, I have a controller called Game. In my games.js.coffee file, I put some code in there and it's called for every page request. Even pages that have nothing to do with Games.
In Rails 3.0.7, what I would do is put a yield(:js) in the application erb file and then call content_for(:js) in my Games#action view. That way, only the js that was needed for that controller was loaded.
Or, am I going about this the wrong way? Is it better to have ALL js code loaded and cached for every page request to improve performance?
Thanks for any suggestions.
Is it better to have ALL js code loaded and cached for every page request to improve performance?
Basically, the Rails team decided that the answer is usually "yes." Most sites work best with just a single (minified) JS file, which you get by default in Rails 3.1. That way, once the user has accessed a single page of your site, all other pages will load quickly, since all the JS has been cached.
If your site has a huge amount of JS (think Facebook), then I'd suggest architecting your site to load rarely-used JS code asynchronously, using a library like RequireJS. Otherwise, I wouldn't bother loading different code under different controllers; it's a lot of extra work for a marginal benefit.
Take a look at this plugin, I think it solves your problem: https://github.com/snitko/specific_assets