What is the approach for lazy loading images for pages that conform to the Google Accelerated Mobile Pages (AMP) project?
It is my understanding that lazy loading means that the image is loaded to the client sometime after the above-the-fold content is rendered. There appears to be three variants of lazy loading:
view port - image loading is triggered by its proximity to the view port
set delay - image loading is triggered by a timer
manual - image loading is triggered by a client event
All these approaches can be implemented using javascript. However, AMP does not allow for javascript, just custom AMP scripts.
The view port approach is the most desirable, as it minimizes content loading to a minimum. AMP has a custom script that supports the view port approach for starting and stopping videos. However, there does not appear to similar support for amp-img.
The set delay appears to be supported with amp-animation, I think. This seems like a rather complex approach. Further, the delay approach is undesirable as the optimum delay depends on internet bandwidth, which is variable.
The delay approach can also be implemented with PHP. However, PHP is parsed on the server-side. This means that the page would have to reloaded, which entirely defeats the purpose of lazy loading.
It seems that the manual approach is all that remains. The following snippet of code creates two buttons that selectively show or hide an image.
amp-img id='img1' ... hidden
button on="tap.img1.show()"
button on="tap.img1.hide()"
For mobile sites, the show button can be embedded into the above the fold content, so the user triggers it while perusing through the site. However, this seem like a kludge
Does Google AMP effectively limit lazy loading to the manual approach, or is there some other way to accomplish view port or delay lazy loading?
Thank you in advance.
All AMP elements always are always lazy loaded. There is currently no way to configure the thresholds for lazy loading (e.g. based on distance from the viewport).
According to AMP documentation, AMP images are lazy loaded. They state that the <amp-img> tag is used by the AMP runtime to:
understand the layout of the page before assets load, crucial to support first-viewport preloading
control network requests to lazy load and prioritize resources effectively"
So, in short: you don't need to lazy load the images. They already do it for you.
It makes sense for them to do so, especially since the whole point of AMP is performance, and lazy loading is one of the most basic things that can be done to improve speed.
In fact, AMP does something even more clever: prefetching lazy loaded resources:
"AMP also prefetches lazy-loaded resources. Resources are loaded as late as possible, but prefetched as early as possible. That way things load very fast but CPU is only used when resources are actually shown to users."
If you dig a little deeper into the AMP runtime, you'll see that they actually implement more advanced techniques than just lazy loading, which is why AMP is near instantaneous...
Related
I have a Vue/Nuxt web app where pages are dynamically generated from lots of components that have child components.
The trouble is the header and footer are rendered first, then the child components that have the actual content. This looks terrible on first load and Lighthouse doesn't like it. It's an Avoid large layout shifts failure. For context it's only an issue when client side rendering, SSR would eliminate this issue while intoducing others.
What I could do is edit every single component in my project and add an event on mounted. That could then be used to decide when to show the layout. The problem is it would be a major hassle and would cause bugs when new components are added and this bit is forgotten.
I'm not able to find any general solution to this in Vue and/or Nuxt. I'd love to have a new lifetime hook of allMounted which would only fire when child components are also mounted but it doesn't exist. Even that would be a bit hacky. An even more general render when all components are mounted option would be awesome.
I'm not sure that a dynamic component can help in your case, but I guess that your company's website will not really benefit from this. Indeed, the problem of the content jumping will still be present IMO.
<component :is="currentTabComponent"></component>
I still think that you content is highly static IMO and that you could even switch to full static to have the best performance benefits rather than having to wait for a long time (TTFB) while SPA is loading all the content. It may be a bit more challenging to have everything look nice of course (before/after the hydration).
Also, you should have an idea of the approximate size of your containers. In that case, you could use some skeletons and a maybe even a prototyping font to visually populate the blocks.
In case you do not agree or think that this is not doable, you still have this solution to your disposal
<child-component #hook:mounted="makeSomeStuff"></child-component>
With this you may be able to display a full-sized loader until your content is done loading. You could add a mixin with the longer mounted syntax in each component to avoid too much boilerplate but this one is deprecated and do have various issues.
But IMO, the issue is more in your way of fetching the data (asyncData and fetch hooks are nice) and the way that everything is full dynamic when there is no specific need. If it's more important to keep the dynamic part, I guess that you can be serious on code reviews or plug some git hooks or alike to kinda scan the code and see if the required mounted emits are in place.
There is no ideal solution in your case but keep in mind that Lighthouse will always prefer some SSR content with the less amount of JS. Here is my personal bible to anything performance related, you could probably grasp some nice tips in this really in-depth article.
Update for Vue3
The syntax has changed for Vue3: https://v3-migration.vuejs.org/breaking-changes/vnode-lifecycle-events.html#_2-x-syntax
I have an app with Knockout binded views. It loads all HTML upfront, to minimize requests to the server.
I want to introduce Durandal to the app. Is this scenario supported? If so how?
Can you give us some more information?
This scenario is possible, as there nothing which would prevent you from loading your own HTML and binding it with knockout.
You will, however, NOT be able to do this in a way that meets the standards and conventions of this framework.
If you have many html views, my suggestion would be to create a small program which converts these existing views and view-models into their Durandal analogues. It will save you loads of headaches later.
Creating a View
I have multiple pages on a site using RequireJS, and most pages have unique functionality. All of them share a host of common modules (jQuery, Backbone, and more); all of them have their own unique modules, as well. I'm wondering what is the best way to optimize this code using r.js. I see a number of alternatives suggested by different parts of RequireJS's and Almond's documentation and examples -- so I came up with the following list of possibilities I see, and I'm asking which one is most recommended (or if there's another better way):
Optimize a single JS file for the whole site, using Almond, which would load once and then stay cached. The downside of this most simple approach is that I'd be loading onto each page code that the user doesn't need for that page (i.e. modules specific to other pages). For each page, the JS loaded would be bigger than it needs to be.
Optimize a single JS file for each page, which would include both the common and the page-specific modules. That way I could include Almond in each page's file and would only load one JS file on each page -- which would be significantly smaller than a single JS file for the whole site would be. The downside I see, though, is that the common modules wouldn't be cached in the browser, right? For every page the user goes to she'd have to re-download the bulk of jQuery, Backbone, etc. (the common modules), as those libraries would constitute large parts of each unique single-page JS file. (This seems to be the approach of the RequireJS multipage example, except that the example doesn't use Almond.)
Optimize one JS file for common modules, and then another for each specific page. That way the user would cache the common modules' file and, browsing between pages, would only have to load a small page-specific JS file. Within this option I see two ways to finish it off, to include the RequireJS functionality:
a. Load the file require.js before the common modules on all pages, using the data-main syntax or a normal <script> tag -- not using Almond at all. That means each page would have three JS files: require.js, common modules, and page-specific modules.
b. It seems that this gist is suggesting a method for plugging Almond into each optimized file ---- so I wouldn't have to load require.js, but would instead include Almond in both my common modules AND my page-specific modules. Is that right? Is that more efficient than loading require.js upfront?
Thanks for any advice you can offer as to the best way to carry this out.
I think you've answered your own question pretty clearly.
For production, we do - as well as most companies I've worked with option 3.
Here are advantages of solution 3, and why I think you should use it:
It utilizes the most caching, all common functionality is loaded once. Taking the least traffic and generating the fastest loading times when surfing multiple pages. Loading times of multiple pages are important and while the traffic on your side might not be significant compared to other resources you're loading, the clients will really appreciate the faster load times.
It's the most logical, since commonly most files on the site share common functionality.
Here is an interesting advantage for solution 2:
You send the least data to each page. If a lot of your visitors are one time, for example in a landing page - this is your best bet. Loading times can not be overestimated in importance in conversion oriented scenarios.
Are your visitors repeat? some studies suggest that 40% of visitors come with an empty cache.
Other considerations:
If most of your visitors visit a single page - consider option 2. Option 3 is great for sites where the average users visit multiple pages, but if the user visits a single page and that's all he sees - that's your best bet.
If you have a lot of JavaScript. Consider loading some of it to give the user visual indication, and then loading the rest in a deferred way asynchronously (with script tag injection, or directly with require if you're already using it). The threshold for people noticing something is 'clunky' in the UI is normally about 100ms. An example of this is GMail's 'loading...' .
Given that HTTP connections are Keep-Alive by default in HTTP/1.1 or with an additional header in HTTP/1.0 , sending multiple files is less of a problem than it was 5-10 years ago. Make sure you're sending the Keep-Alive header from your server for HTTP/1.0 clients.
Some general advice and reading material:
JavaScript minification is a must, r.js for example does this nicely and your thought process in using it was correct. r.js also combines JavaScript which is a step in the right direction.
As I suggested, defering JavaScript is really important too, and can drastically improve loading times. Defering execution will help your loading time look fast which is very important, a lot more important in some scenarios than actually loading fast.
Anything you can load from a CDN like external resources you should load from a CDN. Some libraries people use today like jQuery are pretty bid (80kb), fetching them from a cache could really benefit you. In your example, I would not load Backbone, underscore and jQuery from your site, rather, I'd load them from a CDN.
I created example repository to demonstrate these 3 kinds of optimization.
It can help us to have better understanding of how to use r.js.
https://github.com/cloudchen/requirejs-bundle-examples
FYI, I prefer to use option 3, following the example in https://github.com/requirejs/example-multipage-shim
I am not sure whether it is the most efficient.
However, I find it convienient because:
Only need to configure the require.config (on the various libraries in one place)
During r.js optimization, then decide which are the modules to group as common
I prefer to use option 3,and i can surely tell you that why is that.
It's the most logical.
It utilizes the most caching, all common functionality is loaded once. Taking the least traffic and generating the fastest loading times when surfing multiple pages. Loading times of multiple pages are important and while the traffic on your side might not be significant compared to other resources you're loading, the clients will really appreciate the faster load times.
I have listed much better options for the same.
You can use any content delivery network (CDN) like MaxCDN to ensure your js files get served to everyone. Also I'll suggest you to put your js files in the footer of your html code. Hope that helps.
I'm working on a shopping site. We display 40 images in our results. We're looking to reduce the onload time of our page, and since images block the onload event, I'm considering lazy loading them by initially setting img.src="" and then setting them after onload. Note that this is not ajax loading of html fragments. the image html along with the alt text is present. it's just the image src is deferred.
Does anyone have any idea as to whether this may harm SEO or lead to a google penalty box now that they are measuring sitespeed?
Images don't block anything, they are already lazy loaded. The onload event notifies you that all of the content has been downloaded, including images, but that is long after the document is ready.
It might hurt your rank because of the lost keywords and empty src attributes. You'll probably lose more than you gain - you're better off optimizing your page in other ways, including your images. Gzip + fewer requests + proper expires + a fast static server should go a long way. There is also a free CDN that might interest you.
I'm sure google doesn't mean for the whole web to remove their images from source code to gain a few points. And keep in mind that they consider anything under 3s to be good loading times, there's plenty of room to wiggle before resorting to voodoo techniques.
From a pure SEO perspective, you shouldn't be indexing search result pages. You should index your home page and your product detail pages, and have a spiderable method of getting to those pages (category pages, sitemap.xml, etc.)
Here's what Matt Cutts has to say on the topic, in a post from 2007:
In general, we’ve seen that users usually don’t want to see search results (or copies of websites via proxies) in their search results. Proxied copies of websites and search results that don’t add much value already fall under our quality guidelines (e.g. “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.” and “Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches…”), so Google does take action to reduce the impact of those pages in our index.
http://www.mattcutts.com/blog/search-results-in-search-results/
This isn't to say that you're going to be penalised for indexing the search results, just that Google will place little value on them, so lazy-loading the images (or not) won't have much of an impact.
There are some different ways to approach this question.
Images don't block load. Javascript does; stylesheets do to an extent (it's complicated); images do not. However, they will consume http connections, of which the browser will only fire off 2 per domain at a time.
So, what you can do that should be worry-free and the "Right Thing" is to do a poor man's CDN and just drop them on www1, www2, www3, etc on your own site and servers. There are a number of ways to do that without much difficulty.
On the other hand: no, it shouldn't affect your SEO. I don't think Google even bothers to load images, actually.
We display 40 images in our results.
first question, is this page even a landing page? is it targeted for a specific keyword? internal search result pages are not automatically landing pages. if they are not a landingpage, then do whatever you want with them (and make sure they do not get indexed by google).
if they are a landingpages (a page targeted for a specific keyword) the performance of the site is indeed important, for the conversion rate of these pages and indirectly (and to a smaller extend also directly) also for google. so a kind of lazy load logic for pages with a lot of images is a good idea.
i would go for:
load the first two (product?) images in an SEO optimized way (as normal HTML, with a targeted alt text and a targeted filename). for the rest of the images make a lazy load logic. but not just setting the src= to blank, but insert the whole img tag onload (or onscroll, or whatever) into your code.
having a lot of broken img tags in the HTML for non javacript users (i.e.: google, old mobile devices, textviewer) is not a good idea (you will not get a penalty as long as the lazy loaded images are not missleading) but shitty markup is never a good idea.
for general SEO question please visit https://webmasters.stackexchange.com/ (stack overflow is more for programing related questions)
I have to disagree with Alex. Google recently updated its algorithm to account for page load time. According to the official Google blog
...today we're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.
However, it is important to keep in mind that the most important aspect of SEO is original, quality content.
http://googlewebmastercentral.blogspot.com/2010/04/using-site-speed-in-web-search-ranking.html
I have been added lazyload to my site (http://www.amphorashoes.ro) and i have better pagerank from google (maybe because the content is loading faster) :)
first,don't use src="",it may hunt your page,make a small loading image instead it.
second,I think it won't affect SEO, actually we always use alt="imgDesc.." to describe this image, and spider may catch this alt but not analyse this image what id really be.
I found this tweet regarding Google's SEO
There are various ways to lazy-load images, it's certainly worth
thinking about how the markup could work for image search indexing
(some work fine, others don't). We're looking into making some clearer
recommendations too.
12:24 AM - 28 Feb 2018
John Mueller - Senior Webmaster Trends Analyst
From what I understand, it looks like it depends on how you implement your lazy loading. And Google is yet to recommend an approach that would be SEO friendly.
Theoretically, Google should be running the scripts on websites so it should be OK to lazy load. However, I can't find a source(from Google) that confirms this.
So it looks like crawling lazy loaded or deferred images may not be full proof yet. Here's an article I wrote about lazy loading image deferring and seo that talks about it in detail.
Here's working library that I authored which focuses on lazy loading or deferring images in an SEO friendly way .
What it basically does is cancel the image loading when DOM is ready and continue loading the images after window load event.
...
<div>My last DOM element</div>
<script>
(function() {
// remove the all sources!
})();
window.addEventListener("load", function() {
// return all the sources!
}, false);
</script>
</body>
You can cancel loading of an image by removing it's src value or replacing it with a placeholder image. You can test this approach with Google Fetch
You have to make sure that you have the correct src until DOM is ready so to be sure that Google Fetch will capture your imgs original src.
I was making a website for a music band, and i was wondering the best way to play background music on the website without interrupting the flow of the music (even for a split second).
At the moment, i am considering using frames, but this is not supposed to be good practice. Please someone tell me how i can do this. I would prefer to use HTML to code the website as i have not yet mastered coding in flash.
This might sound controversial, but here's an idea: Don't play music on your website. Seriously, don't. I think everyone knows how incredibly annoying that is, and asking a group of software developers to help you out with that is going to be like asking a group of sheep the best way to make a lambskin coat.
If you really have to do it, frames would be the simplest way, so I'd do that. But you're not going to do it anyway, right?
I can think of four ways:
Frames, as you said.
Make your entire website in Flash and have only one page. You need to know Flash to be able to do this, which could make this difficult.
Pop-out your music player. This is probably the easiest approach, but the downside is this could be annoying, and a lot of web browsers these days would block it.
Use AJAX and dynamically load all your site content within one page, like Gmail. Users will need to have newer browsers, and this will take quite a bit of coding on both the client and the server side.
The only way to prevent the music from stopping is to not let the page your music component is on reload. Currently the only way to do this is to use frames, unfortunately.
The only alternative is to develop the whole site in Flash or another technology that doesn't rely on changing pages as navigation.
It wouldn't be pretty but you could do it using AJAX. Have the master page with the header/footer/navigation controls with a big empty content div, and instead of regular links you have calls to AJAX functions that return HTML to be injected in the content div.
I tend to agree with the others who recommend frames. It may be considered "bad practice", but so is playing background music in the first place.
As was said, to do that you have to prevent your website from relaoding.
An option to achieve this might be to use asynchronous requests to modify your website content without reloading the whole page, that's basically what Ajax is about.
That being said, I sort of agree with Alex here : dont' play music.
This may be a topic for another post, but why would you cosnider IFrames to be good practice? you could out the content you want to change into an IFrame and have your code running your music player ouside it. When you load a page it woul load on the IFrame. Just a thought...
You would most likely need flash or a new window (pop-up) outside of the window.
Don't use frames. Ever.
EDIT: To all the people downmodding and commenting on this, not a single person has given a valid reason why you SHOULD use frames.
Just to clarify my position, please read ANY article on usability, the web, and frames.
For those still learning (and to those old people to dumb to update)
Frames break the unified model of the web.
Frames cause problems for search engine robots.
Frames make URLs stop working.
Frames break bookmarking.
Frames make printing more difficult.
Frames hurt accessibility.
Frames increase technical complexity.
and the #1 reason to not use frames......
USERS HATE THEM!
http://www.456bereastreet.com/archive/200411/who_framed_the_web_frames_and_usability/
Are there seriously this many people out there suggesting frames are a valid solution in 2009? How disappointing.