We have a website with potentially 1000's of pages. We would like to leverage the power of Static Rendering. The CMS, which is hosted on a different server, will trigger a static re-render of the page via WebHooks.
When a new page is created, the main nav may need to change. That means the entire site will need to be re-generated, and with so many pages that could take a very long time.
So what is the work around for this? Can you statically render just the main nav and include it all pages, to avoid re-rendering absolutely everything? ...so partial static rendering?
Depending on where you're hosting your code, you could use ISG: https://youtu.be/4vRn7yg85jw
There are several approaches of solving that yourself too, but it will require some work of course.
Nuxt team is currently working on solving this issue with something baked in: https://github.com/nuxt/framework/discussions/560
You could maybe also optimize some of those pages or look to split them in different projects as told here: https://stackoverflow.com/a/69835750/8816585
Batching the regeneration could be an idea too, or even using the preview feature to avoid some useless builds: https://nuxtjs.org/docs/features/live-preview#preview-mode
Overall, I'm not sure that there is magic solution with a perfect balance between SSR and SSG as of today without a decent amount of work. Of course, if you're using Go + Vite or alike, you will get faster builds overall but it's a quite broad/complex question overall.
I got an weird issue. Here is my code for rendering the vue pages. In my local machine, the rendering time for this page is about 50~80ms around, however, if i access the page parallel, sometimes could be 120ms around(maybe 5 times out of 200 requests ), but most of time, it is still 50~80 ms.
However, when i deploy the code to our production docker, these peek time is getting worse, sometimes it can reach 1 second, and got 500ms a lot of times, the performance is bad. It makes no sense, the request load is not heavy and we have load balance too. For a similiar page which we are using EJS to render, we don't see this kind of peek a lot. The backend logic and services using for EJS and Vue are all the same.
Client side rendering is also the same, it has similar symptom.
Does any body know what kind of reasons could lead this issue?
first of all do two things:
1- do a quick test using lighthouse if possible, it'll help pin pointing the problem.
2- check console for any errors AND warnings.
without further information about you'r code i don't think it's possible to say what's exactly causing the problem.
However after searching for some time i came about an article which the writer had the same performance problems.
This is the result of his lighthouse check. As you can see his website had shortcomings in indexing and content full paint; Long story short he had an infinity loop and v-for loops without keys.
following are some tips on how you can better optimize you'r vue app:
Apply the Vue Style Guide and ESLint
There’s a style guide in Vue docs: https://v2.vuejs.org/v2/style-guide/.
You can find there four Rule categories. We really care about three of them:
Essential rules preventing us from errors,
Recommended and strongly recommended rules for keeping best practices – to improve quality and readability of code.
You can use ESLint to take care of those rules for you. You just have to set everything properly in the ESLint configuration file.
Don’t use multiple v-if
Don’t write api call handlers in components
Simply do what is locally necessary in components logic. Every method that could be external should be separated and only called in components e.g. business logic.
Use slots instead of large amounts of props
5.Lazy load routes
Use watcher with the immediate option instead of the created hook and watcher together.
Here another article on how to improve you'r vue app.
Good Luck!
I'm interested in using FireBase as a data-store for the creation of largely traditional, occasionally updated websites and am concerned about the SEO implications of rendering content using client-side JavaScript.
I know Google has made headway into indexing some JavaScript content, but am wondering what my best course of action is. I know I have some options:
Render content using 100% client-side JS, and probably suffer some indexing trouble
Build static HTML files on the server side (using Node, most likely) and serve them instead
First, I'm not sure how bad the problem actually is doing everything client side (am I solving something that needs solved?). And second, I just wonder if I'm missing some other obvious way to approach this.
Unfortunately, rendering data on the client-side generally makes it difficult to do SEO. Firebase is really intended for use with dynamic data, such as user account info, game data, etc, where SEO is not a goal.
That being said there are a few things you can do to optimize for SEO. First, you can render as much of your site as possible at compile time using a templating tool like mustache. This is what we did on the Firebase.com website (the entire site is static except for the tutorial and examples).
Second, if your app uses hash fragments in the URL for navigation (anything after the "#!"), you can provide a separate set of static or server-generated pages that correspond to your dynamic pages so that crawlers can read the data. Google has a spec for doing this, which you can see here:
https://developers.google.com/webmasters/ajax-crawling/docs/specification
I'm working on a business application built upon PHP & Dojo tool kit. The interface is similar that you see on dojo dijit theme tester.
On internet it takes lot of time to load all those js one by one..
I want to know what is the best technique that is being used by theme tester demo that it loads much faster than one we built.?
I'm interested to know the best practices on optimizing its loading time?
You have rightly observed the biggest cause of runtime performance issue is the many many roundtrips it is doing to the server to fetch the small JS files.
While the modularized design of Dojo is very beneficial at design time (widget extensions, namespacing etc), at runtime, it is expected you optimize the dojo bits - the way to do that is to do a custom build.
Doing a custom build will give you a big performance boost - the hundreds of roundtrips will be reduced to one or 2 and the size of the payload will also dramatically decrease. We have seen a 50x performance improvement with custom build
Custom build will create an optimized, minified JS file that will contain only the code you use in the app.
You can define multiple layers depending on how you want to segregate your application JS files (for example, one single compressed file versus multiple files included in different UIs)
depending on the version of dojo you are using, see:
http://dojotoolkit.org/reference-guide/1.7/build/index.html#build-index
http://dojotoolkit.org/reference-guide/1.7/build/pre17/build.html#build-pre17-build
While it looks daunting at first, sitck with it and you will be able to create an optimized version and see the benefits :)
I'm working on a shopping site. We display 40 images in our results. We're looking to reduce the onload time of our page, and since images block the onload event, I'm considering lazy loading them by initially setting img.src="" and then setting them after onload. Note that this is not ajax loading of html fragments. the image html along with the alt text is present. it's just the image src is deferred.
Does anyone have any idea as to whether this may harm SEO or lead to a google penalty box now that they are measuring sitespeed?
Images don't block anything, they are already lazy loaded. The onload event notifies you that all of the content has been downloaded, including images, but that is long after the document is ready.
It might hurt your rank because of the lost keywords and empty src attributes. You'll probably lose more than you gain - you're better off optimizing your page in other ways, including your images. Gzip + fewer requests + proper expires + a fast static server should go a long way. There is also a free CDN that might interest you.
I'm sure google doesn't mean for the whole web to remove their images from source code to gain a few points. And keep in mind that they consider anything under 3s to be good loading times, there's plenty of room to wiggle before resorting to voodoo techniques.
From a pure SEO perspective, you shouldn't be indexing search result pages. You should index your home page and your product detail pages, and have a spiderable method of getting to those pages (category pages, sitemap.xml, etc.)
Here's what Matt Cutts has to say on the topic, in a post from 2007:
In general, we’ve seen that users usually don’t want to see search results (or copies of websites via proxies) in their search results. Proxied copies of websites and search results that don’t add much value already fall under our quality guidelines (e.g. “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.” and “Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches…”), so Google does take action to reduce the impact of those pages in our index.
http://www.mattcutts.com/blog/search-results-in-search-results/
This isn't to say that you're going to be penalised for indexing the search results, just that Google will place little value on them, so lazy-loading the images (or not) won't have much of an impact.
There are some different ways to approach this question.
Images don't block load. Javascript does; stylesheets do to an extent (it's complicated); images do not. However, they will consume http connections, of which the browser will only fire off 2 per domain at a time.
So, what you can do that should be worry-free and the "Right Thing" is to do a poor man's CDN and just drop them on www1, www2, www3, etc on your own site and servers. There are a number of ways to do that without much difficulty.
On the other hand: no, it shouldn't affect your SEO. I don't think Google even bothers to load images, actually.
We display 40 images in our results.
first question, is this page even a landing page? is it targeted for a specific keyword? internal search result pages are not automatically landing pages. if they are not a landingpage, then do whatever you want with them (and make sure they do not get indexed by google).
if they are a landingpages (a page targeted for a specific keyword) the performance of the site is indeed important, for the conversion rate of these pages and indirectly (and to a smaller extend also directly) also for google. so a kind of lazy load logic for pages with a lot of images is a good idea.
i would go for:
load the first two (product?) images in an SEO optimized way (as normal HTML, with a targeted alt text and a targeted filename). for the rest of the images make a lazy load logic. but not just setting the src= to blank, but insert the whole img tag onload (or onscroll, or whatever) into your code.
having a lot of broken img tags in the HTML for non javacript users (i.e.: google, old mobile devices, textviewer) is not a good idea (you will not get a penalty as long as the lazy loaded images are not missleading) but shitty markup is never a good idea.
for general SEO question please visit https://webmasters.stackexchange.com/ (stack overflow is more for programing related questions)
I have to disagree with Alex. Google recently updated its algorithm to account for page load time. According to the official Google blog
...today we're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.
However, it is important to keep in mind that the most important aspect of SEO is original, quality content.
http://googlewebmastercentral.blogspot.com/2010/04/using-site-speed-in-web-search-ranking.html
I have been added lazyload to my site (http://www.amphorashoes.ro) and i have better pagerank from google (maybe because the content is loading faster) :)
first,don't use src="",it may hunt your page,make a small loading image instead it.
second,I think it won't affect SEO, actually we always use alt="imgDesc.." to describe this image, and spider may catch this alt but not analyse this image what id really be.
I found this tweet regarding Google's SEO
There are various ways to lazy-load images, it's certainly worth
thinking about how the markup could work for image search indexing
(some work fine, others don't). We're looking into making some clearer
recommendations too.
12:24 AM - 28 Feb 2018
John Mueller - Senior Webmaster Trends Analyst
From what I understand, it looks like it depends on how you implement your lazy loading. And Google is yet to recommend an approach that would be SEO friendly.
Theoretically, Google should be running the scripts on websites so it should be OK to lazy load. However, I can't find a source(from Google) that confirms this.
So it looks like crawling lazy loaded or deferred images may not be full proof yet. Here's an article I wrote about lazy loading image deferring and seo that talks about it in detail.
Here's working library that I authored which focuses on lazy loading or deferring images in an SEO friendly way .
What it basically does is cancel the image loading when DOM is ready and continue loading the images after window load event.
...
<div>My last DOM element</div>
<script>
(function() {
// remove the all sources!
})();
window.addEventListener("load", function() {
// return all the sources!
}, false);
</script>
</body>
You can cancel loading of an image by removing it's src value or replacing it with a placeholder image. You can test this approach with Google Fetch
You have to make sure that you have the correct src until DOM is ready so to be sure that Google Fetch will capture your imgs original src.