Lighthouse page score - properly size images and preload - lazy-loading

Properly Size image penalty:
I use WP Rocket with lazy load and Revolution Slider with lazy load. Why do I continue to get penalized with the google lighthouse score "properly size images"?
Preload penalty:
I have this in the headers as specified in some of the speed reports (sometimes it penalizes, sometimes it doesn't) It gives an error that I must use "cross origin" attribute properly. What goes in the cross origin attribute for the links below?
My Code:
<link rel="preload" href="https://apartmentskatytexas.com/wp-content/plugins/revslider/public/assets/fonts/revicons/revicons.woff?5510888" as="font" crossorigin="__________">
https://apartmentskatytexas.com

Performance score is not based on the checklist, but on the speed of your site. So you are not being penalized. More in my article How to get a 100% Google Lighthouse score.

Related

Why h2 get CLS > 100 - h2 is on category page Prestashop

(mobile page) CLS checker for chrome and metrics show bad element h2 on category page? Why? I know how CLS works, but in this case I have no idea why I have CLS so high. The bad result is also visible in the GSC
shop cavaricci
In a quick review it looks like the fallback font is smaller than your loaded font "popins".
There are a few options here
use a default system font.
Change how the font gets loaded this will require a little reading, and being prestashop it probably wont be easy. See this article on css tricks https://css-tricks.com/the-best-font-loading-strategies-and-how-to-execute-them/
Set the font-display property in css
add a pre-load meta tag to the head of your page for each font file you will use, be careful not to load the same font twice in different formats. (see https://3perf.com/blog/link-rels/ for more info) e.g <link rel="preload" href="https://fonts.gstatic.com/s/poppins/v19/pxiEyp8kv8JHgFVrJJfecnFHGPc.woff2">
Set explicit width and heights on your title (not ideal if you are using a CMS)

Differ images cause drop the google rankings

I'am working with a large project with lot of images. to increase it's page speed, chrome "lighthoues" recommend me to differ images. But my company gives priority to the ranking of the page. I'am not sure how this effect for google crawlers.
As you know after dffer the images, there is no real image url under the "src" attribute. So how can google understand and optimize my images? can some one provide me a realiable resource to understand the problem?
above is a sample differed image tag. As you can see src tag doesn't contain the actual image. actual image is under data-src attribute which will be loaded to the site using javascript.
I just wanna know how does this affect to our SEO/Page-ranking?
I thought I had read somewhere that lazy-loading is fine for SEO but to be sure I did some googling and found the following. Spoiler alert; googlebot will render the full page and thus all images will have populated src="" attributes.
https://yoast.com/ask-yoast-lazy-load/

Home page with or without slides

I am making a static website for a friend of mine and the page load time is very crucial for the project since the Internet speed is rather slower in rural areas(near mountains, forests). My question is Does making home page with carousel slides slow down page load time comparing to a page without slides?
Yes, Obviously as you are increasing the loading of content but can be fast if you are cdn(content delivery network) such as cloudinay using the static color and css effects as many of the site are doing to load their content

Improve loading speed withouth loosing search ranking

I have a webpage whit many areas whose visibility can get toggled by the user.
The default visibility state for those area is hidden (css, display: none).
I don't have control to what's going to be put inside, but it could be a lot of images.
I saw with firefox's network observer all images where loaded with the page. This is quite a waste of bandwidth since the user might choose not to display every areas.
I came to a workarround, I put all that content inside a <script type="late-rendering"></script> and to avoid any potential conflict (eg: "" inside the content), I replace all "<" with "8691jQfdtxm" (randomly picked string). Then when the user want to make an area visible, I just fill the area with that content after replacing 8691jQfdtxm with "<".
It works fine, but I think proceeding like this will make crawlers (eg: Google) think my webpage is pure garbage. How could I avoid that?
Unless search engines were heavily relying on the alt tags of your images, or their filenames, there is little risk you will loose search rankings. If your site does load more quickly instead, it will provide a better user experience, which will be probably detected by Google, and this influences rankings positively.
Google executes a lot of Javascript these days. And your trick of breaking the html with a random string seems hokey to me.
I would preload all the textual content ( e.g. have it all in there on first load, with the div closed via display:none ). This content will not count as much as visible content - but it does count.
Then I'd do a delayed loading of the images. Like with make all your images something like:
<img src="blank.jpg" loadlater="realimage.jpg">
blank.jpg can be a tiny image. when the div opens you can use javascript/jquery to rewrite each src with loadlater.

Preventing Images being cached in the browser

I have a feature of "Browse Pictures" where there are thumbnails and when a user clicks it expands.
Now, both these images are stored in separate virtual directories with different sizes, the larger being 200*200 px.
Still it only shows the smaller image when I click it to enlarge, instead of the 200*200 images.
You can add a random URL parameter to the image's href, so that the HTML rendered looks like
<img src="http://static.example.com/some/large/image.jpg?234234652346"/>
instead of
<img src="http://static.example.com/some/large/image.jpg"/>
It sounds like you don't want to prevent them from being cached as such, but you want to give them different URLs.
If they do have different URLs, then this is not a caching problem.
To prevent caching, you use a cache-control:no-cache HTTP response header when serving the images. (are you using Apache?)
But if you really prevent caching, your data transfer will be higher than it needs to be, every time they visit your gallery, they will be fetching your images.