How to configure imageresizer (imageresizing.net) to limit the number of scaled images - imageresizer

In the common responsive web page scenario, the browser requests images at a size determined by the current browserwindow size, so the size requests for an image will be like:
image740?height=731
image740?height=911
image740?width=402
image740?width=403
image740?width=2203
To avoid caching of all those highly specific image sizes and to enhance cache utilization, I would like to set some predefined sizes that are created on the server size. So for instance all image requests between height 600 and 1200 would deliver an image with height 1200.
Q: Is it possible to configure imageresizer doing this?
Q: Is enhancing the SizeLimiting plugin is a good place to implement this?

The Presets plugin lets you define, well, presets, and use those exclusively.
The better solution, however, is to fix your client-side javascript to use intervals instead of the exact browser size. Slimmage.js does this by dividing the pixel count by a factor, rounding up, then multiplying by the same factor. 160 is a good factor that generates ~13 sizes under 2048.

Related

Slow loading of images from amazon s3 and cloudFront

I am using amazon s3 services for hosting images. I have allot of images on my website.
I am also using CloudFront Distributions as cdn.
Image url's are fine.
But my images are still loading slowly as compared to some other top and competitors website.
Is there any way load images more fast?
Thanks
There could be numerous of other problems with images:
Loading too many images on the page. Make sure that you have lazy loading of your images that are not visible on initial render.
Using wrong size of images. This can be fixed by resizing images to correct size. Also, do not forget about responsive images. You can read more about them here
Using next generation formats. For instance, look at using WEBP for Chrome browser and JPEG2000 for Safari.
You can use Lighthouse tool to test your website on all problems listed above.
Also, it might be worth to consider using specialized CDN for images like pixboost.com.
Using a CDN like Cloudfront is the first step towards accelerating images. It addresses the challenges of Global distribution (your website is hosted in Europe but you have visitors from Australia => images will load from Cloudfront's node in Australia and be obviously faster than traveling from Europe). Also, it helps absorbing traffic peaks, for example during sales, Christmas, ...
To go further with image acceleration, you need to work on the images themselves and focus on 2 things:
resize the images to the target size (thumbnail, preview, full size, ...) and have different sizes for different screen sizes.
use image compression algorithms to "shrink" your images. You can use JPEG compression or alternative image formats like WebP, JPEG 2000, JPEG XR, ... These formats usually perform (shrink) better JPEG, however they come a big limitation: they are only supported by specific browser. Check caniuse.com for Browser support information: https://caniuse.com/#feat=webp
Bottom line, you will end up needing 15-20 versions of the same image to get the maximum optimisation across all browsers, device screen sizes, use cases, ...
There are multiple ways for automating this, for example by using ImageMagick. It's a great lib, but it requires coding and maintenance as it evolves quite dynamically.
Another option, is to use a Cloud-based image acceleration and delivery service. These services usually bundle image resizing and CDN delivery together and probably get you better CDN pricing as they negotiate big contracts with multiple CDN vendors.
We use https://cloudimage.io, but there are other great tools out there. Google is your best friend :).
Good luck with accelerating your page, faster images will definitely have a great impact.

Watermarking Plugin Performance - Is FastScaling an Option?

I'm wanting to use ImageResizer to serve thumbnails that are scaled and watermarked on the fly on a high traffic website.
My testing has shown that the Watermarking plugin results in a significant decrease in throughput compared to just scaling them with FastScaling.
Scaled: 150+ images per second
Scaled & Watermarked: 35 images per second
I dug through the Watermark Plugin code and saw that it's using GDI+ for its image manipulations. Is it possible to make it use the more performant FastScaling plugin instead?
This is something we would like to improve. Currently, if either Watermarking (or the DRM red dot) are in use, performance reverts to GDI+ levels.
I would be happy to assist on a pull request for this, or discuss other options.

How to determine maximum scale for cytoscape.js png output?

I want users to be able to generate high quality images for publication from their cytoscape.js graphs (in fact, this is the major use case of my application). Since there's no export from cytoscape.js to a vector format, my documentation tells users to just zoom to a high level of resolution before exporting to png, and then my code passes in the current zoom level as the scale parameter to cy.png(), along with full:true. However, at sufficiently high values for scale, no image gets generated. This maximum scale value seems to be different for different graphs, so I assume it's based on some maximum physical dimensions. How can I determine what the maximum scale value should be for any given graph, so my code won't exceed it?
Also, if I have deleted or moved elements such that the graph takes up less physical space than it used to, cy.png() seems to include that blank area in the resulting image even though it no longer contains any elements -- is there any way to avoid that?
By default, cy.png() and cy.jpg() will export the current viewport to an image. To export the fitted graph, use cy.png({ full: true }) or cy.jpg({ full: true }).
The browser enforces exported overall file size limits. This depends on the OS and the browser vendor. Cytoscape.js can not influence nor can it calculate this limit. For large resolutions/scales, the quality difference between PNG and JPG is negligible. So, JPG tends to be a better option for large exports.
Aside: While bitmap formats scale with resolution, vector formats like (SVG) scale with the number of graph elements. For large graphs, it may be impossible to export SVG (even if such a feature were implemented).

Optimizing image sprite size vs http requests

I am creating a website and the design have lots of background images so I used image sprites to optimize page loading time.
The thing is I have 3 sprite images and 2 of them are 200 KB each and dimensions are not more then 900x700 px.
My question is about optimizing page speed. What will produce best result? Will it be better to use two 200 KB sprites or is it better if I break them into 4 sprite image of 100 KB each.
As far as I read many articles, mostly developers and experts say to minimize http request because these days internet is fast enough so downloading doesn't take much time. I agree at some extent but I am not able to make my mind up between 2 more http request with lower image size or less http request with larger image size.
This seems like micro-optimization for me, but anyway the most performance way is to embed the image in the html page, so you just use the single http request of the page.
You can embed the image using the new data: protocol and including the image as a base64 encoded string, check out this tutorial for the details.
Also see the browser support for this feature.

Website optimization

have can i speed up the loading of images - specialy when i open the website for the first time it takes some time for images to load...
Is there anything i can do to improve this (html, css)?
link
Thank to all for your answers.
Crop the size of http://www.ursic-ei.si/datoteke/d4.jpg! It's 900 pixels wide, and most of that (half?) is empty and white. Make the image smaller and then use background-position and backgroud-color to compensate for anything you trimmed off the edges.
You have a lot of extra newlines in your HTML source. Not hugely significant, but theoretically - since in HTML there's no practical difference between one new line and two - you might want to remove some.
For images, you should consider a content delivery network (CDN), which will cache your images and other files and server them faster than you web server.
This is a must for any high-traffic website.
On the client, you can multipart download; e.g. in Firefox there's a bunch of settings under network.http.pipelining that help speed downloads.
On the server, there's little you can do (though you can gzip text-based files). The client must just know how to cache.
Since in your question you only ask about the images, I guess you already know that the cost of php processing and/or javascript is minor. If you want to speed up the images you can reduce their size, increase the compression rate... also try different formats. JPG is not always the best one.
Try GIF and/or PNG, also with these you can reduce the number of colors. Usually this formats are way better than JPG when you have simple pictures with few colors.
Also consider if some of your images are a simple patter that can be reproduced/repeated several times. For example, if you have a background image with a side banner, you just need one line and repeat it several times.