Slow loading of images from amazon s3 and cloudFront - amazon-s3

I am using amazon s3 services for hosting images. I have allot of images on my website.
I am also using CloudFront Distributions as cdn.
Image url's are fine.
But my images are still loading slowly as compared to some other top and competitors website.
Is there any way load images more fast?
Thanks

There could be numerous of other problems with images:
Loading too many images on the page. Make sure that you have lazy loading of your images that are not visible on initial render.
Using wrong size of images. This can be fixed by resizing images to correct size. Also, do not forget about responsive images. You can read more about them here
Using next generation formats. For instance, look at using WEBP for Chrome browser and JPEG2000 for Safari.
You can use Lighthouse tool to test your website on all problems listed above.
Also, it might be worth to consider using specialized CDN for images like pixboost.com.

Using a CDN like Cloudfront is the first step towards accelerating images. It addresses the challenges of Global distribution (your website is hosted in Europe but you have visitors from Australia => images will load from Cloudfront's node in Australia and be obviously faster than traveling from Europe). Also, it helps absorbing traffic peaks, for example during sales, Christmas, ...
To go further with image acceleration, you need to work on the images themselves and focus on 2 things:
resize the images to the target size (thumbnail, preview, full size, ...) and have different sizes for different screen sizes.
use image compression algorithms to "shrink" your images. You can use JPEG compression or alternative image formats like WebP, JPEG 2000, JPEG XR, ... These formats usually perform (shrink) better JPEG, however they come a big limitation: they are only supported by specific browser. Check caniuse.com for Browser support information: https://caniuse.com/#feat=webp
Bottom line, you will end up needing 15-20 versions of the same image to get the maximum optimisation across all browsers, device screen sizes, use cases, ...
There are multiple ways for automating this, for example by using ImageMagick. It's a great lib, but it requires coding and maintenance as it evolves quite dynamically.
Another option, is to use a Cloud-based image acceleration and delivery service. These services usually bundle image resizing and CDN delivery together and probably get you better CDN pricing as they negotiate big contracts with multiple CDN vendors.
We use https://cloudimage.io, but there are other great tools out there. Google is your best friend :).
Good luck with accelerating your page, faster images will definitely have a great impact.

Related

Watermarking Plugin Performance - Is FastScaling an Option?

I'm wanting to use ImageResizer to serve thumbnails that are scaled and watermarked on the fly on a high traffic website.
My testing has shown that the Watermarking plugin results in a significant decrease in throughput compared to just scaling them with FastScaling.
Scaled: 150+ images per second
Scaled & Watermarked: 35 images per second
I dug through the Watermark Plugin code and saw that it's using GDI+ for its image manipulations. Is it possible to make it use the more performant FastScaling plugin instead?
This is something we would like to improve. Currently, if either Watermarking (or the DRM red dot) are in use, performance reverts to GDI+ levels.
I would be happy to assist on a pull request for this, or discuss other options.

Image Resolution - Important for SEO on Google?

I typically set my image resolution for the web at 72 pixels/inch. Does Google look at image resolution as a ranking signal? Should I save my images at higher resolutions or does it not matter? The higher resolution images mean larger k sizes but I was not sure about what is acceptable.
Additional links I found helpful:
http://youtu.be/h2Zaj0CAUoU (the most helpful)
http://youtu.be/Sj5Ny21q3oY
http://youtu.be/3NbuDpB_BTc
Different Search Engines have different criteria/Algo. For Image search but as far as Google image search is concerned
I haven’t seen any such evidence that the resolution matters for a general image search, rather markup techniques (proper naming, right alt attribute ) plays a vital role.
But the resolution/size of your image will matter when the user filters the image results through sizes Large, Medium Icon etc.
There is a lot of info about optimizing images for Google search, Just use Google itself to find a more detailed answer, I found few links below pretty helpful though not specific to Google but can give you fair idea about the whole process, you might want to have a look at these:
Top 7 image optimization tips for SEO and site speed
How To Optimize Images For Search Engines…
72 dpi is standard web resolution. As far as affecting the SEO of any site these images are on, image size affects how fast a page will load, and that DOES impact your SEO.

Bigger cookie-like files for local data storage (browser "caching" of complex structures)

I am developing a browser based game, and I have a big map there. The terrain of the map is static. Therefore, I have some thousands of tiles that will not change (whether they represent a forest, a desert, whatever), just the players above it can change.
Hence, I wanted to store all my map in the player's computer. I am working with Ruby on Rails, and those map information are passed from the server to the javascript that runs on the user browser, in order to render a pretty map. But it makes me pretty sad to have a 200kb .html file, containing all those map related information.
What would be the simplest way to solve this issue? Cookies! Well. That's what I thought. A complete map information can get to almost 200kb (they are pretty big). A cookie can have at most 4kb.. I don't feel that the right way to achieve my objective is to create tons of cookies, one for each row of the map, for instance. Is there any more elegant way to have this static information lie on the player's browser, without creating lots of cookies? A way to cache it on his browser? I mean.. I can cache a 400kb image, why can't I cache a 200kb map structure?
Thanks in advance!
Fernando.
Well, HTML Local Storage gives you 5 MB (though data is stored *as strings*, so the actual amount of data you can fit in the container is likely a lot less than 5 MB.
This limit is oddly fluid. For one thing, it's just a recommended limit; and for another, i.e., Webkit-based browsers use UTF-16, which immediately cuts that in half (2.5 MB).
Browser support for Local Storage is good: IE, Firefox, Safari 4.0+, Chrome 4.0+, and Opera 10.5+. Both iPhone and Android are supported above versions 3.0 and 2.0. respectively.
Using Local Storage to preserve game state appears to be a proto-typical use case.
Finally, Paul Kinlan published an excellent step-by-step tutorial on HTML5Rocks, which i highly recommend (though it's a little more than a year old).
Have you considered storing it in a js file? Most browser will cache linked js files, allowing you to only serve it every once in a while. It would be very simple to deploy.

Website optimization

have can i speed up the loading of images - specialy when i open the website for the first time it takes some time for images to load...
Is there anything i can do to improve this (html, css)?
link
Thank to all for your answers.
Crop the size of http://www.ursic-ei.si/datoteke/d4.jpg! It's 900 pixels wide, and most of that (half?) is empty and white. Make the image smaller and then use background-position and backgroud-color to compensate for anything you trimmed off the edges.
You have a lot of extra newlines in your HTML source. Not hugely significant, but theoretically - since in HTML there's no practical difference between one new line and two - you might want to remove some.
For images, you should consider a content delivery network (CDN), which will cache your images and other files and server them faster than you web server.
This is a must for any high-traffic website.
On the client, you can multipart download; e.g. in Firefox there's a bunch of settings under network.http.pipelining that help speed downloads.
On the server, there's little you can do (though you can gzip text-based files). The client must just know how to cache.
Since in your question you only ask about the images, I guess you already know that the cost of php processing and/or javascript is minor. If you want to speed up the images you can reduce their size, increase the compression rate... also try different formats. JPG is not always the best one.
Try GIF and/or PNG, also with these you can reduce the number of colors. Usually this formats are way better than JPG when you have simple pictures with few colors.
Also consider if some of your images are a simple patter that can be reproduced/repeated several times. For example, if you have a background image with a side banner, you just need one line and repeat it several times.

Looking for a lossless compression api similar to smushit

Anyone know of an lossless image compression api/service similar to smushit from yahoo?
From their own FAQ:
WHAT TOOLS DOES SMUSH.IT USE TO SMUSH IMAGES?
We have found many good tools for reducing image size. Often times these tools are specific to particular image formats and work much better in certain circumstances than others. To "smush" really means to try many different image reduction algorithms and figure out which one gives the best result.
These are the algorithms currently in use:
ImageMagick: to identify the image type and to convert GIF files to PNG files.
pngcrush: to strip unneeded chunks from PNGs. We are also experimenting with other PNG reduction tools such as pngout, optipng, pngrewrite. Hopefully these tools will provide improved optimization of PNG files.
jpegtran: to strip all metadata from JPEGs (currently disabled) and try progressive JPEGs.
gifsicle: to optimize GIF animations by stripping repeating pixels in different frames.
More information about the smushing process is available at the Optimize Images section of Best Practices for High Performance Web pages.
It mentions several good tools. By the way, the very same FAQ mentions that Yahoo will make Smush.It a public API sooner or later so that you can run at it your own. Until then you can just upload images separately for Smush.It here.
Try Kraken Image Optimizer: https://kraken.io/signup
The developer's plan is free - but only returns dummy results. You must subscribe to one of the paid plans to use the API, however, the Web Interface is free and unlimited for images of up to 1MB.
Find out more in the Kraken documentation.
See this:
http://github.com/thebeansgroup/smush.py
It's a Python implementation of smushit that can be run off-line to optimise your images without uploading them to Yahoo's service.
As I know the best image compression for me is : Tinypng
They have also API : https://tinypng.com/developers
Once you retrieve your key, you can immediately start shrinking
images. Official client libraries are available for Ruby, PHP,
Node.js, Python and Java. You can also use the WordPress plugin, the
Magento 1 extension or improved Magento 2 extension to compress your
JPEG and PNG images.
And First 500 images per month is for free
Tip : Via using their API, you have no limit about file-size (not max 5MB each as their online tool)