Image Resolution - Important for SEO on Google? - seo

I typically set my image resolution for the web at 72 pixels/inch. Does Google look at image resolution as a ranking signal? Should I save my images at higher resolutions or does it not matter? The higher resolution images mean larger k sizes but I was not sure about what is acceptable.
Additional links I found helpful:
http://youtu.be/h2Zaj0CAUoU (the most helpful)
http://youtu.be/Sj5Ny21q3oY
http://youtu.be/3NbuDpB_BTc

Different Search Engines have different criteria/Algo. For Image search but as far as Google image search is concerned
I haven’t seen any such evidence that the resolution matters for a general image search, rather markup techniques (proper naming, right alt attribute ) plays a vital role.
But the resolution/size of your image will matter when the user filters the image results through sizes Large, Medium Icon etc.
There is a lot of info about optimizing images for Google search, Just use Google itself to find a more detailed answer, I found few links below pretty helpful though not specific to Google but can give you fair idea about the whole process, you might want to have a look at these:
Top 7 image optimization tips for SEO and site speed
How To Optimize Images For Search Engines…

72 dpi is standard web resolution. As far as affecting the SEO of any site these images are on, image size affects how fast a page will load, and that DOES impact your SEO.

Related

Slow loading of images from amazon s3 and cloudFront

I am using amazon s3 services for hosting images. I have allot of images on my website.
I am also using CloudFront Distributions as cdn.
Image url's are fine.
But my images are still loading slowly as compared to some other top and competitors website.
Is there any way load images more fast?
Thanks
There could be numerous of other problems with images:
Loading too many images on the page. Make sure that you have lazy loading of your images that are not visible on initial render.
Using wrong size of images. This can be fixed by resizing images to correct size. Also, do not forget about responsive images. You can read more about them here
Using next generation formats. For instance, look at using WEBP for Chrome browser and JPEG2000 for Safari.
You can use Lighthouse tool to test your website on all problems listed above.
Also, it might be worth to consider using specialized CDN for images like pixboost.com.
Using a CDN like Cloudfront is the first step towards accelerating images. It addresses the challenges of Global distribution (your website is hosted in Europe but you have visitors from Australia => images will load from Cloudfront's node in Australia and be obviously faster than traveling from Europe). Also, it helps absorbing traffic peaks, for example during sales, Christmas, ...
To go further with image acceleration, you need to work on the images themselves and focus on 2 things:
resize the images to the target size (thumbnail, preview, full size, ...) and have different sizes for different screen sizes.
use image compression algorithms to "shrink" your images. You can use JPEG compression or alternative image formats like WebP, JPEG 2000, JPEG XR, ... These formats usually perform (shrink) better JPEG, however they come a big limitation: they are only supported by specific browser. Check caniuse.com for Browser support information: https://caniuse.com/#feat=webp
Bottom line, you will end up needing 15-20 versions of the same image to get the maximum optimisation across all browsers, device screen sizes, use cases, ...
There are multiple ways for automating this, for example by using ImageMagick. It's a great lib, but it requires coding and maintenance as it evolves quite dynamically.
Another option, is to use a Cloud-based image acceleration and delivery service. These services usually bundle image resizing and CDN delivery together and probably get you better CDN pricing as they negotiate big contracts with multiple CDN vendors.
We use https://cloudimage.io, but there are other great tools out there. Google is your best friend :).
Good luck with accelerating your page, faster images will definitely have a great impact.

Watermarking Plugin Performance - Is FastScaling an Option?

I'm wanting to use ImageResizer to serve thumbnails that are scaled and watermarked on the fly on a high traffic website.
My testing has shown that the Watermarking plugin results in a significant decrease in throughput compared to just scaling them with FastScaling.
Scaled: 150+ images per second
Scaled & Watermarked: 35 images per second
I dug through the Watermark Plugin code and saw that it's using GDI+ for its image manipulations. Is it possible to make it use the more performant FastScaling plugin instead?
This is something we would like to improve. Currently, if either Watermarking (or the DRM red dot) are in use, performance reverts to GDI+ levels.
I would be happy to assist on a pull request for this, or discuss other options.

Get Highest Res Favicon

I'm making a website that needs to dynamically obtain the favicon of sites upon request. I've found a few api's that can accomplish this fairly well, and so far I'm liking http://www.fvicon.com/.
The final image for my website will be 64x64px, and some websites such as Google and Wordpress have nice images of this size that are easily retrieved via this api. Though, of course, most websites only have a 16x16 favicon image and scaling that image to 64x64 has very bad quality loss.
Examples:
(high res) http://a.fvicon.com/wordpress.com?format=png&width=64&height=64
(low res) http://a.fvicon.com/yahoo.com?format=png&width=64&height=64
Keeping this in mind, I'm planning on somehow determining whether a high-res image is available and, if so, the website will use this image. If not, I want to use a pre-made 64x64 icon with the smaller icon layered over it. What I'm having trouble with is determining if there is a high res favicon available or not.
Also, I'm curious if there's a better approach to this situation. I'd rather not use smaller images (64x64 works out really well for this project). The lowest res I'm willing to drop to is 48x48 but even then there will be a significant quality loss for scaling up 16x16 favicons.
Any ideas? If you need any more information I will gladly provide it.
Thank you!
Here's a link to a way of using favicons in different resolutions. Maybe it helps you to find out how to get them. You would have to search the website's code (don't know how to do this) for /<link.*rel="icon".*>/ and search in the root directory of the webserver for favicon.ico. (I hope the regex is correct)
Hope I can help you.
PHP has a native function called getimagesize. If you can retrieve the favicon, then you can run the function and determine if it's high res or not and act accordingly.
Just a thought.
This is late to the party, but grabicon.com works very similarly to fvicon, but returns the image size you want without quality degradation. It adds a matching border around the smaller image to give you the icon size you require, so the icons on your web page all "match". If a site doesn't have an icon, grabicon creates a unique one for it.
Full disclosure, I'm grabicon's creator, but it's free and gives you what you're asking for :)

Update already indexed images, what's the effect on SERPS on Google image search

I have over 144,000 images indexed in google, however the current watermark is big and obtrusive and to provide a better user experience i wish to change it to a much smaller watermark. But since all these images are already indexed in google i was wondering what effect would it have on SERPS in google and Yahoo Image Searches as they are the biggest source of traffic to my website.
Unfortunately when you usually change images, Google treats it as a new image. I recommend changing a small percentage and doing a test run for at least a few weeks to see what the changes are in your SERPs.

Optimal image size for browser rendering

The question
Is there a known benchmark or theoretical substantiation on the optimal (rendering speed wise) image size?
A little background
The problem is as follows: I have a collection of very large images, thousands of pixels wide in each dimension. These should be presented to the user and manipulated somehow. In order to improve performance of my web app, I need to slice them. And here is where my question arises: what should be the dimensions of these slices?
You can only find out by testing, every browser will have different performance parameters and your user base may have anything from a mobile phone to a 16-core Xeon desktop. The larger determining factor may actually be the network performance in loading new tiles which is completely dependent upon how you are hosting and who your users are.
As the others already said, you can save a lot of research by duplicating the sizes already used by similar projects: Google Maps, Bing Maps, any other mapping system, not forgetting some of the gigapixel projects like gigapan.
It's hard to give a definitive dimension, but I successfully used 256x256 tiles.
This is also the size used by Microsoft Deep Zoom technology.
In absence of any other suggestions, I'd just use whatever Google Maps is using. I'd imagine they would have done such tests.