Is there PNG compression in ImageResizer like tinypng.org? - imageresizer

I have this project that displays a big list of transparent pngs. I use Cloudfront and ImageResizer to serve up my images:
media.mysite.com/Images/imageA.png;w=170
Now here is my pickle.
imageA.png is ~220kb --> After tinypng.org --> ~87kb / ~62% reduction
I have access to png "compression" libraries like that, but the problem is that the ImageResizer bumps the size back up to full (without compression)
imageA.png;w=170 is ~90kb --> After tinypng.org --> ~20kb / ~62% reduction
So even if I run my imageA.png through tinypng.org, then apply the ImageResizer resizing, I loose the png compression. And in the end this makes my page heavy.
(I do think about a workaround to load my images asynchrounously.)
So I'm wondering is ImageResizer can do this png compression "on the fly"?
I'm reading about http://imageresizing.net/plugins/prettygifs but I'm not sure I understand how it works.
Thanks!

This appears to be a duplicate of PNG Compression and ImageResizer.
To sum up - PNG encoding is absolutely nothing like Jpeg encoding.
Fully optimizing the encoding of a PNG file takes lots of time and memory. You can't resize an image without decoding and re-encoding it. ImageResizer prefers to get the image back to you fast (~100-250ms) with a larger file size, instead of taking 5-15 seconds and using all of your server's RAM to generate a smaller PNG. Consider how long tinypng.org takes to optimize your image. Would you be OK with that kind of delay, in real time, on your website?
ImageResizer offers WebP, which can give you a better speed/size balance than PNG.
Basically, you won't get much file size benefit out of using ImageResizer on pre-optimized PNG sprites.

This plugin works great for me
https://github.com/svenrog/ImageResizer.Plugins.PngOptimizer
Just build it (I couldn't find it on nuget) and add the following to your <plugins> section of web.config
<add name="PngOptimizer" />
Then add &optimized=1 to your URL parameters.
Make sure you are using some kind of cache - Amazon cloudfront works the best because you can point it at your website and serve any file cached.
Important: There's a typo in the example and it should be optimized=1 and not optimize=1

Related

Slow loading of images from amazon s3 and cloudFront

I am using amazon s3 services for hosting images. I have allot of images on my website.
I am also using CloudFront Distributions as cdn.
Image url's are fine.
But my images are still loading slowly as compared to some other top and competitors website.
Is there any way load images more fast?
Thanks
There could be numerous of other problems with images:
Loading too many images on the page. Make sure that you have lazy loading of your images that are not visible on initial render.
Using wrong size of images. This can be fixed by resizing images to correct size. Also, do not forget about responsive images. You can read more about them here
Using next generation formats. For instance, look at using WEBP for Chrome browser and JPEG2000 for Safari.
You can use Lighthouse tool to test your website on all problems listed above.
Also, it might be worth to consider using specialized CDN for images like pixboost.com.
Using a CDN like Cloudfront is the first step towards accelerating images. It addresses the challenges of Global distribution (your website is hosted in Europe but you have visitors from Australia => images will load from Cloudfront's node in Australia and be obviously faster than traveling from Europe). Also, it helps absorbing traffic peaks, for example during sales, Christmas, ...
To go further with image acceleration, you need to work on the images themselves and focus on 2 things:
resize the images to the target size (thumbnail, preview, full size, ...) and have different sizes for different screen sizes.
use image compression algorithms to "shrink" your images. You can use JPEG compression or alternative image formats like WebP, JPEG 2000, JPEG XR, ... These formats usually perform (shrink) better JPEG, however they come a big limitation: they are only supported by specific browser. Check caniuse.com for Browser support information: https://caniuse.com/#feat=webp
Bottom line, you will end up needing 15-20 versions of the same image to get the maximum optimisation across all browsers, device screen sizes, use cases, ...
There are multiple ways for automating this, for example by using ImageMagick. It's a great lib, but it requires coding and maintenance as it evolves quite dynamically.
Another option, is to use a Cloud-based image acceleration and delivery service. These services usually bundle image resizing and CDN delivery together and probably get you better CDN pricing as they negotiate big contracts with multiple CDN vendors.
We use https://cloudimage.io, but there are other great tools out there. Google is your best friend :).
Good luck with accelerating your page, faster images will definitely have a great impact.

How to convert scanned document images to a PDF document with high compression?

I need to convert scanned document images to a PDF document with high compression. Compression ratio is very important. Can someone recommend any solution on C# for this task?
Best regards, Alexander
There is a free program called PDFBeads that can do it. It requires Ruby, ImageMagick and optionally jbig2enc.
The PDF format itself will probably add next to no overhead in your case. I mean your images will account for most of the output file size.
So, you should compress your images with highest possible compression. For black-and-white images you might get smallest output using FAX4 or JBIG2 compression schemes (both supported in PDF files).
For other images (grayscale, color) either use smallest possible size, lowest resolution and quality, or convert images to black-and-white and use FAX4/JBIG2 compression scheme.
Please note, that most probably you will lose some detail of any image while converting to black-and-white.
If you are looking for a library that can help you with recompression then have a look at Docotic.Pdf library (Disclaimer: I am one of developers of the library).
The Optimize images sample code shows how to recompress images before adding them to PDF. The sample shows how to recompress with JPEG, but for FAX4 the code will be almost the same.

Possibilities to compress an image (size)

I am implementing an application. In that I need to find out a way to compress the image (size). Because it will help a lot for me to making the space comfortable in the database(server).Please help me on this.
Thanks in advance,
Sekhar Behalam.
Your options are to reduce the dimension of the images and/or reduce the quality by increasing the compression. Are the images photographic in nature (JPG is best) or simple solid colour graphics (use PNGs)?
If the images are JPG (lossy compression) you can simply load and re-save them with a higher compression setting. This can result in a large space saving.
The image quality will of course decline, but you can get away with quite a lot of compression in JPG before it is noticeable. What is acceptable of course is determined by the use of the images (which you have not stated).
Hope this helps.
Also consider pngcrush, which is a utility that is included with the SDK. In the Project Settings in Xcode, there's an option to "Compress PNG Images." Make sure to check that. Note that this only works for resource images (as far as I know)—but you haven't stated if these images will be user-created/instantiated, or brought into the app bundle directly.

Website optimization

have can i speed up the loading of images - specialy when i open the website for the first time it takes some time for images to load...
Is there anything i can do to improve this (html, css)?
link
Thank to all for your answers.
Crop the size of http://www.ursic-ei.si/datoteke/d4.jpg! It's 900 pixels wide, and most of that (half?) is empty and white. Make the image smaller and then use background-position and backgroud-color to compensate for anything you trimmed off the edges.
You have a lot of extra newlines in your HTML source. Not hugely significant, but theoretically - since in HTML there's no practical difference between one new line and two - you might want to remove some.
For images, you should consider a content delivery network (CDN), which will cache your images and other files and server them faster than you web server.
This is a must for any high-traffic website.
On the client, you can multipart download; e.g. in Firefox there's a bunch of settings under network.http.pipelining that help speed downloads.
On the server, there's little you can do (though you can gzip text-based files). The client must just know how to cache.
Since in your question you only ask about the images, I guess you already know that the cost of php processing and/or javascript is minor. If you want to speed up the images you can reduce their size, increase the compression rate... also try different formats. JPG is not always the best one.
Try GIF and/or PNG, also with these you can reduce the number of colors. Usually this formats are way better than JPG when you have simple pictures with few colors.
Also consider if some of your images are a simple patter that can be reproduced/repeated several times. For example, if you have a background image with a side banner, you just need one line and repeat it several times.

Looking for a lossless compression api similar to smushit

Anyone know of an lossless image compression api/service similar to smushit from yahoo?
From their own FAQ:
WHAT TOOLS DOES SMUSH.IT USE TO SMUSH IMAGES?
We have found many good tools for reducing image size. Often times these tools are specific to particular image formats and work much better in certain circumstances than others. To "smush" really means to try many different image reduction algorithms and figure out which one gives the best result.
These are the algorithms currently in use:
ImageMagick: to identify the image type and to convert GIF files to PNG files.
pngcrush: to strip unneeded chunks from PNGs. We are also experimenting with other PNG reduction tools such as pngout, optipng, pngrewrite. Hopefully these tools will provide improved optimization of PNG files.
jpegtran: to strip all metadata from JPEGs (currently disabled) and try progressive JPEGs.
gifsicle: to optimize GIF animations by stripping repeating pixels in different frames.
More information about the smushing process is available at the Optimize Images section of Best Practices for High Performance Web pages.
It mentions several good tools. By the way, the very same FAQ mentions that Yahoo will make Smush.It a public API sooner or later so that you can run at it your own. Until then you can just upload images separately for Smush.It here.
Try Kraken Image Optimizer: https://kraken.io/signup
The developer's plan is free - but only returns dummy results. You must subscribe to one of the paid plans to use the API, however, the Web Interface is free and unlimited for images of up to 1MB.
Find out more in the Kraken documentation.
See this:
http://github.com/thebeansgroup/smush.py
It's a Python implementation of smushit that can be run off-line to optimise your images without uploading them to Yahoo's service.
As I know the best image compression for me is : Tinypng
They have also API : https://tinypng.com/developers
Once you retrieve your key, you can immediately start shrinking
images. Official client libraries are available for Ruby, PHP,
Node.js, Python and Java. You can also use the WordPress plugin, the
Magento 1 extension or improved Magento 2 extension to compress your
JPEG and PNG images.
And First 500 images per month is for free
Tip : Via using their API, you have no limit about file-size (not max 5MB each as their online tool)