How to get exact size of image in bytes? - objective-c

I have calculated image size in bytes by converting image into NSData and its data length got wrong value.
NSData *data = UIImageJPEGRepresentation(image,0.5);
NSLog(#"image size in bytes %lu",(unsigned long)data.length);

Actually, the UIImage.length function here is not returning the wrong value, its just the result of the lossy conversion/reversion from a UIImage to NSData.
Setting the compressionQuality to the lowest compression possible of 1.0 in UIImageJpegRepresentation will not return the original image. Although the image metadata is stripped in this process, the function can and usually will yield an object larger than the original. Note that this increase in filesize does NOT increase the quality of the image from the compressed original either. Jpegs are highly compressed to begin with, which is why they are used so often, and the function is uncompressing it and then recompressing it. Its kind of like getting botox after age has stretched your body out, it might look similar to the original, but the insides are just not as as good as they used to be.
You could use a lower compressionQuality conditionally on larger files, close to 1.0, as the quality will drop off quickly. Other than that, depending on the final purpose of your images, the only other option would be to resize the image or adjust its resolution, perhaps in addition to adjusting the compression ratio. This change will exponentially curtail data usage. Web and mobile usage typically don't need the resolution as something like images meant for digital print.
You can write some code that adjusts each image and NSData representation only as much as needed to fit its individual data constraint.

Related

TFRecord larger than the original data

Actually, I am dealing with many pictures which are from different videos, so I use tf.SequenceExample() to save them as different sequences and their labels attached into TFRcord.
But after running my code to generate TFRecord, it generates a TFRecord which is 29GB larger than my original pictures 3GB.
Is that normal to create TFRecord larger than the original data?
You may be storing the decoded images instead of the jpeg encoded ones. TFRecord has no concept of image formats so you can use any encoding you want. To keep the size the same, convert the original image file contents to a BytesList and store that without calling decode_image or using any image libraries or anything that understands image formats.
Another possibility is you might be storing the image as an Int64List full of bytes which would be 8x the size. Instead, store it as a BytesList containing a single Bytes.
Check the the type of data you load. I guess you load images as pixel-data. Every pixel is unit8 (8 bit) and likely to be converted to float (32 bit). Hence you have to expect that it gets 4 times the original size (3 GB -> 12 GB).
Also, the original format might have (better) compression than tfrecords. (I'm not sure if tfrecords can use compression)

Size in MemoryRequirements not what I'm expecting

I'm creating a texture, querying the memory requirements, and it's not what I was expecting. Here's the ImageCreateInfo structure:
ImageCreateInfo()
.X2D(1024, 1024)
.Format(Format::R8G8B8_UNORM)
.InitialLayout(ImageLayout::PREINITIALIZED)
.Tiling(ImageTiling::LINEAR)
.Usage(ImageUsageFlagBits::TRANSFER_SRC);
Now, I was expecting one byte for each of R,G,B, at width and height of 1024 to give memory requirements of 3 * 1024 * 1024 = 3,145,728. But instead, it returns 1,048,576, which is perfectly 1024 * 1024. It seems to not care about the one byte for each channel of RGB. What am I missing here?
You're right in that this should return 3,145,728 bytes, but is the R8G8B8_UNORM format actually available on your implementation? If not, you won't get a correct allocation size because you actually are not going to be able to use that image anyway.
If you enable validation layers this should throw an error from the image validation layers btw.
At least on the GPU I'm right now it's not supported for any of the tiling modes or as a buffer format. But e.g. R8G8B8A8 or R8G8 are available and return the correct allocation size.
If R8G8B8 is actually available on your GPU could you post your complete VkImageCreateInfo structure, including number of mips and layers?
So a good idea would be to check if the image format you request (and want to allocate for) is actually supported for your use case (linear, optimal, buffer).

Efficiently writing NSDictionary with an NSImage to a file

I have an NSMutableDicitionary containing two NSStrings and one NSImage, which I am trying to write to a file. However the output file is huge compared to the input image file (88kB to 1.1MB). The image is loaded via an Image Well drag and drop.
My first try was:
NSImage * image = _imageView.image;
[dictionary setObject:image forKey:#"Image"];
[NSKeyedArchiver archiveRootObject:dictionary toFile:filePath];
Result: 1.1MB
Second try:
NSImage * image = _imageView.image;
NSData * imageData = [image TIFFRepresentation];
NSBitmapImageRep * encodedImage = [NSBitmapImageRep imageRepWithData:imageData];
[dictionary setObject:encodedImage forKey:#"Image"];
[NSKeyedArchiver archiveRootObject:dictionary toFile:filePath];
Result: still 1.1MB
At first I thought that NSBitmapImage would keep the image in a (at least relatively) compressed format, but the file size does not differ one bit.
How should I write the NSMutable dictionary to a file so that it does not exceed the initial image file size by so much? Or how should I store the image efficiently in the dictionary in the first place?
I don't know if it matters, but the original format is png.
When you drop an image into an Image View, it gets loaded into an NSImage as a bitmap (color data for every pixel). It isn't compressed any longer like the original image you dropped into it was (PNG in your case). PNG uses lossless compression and its data footprint will be in most cases less than an uncompressed bitmap or TIFF representation of the image.
Then when you request the TIFF representation of the image, it has nothing to do with the original PNG image you loaded into it anymore except the pixel data itself.
If you want to store back the image in a compressed format or keep the original image, you can try one of the following:
Ask for compressed TIFF representation using TIFFRepresentationUsingCompression:factor:. Will still be different format and you will lose all metadata
Re-compressed the image with your desired format e.g PNG, jpeg, etc (i.e. with Quartz API) and store the compressed format in your dictionary. You will still need to preserve metadata yourself if you need it
Keep reference to the original image file or store its data. By overriding the ImageView's drag/drop methods and keeping track of the original's image URL or load it from the file and store the raw data of the original format. Then save that into your dictionary. You will get the exact copy of the original image and won't have to deal with format conversions and metadata copying.

How to quickly estimate file sizes of resized images on iOS?

In Mail, when I add an image and try to send it, it quickly asks me which size I want to send the images as. See screenshot:
I want to do something similar in an app where I will be uploading an image and want to enable the user to resize the image before it is uploaded. What is the best way to estimate the file size as Apple does here?
It seems that it would take too long to actually create each of the resized images only to check them for sizes. Is there a better way?
I did find this Apple sample code which helps a little bit but to be honest is a bit overwhelming. :)
The single biggest factor in determining the final compressed image size is not image size or JPEG compression quality, but image complexity (lit. entropy). If you know that you're always going to be dealing with highly-detailed photos (as opposed to solid color fields or gradients), that somewhat reduces the variance along that dimension, but...
I spent a fair amount of time doing numerical analysis on this problem. I sampled the compressed image size of a detailed, high-resolution image that was scaled down in 10 percentage point increments, at 9 different JPEG quality levels. This produced a 3-dimensional data set describing an implicit function z = (x, y) where x is the scaled image size in pixels (w*h), y is the JPEG compression quality, and z is the size of the resulting image in bytes.
The resulting surface is hard to estimate. Counterintuitively, it has oscillations and multiple inflection points, meaning that a function of degree 2 in both x and y is insufficient to fit it, and increasing the polynomial degrees and creating custom fitting functions didn't yield significantly better results. Not only is it not a linear relation, it isn't even a monotonic relation. It's just complex.
Let's get practical. Notice when Apple prompts you for the image size: when you hit "Send", not when the image first appears in the mail composition view. This gives them as long as it takes to compose your message before they have to have the estimated image sizes ready. So my suspicion is this: they do it the hard way. Scaling the image to the different sizes can be parallelized and performed in the background, and even though it takes several seconds on iPhone 4-calibur hardware, all of that work can be hidden from the user. If you're concerned about memory usage, you can write the images to temporary files and render them sequentially instead of in parallel, which will use no more than ~2x the memory of the uncompressed file in memory.
In summary: unless you know a lot about the expected entropy of the images you're compressing, any estimation function will be wildly inaccurate for some class of images. If you can handle that, then it's fairly easy to do a linear or quadratic fit on some sample data and produce a function for estimation purposes. However, if you want to get as close as Apple does, you probably need to do the actual resizing work in the background, since there are simply too many factors to construct a heuristic that gets it right all of the time.
I have built a method that would resize the image, like so:
-(UIImage *)resizeImage:(UIImage *)image width:(CGFloat)resizedWidth height:(CGFloat)resizedHeight
{
CGImageRef imageRef = [image CGImage];
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef bitmap = CGBitmapContextCreate(NULL, resizedWidth, resizedHeight, 8, 4 * resizedWidth, colorSpace, kCGImageAlphaPremultipliedFirst);
CGContextDrawImage(bitmap, CGRectMake(0, 0, resizedWidth, resizedHeight), imageRef);
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage *result = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap);
CGImageRelease(ref);
return result;
}
And to get the size of the image, you would have to convert it into NSData, and ask for the length:
UIImage* actualImage = [UIImage imageNamed:#"image"];
NSData* actualImageData = UIImagePNGRepresentation(actualImage);
NSLog(#"Actual %f KB", (CGFloat)actualImageData.length / (CGFloat)1024);
UIImage* largeImage = [self resizeImage:actualImage width:actualImage.size.width * 0.8 height:actualImage.size.height * 0.8];
NSData* largeImageData = UIImagePNGRepresentation(largeImage);
NSLog(#"Large %f KB", (CGFloat)largeImageData.length / (CGFloat)1024);
UIImage* mediumImage = [self resizeImage:actualImage width:actualImage.size.width * 0.5 height:actualImage.size.height * 0.5];
NSData* mediumImageData = UIImagePNGRepresentation(mediumImage);
NSLog(#"Medium %f KB", (CGFloat)mediumImageData.length / (CGFloat)1024);
UIImage* smallImage = [self resizeImage:actualImage width:actualImage.size.width * 0.3 height:actualImage.size.height * 0.3];
NSData* smallImageData = UIImagePNGRepresentation(smallImage);
NSLog(#"Small %f KB", (CGFloat)smallImageData.length / (CGFloat)1024);
You can always use the UIImageJPEGRepresentation to compress an image. The four options can be values ranging 0.25, 0.5, 0.75 and 1.0 whose size can be found out easily by calculations on image after applying the same method.
The image sizes provided in the Mail app are only estimates - the actual filesize of the sent image is different. It would be also be far too slow to convert a full-size image (3264 x 2448 in the iPhone 4S) to the various sizes, just to get the filesize.
[edit]
The compression filesizes aren't linear, so you can't just get numPixels/filesize to accurately estimate the filesize for smaller images.
So this answer isn't totally useless, here are the image sizes the Mail.app exports at:
Small: 320x240
Medium: 640x480
Large: 1224x1632
If you store it to NSData you can call [NSData length] to get number of bytes contained and then divide it to get proper sizes in kB or MB

UIImagePicker and UIImage size

I have an UIImagePickerViewController and I'm saving my image to my app just with UIImagePickerControllerOriginalImage and :
[self.fileManager createFileAtPath:aPath contents:UIImageJPEGRepresentation(image, 1.f) attributes:nil];
Result :
original image from Photo.app -> 2.2 Mo
new saved image from my app -> 5.3 Mo with JPEG representation & 10.8 Mo with PNG representation !
So my question is quite simple : why ? And how to reach the Photo.app size ?
Thanks for your help :)
As you probably know, the second parameter passed in UIImageJPEGRepresentation defines the compression quality (1 being the highest). Because the image is broken down back to basic data and then re-compressed (jpeg is a compressed image format), the result may be worse compression (larger file) and of course the image quality will not get any better. Try lowering the parameter to something in between 0.0 and 1.0 and see when you get the best match in file size (Will be unique for each image processed, so try and find a good value in the middle).