I would like to know how to retrieve the dimensions of an image with objective-c. I have searched google and the documentation with no success, unfortunately.
NSImage has a size method, returning an NSSize struct containing the width and height of the image, or {0, 0} if the size couldn't be determined.
NSImage *image = [NSImage imageNamed:#"myimage.png"];
NSSize dimensions = [image size];
NSLog (#"Width: %lf, height: %lf", (double) dimensions.width, (double) dimensions.height);
Edit:
I have placed explicit casts to double because on some systems (64-bit especially) the width and height members may not be float types but double, and we would be providing the wrong types to NSLog. A double can represent more values than float, so we can cast up with a bit more safety.
In iOS, UIImage has a size property, which returns a CGSize.
In OS X, NSImage has a size method which returns an NSSize. It also has a representations property, which returns an NSArray of NSImageRep objects. NSImageRep has a size method.
The size method is correct for a size in points. Sometimes you need the pixel dimensions rather than the nominal size, in which case you must select an image rep (NSImages can have several) and query its pixelsWide and pixelsHigh properties. (Alternatively, load images using CGImage and use CGImageGetWidth() and CGImageGetHeight().
You could use the NSString+FastImageSize category from the Additions section in https://github.com/danielgindi/drunken-danger-zone
I've only recently written it, and it seems to work fine.
This code will actually open the file and check the headers for dimensions (= it is really really fast), and supports PNG,JPG,GIF and BMP. Although I really hope you don't use BMP on a mobile device...
Enjoy!
Related
I'm using the following code to scale down my image:
NSImage * smallImage = [[NSImage alloc] initWithSize:CGSizeMake(width, height)];
[smallImage lockFocus];
[[NSGraphicsContext currentContext]
setImageInterpolation:NSImageInterpolationHigh];
[image drawInRect:CGRectMake(0, 0, width, height)
fromRect:NSZeroRect
operation:NSCompositeCopy
fraction:1.0];
[smallImage unlockFocus];
Basically, this works fine, but if I set the width and height to exactly as the original one, and compare the images pixel by pixel, there are still some pixels changed.
And since my app is pixel-sensitive, I need to make sure every pixel is correct, so I'm wondering how can I keep pixels as they are during such scale down, is it possible?
Yes, NSImage will change the image data in various ways. It attempts to optimize the "payload" image data according to the size needed for its graphical representation on the UI.
Scaling it down and up again is generally not a good idea.
AFAIK you can only avoid that by keeping the original image data somehere else (e.g. on disk or in a separate NSData container or so).
If you need to apply calcluations or manipulations on the image data which needs to be 100% accurate down to each pixel, then work with NSData or C strings/byte arrays only. Avoid NSImage unless
a) the result is for presentations on the device only
b) you really need functionality that comes with NSImage objects.
I am explaining the problems in principle, not scientific.
Pixels have a fixed size, for technical reasons.
No, you can't keep your pixels, when scaling down.
An example to explain: Pixelsize in square 0,25 inch. Now you want to fill a square wich 1,1 inch. It's impossible. How many pixels should be used? 4 = too less, 5 too much. Now in the COCOA libs or wherever it happens, a decision is made: better more pixels = enlarging square size, or less = reducing square size. That's out of control for you.
Another problem is - also out of control for you - the way how measures are computed.
An example: 1 inch is nearly 2.54 cm, so 1.27 is 0.5 inch, but what is 1.25 cm? Values, not only measures are internally computed using one measure-unit: I think it's inch (as DOUBLE, with fixed number of digits after the period). When using the unit cm it is internally recomputed in inch, some mathematical operations are done (e.g. How many pixels are neccessary for the square?) and the result is sent back, maybe recomputed in cm. That also happens when using INTEGER, internally computed as DOUBLE and returned as INTEGERS. Funny things = unexpected values happen from that, especially after divisions, which are used for scaling down!
By the way: If an image is scaled, often new pixels are created for the scaled image. For example, if you have 4 pixels: 2 red, 2 blue, the new ONE has a mixed color, somehow violet. There is no way back. So always work on copies of an image!
In Mail, when I add an image and try to send it, it quickly asks me which size I want to send the images as. See screenshot:
I want to do something similar in an app where I will be uploading an image and want to enable the user to resize the image before it is uploaded. What is the best way to estimate the file size as Apple does here?
It seems that it would take too long to actually create each of the resized images only to check them for sizes. Is there a better way?
I did find this Apple sample code which helps a little bit but to be honest is a bit overwhelming. :)
The single biggest factor in determining the final compressed image size is not image size or JPEG compression quality, but image complexity (lit. entropy). If you know that you're always going to be dealing with highly-detailed photos (as opposed to solid color fields or gradients), that somewhat reduces the variance along that dimension, but...
I spent a fair amount of time doing numerical analysis on this problem. I sampled the compressed image size of a detailed, high-resolution image that was scaled down in 10 percentage point increments, at 9 different JPEG quality levels. This produced a 3-dimensional data set describing an implicit function z = (x, y) where x is the scaled image size in pixels (w*h), y is the JPEG compression quality, and z is the size of the resulting image in bytes.
The resulting surface is hard to estimate. Counterintuitively, it has oscillations and multiple inflection points, meaning that a function of degree 2 in both x and y is insufficient to fit it, and increasing the polynomial degrees and creating custom fitting functions didn't yield significantly better results. Not only is it not a linear relation, it isn't even a monotonic relation. It's just complex.
Let's get practical. Notice when Apple prompts you for the image size: when you hit "Send", not when the image first appears in the mail composition view. This gives them as long as it takes to compose your message before they have to have the estimated image sizes ready. So my suspicion is this: they do it the hard way. Scaling the image to the different sizes can be parallelized and performed in the background, and even though it takes several seconds on iPhone 4-calibur hardware, all of that work can be hidden from the user. If you're concerned about memory usage, you can write the images to temporary files and render them sequentially instead of in parallel, which will use no more than ~2x the memory of the uncompressed file in memory.
In summary: unless you know a lot about the expected entropy of the images you're compressing, any estimation function will be wildly inaccurate for some class of images. If you can handle that, then it's fairly easy to do a linear or quadratic fit on some sample data and produce a function for estimation purposes. However, if you want to get as close as Apple does, you probably need to do the actual resizing work in the background, since there are simply too many factors to construct a heuristic that gets it right all of the time.
I have built a method that would resize the image, like so:
-(UIImage *)resizeImage:(UIImage *)image width:(CGFloat)resizedWidth height:(CGFloat)resizedHeight
{
CGImageRef imageRef = [image CGImage];
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef bitmap = CGBitmapContextCreate(NULL, resizedWidth, resizedHeight, 8, 4 * resizedWidth, colorSpace, kCGImageAlphaPremultipliedFirst);
CGContextDrawImage(bitmap, CGRectMake(0, 0, resizedWidth, resizedHeight), imageRef);
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage *result = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap);
CGImageRelease(ref);
return result;
}
And to get the size of the image, you would have to convert it into NSData, and ask for the length:
UIImage* actualImage = [UIImage imageNamed:#"image"];
NSData* actualImageData = UIImagePNGRepresentation(actualImage);
NSLog(#"Actual %f KB", (CGFloat)actualImageData.length / (CGFloat)1024);
UIImage* largeImage = [self resizeImage:actualImage width:actualImage.size.width * 0.8 height:actualImage.size.height * 0.8];
NSData* largeImageData = UIImagePNGRepresentation(largeImage);
NSLog(#"Large %f KB", (CGFloat)largeImageData.length / (CGFloat)1024);
UIImage* mediumImage = [self resizeImage:actualImage width:actualImage.size.width * 0.5 height:actualImage.size.height * 0.5];
NSData* mediumImageData = UIImagePNGRepresentation(mediumImage);
NSLog(#"Medium %f KB", (CGFloat)mediumImageData.length / (CGFloat)1024);
UIImage* smallImage = [self resizeImage:actualImage width:actualImage.size.width * 0.3 height:actualImage.size.height * 0.3];
NSData* smallImageData = UIImagePNGRepresentation(smallImage);
NSLog(#"Small %f KB", (CGFloat)smallImageData.length / (CGFloat)1024);
You can always use the UIImageJPEGRepresentation to compress an image. The four options can be values ranging 0.25, 0.5, 0.75 and 1.0 whose size can be found out easily by calculations on image after applying the same method.
The image sizes provided in the Mail app are only estimates - the actual filesize of the sent image is different. It would be also be far too slow to convert a full-size image (3264 x 2448 in the iPhone 4S) to the various sizes, just to get the filesize.
[edit]
The compression filesizes aren't linear, so you can't just get numPixels/filesize to accurately estimate the filesize for smaller images.
So this answer isn't totally useless, here are the image sizes the Mail.app exports at:
Small: 320x240
Medium: 640x480
Large: 1224x1632
If you store it to NSData you can call [NSData length] to get number of bytes contained and then divide it to get proper sizes in kB or MB
I'm creating a CIImage from a UIImage as follows:
CIImage* someCIImage = [CIImage imageWithCGImage:someUIImage.CGImage];
Comparing someCIImage.extent.size.width/height with someUIImage.size.width/height, I find that sometimes the CIImage is double size in dimensions, and sometimes its the same size as the UIImage.
It seems like if the UIImage is slightly larger, the CIImage is double the size, whereas if the UIImage is slightly smaller this isn't the case.
Has anyone seen this before/know why this is? This is causing me real trouble as I'm trying to draw a CIImage from a loaded UIImage.
size has to do with the distance between points on the display, as so the number of points on a retina display are doubled. size does not represent the number of pixels as you assumed.
I think i'm missing something really basic here. If I do this with a legal URL/path which I know exists:
NSImage* img = [[NSImage alloc] initWithContentsOfFile:[[selectedItem url] path]];
NSLog(#"Image width: %d height: %d", [img size].width, [img size].height);
then I get reported to the console that the width is -2080177216 and the height 0. Although I know that the width is actually 50 and the height 50. I tried calling isValid and it returns YES, and I also tried checking the size of the first representation and it returned the same messed up values. How come the image is not loading properly?
The size method returns an NSSize, a struct whose width and height members are of type float. You're treating them as int. Use %f and everything should be fine.
Does this help?
setSize:
Sets the width and height of the image.
- (void)setSize:(NSSize)aSize
Discussion:
The size of an NSImage object must be set before it can be used. If the size of the image hasn’t already been set when an image representation is added, the size is taken from the image representation's data. For EPS images, the size is taken from the image's bounding box. For TIFF images, the size is taken from the ImageLength and ImageWidth attributes.
Changing the size of an NSImage after it has been used effectively resizes the image. Changing the size invalidates all its caches and frees them. When the image is next composited, the selected representation will draw itself in an offscreen window to recreate the cache.
Availability
Available in Mac OS X v10.0 and later.
See Also
See NSImage size not real size with some pictures?
You need to iterate through NSImageRep and set the size from the largest found.
Objective-C / Cocoa:
I need to load the image from a JPG file into a two dimensional array so that I can access each pixel. I am trying (unsuccessfully) to load the image into a NSBitmapImageRep. I have tried several variations on the following two lines of code:
NSString *filePath = [NSString stringWithFormat: #"%#%#",#"/Users/adam/Documents/phoneimages/", [outLabel stringValue]]; //this coming from a window control
NSImageRep *controlBitmap = [[NSImageRep alloc] imageRepWithContentsOfFile:filePath];
With the code shown, I get a runtime error: -[NSImageRep imageRepWithContentsOfFile:]: unrecognized selector sent to instance 0x100147070.
I have tried replacing the second line of code with:
NSImage *controlImage = [[NSImage alloc] initWithContentsOfFile:filePath];
NSBitmapImageRep *controlBitmap = [[NSBitmapImageRep alloc] initWithData:controlImage];
But this yields a compiler error 'incompatible type' saying that initWithData wants a NSData variable not an NSImage.
I have also tried various other ways to get this done, but all are unsuccessful either due to compiler or runtime error. Can someone help me with this? I will eventually need to load some PNG files in the same way (so it would be nice to have a consistent technique for both).
And if you know of an easier / simpler way to accomplish what I am trying to do (i.e., get the images into a two-dimensional array), rather than using NSBitmapImageRep, then please let me know! And by the way, I know the path is valid (confirmed with fileExistsAtPath) -- and the filename in outLabel is a file with .jpg extension.
Thanks for any help!
Easy!
NSImage *controlImage = [[NSImage alloc] initWithContentsOfFile:filePath];
NSBitmapImageRep *imageRep = [[controlImage representations] objectAtIndex:0];
Then to get the actual bitmap pixel data:
unsigned char *pixelData = [imageRep bitmapData];
If your image has multiple representations (it probably doesn't), you can get them out of that same array. The same code will work for your .png images.
Edit:
Carl has a better answer. This is only good if you also want to manipulate the image in some way, like scaling or changing color mode.
Original:
I would probably use core graphics for this. NSImage and most NSImageRep were not designed to have a pixel array sitting around. They convert source image data into pixels only when drawn.
When you create a CGBitmapContext, you can pass it a buffer of pixels to use. That is your two dimensional array, with whatever row bytes, color depth, pixel format and other properties you specify.
You can initialize a CGImage with JPG or PNG data using CGImageCreateWithJPEGDataProvider or CGImageCreateWithPNGDataProvider respectively.
Once you draw the image into the context with CGContextDrawImage the original buffer you passed to CGBitmapContextCreate is now filled with the pixel data of the image.
Create a CGDataProvider with the image data.
Create a CGImageRef with the data provider.
Create a buffer large enough for the image pixels.
Create a CGBitmapContext that is the size of the image with the pixel buffer.
Draw the image into the bitmap context.
Access the pixels in the buffer.
If you want to use NSImage instead of CGImage, you can create an NSGraphicsContext from a CGContext with graphicsContextWithGraphicsPort:flipped: and set it as the current context. That basically replaces steps 1 and 2 above with whatever code you want to use to make an NSImage.
imageRepWithContentsOfFile is a static function
Correct way to use it:
[NSBitmapImageRep imageRepWithContentsOfFile:filePath];