How can I extract image information dumped in NSData - objective-c

I have as an input the dump of an image in an NSData object. Now, I want to extract relevant information of the image from this object like number of pixels, no. of bits per pixel, etc.
Can anyone tell me how to extract this info from the NSData object dump?
P.S.: I have gone through this documentation of the NSData class, but could not isolate out the relevant methods.

So the easiest way is to actually build the UIImage object from the NSData then extract the info from the UIImage then.
UIImage* image = [UIImage imageWithData:yourData];
NSLog(#"Image is %dx%d",image.size.width, image.size.height);
If you are only interested in the properties of the image but don't want to actually build its representation and only get the properties, take a look at CGImageSource
#import <ImageIO/ImageIO.h>
CGImageSourceRef imgSrc = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
size_t nbImages = CGImageSourceGetCount(imgSrc);
for(size_t idx=0; idx<nbImages; ++idx)
{
NSDictionary* props = (__bridge NSDictionary*)CGImageSourceCopyPropertiesAtIndex(imgSrc, idx, NULL);
NSLog(#"properties for image %lu in imageSource: %#", idx, props);
}
CFRelease(imgSrc);
[EDIT] For this to work, obviously add the ImageIO.framework to your "Link Binary With Libraries"

Convert the data to UIImage, then take a look at this post.
UIImage *image = [UIImage imageWithData:data];

Related

Load an image file to core data

I'm a new developer with objective c.
I have images stored in a folder on my computer and I want to insert it to my coredata, can someone please tell me how can I do so, cause I can't find the right way for this.
By the way its my first question so get mad on me :)
Thank you
You need an Entity:(b.e.: Photos), this entity need an attribute, with type is Binary Data. (b.e imageData.
Here the example to save and to show:
// Save the image.
UIImage *imageToSave;
NSManagedObject *newPhoto = [NSEntityDescription entityForName:#"Photo" inManagedObjectContext:yourManagedObjectContext];
NSData *imageData = UIImageJPEGRepresentation(imageEdited, 0.7);
[newPhoto setValue:imageData forKey:#"imageData"];
[yourManagedObjectContext save:&error];
// Sow the image
UIImage *image = [UIImage imageWithData:[newPhoto valueForKey:#"imageData"]];

Converting a UIImage to jpeg/png file for sharing using AddThis Plug-in

I want to convert a UIImage into a format such as a jpeg or png so that I can then share that file using the IOS plug-in called "AddThis".
I tried to share it using just the UIImage but the plug-in doesn't support it so I need to find a way to convert the UIImage to a jpeg first, then add it into this code:
[AddThisSDK shareImage:[UIImage imageNamed:#"test.jpg] withService:#"twitter" title:#"I'm sharing something" description:#"Random description of image"];
the code has to have shareImage:[UIImageNamed:#""] otherwise an error occurs.
So far I've tried to convert it using UIImageJPEGRepresentation but I don't think I've done it properly. To be honest I tried to do it similarly to how you'd convert it straight from taking an image:
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"photo_boom.jpg"];
[UIImageJPEGRepresentation(shareImage, 1.0) writeToFile:jpgPath atomically:YES];
NSError *error;
NSFileManager *fileMgr = [NSFileManager defaultManager];
NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"photo_boom.jpg"];
NSLog(#"Documents directory: %#", [fileMgr contentsOfDirectoryAtPath:documentsDirectory error:&error]);
Something tells me this isn't the correct way... mainly because I haven't been able to get it to work.
I'd really appreciate any kind of help!
Basically, I've made a UIImage from converting a UIView:
UIGraphicsBeginImageContext(firstPage.bounds.size);
[firstPage.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
Which I then want to give a jpeg format because when I tried to simply put it in as
[AddThisSDK shareImage:image withService:#"twitter" title:#"I'm sharing something" description:#"Random description of image"];
It gives me an error
UIImage has its own internal representation of an image, so it's irrelevant whether you load it with jpeg or png data.
The API call you're interested has a UIImage as the first parameter, so something along the lines of
[AddThisSDK shareImage:[UIImage imageNamed:#"photo_boom.jpg"]
withService:#"twitter"
title:#"I'm sharing something"
description:#"Random description of image"];
should work, provided photo_boom.jpg is included in your bundle. If you're loading a previously saved image from a folder, you'll need something like this:
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"photo_boom.jpg"];
UIImage * myImage = [UIImage imageWithContentsOfFile: jpgPath];
[AddThisSDK shareImage:myImage
withService:#"twitter"
title:#"I'm sharing something"
description:#"Random description of image"];
If that doesn't work, have you tried putting a breakpoint on the AddThisSDK line, and checking the value of image? Type po image on the console.

rotate CGImage on disk using CGImageSource/CGImageDestination?

I'm working on an application that needs to take a picture using UIImagePickerController, display a thumbnail of it in the app, and also submit that picture to a server using ASIFormDataRequest (from ASIHTTPRequest).
I want to use setFile:withFileName:andContentType:forKey: from ASIFormDataRequest since in my experience it's faster than trying to submit an image using UIImageJPEGRepresentation and submitting raw NSData. To that end I'm using CGImageDestination and creating an image destination with a url, saving that image to disk, and then uploading that file on disk.
In order to create the thumbnail I'm using CGImageSourceCreateThumbnailAtIndex (see docs) and creating an image source with the path of the file I just saved.
My problem is that no matter what options I pass into the image destination or the thumbnail creation call, my thumbnail always comes out rotated 90 degrees counterclockwise. The uploaded image is also rotated. I've tried explicitly setting the orientation in the options of the image using CGImageDestinationSetProperties but it doesn't seem to take. The only solution I've found is to rotate the image in memory, but I really want to avoid that since doing so doubles the time it takes for the thumbnail+saving operation to finish. Ideally I'd like to be able to rotate the image on disk.
Am I missing something in how I'm using CGImageDestination or CGImageSource? I'll post some code below.
Saving the image:
NSURL *filePath = [NSURL fileURLWithPath:self.imagePath];
CGImageRef imageRef = [self.image CGImage];
CGImageDestinationRef ref = CGImageDestinationCreateWithURL((CFURLRef)filePath, kUTTypeJPEG, 1, NULL);
CGImageDestinationAddImage(ref, imageRef, NULL);
NSDictionary *props = [[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:1.0], kCGImageDestinationLossyCompressionQuality,
nil] retain];
//Note that setting kCGImagePropertyOrientation didn't work here for me
CGImageDestinationSetProperties(ref, (CFDictionaryRef) props);
CGImageDestinationFinalize(ref);
CFRelease(ref);
Generating the thumbnail
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)filePath, NULL);
if (!imageSource)
return;
CFDictionaryRef options = (CFDictionaryRef)[NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageIfAbsent,
(id)[NSNumber numberWithFloat:THUMBNAIL_SIDE_LENGTH], (id)kCGImageSourceThumbnailMaxPixelSize, nil];
CGImageRef imgRef = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, options);
UIImage *thumb = [UIImage imageWithCGImage:imgRef];
CGImageRelease(imgRef);
CFRelease(imageSource);
And then to upload the image I just use
[request setFile:path withFileName:fileName andContentType:contentType forKey:#"photo"];
where path is the path to the file saved with the code above.
As far as I know and after trying lots of different things, this cannot be done with current public APIs and has to be done in memory.

Get underlying NSData from UIImage

I can create UIImage from NSData using [UIImage imageWithData:] or [UIImage initWithData:] methods.
I wonder if I can get the NSData back from an existing UIImage?
Something on the line of NSData *myData = [myImage getData];
NSData *imageData = UIImageJPEGRepresentation(image, 0.7); // 0.7 is JPG quality
or
NSData *imageData = UIImagePNGRepresentation(image);
Depending if you want your data in PNG format or JPG format.
When initialising a UIImage object with init(data: originalData), that originalData will be converted into raw data in some kind of internal format. These data can be retrieved later with
let rawData = myImage.cgImage?.dataProvider?.data as Data?
However because the rawData is raw, it is going to be even larger than when using UIImagePNGRepresentation.
Swift 4.2
let dataPng = image.pngData() // return image as PNG. May return nil if image has no CGImageRef or invalid bitmap format
let dataJpg = image.jpegData(compressionQuality: 1) // return image as JPEG. May return nil if image has no CGImageRef or invalid bitmap format. compression is 0(most)..1(least)
Just because I stumbled upon this and i like swift :)
Here is the swift translation of Caroiline's post.
var imageData = UIImagePNGRepresentation(image)
Or
var imageData = UIImageJPEGRepresentation(image, 0.7)
You can expect that a UIImage is an object formatted for display and so won't be using the original data (which is probably in PNG or JPEG format) but more likely a pixel array or some other internal format. In other words, UIImage(data: foo) will not retain foo.
If you just want to use it elsewhere in your program, the original UIImage will do fine (I presume that's not actually the case here)
If you want to serialise, UIImagePNGRepresentation(...) will work but will be oversized if the original was a JPEG; UIImageJPEGRepresentation(...) will often result in slightly oversize data and is slightly lossy if your original was PNG. It should be okay to pick one based on the way the image will be displayed and the format you expect to be provided. If you happen to be using PNG in and want PNG out, you should get a good file size and almost identical data, special PNG chunks aside.
If you want to get an exact copy of the original data (perhaps to save a file after thumbnailing, or to SHA1 it), then you need to retain it separately. You might do something like:
var image:UIImage
var imageData:NSData {
didSet {
image = UIImage(data: imageData)
}
}
The only solution I found to serialize/unserialize a UIImage (via Data) is by using this solution.
You can then serialize/unserialize regardless of how the UIImage was created by using the extension method on UIImage:
let originalImage: UIImage = ...
let cgData = image.cgImage!.png!
let image = UIImage(data: cgData)!
Things have changed since the above answer was given, for those still looking because they share CipherCom's concern: iOS 5 has added the CIImage property.

Error saving NSImage as NSData

I am using the following code to save a frame of a movie to my desktop:
NSCIImageRep *imageRep = [NSCIImageRep imageRepWithCIImage:[CIImage imageWithCVImageBuffer:imageBuffer]];
NSImage *image = [[[NSImage alloc] initWithSize:[imageRep size]] autorelease];
[image addRepresentation:imageRep];
CVBufferRelease(imageBuffer);
NSArray *representations = [image representations];
NSData *bitmapData = [NSBitmapImageRep representationOfImageRepsInArray:representations usingType:NSJPEGFileType properties:nil];
[bitmapData writeToFile:#"/Users/ricky/Desktop/MyImage.jpeg" atomically:YES];
At the second last line of code, I receive the following messages in the console, with no result being saved to the desktop:
<Error>: CGImageDestinationFinalize image destination does not have enough images
CGImageDestinationFinalize failed for output type 'public.jpeg'
The NSImage is still an allocated object for the entire method call, so I'm not sure why I am receiving complaints about insufficient amount of images.
I'd appreciate any help.
Thanks in advance,
Ricky.
I think the source of the problem is that you're passing an array of NSCIImageRep objects to representationOfImageRepsInArray:usingType:properties:, which I believe expects an array of NSBitmapImageRep objects.
What you want to do is create an NSBitmapImageRep from your CIImage. Then you can use that to write to disk. That would be roughly:
CIImage *myImage = [CIImage imageWithCVImageBuffer:imageBuffer];
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCIImage:myImage];
NSData *jpegData [bitmapRep representationUsingType:NSJPEGFileType properties:nil];
[jpegData writeToFile:#"/Users/ricky/Desktop/MyImage.jpeg" atomically:YES];
Of course, you'd want to handle any error cases and probably pass a properties dictionary to fine-tune the JPEG creation.
I'm sorry i don't really know why your code doesn't work, but approaching it a different way (and i think more efficiently than your CVImageBuffer to CIImage to NSCIImageRep to NSImage to NSData, albeit at a slightly lower level):-
CVImageBuffer to CGImage
CGImage to jpg file
I don't have code ready made to do this but extracting the right stuff from those examples should be straight forward.