Get underlying NSData from UIImage - cocoa-touch

I can create UIImage from NSData using [UIImage imageWithData:] or [UIImage initWithData:] methods.
I wonder if I can get the NSData back from an existing UIImage?
Something on the line of NSData *myData = [myImage getData];

NSData *imageData = UIImageJPEGRepresentation(image, 0.7); // 0.7 is JPG quality
or
NSData *imageData = UIImagePNGRepresentation(image);
Depending if you want your data in PNG format or JPG format.

When initialising a UIImage object with init(data: originalData), that originalData will be converted into raw data in some kind of internal format. These data can be retrieved later with
let rawData = myImage.cgImage?.dataProvider?.data as Data?
However because the rawData is raw, it is going to be even larger than when using UIImagePNGRepresentation.

Swift 4.2
let dataPng = image.pngData() // return image as PNG. May return nil if image has no CGImageRef or invalid bitmap format
let dataJpg = image.jpegData(compressionQuality: 1) // return image as JPEG. May return nil if image has no CGImageRef or invalid bitmap format. compression is 0(most)..1(least)

Just because I stumbled upon this and i like swift :)
Here is the swift translation of Caroiline's post.
var imageData = UIImagePNGRepresentation(image)
Or
var imageData = UIImageJPEGRepresentation(image, 0.7)

You can expect that a UIImage is an object formatted for display and so won't be using the original data (which is probably in PNG or JPEG format) but more likely a pixel array or some other internal format. In other words, UIImage(data: foo) will not retain foo.
If you just want to use it elsewhere in your program, the original UIImage will do fine (I presume that's not actually the case here)
If you want to serialise, UIImagePNGRepresentation(...) will work but will be oversized if the original was a JPEG; UIImageJPEGRepresentation(...) will often result in slightly oversize data and is slightly lossy if your original was PNG. It should be okay to pick one based on the way the image will be displayed and the format you expect to be provided. If you happen to be using PNG in and want PNG out, you should get a good file size and almost identical data, special PNG chunks aside.
If you want to get an exact copy of the original data (perhaps to save a file after thumbnailing, or to SHA1 it), then you need to retain it separately. You might do something like:
var image:UIImage
var imageData:NSData {
didSet {
image = UIImage(data: imageData)
}
}

The only solution I found to serialize/unserialize a UIImage (via Data) is by using this solution.
You can then serialize/unserialize regardless of how the UIImage was created by using the extension method on UIImage:
let originalImage: UIImage = ...
let cgData = image.cgImage!.png!
let image = UIImage(data: cgData)!

Things have changed since the above answer was given, for those still looking because they share CipherCom's concern: iOS 5 has added the CIImage property.

Related

Objective-C/Cocoa to JXA

I need to convert a PNG file to Base64 data so that I can add it to a JSON object, using JXA (JavaScript Application Scripting).
JXA is limited compared to regular JavaScript so I can't immediately use functions from FileReader, etc.
From what I've read, there is no way that I know how to do this without using Objective-C/Cocoa (which I only started reading about today for this task).
I found the following code in another post:
NSArray *keys = [NSArray arrayWithObject:#"NSImageCompressionFactor"];
NSArray *objects = [NSArray arrayWithObject:#"1.0"];
NSDictionary *dictionary = [NSDictionary dictionaryWithObjects:objects forKeys:keys];
NSImage *image = [[NSImage alloc] initWithContentsOfFile:[imageField stringValue]];
NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithData:[image TIFFRepresentation]];
NSData *tiff_data = [imageRep representationUsingType:NSPNGFileType properties:dictionary];
NSString *base64 = [tiff_data encodeBase64WithNewlines:NO];
I believe it is pertinent to what I am trying to do- does anybody know how I can bridge this method to use it in JXA?
I have been reading over the JXA Cookbook's section on Syntax for calling ObjC functions, but I am having difficulty understanding it... this is all that I have come up with so far:
var desktopString = app.pathTo("desktop").toString()
var file = `${desktopString}/test.png`
ObjC.import("Cocoa");
var image = $.NSImage.alloc.initWithContentsOfFile(file)
var imageRep = $.NSBitmapImageRep.alloc.initWithData(image)
But I don't know how to proceed- I am thrown off by:
The whole initial NSArray/NSDictionary part
TIFFRepresentation (Do I need it? Where do I put it?)
NSData *tiff_data = [imageRep representationUsingType:NSPNGFileType
properties:dictionary]; (There's no alloc! Why is dictionary needed?)
NSString *base64 = [tiff_data encodeBase64WithNewlines:NO]; (Again,
no alloc.)
I would be very appreciative if somebody could point me in the right direction / give me a few pointers on how I can accomplish what I am trying to do.
Thank you in advance!
Converting an image file into an NSImage representation and then onto a base-64 string is a lot of work, and was only applicable to the answer you sourced because the OP there was coming from a starting point of having NSImage class data. As you stated that you have a .png file, the route is much simpler:
ObjC.import('Foundation');
function fileToBase64(filepath) {
const standardizedPath = $.NSString.stringWithString(filepath)
.stringByStandardizingPath;
const base64String = $.NSData.dataWithContentsOfFile(standardizedPath)
.base64EncodedStringWithOptions(0);
return ObjC.unwrap(base64String);
}
(() => {
return fileToBase64('~/Desktop/test.png');
})();
For reference, this returns identical output to the following bash shell command:
base64 --input ~/Desktop/test.png
PS. For the benefit of learning, despite what the JXA Cookbook teaches, try not to import the entire Cocoa framework into your scripts, and just import the ones specific to the objective-c classes you're using.

How can I extract image information dumped in NSData

I have as an input the dump of an image in an NSData object. Now, I want to extract relevant information of the image from this object like number of pixels, no. of bits per pixel, etc.
Can anyone tell me how to extract this info from the NSData object dump?
P.S.: I have gone through this documentation of the NSData class, but could not isolate out the relevant methods.
So the easiest way is to actually build the UIImage object from the NSData then extract the info from the UIImage then.
UIImage* image = [UIImage imageWithData:yourData];
NSLog(#"Image is %dx%d",image.size.width, image.size.height);
If you are only interested in the properties of the image but don't want to actually build its representation and only get the properties, take a look at CGImageSource
#import <ImageIO/ImageIO.h>
CGImageSourceRef imgSrc = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
size_t nbImages = CGImageSourceGetCount(imgSrc);
for(size_t idx=0; idx<nbImages; ++idx)
{
NSDictionary* props = (__bridge NSDictionary*)CGImageSourceCopyPropertiesAtIndex(imgSrc, idx, NULL);
NSLog(#"properties for image %lu in imageSource: %#", idx, props);
}
CFRelease(imgSrc);
[EDIT] For this to work, obviously add the ImageIO.framework to your "Link Binary With Libraries"
Convert the data to UIImage, then take a look at this post.
UIImage *image = [UIImage imageWithData:data];

How do I create a valid CGImageSourceRef from an ALAssetRepresentation?

I'm trying to use CGImageSourceCreateThumbnailAtIndex to efficiently create a resized version of an image. I have some existing code that does this with images from disk, and now I'm trying to use an image that comes from ALAssetsLibrary.
Here's my code:
ALAsset *asset;
ALAssetRepresentation *representation = [asset defaultRepresentation];
CGImageRef imageRef = [representation fullResolutionImage];
CGDataProviderRef provider = CGImageGetDataProvider(imageRef);
CGImageSourceRef sourceRef = CGImageSourceCreateWithDataProvider(provider, NULL);
NSDictionary *resizeOptions = #{
kCGImageSourceCreateThumbnailWithTransform : #YES,
kCGImageSourceCreateThumbnailFromImageAlways : #YES,
kCGImageSourceThumbnailMaxPixelSize : #(2100)
};
CGImageRef resizedImage = CGImageSourceCreateThumbnailAtIndex(source, 0, resizeOptions);
The problem is that resizedImage is null, and CGImageSourceGetCount(sourceRef) returns 0. The data provider does have quite a bit of data in it, though, and the data does appear to be valid image data. The ALAsset comes from an iPhone 4S camera roll.
What am I missing? Why does CGImageSourceCreateWithDataProvider() create an image source with 0 images?
CGImageSource is for deserializing serialized images, such as JPEGs, PNGs, and whatnot.
CGImageGetDataProvider returns (the provider of) the raw pixel data of the image. It does not return serialized bytes in some external format. CGImageSource has no way to know what pixel format (color space, bits-per-component, alpha layout, etc.) any given raw pixel data is in.
You could try getting the URL of the asset rep and giving that to CGImageSourceCreateWithURL. If that doesn't work (e.g., not a file URL), you'll have to run the image through a CGImageDestination and create a CGImageSource with wherever you put the output.
(The one other thing to try would be to see whether the rep's filename is actually a full path, the way Cocoa often misuses the term. But you probably shouldn't count on that.)
One thing you might try is the asset rep's CGImageWithOptions: method.
The documentation claims that:
This method returns the biggest, best representation available, unadjusted in any way.
But it says that about fullResolutionImage, too, and I'm not sure why this class would have both methods if they both do the same thing. I wonder if it's a copy-and-paste error.
Try CGImageWithOptions: with a bunch of thumbnail-creating options and see what happens.
Option #3 would be the rep's fullScreenImage. Depending on what sort of “thumbnail” you need, it may be cheaper and/or simpler to just use this, which will be no bigger than (approximately) the size of the device's screen.
This can also help...
ALAssetRepresentation* rep = [asset defaultRepresentation];
NSDictionary* options = [[NSDictionary alloc] initWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageAlways,
(id)[NSNumber numberWithDouble:400], (id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef image = [rep CGImageWithOptions:options];

iOS: Load PNG from disk, add PixelPerMeter properties to it, and save it to the Photo Album

Apparently with CGImages you can add kCGImagePropertyPNGXPixelsPerMeter and kCGImagePropertyPNGYPixelsPerMeter properties to the image and they will get added to the pHYs chunk of the png. But UIImageWriteToSavedPhotosAlbum doesn't directly accept a CGImage.
I've been trying to load the image as a UIImage, then create a CGImage from that, add the properties, and then create a new UIImage from that data and saving that to the photo album.
An image is getting written to the photo album, but without the PPM settings. I can verify this in the preview app on MacOS, or using ImageMagick's identify.
ObjC isn't my domain, so here's what I've cobbled together from similar stackoverflow questions.
UIImage *uiImage = [UIImage imageWithContentsOfFile:path];
NSDictionary* properties = [NSDictionary dictionaryWithObjectsAndKeys:
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:5905], (NSString *)kCGImagePropertyPNGXPixelsPerMeter, /* this doesn't work */
[NSNumber numberWithInt:5905], (NSString *)kCGImagePropertyPNGYPixelsPerMeter, /* this doesn't work */
#"This works when saved to disk.", (NSString *)kCGImagePropertyPNGDescription,
nil],(NSString*) kCGImagePropertyPNGDictionary,
nil];
NSMutableData* imageData = [NSMutableData data];
CGImageDestinationRef imageDest = CGImageDestinationCreateWithData((CFMutableDataRef) imageData, kUTTypePNG, 1, NULL);
CGImageDestinationAddImage(imageDest, uiImage.CGImage, (CFDictionaryRef) properties);
CGImageDestinationSetProperties(imageDest, (CFDictionaryRef) properties); /* I'm setting the properties twice because I was going crazy */
CGImageDestinationFinalize(imageDest);
UIImageWriteToSavedPhotosAlbum( [UIImage imageWithData:imageData], nil, NULL, NULL );
// This is a test to see if the data is getting stored at all.
CGImageDestinationRef diskDest = CGImageDestinationCreateWithURL(
(CFURLRef)[NSURL fileURLWithPath:[NSString stringWithFormat:#"%#.dpi.png",path]], kUTTypePNG, 1, NULL);
CGImageDestinationAddImage(diskDest, uiImage.CGImage, (CFDictionaryRef) properties);
CGImageDestinationSetProperties(diskDest, (CFDictionaryRef) properties);
CGImageDestinationFinalize(diskDest);
Updated: Revised and simplified the code a bit. Still saves image to photo album and disk, but the pixels per meter value is not saving. The description block gets saved to disk, but appears to get stripped when written to the photo album.
Update 2: By saving the file as a jpeg and adding TIFF X/Y Resolution tags to it I've been able to get the version saved to disk to store a PPI value other than 72. This information is still stripped off when saving to the photo album.
Comparing a photo taken via iOS Camera app, one taken via Instagram, and one saved by my code, I noticed that the Camera app version has a ton of TIFF/EXIF tags, and the Instagram and my one have the same limited set of tags. This leads me to the conclusion that iOS strips these tags intentionally. A definitive answer would help me sleep at night though.
The answer is don't use UIImageWriteToSavedPhotosAlbum.
There's an ALAssetsLibrary method
[writeImageDataToSavedPhotosAlbum: metadata: completionBlock:]
The metadata is a dictionary in the same format as the one passed to CGImageDestinationSetProperties. PNG and JFIF density settings may be broken, I'm using kCGImagePropertyTIFFXResolution.

rotate CGImage on disk using CGImageSource/CGImageDestination?

I'm working on an application that needs to take a picture using UIImagePickerController, display a thumbnail of it in the app, and also submit that picture to a server using ASIFormDataRequest (from ASIHTTPRequest).
I want to use setFile:withFileName:andContentType:forKey: from ASIFormDataRequest since in my experience it's faster than trying to submit an image using UIImageJPEGRepresentation and submitting raw NSData. To that end I'm using CGImageDestination and creating an image destination with a url, saving that image to disk, and then uploading that file on disk.
In order to create the thumbnail I'm using CGImageSourceCreateThumbnailAtIndex (see docs) and creating an image source with the path of the file I just saved.
My problem is that no matter what options I pass into the image destination or the thumbnail creation call, my thumbnail always comes out rotated 90 degrees counterclockwise. The uploaded image is also rotated. I've tried explicitly setting the orientation in the options of the image using CGImageDestinationSetProperties but it doesn't seem to take. The only solution I've found is to rotate the image in memory, but I really want to avoid that since doing so doubles the time it takes for the thumbnail+saving operation to finish. Ideally I'd like to be able to rotate the image on disk.
Am I missing something in how I'm using CGImageDestination or CGImageSource? I'll post some code below.
Saving the image:
NSURL *filePath = [NSURL fileURLWithPath:self.imagePath];
CGImageRef imageRef = [self.image CGImage];
CGImageDestinationRef ref = CGImageDestinationCreateWithURL((CFURLRef)filePath, kUTTypeJPEG, 1, NULL);
CGImageDestinationAddImage(ref, imageRef, NULL);
NSDictionary *props = [[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:1.0], kCGImageDestinationLossyCompressionQuality,
nil] retain];
//Note that setting kCGImagePropertyOrientation didn't work here for me
CGImageDestinationSetProperties(ref, (CFDictionaryRef) props);
CGImageDestinationFinalize(ref);
CFRelease(ref);
Generating the thumbnail
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)filePath, NULL);
if (!imageSource)
return;
CFDictionaryRef options = (CFDictionaryRef)[NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageIfAbsent,
(id)[NSNumber numberWithFloat:THUMBNAIL_SIDE_LENGTH], (id)kCGImageSourceThumbnailMaxPixelSize, nil];
CGImageRef imgRef = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, options);
UIImage *thumb = [UIImage imageWithCGImage:imgRef];
CGImageRelease(imgRef);
CFRelease(imageSource);
And then to upload the image I just use
[request setFile:path withFileName:fileName andContentType:contentType forKey:#"photo"];
where path is the path to the file saved with the code above.
As far as I know and after trying lots of different things, this cannot be done with current public APIs and has to be done in memory.