CGImageRef from NSData is null, but UIImage is not? - objective-c

I'm trying to move some image loading into the background. Currently i'm loading UIImages in the background but from what I have read, this is not the suggested way to go about doing it, instead I should load the CGImageRef in the background, then load the UIImage from it in the main thread.
The problem is that when I try to create a CGImageRef, its coming back as null. Sample code:
NSData * imageData = [NSData dataWithContentsOfFile: coverPath];
if(nil != imageData)
{
UIImage * uiImage = [UIImage imageWithData: imageData];
CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef) imageData);
CGImageRef imageRef = CGImageCreateWithPNGDataProvider(provider, NULL, true, kCGRenderingIntentDefault);
NSLog(#"Test: %p, %p", imageRef, uiImage);
CGDataProviderRelease(provider);
}
Which logs out Test: 0x0, 0x1c566bb0. Meaning the imageRef is null but the uiImage is not. Any ideas what i'm doing wrong here? It seems as if this should be quite simple?

CGImageCreateWithPNGDataProvider()
returns nil if the provided data is not in PNG format (for example JPEG or TIFF).
[UIImage imageWithData: imageData]
returns an image for all supported image file formats (PNG, JPEG, TIFF etc.)
This explains why the first function can fail while the second succeeds.

Related

I am trying to convert UIImage into base64 string Objective c

I tried the solution given here using:
- (NSString *)encodeToBase64String:(UIImage *)image {
return [UIImagePNGRepresentation(image) base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
}
The encoded string I am getting is different from the one I can get via uploading an image on web tool like.
Simply my string is different from web and is not able to decode on the web for an image.
my other implementation is
-(NSString*)base64StringForImage{
UIImage *originalImage =_image_PreviewImage.image;
UIGraphicsBeginImageContext(originalImage.size);
[originalImage drawInRect:CGRectMake(0, 0, 1000, 1000)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imgData=UIImagePNGRepresentation(newImage);
NSString *base64String = [imgData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed];
return base64String;
}
originalImage and newImage both have the images. testing out reducing the image sizes
the option with
NSDataBase64EncodingEndLineWithLineFeed
given encoded string with half image like the below

OS X - How to save NSImage or NSBitmapImageRep to PNG file without alpha channel?

I'm building an OS X app that needs to save the file to disk.
I'm currently using NSBitmapImageRep to represent the image in my code, and while saving the image to disk with representationUsingType:properties: method, I want to set the hasAlpha channel for the image, but the properties dictionary does not seem to support this.
So, I've tried to create a no-alpha bitmap representation, but according to many SO questions, the 3 channel/24 bits combination is not supported. Well, what should I do then?
Big thanks!
First off, I would try just making sure you create your NSBitmapImageRep with
-initWithBitmapDataPlanes:... hasAlpha:NO ...
And write it out and see if it the result doesn’t have alpha—one would kind of hope so.
If you’re trying to write out an image that has alpha, but not write the alpha, just copy it into a non-alpha image first, and write that out.
`
NSURL *url = [NSURL fileURLWithPath:name];
CGImageSourceRef source;
NSImage *srcImage =[[NSImage alloc] initWithContentsOfURL:url];;
NSLog(#"URL: %#",url);
source = CGImageSourceCreateWithData((__bridge CFDataRef)[srcImage TIFFRepresentation], NULL);
CGImageRef imageRef = CGImageSourceCreateImageAtIndex(source, 0, NULL);
CGRect rect = CGRectMake(0.f, 0.f, CGImageGetWidth(imageRef), CGImageGetHeight(imageRef));
CGContextRef bitmapContext = CGBitmapContextCreate(NULL,
rect.size.width,
rect.size.height,
CGImageGetBitsPerComponent(imageRef),
CGImageGetBytesPerRow(imageRef),
CGImageGetColorSpace(imageRef),
kCGImageAlphaNoneSkipLast | kCGBitmapByteOrder32Little
);
CGContextDrawImage(bitmapContext, rect, imageRef);
CGImageRef decompressedImageRef = CGBitmapContextCreateImage(bitmapContext);
NSImage *finalImage = [[NSImage alloc] initWithCGImage:decompressedImageRef size:NSZeroSize];
NSData *imageData = [finalImage TIFFRepresentation];
NSBitmapImageRep *imageRep = [NSBitmapImageRep imageRepWithData:imageData];
NSDictionary *imageProps = [NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:0.9] forKey:NSImageCompressionFactor];
imageData = [imageRep representationUsingType:NSPNGFileType properties:imageProps];
[imageData writeToFile:name atomically:NO];
CGImageRelease(decompressedImageRef);
CGContextRelease(bitmapContext);
`
ref:
https://github.com/bpolat/Alpha-Channel-Remover

Image size anomaly

I have an image in the form of an NSURL as input. I converted this url to NSImage and then to NSData from which I could get CGImageRef. This imageRef helped me extracting the raw data information from the image such as the height, width, bytesPerRow, etc.
Here's the code that I used:
NSString * urlName = [url path];
NSImage *image = [[NSImage alloc] initWithContentsOfFile:urlName];
NSData *imageData = [image TIFFRepresentation];
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)CFBridgingRetain(imageData), NULL);
CGImageRef imageRef = CGImageSourceCreateImageAtIndex(source, 0, NULL);
NSUInteger numberOfBitsPerPixel = CGImageGetBitsPerPixel(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
...
...
Now, I checked the size of the image using:
int sz = [imageData length];
which is different from - int sz' = bytesPerRow * height
I cannot understand why is there such a difference. And sz is actually half of sz'.
Am I making some mistake while extracting various info? From what I can get is that maybe while conversion of image to NSData some decompressions are done. In such a case, what should I use that can get me the reliable data.
I am new to the world image processing in Objective-C, so please bear with me!
P.S. I actually checked the size of the file that I am getting as input in the form of NSURL which is same as sz.
Try This:
Instead of
NSData *imageData = [image TIFFRepresentation];
use this:
NSData *imageData = [image TIFFRepresentationUsingCompression:NSTIFFCompressionLZW factor:0];

Loading an image from a URL but displaying it progressively

I have a screen that will load around 5 images, but they are huge images. Right now I use a
NSURLRequest
and a:
connectionDidFinishLoading
..for callback to tell me when each image is loaded.
The problem is that images would pop up one by one. Is there a way to have it display the image while it loads?
Thanks
The guts of what you need to do this are available as CGImageSource methods.
First, you use an asynchronous NSURLConnection to get the data. You add received data to a NSMutableData object as it arrives, so the data object gets bigger and bigger til finished.
You also create a progressive image source:
CGImageSourceRef imageSourcRef = CGImageSourceCreateIncremental(dict);
You will find lots of examples here and on google how to set the dictionary required.
Then as the data arrives, you pass the TOTAL data object into this method:
CGImageSourceUpdateData(imageSourcRef, (__bridge CFDataRef)data, NO); // No means not finished
You can then ask the image source for an image, which will be partial as the image is downloading. With a CGImage you can create a UIImage.
When you get the final data, you update the image source on last time:
CGImageSourceUpdateData(imageSourcRef, (__bridge CFDataRef)data, YES);
You then use the image source to get a final image and you're done.
Displaying it while loading ,I don't think UIImageView can load UIImageswith incomplete data while loading.I will go for
AsyncImageView ,
It can deal with all the burden of loading image asynchronous.Also UIActivityIndicator is already added to it.So it will be more user friendly
Use blocks and GCD's dispatch_async method.
Look at this example:
//communityDetailViewController.h
#interface communityDetailViewController : UIViewController {
UIImageView *imgDisplay;
UIActivityIndicatorView *activity;
// the dispatch queue to load images
dispatch_queue_t queue;
}
#end
//communityDetailViewController.m
- (void)loadImage
{
[activity startAnimating];
NSString *url = #"URL the image";
if (!queue) {
queue = dispatch_queue_create("image_queue", NULL);
}
dispatch_async(queue, ^{
NSData *data = [NSData dataWithContentsOfURL:[NSURL URLWithString:url]];
UIImage *anImage = [UIImage imageWithData:data];
dispatch_async(dispatch_get_main_queue(), ^{
[activity stopAnimating];
activity.hidden = YES;
if (anImage != nil) {
[imgDisplay setImage:anImage];
}else{
[imgDisplay setImage:[UIImage imageNamed:#"no_image_available.png"]];
}
});
});
}
You can subclass UIImageView and use this.
-(void)connection:(NSURLConnection*)connection didReceiveResponse:(NSURLResponse*)response
{
imageData = [NSMutableData data];
imageSize = [response expectedContentLength];
imageSource = CGImageSourceCreateIncremental(NULL);
}
-(void)connection:(NSURLConnection*)connection didReceiveData:(NSData*)data
{
[imageData appendData:data];
CGImageSourceUpdateData(imageSource, (__bridge CFDataRef)imageData, ([imageData length] == imageSize) ? true : false);
CGImageRef cgImage = CGImageSourceCreateImageAtIndex(imageSource, 0, NULL);
if (cgImage){
UIImage* img = [[UIImage alloc] initWithCGImage:cgImage scale:1.0f orientation:UIImageOrientationUp];
dispatch_async( dispatch_get_main_queue(), ^{
self.image = img;
});
CGImageRelease(cgImage);
}
}

How to compose an image from GUI elements on iOS?

I need to form an image by composing some visual elements and save it on disk. The question is: how to "screenshot" a certain area of a view? Possibly a view that is not visible, so the procedure can be executed unnoticed?
This code snippet shows how to render a view to a UIImage:
UIGraphicsBeginImageContext(myView.bounds.size);
[myView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
And this snippet shows how to save a UIImage as a JPEG or a PNG:
NSString *pngPath = [NSHomeDirectory()
stringByAppendingPathComponent:#"Documents/Test.png"];
NSString *jpgPath = [NSHomeDirectory()
stringByAppendingPathComponent:#"Documents/Test.jpg"];
[UIImageJPEGRepresentation(viewImage, 1.0) writeToFile:jpgPath atomically:YES];
[UIImagePNGRepresentation(viewImage) writeToFile:pngPath atomically:YES];