Objective-C imageWithCGImage memory leak - objective-c

I want to save all my photos from assets to some folder. Doing this in loop by:
ALAssetRepresentation *representation = [asset defaultRepresentation];
CGImageRef imageRef = [representation fullResolutionImage];
ALAssetOrientation orientation = [representation orientation];
UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:(UIImageOrientation)orientation];
CGFloat compressionQuality = 1.0;
NSData *imageData = [NSData dataWithData:UIImageJPEGRepresentation(image, compressionQuality)];
[imageData writeToFile:path atomically:YES];
CGImageRelease(imageRef);
I have Automatic Reference Counting enabled. This code is inside autorelease pool. It has a memory leak of CGImageRef objects. If i'll make
CGImageRelease(imageRef);
CGImageRelease(imageRef);
twice there is no memory leak. Why? Anybody can help me?

This is one unbelievable bug in iOS. Apparently when you create an UIImage using imageWithCGImage: method it retains the original CGImageRef which never gets released even if you release the UIImage itself (like set it to nil if you use ARC)! So you have to release it explicitly like this:
UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:(UIImageOrientation)orientation];
CGImageRelease(imageRef);
...
CGImageRelease(image.CGImage);
image = nil; // once you are done with it
Cost me full day of digging around until I came across this question that actually contained the answer. Who can I send the bill at Apple for all the time I spent on debugging this unspeakable bug?
CORRECTION: This was not an iOS bug, this was my stupid mistake. At some point I have "hijacked" dealloc method of UIImage through a private category to do some debugging and forgot about it. This is a WRONG thing to do, since the dealloc on the actual object never gets called in that case. So the end result is expected: UIImage didn't have a chance to do all the housekeeping it's supposed to do when being deallocated. NEVER EVER override dealloc through a private category.

Related

CGImageRef not init/alloc correctly

I am currently having problems with CGImageRef.
Whenever I create a CGImageRef and look at it in debugger view, in Xcode, it is nil.
Here's the code:
-(void)mouseMoved:(NSEvent *)theEvent{
if (self.shoulddrag) {
NSPoint event_location = [theEvent locationInWindow];//direct from the docs
NSPoint local_point = [self convertPoint:event_location fromView:nil];//direct from the docs
CGImageRef theImage = (__bridge CGImageRef)(self.image);
CGImageRef theClippedImage = CGImageCreateWithImageInRect(theImage, CGRectMake(local_point.x,local_point.y,1,1));
NSImage * image = [[NSImage alloc] initWithCGImage:theClippedImage size:NSZeroSize];
self.pixleView.image = image;
CGImageRelease(theClippedImage);
}
}
Everything else seems to be working though. I can't understand. Any help would be appreciated.
Note: self.pixelView is an NSImageView instance that has not been overridden in any way.
Very likely local_point is not inside of the image. You've converted the point from the window to the view coordinates, but that may not be equivalent to the image coordinates. Test this to see if the lower-left corner of your image results in local_point being (0,0).
It's not clear how your view is laid out, but I suspect that what you want to do is subtract the origin of whatever region (possibly a subview) the user is interacting with relative to self.
Alright, I figured it out.
What I was using to create the CGImageRef was:
CGImageRef theImage = (__bridge CGImageRef)(self.image);
Apparently what I should have used is:
CGImageSourceRef theImage = CGImageSourceCreateWithData((CFDataRef)[self.image TIFFRepresentation], NULL);
I guess my problem was that for some reason I thought NSImage and CGImageRef had toll free bridging.
Apparently, I was wrong.

ALAssetRepresentation fullResolutionImage never returns

I am writing a multithreaded application which needs to upload photos from the ALAssetsLibrary en masse in the background. So I have an NSOperation subclass which finds the appropriate ALAsset via the asset's URL and adds the image to an upload queue.
In the upload queue for the current ALAsset, I need to get the metadata from the image, but I've encountered a problem: both the -metadata and the -fullResolutionImage methods never return when they are called on the ALAssetRepresentation of the ALAsset. They simply hang there indefinitely. I tried printing the value of each of these methods in LLDB, but it hung the debugger up, and I ended up killing Xcode, signal 9 style. These methods are being called on a background queue.
I am testing these on an iPad 2. This is the method in which the ALAsset is added to the upload queue when it is found in the success block of -assetForURL:resultBlock:failureBlock:
- (void)addMediaToUploadQueue:(ALAsset *)media {
#autoreleasepool {
ALAssetRepresentation *defaultRepresentation = [media defaultRepresentation];
CGImageRef fullResolutionImage = [defaultRepresentation fullResolutionImage];
// Return if the user is trying to upload an image which has already been uploaded
CGFloat scale = [defaultRepresentation scale];
UIImageOrientation orientation = [defaultRepresentation orientation];
UIImage *i = [UIImage imageWithCGImage:fullResolutionImage scale:scale orientation:orientation];
if (![self isImageUnique:i]) return;
NSDictionary *imageDictionary = [self dictionaryForAsset:media withImage:i];
dispatch_async(self.background_queue, ^{
NSManagedObjectContext *ctx = [APPDELEGATE createManagedObjectContextForThread];
[ctx setUndoManager:nil];
[ctx performBlock:^{
ImageEntity *newImage = [NSEntityDescription insertNewObjectForEntityForName:#"ImageEntity"
inManagedObjectContext:ctx];
[newImage updateWithDictionary:imageDictionary
inManagedObjectContext:ctx];
[ctx save:nil];
[APPDELEGATE saveContext];
dispatch_async(dispatch_get_main_queue(), ^{
[self.fetchedResultsController performFetch:nil];
});
if (!currentlyUploading) {
currentlyUploading = YES;
[self uploadImage:newImage];
}
}];
});
}
}
I had a similar problem and I was tearing my hair out trying to figure it out.
Turns out while I had thought I setup a singleton for ALAssetsLibrary, my code was not calling it properly and some ALAssets were returning an empty 'fullResolutionImage'
In all of my NSLogs I must have missed the most important message from Xcode:
"invalid attempt to access ALAssetPrivate past the lifetime of its owning ALAssetsLibrary"
Follow this link
http://www.daveoncode.com/2011/10/15/solve-xcode-error-invalid-attempt-to-access-alassetprivate-past-the-lifetime-of-its-owning-alassetslibrary/
I hope that helps

Applying core image filters - app crashes

I use the following code for applying a few types of image filters. (there are three more 'editImage' functions for brightness, saturation and contrast, with a common completeImageUsingOutput method). I use a slider to vary their values.
If I work with any of them individually, it works fine. As soon as I make two function calls on two different filters, the app crashed.
EDIT: didReceiveMemoryWarning is called. I see the memory allocations using memory leaks instrument, and after each edit memory allocation increases by around 15mb
The crash happens during
CGImageRef cgimg = [context createCGImage:outputImage fromRect:outputImage.extent];
Moreover, if the instructions completeImageUsingOutputImage method are put into the individual functions, I am able to work with two types of filters without crashing. As soon as I call the third one, the app crashes.
(filters and context have been declared as instance variables and initialized in the init method)
- (UIImage *)editImage:(UIImage *)imageToBeEdited tintValue:(float)tint
{
CIImage *image = [[CIImage alloc] initWithImage:imageToBeEdited];
NSLog(#"in edit Image:\ncheck image: %#\ncheck value:%f", image, tint);
[tintFilter setValue:image forKey:kCIInputImageKey];
[tintFilter setValue:[NSNumber numberWithFloat:tint] forKey:#"inputAngle"];
CIImage *outputImage = [tintFilter outputImage];
NSLog(#"check output image: %#", outputImage);
return [self completeEditingUsingOutputImage:outputImage];
}
- (UIImage *)completeEditingUsingOutputImage:(CIImage *)outputImage
{
CGImageRef cgimg = [context createCGImage:outputImage fromRect:outputImage.extent];
NSLog(#"check cgimg: %#", cgimg);
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
NSLog(#"check newImge: %#", newImage);
CGImageRelease(cgimg);
return newImage;
}
EDIT : using these filters on a reduced sized image is working now, but still, it would be good if I why was some memory not being released before.
Add this line in at top most of completeEditingUsingOutputImage: method
CIContext *context = [CIContext contextWithOptions:nil];
Also this is how get CIImage:
CIImage *outputImage = [tintFilter valueForKey:#"outputImage"];

Image from URL for Retina Display

I have an application that pulls images from an NSURL. Is it possible to inform the application that they are retina ('#2x') versions (the images are of retina resolution)? I currently have the following but the images appear pixelated on the higher resolution displays:
NSURL *url = [NSURL URLWithString:self.imageURL];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *image = [UIImage imageWithData:data];
self.pictureImageView.image = image;
You need to rescale the UIImage before adding it to the image view.
NSURL *url = [NSURL URLWithString:self.imageURL];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *image = [UIImage imageWithData:data];
CGFloat screenScale = [UIScreen mainScreen].scale;
if (image.scale != screenScale)
image = [UIImage imageWithCGImage:image.CGImage scale:screenScale orientation:image.imageOrientation];
self.pictureImageView.image = image;
It's best to avoid hard-coding the scale value, thus the UIScreen call. See Apple’s documentation on UIImage’s scale property for more information about why this is necessary.
It’s also best to avoid using NSData’s -dataWithContentsOfURL: method (unless your code is running on a background thread), as it uses a synchronous network call which cannot be monitored or cancelled. You can read more about the pains of synchronous networking and the ways to avoid it in this Apple Technical Q&A.
Try using imageWithData:scale: (iOS 6 and later)
NSData *imageData = [NSData dataWithContentsOfURL:url];
UIImage *image = [UIImage imageWithData:imageData scale:[[UIScreen mainScreen] scale]];
You need to set the scale on the UIImage.
UIImage* img = [[UIImage alloc] initWithData:data];
CGFloat screenScale = [UIScreen mainScreen].scale;
if (screenScale != img.scale) {
img = [UIImage imageWithCGImage:img.CGImage scale:screenScale orientation:img.imageOrientation];
}
The documentation says to be careful to construct all your UIImages at the same scale, otherwise you might get weird display issues where things show at half size, double size, half resolution, et cetera. To avoid all that, load all UIImages at retina resolution. Resources will be loaded at the correct scale automatically. For UIImages constructed from URL data, you need to set it.
Just to add to this, what I did specifically was the following, in the same situation, works like a charm.
double scaleFactor = [UIScreen mainScreen].scale;
NSLog(#"Scale Factor is %f", scaleFactor);
if (scaleFactor==1.0) {
[cell.videoImageView setImageWithURL:[NSURL URLWithString:regularThumbnailURLString];
}else if (scaleFactor==2.0){
[cell.videoImageView setImageWithURL:[NSURL URLWithString:retinaThumbnailURLString];
}
#2x convention is just convenient way for loading images from application bundle.
If you wan't to show image on retina display then you have to make it 2x bigger:
Image size 100x100
View size: 50x50.
Edit: i think if you're loading images from server the best solution would be adding some additional param (e.g. scale) and return images of the appropriate size:
www.myserver.com/get_image.php?image_name=img.png&scale=2
You can obtain scale using [[UIScreen mainScreen] scale]
To tell the iPhone programmatically that particular image is Retina, you can do something like this:
UIImage *img = [self getImageFromDocumentDirectory];
img = [UIImage imageWithCGImage:img.CGImage scale:2 orientation:img.imageOrientation];
In my case, TabBarItem image was dynamic i.e. that was downloading from server. Then the iOS cannot identify it as retina. The above code snippet worked for me like a charm.

CGImageRef Memory leak

I'm having a memory leak when using this custom method which returns a CGImageRef. I can't release "cgImage" properly because I have to return it. What chould I do ?
- (CGImageRef)rectRoundedImageRef:(CGRect)rect radius:(int)radius
{
CGSize contextSize = CGSizeMake(rect.size.width, rect.size.height);
CGFloat imageScale = (CGFloat)1.0;
CGFloat width = contextSize.width;
CGFloat height = contextSize.height;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL, width * imageScale, height * imageScale, 8, 0, colorSpace, kCGImageAlphaPremultipliedLast);
// Draw ...
// Get your image
CGImageRef cgImage = CGBitmapContextCreateImage(context);
CGColorSpaceRelease(colorSpace);
CGContextRelease(context);
//CGImageRelease(cgImage); //If I release cgImage the app crashes.
return cgImage;
}
cgImage is owned by your method, you need to return it and give responsibility to the caller to release it through CFRelease.
You can also return the CGImage wrapped inside a UIImage instance, like this:
UIImage *image = [UIImage imageWithCGImage:cgImage];
CFRelease(cgImage); //cgImage is retained by the UIImage above
return image;
This is a general problem with Core Foundation objects because there is no autorelease pool in CF. As I see it, you have two options to solve the problem:
Rename the method to something like -newRectRoundedImageRef:radius: to tell the caller that he takes ownership of the returned object and responsible for releasing it.
Wrap the CGImageRef in an autoreleased UIImage object and return that ([UIImage imageWithCGImage:]). That's probably what I would do.
You can autorelease a Core Foundation-compatible object. it just looks a bit wonky. :)
The GC-safe way is like so:
CGImageRef image = ...;
if (image) {
image = (CGImageRef)[[(id)image retain] autorelease];
CGImageRelease(image);
}
The shortcut, which is safe on iOS but no longer safe on the Mac, is this:
CGImageRef image = ...;
if (image) {
image = (CGImageRef)[(id)image autorelease];
}
Either one will place the image in an autorelease pool and prevent a leak.
As suggested, we used:
CGImageRelease(imageRef);
but we still got an memory leak.
our solution was to wrap code with an
#autoreleasepool {}
block and that solve our problem.