NSImage drawInRect and NSView cacheDisplayInRect memory retained - objective-c

Application that I am working process images. It's like user drops at max 4 images and app layout them based on user's selected template. One image might be added 2-3 times in final view.
Each image in layout is drawn in NSView (drawRect method of NSView using drawInRect method).Now final image (combined image by layouting all images) is created by saving NSView as Image and it all works very well.
Now problem that I am facing is memory is being retained by app once all processing is done. I have used instruments allocation and I don't see memory leaks but I see "Persistent bytes" are increasing continuously with each session of app and one user reported issue in GB's. Please see screenshot.
When I further investigated in Instruments I saw below code snaps of app that is causing memory retentions. All are related to ImageIO and coreImages. See below from instruments:
However this seems to be only problem with 10.10 and above system. Tested same version of the app in 10.9.x and memory usage remains with in 60MB. During session execution in app it goes to 200MB but once it's done it comes back to 50-60MB that usual for kind of app.
[_photoImage drawInRect: self.bounds fromRect: NSZeroRect operation: NSCompositeSourceOver fraction: 1.0 respectFlipped: YES hints: nil];
_photoImage = nil;
Above code I am using to draw image in NSView's drawRect method and code shown in image is being used to get NSView as Image.
Update: After my further investigation I found that it's CGImageSourceCreateWithData that is caching the TIFF data of NSImage. More ever I am using below code to crop the image and if i uncomment it memory consumption just works fine.
NSData *imgData = [imageToCrop TIFFRepresentation];
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)imgData, NULL);
CGImageRef maskRef = CGImageSourceCreateImageAtIndex(source, 0, NULL);
CGImageRef imageRef = CGImageCreateWithImageInRect(maskRef, rect);
NSImage *cropped = [[NSImage alloc] initWithCGImage: imageRef size:rect.size];
CGImageRelease(maskRef);
CGImageRelease(imageRef);
CFRelease(source);
//CFRelease( options );
imgData = nil;
I have also trying explicitly setting kCGImageSourceShouldCache to false (but it's by default false) but same results.
Please help to solve the memory retention issue.

Finally after lots of debugging it turns out that CGImageSourceCreateWithData is somewhere retaining TIFF data of NSImage. When I changed this line:
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)imgData, NULL);
with
CGImageSourceRef source = CGImageSourceCreateWithURL((CFURLRef)[NSURL fileURLWithPath:path], NULL);
everything just started working fine and app's memory usage was dropped from 300MB (for 6images) to 50-60MB and it's consistent behaviour now.
Apart from above changes it was still causing memory retention somewhere so to get rid of that, after all processing is done I cleared image of each layer to 'nil' and that works like charm. I was in impression that making parent as 'nil' would release images as well but that was not working.
Anyway if anyone seems issue with drawInRect or cacheDisplayInRect then make sure to clear out the image if not needed later on.
Update 2nd July 2016
I found that kCGImageSourceShouldCache is false by default in 32bit and true for 64bit. I was able to release memory with below code by setting it to false.
const void *keys[] = { kCGImageSourceShouldCache};
const void *values[] = { kCFBooleanFalse};
CFDictionaryRef optionsDictionary = CFDictionaryCreate(NULL, keys, values, 1, NULL, NULL);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)[image TIFFRepresentation], optionsDictionary);
Hope it helps someone.

Related

Getting message in console: "CreateWrappedSurface() failed for a dataprovider-backed CGImageRef."

Updated to Xcode 7 and getting this (warning?) message while an image was being rendered in an operation:
CreateWrappedSurface() failed for a dataprovider-backed CGImageRef.
There was no message like this under Xcode 6.4.
Got which code part threw the message:
if (!self.originalImage) // #property (nonatomic, strong) UIImage *originalImage;
return;
CGImageRef originalCGImage = self.originalImage.CGImage;
NSAssert(originalCGImage, #"Cannot get CGImage from original image");
CIImage *inputCoreImage = [CIImage imageWithCGImage:originalCGImage]; // this results the console message
I replaced my CIIImage creator to get it directly from the UIImage:
CIImage *originalCIImage = self.originalImage.CIImage;
NSAssert(originalCIImage, #"Cannot build CIImage from original image");
In this case I didn't get any console message, but had an assert: originalCIImage was nil.
The class reference of UIImage says:
#property(nonatomic, readonly) CIImage *CIImage
If the UIImage object was initialized using a CGImageRef, the value of the property is nil.
So I'm using the original code as fallback:
CIImage *originalCIImage = self.originalImage.CIImage;
if (!originalCIImage) {
CGImageRef originalCGImageRef = self.originalImage.CGImage;
NSAssert(originalCGImageRef, #"Unable to get CGimageRef of originalImage");
originalCIImage = [CIImage imageWithCGImage:originalCGImageRef];
}
NSAssert(originalCIImage, #"Cannot build CIImage from original image");
The problem is, I'm still getting the warning messages in console.
Has anybody got this message before? What's the solution to nuke that warning(?) message?
Thanks,
Adam
Finally figured out the answer. Curious by the error I studied up on how CIImage works (https://uncorkedstudios.com/blog/image-filters-with-core-graphics)
I noticed that the CGImageRef is dataprovider-backed with premultiplied values (RGB and A)
I thought to myself that the CGImage I am loading into a CIImage (using [CIImage imageWithCGImage:originalCGImage]; is only RGB and not RGBA). Sure enough, I was creating this image by taking a snapshot of a view using the standard UIGraphicsBeginImageContextWithOptions and I had the opaque parameter set to "YES".
I simply changed:
UIGraphicsBeginImageContextWithOptions(bounds, YES, 1.0f);
to
UIGraphicsBeginImageContextWithOptions(bounds, NO, 1.0f);
So that I am now creating a RGBA image, not an RGB image.
Now I convert my CGImage to CIImage and the CIImage NOW has proper dataprovider backing and the error goes away.
NOTE:
I was using a CIClamp filter for gaussian blur purposes, and with opaque set to NO the clamp doesn't work as effectively. I decided to just keep the opaque at YES and ignore the log warnings, they don't seem to actually do anything.)

Draw Image to PDF

I'd like to draw an full page image to a PDF but I've always had a hard time wrapping my head around CGContextRefs so I don't know what to make of the error.
Note, this is NOT iOS. I'm making a desktop application.
So far, I have this:
-(void) addImage:(NSURL*) url toPage:(size_t) page toPDF:(CGPDFDocumentRef) pdf
{
NSImage *image = [[NSImage alloc] initWithContentsOfURL:url];
image = [image imageScaledToFitSize:pageSize.size]; //From Matt Gemmell's Crop extensions category
[image lockFocus];
CGContextRef context = [[NSGraphicsContext currentContext] graphicsPort];
[image drawInRect:pageSize];
[image unlockFocus];
CGPDFPageRef pageRef = CGPDFDocumentGetPage(pdf, page);
CGPDFContextBeginPage(context, pageInformation);
CGContextDrawPDFPage(context, pageRef);
CGPDFContextEndPage(context);
}
However, I'm greeted with the error:
CGPDFContextEndPage: invalid context 0x61000017b540. This is a serious error. This application, or a library it uses, is using an invalid context and is thereby contributing to an overall degradation of system stability and reliability. This notice is a courtesy: please fix this problem. It will become a fatal error in an upcoming update.
What is wrong with my contexts please?
You need to create and use a CGPDFContext. Different contexts are specific to different rendering processes / destinations so you need to choose the correct one. So look at using CGPDFContextCreateWithURL to create a PDF context to write the data to a file.

CGImageRef MemoryLeak again

I have the memory leak problem with CGImageRef (I guess, it's so) in my Cocoa Desktop Application. I've read lots of questions here and anywhere else, read FAQ of Core Graphics memory management on developer.apple.com. Maybe this question is more similar to mine, though solution didn't help.
My task is to scale area 15*15 from saved CGImage and return NSImage* as a result, it is done on each mouse movement.
-(NSImage*)getScaledAreaInX:(int)x andY:(int)y
{
// Catching image from the screen
CGImageRef fullscreen = CGImageRetain(_magniHack);
// Cropping
int screenHeight = CGImageGetHeight(fullscreen);
CGRect fixedRect = CGRectMake(x-7, screenHeight-y-8, 15, 15);
CGImageRef cropped = CGImageCreateWithImageInRect(fullscreen, fixedRect);
// Scaling
int width = CGImageGetWidth(cropped)*8; // New width;
int height = CGImageGetHeight(cropped)*8; // New height;
CGColorSpaceRef colorspace = CGImageGetColorSpace(cropped);
CGContextRef context = CGBitmapContextCreate(NULL, width, height,
CGImageGetBitsPerComponent(cropped),
CGImageGetBytesPerRow(cropped),
colorspace,
CGImageGetAlphaInfo(cropped));
CGContextSetInterpolationQuality(context, kCGInterpolationNone);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), cropped);
CGImageRef scaled = CGBitmapContextCreateImage(context);
// Casting to NSImage
NSImage *image = [[NSImage alloc] initWithCGImage:scaled size:NSZeroSize];
// Releasing memory
CGImageRelease(fullscreen);
CGColorSpaceRelease(colorspace);
CGContextRelease(context);
//CGImageRelease(cropped); // Can't do: will crash; In what situations can free?
cropped = NULL;
CGImageRelease(scaled);
scaled = NULL;
return image;
}
If I uncomment CGImageRelease line, app will crash on 6th movement of cursor, on retaining _magniHack or cropping the image (it differs every time), the message is "EXC_BAD_ACCESS". If I don't, there will be memory leak every time (during frequent movements the leak is dozens of MB). The same result I get if I release cropped, but do not release scaled image (though leak will be much more).
_magniHack - CGImageRef, it is private instance variable, it is set only once in that code:
-(void)storeFullScreen
{
if (_magniHack) {
CGImageRelease(_magniHack);
}
_magniHack = CGDisplayCreateImage(displays[0]);
}
I use ARC in project if it helps. Though this thing still can't help get rid of leaks.
I guess that _magniHack is released somewhere, but I can't find, where, cause I always implement retain on start and release in the end.
This is pretty old, but I had the same problem in my case.
The issue was releasing colorspace without actually having a copy of it. CGImageGetColorSpace(cropped) gives you a pointer to existing color space and you should not release it. It's not a copy created for you.
That was my case. When I noticed that (I used code from the internet to scale image as well)
CGDisplayCreateImage(displays[0]) was not crashing anymore.
Actually, I got rid of this leak by using Core Graphics only if necessary, i.e. after grabbing the screen I wrap it into NSImage*, and atter I work only with it. But it is still interesting, what was wrong with the code above.

Another IKImageView Question: copying a region

I'm trying to use the select and copy feature of the IKImageView. If all you want to do is have an app with an image, select a portion and copy it to the clipboard, it's easy. You set the copy menu pick to the first responder's copy:(id) method and magically everything works.
However, if you want something more complicated, like you want to copy as part of some other operation, I can't seem to find the method to do this.
IKImageView doesn't seem to have a copy method, it doesn't seem to have a method that will even tell you the selected rectangle!
I have gone through Hillegass' book, so I understand how the clipboard works, just not how to get the portion of the image out of the view...
Now, I'm starting to think that I made a mistake in basing my project on IKImageView, but it's what Preview is built on (or so I've read), so I figured it had to be stable... and anyway, now it's too late, I'm too deep in this to start over...
So, other than not using IKImageView, any suggestions on how to copy the select region to the clipboard manually?
EDIT actually, I have found the copy(id) method, but when I call it, I get
<Error>: CGBitmapContextCreate: unsupported parameter combination: 8 integer bits/component; 16 bits/pixel; 1-component color space; kCGImageAlphaPremultipliedLast; 2624 bytes/row.
Which obviously doesn't happen when I do a normal copy through the first-responder... I understand the error message, but I'm not sure where it's getting those parameters from...
Is there any way to trace through this and see how this is happening? A debugger won't help for obvious reasons, as well as the fact that I'm doing this in Mozilla, so a debugger isn't an option anyway...
EDIT 2 It occurs to me that the copy:(id) method I found may be copying the VIEW rather than copying a chunk of the image to the clipboard, which is what I need.
The reason I thought it was the clipboard copy is that in another project, where I'm copying from an IKImageView to the clipboard straight from the edit menu, it just sends a copy:(id) to the firstResponder, but I'm not actually sure what the firstresponder does with it...
EDIT 3 It appears that the CGBitmapContextCreate error is coming from [imageView image] which, oddly enough, IS a documented method.
It's possible that this is happening because I'm putting the image in there with a setImage:(id) method, passing it an NSImage*... Is there some other, more clever way of getting an NSImage into an IKImageView?
The -copy: method in IKImageView does what every other -copy: method does: it copies the current selection to the clipboard. It is, however, implemented as a private method in IKImageView for some reason.
You can just call it directly:
[imageView copy:nil];
This will copy whatever is currently selected to the clipboard.
I don't think there's a way to directly access the image content of the current selection in IKImageView using public methods, this is a good candidate for a bug report/feature request.
You can, however, use the private method -selectionRect to get a CGRect of the current selection and use that to extract the selected portion of the image:
//stop the compiler from complaining when we call a private method
#interface IKImageView (CompilerSTFU)
- (CGRect)selectionRect
#end
#implementation YourController
//imageView is an IBOutlet connected to your IKImageView
- (NSImage*)selectedImage
{
//get the current selection
CGRect selection = [imageView selectionRect];
//get the portion of the image that the selection defines
CGImageRef selectedImage = CGImageCreateWithImageInRect([imageView image],(CGRect)selection);
//convert it to an NSBitmapImageRep
NSBitmapImageRep* bitmap = [[[NSBitmapImageRep alloc] initWithCGImage:selectedImage] autorelease];
CGImageRelease(selectedImage);
//create an image from the bitmap data
NSImage* image = [[[NSImage alloc] initWithData:[bitmap TIFFRepresentation]] autorelease];
//in 10.6 you can skip converting to an NSBitmapImageRep by doing this:
//NSImage* image = [[NSImage alloc] initWithCGImage:selectedImage size:NSZeroSize];
return image;
}
#end
Ok, so the copy: nil fails, and the [imageView image] fails, but it turns out that I have another copy of the NSImage from when I added it into the view in the first place, so I could that. Also, CGImageCreateWithImageInRect expects a CGImageRef not an NSImage*, so I had to do some conversions.
In addition, for some reason the selection rectangle is flipped, either it's bottom origined, and the image is top, or the other way around, so I had to flip it.
And for some reason, the compiler suddenly started complaining that NSRect isn't the same type as CGRect (Which implies that it suddenly went from 32 to 64 bit or something... not sure why...)
Anyway, here is my copy of selectedImage:
- (NSImage*)selectedImage
{
//get the current selection
CGRect selection = flipCGRect(imageView, [imageView selectionRect]);
//get the portion of the image that the selection defines
struct CGImage * full = [[doc currentImage] CGImageForProposedRect: NULL context: NULL hints: NULL];
CGImageRef selectedImage = CGImageCreateWithImageInRect( full, selection);
//convert it to an NSBitmapImageRep
NSBitmapImageRep* bitmap = [[[NSBitmapImageRep alloc] initWithCGImage:selectedImage] autorelease];
CGImageRelease(selectedImage);
// //create an image from the bitmap data
NSImage* image = [[[NSImage alloc] initWithData:[bitmap TIFFRepresentation]] autorelease];
// //in 10.6 you can skip converting to an NSBitmapImageRep by doing this:
//NSImage* image = [[NSImage alloc] initWithCGImage:selectedImage size:NSZeroSize];
return image;
}
I wrote flipCGRect, and [doc currentImage] returns an NSImage*...

imageWithCGImage and memory

If I use [UIImage imageWithCGImage:], passing in a CGImageRef, do I then release the CGImageRef or does UIImage take care of this itself when it is deallocated?
The documentation isn't entirely clear. It says "This method does not cache the image object."
Originally I called CGImageRelease on the CGImageRef after passing it to imageWithCGImage:, but that caused a malloc_error_break warning in the Simulator claiming a double-free was occurring.
According to the fundamental rule of Cocoa memory management, an owning object should release the owned object when it no longer needs it. Other objects are responsible for taking and releasing ownership on their own. If a UIImage needs an object to persist but doesn't retain or copy it, it's a bug in UIImage's implementation and should be reported as such.
I have the same problem in XCode 3.2.1 on Snow Leopard. I have pretty much the same code as Jason.
I have looked at the iPhone sample code which uses imageWithCGImage and they always release the CGImageRef using CGImageRelease after a call to imageWithCGImage.
So, is this a bug in the simulator? I always get the malloc_error_break warning on the console when I use CGImageRelease.
The ownership in UIImage about CGImage is unclear. It seems like don't copy the CGImage but it's not guaranteed by documentation. So we have to handle that ourselves.
I used a new subclass of UIImage to handle this problem. Just retaining passing CGImage, and releasing it when the dealloc.
Here is sample.
#interface EonilImage : UIImage
{
#private
CGImageRef sourceImage;
}
- (id)initWithCGImage:(CGImageRef)imageRef scale:(CGFloat)scale orientation:(UIImageOrientation)orientation;
#end
#implementation EonilImage
- (id)initWithCGImage:(CGImageRef)imageRef scale:(CGFloat)scale orientation:(UIImageOrientation)orientation
{
self = [super initWithCGImage:imageRef scale:scale orientation:orientation];
if (self)
{
sourceImage = imageRef;
CGImageRetain(imageRef);
}
return self;
}
- (void)dealloc
{
CGImageRelease(sourceImage);
[super dealloc];
}
#end
Because the CGImage returned by -[UIImage CGImage] property is not guaranteed to be same CGImage passed into init method, the class stores CGImage separately.
I agree with you -- the documentation is muddled at best for this API. Based on your experiences, then, I would conclude that you are responsible for the lifetime of both objects - the UIImage and the CGImageRef. On top of that you have to make sure the lifetime of the CGImageRef is at least as long as the UIImage (for obvious reasons).
Are you using Xcode 3.1 on Snow Leopard? I am experiencing the same issue:
CGContextRef ctx = ...;
CGImageRef cgImage = CGBitmapContextCreateImage(ctx);
UIImage * newImage = [[UIImage imageWithCGImage:cgImage] retain];
CGImageRelease(cgImage); // this should be OK, but it now crashes in Simulator on SL
I'm guessing that upgrading to Xcode 3.2 will fix the issue.
Not sure if this helps, but I had a similar problem.
I read the answers and then did the following which appears to have fixed it:
CGImageRef cgImage = [asset thumbnail];
UIImage *thumbImage = [[UIImage imageWithCGImage:cgImage ]retain];
UIImageView *thumbImageView = [[UIImageView alloc] initWithImage:thumbImage];
CGImageRelease(cgImage);
I used the autorelease as suggested but it was not enough.
One I had added the image to the UIImageView I then released the cgImage.
It didn't crash and deallocated nicely. Why -- I have no idea but it worked.
The asset thumbnail part is from the ALAsset Library by the way, you might need something else there.
<>
Not my opinion. I had the same problem. According to documentation
UIImage *image = [[UIImage alloc] initWithCGImage:imageRef];
imho keeps all references correct. If you are using a CGDataProvider please have a look at
CGDataProviderReleaseDataCallback
and set a breakpoint in your callback. You can see that is is correctly called after you [release] your image and you can free() your image data buffer there.
"This method does not cache the image object."
So the UIImage take the ownership, and thus you should not release the CGImageRef.