Updated to Xcode 7 and getting this (warning?) message while an image was being rendered in an operation:
CreateWrappedSurface() failed for a dataprovider-backed CGImageRef.
There was no message like this under Xcode 6.4.
Got which code part threw the message:
if (!self.originalImage) // #property (nonatomic, strong) UIImage *originalImage;
return;
CGImageRef originalCGImage = self.originalImage.CGImage;
NSAssert(originalCGImage, #"Cannot get CGImage from original image");
CIImage *inputCoreImage = [CIImage imageWithCGImage:originalCGImage]; // this results the console message
I replaced my CIIImage creator to get it directly from the UIImage:
CIImage *originalCIImage = self.originalImage.CIImage;
NSAssert(originalCIImage, #"Cannot build CIImage from original image");
In this case I didn't get any console message, but had an assert: originalCIImage was nil.
The class reference of UIImage says:
#property(nonatomic, readonly) CIImage *CIImage
If the UIImage object was initialized using a CGImageRef, the value of the property is nil.
So I'm using the original code as fallback:
CIImage *originalCIImage = self.originalImage.CIImage;
if (!originalCIImage) {
CGImageRef originalCGImageRef = self.originalImage.CGImage;
NSAssert(originalCGImageRef, #"Unable to get CGimageRef of originalImage");
originalCIImage = [CIImage imageWithCGImage:originalCGImageRef];
}
NSAssert(originalCIImage, #"Cannot build CIImage from original image");
The problem is, I'm still getting the warning messages in console.
Has anybody got this message before? What's the solution to nuke that warning(?) message?
Thanks,
Adam
Finally figured out the answer. Curious by the error I studied up on how CIImage works (https://uncorkedstudios.com/blog/image-filters-with-core-graphics)
I noticed that the CGImageRef is dataprovider-backed with premultiplied values (RGB and A)
I thought to myself that the CGImage I am loading into a CIImage (using [CIImage imageWithCGImage:originalCGImage]; is only RGB and not RGBA). Sure enough, I was creating this image by taking a snapshot of a view using the standard UIGraphicsBeginImageContextWithOptions and I had the opaque parameter set to "YES".
I simply changed:
UIGraphicsBeginImageContextWithOptions(bounds, YES, 1.0f);
to
UIGraphicsBeginImageContextWithOptions(bounds, NO, 1.0f);
So that I am now creating a RGBA image, not an RGB image.
Now I convert my CGImage to CIImage and the CIImage NOW has proper dataprovider backing and the error goes away.
NOTE:
I was using a CIClamp filter for gaussian blur purposes, and with opaque set to NO the clamp doesn't work as effectively. I decided to just keep the opaque at YES and ignore the log warnings, they don't seem to actually do anything.)
Related
Application that I am working process images. It's like user drops at max 4 images and app layout them based on user's selected template. One image might be added 2-3 times in final view.
Each image in layout is drawn in NSView (drawRect method of NSView using drawInRect method).Now final image (combined image by layouting all images) is created by saving NSView as Image and it all works very well.
Now problem that I am facing is memory is being retained by app once all processing is done. I have used instruments allocation and I don't see memory leaks but I see "Persistent bytes" are increasing continuously with each session of app and one user reported issue in GB's. Please see screenshot.
When I further investigated in Instruments I saw below code snaps of app that is causing memory retentions. All are related to ImageIO and coreImages. See below from instruments:
However this seems to be only problem with 10.10 and above system. Tested same version of the app in 10.9.x and memory usage remains with in 60MB. During session execution in app it goes to 200MB but once it's done it comes back to 50-60MB that usual for kind of app.
[_photoImage drawInRect: self.bounds fromRect: NSZeroRect operation: NSCompositeSourceOver fraction: 1.0 respectFlipped: YES hints: nil];
_photoImage = nil;
Above code I am using to draw image in NSView's drawRect method and code shown in image is being used to get NSView as Image.
Update: After my further investigation I found that it's CGImageSourceCreateWithData that is caching the TIFF data of NSImage. More ever I am using below code to crop the image and if i uncomment it memory consumption just works fine.
NSData *imgData = [imageToCrop TIFFRepresentation];
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)imgData, NULL);
CGImageRef maskRef = CGImageSourceCreateImageAtIndex(source, 0, NULL);
CGImageRef imageRef = CGImageCreateWithImageInRect(maskRef, rect);
NSImage *cropped = [[NSImage alloc] initWithCGImage: imageRef size:rect.size];
CGImageRelease(maskRef);
CGImageRelease(imageRef);
CFRelease(source);
//CFRelease( options );
imgData = nil;
I have also trying explicitly setting kCGImageSourceShouldCache to false (but it's by default false) but same results.
Please help to solve the memory retention issue.
Finally after lots of debugging it turns out that CGImageSourceCreateWithData is somewhere retaining TIFF data of NSImage. When I changed this line:
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)imgData, NULL);
with
CGImageSourceRef source = CGImageSourceCreateWithURL((CFURLRef)[NSURL fileURLWithPath:path], NULL);
everything just started working fine and app's memory usage was dropped from 300MB (for 6images) to 50-60MB and it's consistent behaviour now.
Apart from above changes it was still causing memory retention somewhere so to get rid of that, after all processing is done I cleared image of each layer to 'nil' and that works like charm. I was in impression that making parent as 'nil' would release images as well but that was not working.
Anyway if anyone seems issue with drawInRect or cacheDisplayInRect then make sure to clear out the image if not needed later on.
Update 2nd July 2016
I found that kCGImageSourceShouldCache is false by default in 32bit and true for 64bit. I was able to release memory with below code by setting it to false.
const void *keys[] = { kCGImageSourceShouldCache};
const void *values[] = { kCFBooleanFalse};
CFDictionaryRef optionsDictionary = CFDictionaryCreate(NULL, keys, values, 1, NULL, NULL);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)[image TIFFRepresentation], optionsDictionary);
Hope it helps someone.
I get this error BSXPCMessage received error for message: Connection interrupted showing up whenever I create a CIContext for an image I am going to allow the user to edit. This error can be found on other stackoverflow question like this one - BSXPCMessage received error for message: Connection interrupted on CIContext with iOS 8
However, while the accepted answer (https://stackoverflow.com/a/26268910/2939977) does work in eliminating the error message it spikes the CPU usage to unusable level and makes the UI get very choppy (CPU usage from 50% to about 150% once the kCIContextUseSoftwareRenderer flag is set to YES) as my filters are attached to sliders and performance is paramount.
I tried following apple documentation here to create an EAGL context to create my CIContext that I set the color space to null.
EAGLContext *myEAGLContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
NSDictionary *options = #{ kCIContextWorkingColorSpace : [NSNull null] };
imageContext = [CIContext contextWithEAGLContext:myEAGLContext options:options];
This creates the filter without the error message and performance is much much better Hooray! However, as soon as the user engages the editing options to adjust the image I get the BSXPCMessage received error for message: Connection interrupted on the first adjustment. I never see the error message again and it does not seem to effect the output at all.
[filter setValue:#(contrast) forKey:#"inputContrast"];
[filter setValue:#(color) forKey:#"inputSaturation"];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [imageContext createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg scale:1 orientation:UIImageOrientationRight];
transitionImage.image = newImage;
CGImageRelease(cgimg);
In case you are wondering I did start on GPUImage because of the better performance when using filters on sliders but found it had some other issues when it came to image scaling that cause me to remove it and go down the CIFilter route.
Any advice or help would be greatly appreciated :)
If you are going to create a CIContext with:
[CIContext contextWithEAGLContext:options:]
then you should draw directly into that context using:
[cictx drawImage:inRect:fromRect:]
This will be much faster than creating a CGImageRef and passing that to a UIImageView.
I have a program that fetches an image from the library, but I'm using code I found online to resize that image so that it can fit on the screen (basically making it 640x960), but then it would still be too big to display, so in another UIImage I'm copying the first resized image and re-resizing this one to make it about 1/4 of the screen (or 160x240). The code is this:
for ViewController.h:
UIImage *img;
UIImage *thumb;
-(UIImage*) scaleImage: (UIImage*)image toSize:(CGSize)newSize;
(this of course, is only the code related to my problem)
for ViewController.m
-(UIImage*) scaleImage: (UIImage*)image toSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
and in the same m file on another function, the scaleImage function is called when pressing a button with these lines:
[self scaleImage:img toSize:CGSizeMake(640, 960)];
thumb = img;
[self scaleImage:thumb toSize:CGSizeMake(160, 240)];
In the project I've previously been able to successfully provide an image for img using [info objectForKey:UIImagePickerControllerOriginalImage]; which would be the image chosen from the library. I've already "file owned" everything so that this function takes place (and it does because I create an UIAlert within it and it shows, and a NSLog to print out when scaleImage starts and it does, twice), but the image is never re-sized!! Does anyone know why?? Please let me know, thank you for anyone who comments with help or suggestions!!
Your scaleImage method returns the scaled image, for example
thumb = [self scaleImage:img toSize:CGSizeMake(640, 960)];
Stuck again. :(
I have the following code crammed into a procedure invoked when I click on a button on my application main window. I'm just trying to tweak a CIIMage and then display the results. At this point I'm not even worried about exactly where / how to display it. I'm just trying to slam it up on the window to make sure my Transform worked. This code seems to work down through the drawAtPoint message. But I never see anything on the screen. What's wrong? Thanks.
Also, as far as displaying it in a particular location on the window ... is the best technique to put a frame of some sort on the window, then get the coordinates of that frame and "draw into" that rectangle? Or use a specific control from IB? Or what? Thanks again.
// earlier I initialize a NSImage from JPG file on disk.
// then create NSBitmapImageRep from the NSImage. This all works fine.
// then ...
CIImage * inputCIimage = [[CIImage alloc] initWithBitmapImageRep:inputBitmap];
if (inputCIimage == Nil)
NSLog(#"could not create CI Image");
else {
NSLog (#"CI Image created. working on transform");
CIFilter *transform = [CIFilter filterWithName:#"CIAffineTransform"];
[transform setDefaults];
[transform setValue:inputCIimage forKey:#"inputImage"];
NSAffineTransform *affineTransform = [NSAffineTransform transform];
[affineTransform rotateByDegrees:3];
[transform setValue:affineTransform forKey:#"inputTransform"];
CIImage * myResult = [transform valueForKey:#"outputImage"];
if (myResult == Nil)
NSLog(#"Transformation failed");
else {
NSLog(#"Created transformation successfully ... now render it");
[myResult drawAtPoint: NSMakePoint ( 0,0 )
fromRect: NSMakeRect ( 0,0,128,128 )
operation: NSCompositeSourceOver
fraction: 1.0]; //100% opaque
[inputCIimage release];
}
}
Edit #1:
snip - removed the prior code sample mentioned below (in the comments about drawRect), which did not work
Edit #2: adding some code that DOES work, for anyone else in the future who might be stuck on this same thing. Not sure if this is the BEST way to do it ... but it does work for my quick and dirty purposes. So this new code (below) replaces the entire [myResult drawAtPoint ...] message from above / in my initial question. This code takes the image created by the CIImage transform and displays it in the NSImageView control.
NSImage *outputImage;
NSCIImageRep *ir;
ir = [NSCIImageRep imageRepWithCIImage:myResult];
outputImage = [[[NSImage alloc] initWithSize: NSMakeSize(inputImage.size.width, inputImage.size.height)] autorelease];
[outputImage addRepresentation:ir];
[outputImageView setImage: outputImage]; //outputImageView is an NSImageView control on my application's main window
Drawing on screen in Cocoa normally takes place inside an -[NSView drawRect:] override. I take it you're not doing that, so you don't have a correctly set up graphics context.
So one solution to this problem is to create a NSCIImageRep from the CIImage, then add that representation to a new NSImage, then it is easy to display the NSImage in a variety of ways. I've added the code I used up above (see "edit #2"), where I display the "output image" within an NSImageView control. Man ... what a PITA this was!
If I use [UIImage imageWithCGImage:], passing in a CGImageRef, do I then release the CGImageRef or does UIImage take care of this itself when it is deallocated?
The documentation isn't entirely clear. It says "This method does not cache the image object."
Originally I called CGImageRelease on the CGImageRef after passing it to imageWithCGImage:, but that caused a malloc_error_break warning in the Simulator claiming a double-free was occurring.
According to the fundamental rule of Cocoa memory management, an owning object should release the owned object when it no longer needs it. Other objects are responsible for taking and releasing ownership on their own. If a UIImage needs an object to persist but doesn't retain or copy it, it's a bug in UIImage's implementation and should be reported as such.
I have the same problem in XCode 3.2.1 on Snow Leopard. I have pretty much the same code as Jason.
I have looked at the iPhone sample code which uses imageWithCGImage and they always release the CGImageRef using CGImageRelease after a call to imageWithCGImage.
So, is this a bug in the simulator? I always get the malloc_error_break warning on the console when I use CGImageRelease.
The ownership in UIImage about CGImage is unclear. It seems like don't copy the CGImage but it's not guaranteed by documentation. So we have to handle that ourselves.
I used a new subclass of UIImage to handle this problem. Just retaining passing CGImage, and releasing it when the dealloc.
Here is sample.
#interface EonilImage : UIImage
{
#private
CGImageRef sourceImage;
}
- (id)initWithCGImage:(CGImageRef)imageRef scale:(CGFloat)scale orientation:(UIImageOrientation)orientation;
#end
#implementation EonilImage
- (id)initWithCGImage:(CGImageRef)imageRef scale:(CGFloat)scale orientation:(UIImageOrientation)orientation
{
self = [super initWithCGImage:imageRef scale:scale orientation:orientation];
if (self)
{
sourceImage = imageRef;
CGImageRetain(imageRef);
}
return self;
}
- (void)dealloc
{
CGImageRelease(sourceImage);
[super dealloc];
}
#end
Because the CGImage returned by -[UIImage CGImage] property is not guaranteed to be same CGImage passed into init method, the class stores CGImage separately.
I agree with you -- the documentation is muddled at best for this API. Based on your experiences, then, I would conclude that you are responsible for the lifetime of both objects - the UIImage and the CGImageRef. On top of that you have to make sure the lifetime of the CGImageRef is at least as long as the UIImage (for obvious reasons).
Are you using Xcode 3.1 on Snow Leopard? I am experiencing the same issue:
CGContextRef ctx = ...;
CGImageRef cgImage = CGBitmapContextCreateImage(ctx);
UIImage * newImage = [[UIImage imageWithCGImage:cgImage] retain];
CGImageRelease(cgImage); // this should be OK, but it now crashes in Simulator on SL
I'm guessing that upgrading to Xcode 3.2 will fix the issue.
Not sure if this helps, but I had a similar problem.
I read the answers and then did the following which appears to have fixed it:
CGImageRef cgImage = [asset thumbnail];
UIImage *thumbImage = [[UIImage imageWithCGImage:cgImage ]retain];
UIImageView *thumbImageView = [[UIImageView alloc] initWithImage:thumbImage];
CGImageRelease(cgImage);
I used the autorelease as suggested but it was not enough.
One I had added the image to the UIImageView I then released the cgImage.
It didn't crash and deallocated nicely. Why -- I have no idea but it worked.
The asset thumbnail part is from the ALAsset Library by the way, you might need something else there.
<>
Not my opinion. I had the same problem. According to documentation
UIImage *image = [[UIImage alloc] initWithCGImage:imageRef];
imho keeps all references correct. If you are using a CGDataProvider please have a look at
CGDataProviderReleaseDataCallback
and set a breakpoint in your callback. You can see that is is correctly called after you [release] your image and you can free() your image data buffer there.
"This method does not cache the image object."
So the UIImage take the ownership, and thus you should not release the CGImageRef.