I have a custom class Frame that gets image data from multiple sources. The class can generate an UIImage. The problem is when a generated UIImage is drawn on the screen, it crashes with EXC_BAD_ACCESS.
Callstack is empty, it ends at start->main->UIApplicationMain.
I think it has something to do with CGImageCreate and that the pointer isn't retained somehow. But I have a hard time figuring out why. The XCode debugger shows the UIImage exists right before it's added as a subview through UIImageView, but after it just crashes. I've also tried to draw it directly to a custom UIView with drawRect, but it crashes with EXC_BAD_ACCESS at drawRect.
Any thoughts would be greatly appreciated!
Here's the code:
UIImage *image = [UIImage imageNamed:#"test.png"];
// To NSData
CGImageRef imageRef = image.CGImage;
CFDataRef dataRef = CGDataProviderCopyData(CGImageGetDataProvider(imageRef));
const unsigned char *pixels = CFDataGetBytePtr(dataRef);
const signed long length = CFDataGetLength(dataRef);
NSData *data = [NSData dataWithBytes:pixels length:length];
CGFloat width = CGImageGetWidth(imageRef);
CGFloat height = CGImageGetHeight(imageRef);
CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef);
// NSData to UIImage
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGDataProviderRef dataProviderRef =
CGDataProviderCreateWithData(NULL, data.bytes, data.length, NULL);
CGImageRef imageRef2 =
CGImageCreate(width, height, 8, 32, 4 * width, colorSpaceRef, bitmapInfo,
dataProviderRef, NULL, NO, kCGRenderingIntentDefault);
UIImage *image2 = [UIImage imageWithCGImage:imageRef2];
CGColorSpaceRelease(colorSpaceRef);
CGDataProviderRelease(dataProviderRef);
CGImageRelease(imageRef);
//Show UIImage
UIImageView *imageView = [[UIImageView alloc] initWithFrame:self.view.frame];
imageView.image = image2;
//Breakpoint here shows that `image2` is equal to `image`
[self.view addSubview:imageView];
//EXC_BAD_ACCESS
Related
With the help of the instrument, it is shown that CGDataProviderCopyData is using too much memory. How to fix the issue?
-(UIImage*)imageNamed:(NSString*)name {
UIImage *uiimage = [UIImage imageNamed:name];
CGImageRef originalImage = [uiimage CGImage];
CFDataRef imageData = CGDataProviderCopyData(
CGImageGetDataProvider(originalImage));
CGDataProviderRef imageDataProvider = CGDataProviderCreateWithCFData(imageData);
CFRelease(imageData);
CGImageRef image = CGImageCreate(
CGImageGetWidth(originalImage),
CGImageGetHeight(originalImage),
CGImageGetBitsPerComponent(originalImage),
CGImageGetBitsPerPixel(originalImage),
CGImageGetBytesPerRow(originalImage),
CGImageGetColorSpace(originalImage),
CGImageGetBitmapInfo(originalImage),
imageDataProvider,
CGImageGetDecode(originalImage),
CGImageGetShouldInterpolate(originalImage),
CGImageGetRenderingIntent(originalImage));
CGDataProviderRelease(imageDataProvider);
return [UIImage imageWithCGImage:image];
}
Finally, I was able to solve this. Actually there were few extra unnecessary steps which were causing memory leaks. Here is the updated function:
-(UIImage*)imageNamed:(NSString*)name {
UIImage *uiimage = [UIImage imageNamed:name];
CGImageRef originalImage = [uiimage CGImage];
CGImageRef image = CGImageCreate(
CGImageGetWidth(originalImage),
CGImageGetHeight(originalImage),
CGImageGetBitsPerComponent(originalImage),
CGImageGetBitsPerPixel(originalImage),
CGImageGetBytesPerRow(originalImage),
CGImageGetColorSpace(originalImage),
CGImageGetBitmapInfo(originalImage),
CGImageGetDataProvider(originalImage),
CGImageGetDecode(originalImage),
CGImageGetShouldInterpolate(originalImage),
CGImageGetRenderingIntent(originalImage));
return [UIImage imageWithCGImage:image];
}
CGImageGetDataProvider(originalImage) returns CGDataProviderRef and that is required as 8th parameter in CGImageCreate function. Above steps to copy image data and then creating CGDataProviderRef were unnecessary.
The problem is that you never release image.
Update your code as follows:
-(UIImage*)imageNamed:(NSString*)name {
UIImage *uiimage = [UIImage imageNamed:name];
CGImageRef originalImage = [uiimage CGImage];
CFDataRef imageData = CGDataProviderCopyData(
CGImageGetDataProvider(originalImage));
CGDataProviderRef imageDataProvider = CGDataProviderCreateWithCFData(imageData);
CFRelease(imageData);
CGImageRef image = CGImageCreate(
CGImageGetWidth(originalImage),
CGImageGetHeight(originalImage),
CGImageGetBitsPerComponent(originalImage),
CGImageGetBitsPerPixel(originalImage),
CGImageGetBytesPerRow(originalImage),
CGImageGetColorSpace(originalImage),
CGImageGetBitmapInfo(originalImage),
imageDataProvider,
CGImageGetDecode(originalImage),
CGImageGetShouldInterpolate(originalImage),
CGImageGetRenderingIntent(originalImage));
CGDataProviderRelease(imageDataProvider);
UIImage *result = [UIImage imageWithCGImage:image];
CGImageRelease(image);
return result;
}
i have CGImageRef object (var quartzImage). How convert this object to format PNG data for web:
"data:image/png;base64,"+ base64 data image
my code:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
NSLog(#"%#",quartzImage);
}
If you already have a CGImageRef (with name quartzImage in your code) then you do not need to create an NSImage. Create a NSBitmapImageRep directly. And you should in no case use the lockFocus method. This is good for images that shall be depicted to the screen. And therefore lockFocus usually creates images with a resolution of 72 dpi and 144 dpi for Retina screens. Or do you want to create images for the web with the properties of your screen? Try this:
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCGImage:quartzImage];
NSData *repData = [bitmapRep representationUsingType:NSPNGFileType] properties:nil];
NSString *base64String = [repData base64EncodedStringWithOptions:0];
This base64… method is not available before OS X 10.9. In that case you should use base64Encoding
NSImage *image = [NSImage imageWithCGImage:imageRef];
[image lockFocus];
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithFocusedViewRect:NSMakeRect(0, 0, image.size.width, image.size.height)];
[image unlockFocus];
NSData *imageData = [bitmapRep representationUsingType:NSPNGFileType properties:nil];;
NSString *base64String = [imageData base64EncodedStringWithOptions:0];
I'm using an AVCaptureVideoDataOutput along with its delegate method to manipulate video frames. In the delegate method, I am using the sampleBuffer to create a CIImage, and from here I crop the CIImage, convert it to a UIImage and display it. Unfortunately, I need to determine the file-size of this new UIImage, but it's returning 0. The code works, the image is cropped beautifully, everything. I just don't see why it has no data!
Why might this be? Relevant code follows:
//In delegate method, given sampleBuffer...
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault,
sampleBuffer, kCMAttachmentMode_ShouldPropagate);
CIImage *ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer
options:(NSDictionary *)attachments];
...
dispatch_async(dispatch_get_main_queue(), ^(void) {
CGRect rect = [self drawFaceBoxesForFeatures:features forVideoBox:clap
orientation:curDeviceOrientation];
CIImage *cropped = [ciImage imageByCroppingToRect:rect];
UIImage *image = [[UIImage alloc] initWithCIImage:cropped];
NSData *data = UIImageJPEGRepresentation(image, 1);
NSLog(#"Image size is %d", data.length); //returns 0???
[imageView setImage:image];
[image release];
});
I had the same Problem, but with simple filtered images.
I stumbled upon this and it solved the issue. After this, I was able to save my image.
CGSize size = self.originalImage.size;
CGRect rect;
rect.origin = CGPointZero;
rect.size = size;
UIGraphicsBeginImageContext(size);
[[UIImage imageWithCIImage:self.filteredImage] drawInRect:rect];
UIImage * image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * jpegData = UIImageJPEGRepresentation(image, 1.0);
But I only needed this two lines in the "ImageContext"
I have some well-working code for reading the screen or offscreen buffer and saving the result to the iPad photo album as a PNG with transparency. The images appear perfectly when viewed in the ipad photo viewer or any other image viewer. However, within ipad's native photo viewing app the thumbnails show portions of other images from the album in the transparent sections of the thumbnail.
Has anyone else experienced this problem, and if so found a fix for it? Here's my offscreen (partial) code for generating the images:
EAGLContext *myContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
[EAGLContext setCurrentContext:myContext];
[... set up render buffer code removed for display ...]
[EAGLContext setCurrentContext:myContext];
ImageTextureManager *imageManager = [[ImageTextureManager alloc] init];
[imageManager loadImageTexture:gAppModel.currentImageRef];
[imageManager release];
glBindRenderbufferOES(GL_RENDERBUFFER_OES, offscreenColorRenderbuffer);
[self renderTransformedImage]; // render the image to the buffer
[myContext presentRenderbuffer:GL_RENDERBUFFER_OES];
// grab image from frameBuffer and return it as UIImage
NSInteger x = 0, y = 0;
NSInteger dataLength = width * height * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);
CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace,
kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast,
ref, NULL, true, kCGRenderingIntentDefault);
UIGraphicsBeginImageContext(CGSizeMake(width, height));
CGContextRef cgcontext = UIGraphicsGetCurrentContext();
CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, width, height), iref);
UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); // this call creates an AutoRelease UIImage
NSData* imdata = UIImagePNGRepresentation(image); // get PNG representation
UIImage* myImagePNG = [UIImage imageWithData:imdata]; // wrap UIImage around PNG representation
UIImageWriteToSavedPhotosAlbum(myImagePNG, nil, nil, nil);
UIGraphicsEndImageContext();
Thanks to medvedNick for his offscreen rendering code: Drawing into OpenGL ES framebuffer and getting UIImage from it on iPhone
i am using this function to create Screenshots of the my IPAD App. I am using the Sparrow Framework in my Project. SPDisplayObject uses OpenGl-ES based rendering.
#implementation SPDisplayObject (ScreenshotFromSPDisplayObject)
- (UIImage *)getImageScreenshot{
int WIDTH = 1024;
int HEIGHT = 768;
CGSize size = CGSizeMake(WIDTH,HEIGHT);
//Create un buffer for pixels
GLuint bufferLenght=size.width*size.height*4;
GLubyte *buffer = (GLubyte *) malloc(bufferLenght);
//Read Pixels from OpenGL
glReadPixels(0,0,size.width,size.height,GL_RGBA,GL_UNSIGNED_BYTE,buffer);
//Make data provider with data.
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, bufferLenght, NULL);
//Configure image
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * size.width;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef iref = CGImageCreate(size.width,size.height,bitsPerComponent,bitsPerPixel,bytesPerRow,colorSpaceRef,bitmapInfo,provider,NULL,NO,renderingIntent);
uint32_t *pixels = (uint32_t *)malloc(bufferLenght);
CGContextRef context = CGBitmapContextCreate(pixels, WIDTH, HEIGHT, 8, WIDTH*4, CGImageGetColorSpace(iref), kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGContextTranslateCTM(context,0, size.height);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextDrawImage(context, CGRectMake(0.0, 0.0, size.width, size.height), iref);
UIImage* screenshot = [UIImage imageWithCGImage:CGBitmapContextCreateImage(context)];
UIGraphicsEndImageContext();
//free memory
CGColorSpaceRelease(colorSpaceRef);
CGDataProviderRelease(provider);
CGImageRelease(iref);
CGContextRelease(context);
free(buffer);
free(pixels);
return screenshot;
}
#end
I use it like this from an UIViewController:
#interface
UIImageView *screenShot;
UIImage *tempImage;
-(void) deactivePage
{
// attach screenshot
tempImage = [self.stage getImageScreenshot];
screenShot = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,1024,768)];
screenShot.image = tempImage;
[self.view addSubview:screenShot];
}
- (void)dealloc
{
screenShot.image = nil;
[screenShot removeFromSuperview];
[screenShot release];
[super dealloc];
}
The UIViewController is released and deallocated aprox. 5 Seconds after the "deactivePage" function is called.
The Screenshot is used for a View Transition.
Taking Screenshots works like a charm, but with every Screenshot my App is growing around 10 MBs So i can do this around 15 Times till the app crashes.
So where is the leak? I am stuck.. :-(
In the getImageScreenshot function you do this:
UIImage* screenshot = [UIImage imageWithCGImage:CGBitmapContextCreateImage(context)];
which creates a CGImageRef and then creates (autorelease) an UIImage from it.
What happens here is that this CGImageRef remains alive and is never released, so it's leaking.
What you should do, instead, is this:
CGImageRef myCGImage = CGBitmapContextCreateImage(context);
UIImage* screenshot = [UIImage imageWithCGImage:myCGImage];
CGImageRelease(myCGImage);
Have you tried to see it using Instruments (Leaks or Heapshots)? you should see these CGImareRef elements still alive.
I don't see where you deallocate tempImage in the UIViewController when it's going down.