Memory leak with core graphics - objective-c

I've been tasked to solve a memory leak with a custom objective class for a legacy app that uses garbage collection.
The class takes in NSData from a jpeg file and can thumb the image. There is a method to return a new NSData object with the newly resized image.
ImageThumber * imgt = [ImageThumber withNSData:dataObjectFromJpeg];
[imgt thumbImage:1024];
NSdata * smallImage = [imgt imageData];
[imgt thumbImage:256];
NSdata * extraSmallImage = [imgt imageData];
It does what it's supposed to do but it's been discovered that for every ImageThumber that's created it allocates a ImageIO_jpeg_Data object that's never deallocated. This was found in Instruments.
When created using ImageThumber withNSData:(NSData) it creates a CGImage and stores it in a private CGImageRef variable
I found that if thumbImage:(int) isn't called the ImageIO_jpeg_Data will deallocate when ImageThumber is deallocated which leads me to believe the problem lies somewhere within the thumbImage method. If thumbImage method is called multiple times it doesn't create extra ImageIO_jpeg_Data.
I have little experience with core graphics and garbage collection.
+(id)SAMImageDataWithNSData:(NSData *)data
{
SAMImageData * new = [[[self alloc] init] autorelease];
new.imageData = [NSMutableData dataWithData:data];
CFDataRef imgData = (CFDataRef)data;
CGDataProviderRef imgDataProvider;
imgDataProvider = CGDataProviderCreateWithCFData(imgData);
new->_cgImageRef = CGImageCreateWithJPEGDataProvider(imgDataProvider, NULL, true, kCGRenderingIntentDefault);
CGDataProviderRelease(imgDataProvider);
int width = (int)CGImageGetWidth(new->_cgImageRef);
int height = (int)CGImageGetHeight(new->_cgImageRef);
new.originalSize = NSMakeSize(width, height);
return new;
}
-(void)thumbImage:(int)length
{
/* simple logic to calculate new width and height */
//If the next line is commented out the problem doesn't exist.
//You just don't get the image resize.
[self resizeCGImageToWidth:newSize.width andHeight:newSize.height];
CFMutableDataRef workingData = (CFMutableDataRef)[[NSMutableData alloc] initWithCapacity:0];
CGImageDestinationRef dest;
dest = CGImageDestinationCreateWithData(workingData,kUTTypeJPEG,1,NULL);
CGImageDestinationAddImage(dest,_cgImageRef,NULL);
CGImageDestinationFinalize(dest);
CFRelease(dest);
self.imageData = (NSMutableData *)workingData;
}
This where I believe the problem exists:
- (void)resizeCGImageToWidth:(int)width andHeight:(int)height {
CGColorSpaceRef colorspace = CGImageGetColorSpace(_cgImageRef);
CGContextRef context = CGBitmapContextCreate(NULL, width, height,
CGImageGetBitsPerComponent(_cgImageRef),
CGImageGetBytesPerRow(_cgImageRef),
colorspace,
kCGImageAlphaPremultipliedFirst);
CGColorSpaceRelease(colorspace);
if(context == NULL)
return;
CGContextDrawImage(context, CGRectMake(0, 0, width, height), _cgImageRef);
_cgImageRef = CGBitmapContextCreateImage(context);
CGContextRelease(context);
}

Related

CGImageRef gets corrupted after returned from a method

My code creates an TIFFRepresentation of an image and I want to recode it to something different. This is not problematic.
My ImgUtils function is:
+ (CGImageRef) processImageData:(NSData*)rep {
NSBitmapImageRep *bitmapRep = [NSBitmapImageRep imageRepWithData:rep];
int width = bitmapRep.size.width;
int height = bitmapRep.size.height;
size_t pixels_size = width * height;
Byte raw_bytes[pixels_size * 3];
//
// processing, creates and stores raw byte stream
//
int bitsPerComponent = 8;
int bytesPerPixel = 3;
int bitsPerPixel = bytesPerPixel * bitsPerComponent;
int bytesPerRow = bytesPerPixel * width;
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL,
raw_bytes,
pixels_size * bytesPerPixel,
NULL);
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef imageRef = CGImageCreate(width,
height,
bitsPerComponent,
bitsPerPixel,
bytesPerRow,
colorSpaceRef,
bitmapInfo,
provider,
NULL,
NO,
renderingIntent);
[ImgUtils saveToPng:imageRef withSuffix:#"-ok"];
CGColorSpaceRelease(colorSpaceRef);
CGDataProviderRelease(provider);
return imageRef;
}
There is another method, that saves a CGImageRef to filesystem.
+ (BOOL) saveToPng:(CGImageRef)imageRef withSuffix:(NSString*)suffix {
CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:[NSString stringWithFormat:#"~/Downloads/pic%#.png", suffix]];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL(url, kUTTypePNG, 1, NULL);
CGImageDestinationAddImage(destination, imageRef, nil);
CGImageDestinationFinalize(destination);
CFRelease(destination);
return YES;
}
As you can see, immediately after processing the image, I save it on a disk, as pic-ok.png.
Here is the code, that calls the processing function:
CGImageRef cgImage = [ImgUtils processImageData:imageRep];
[ImgUtils saveToPng:cgImage withSuffix:#"-bad"];
The problem is, that the two images differ. Second one, with the -bad suffix is corrupted.
See examples below. Seems like the memory area the CGImageRef pointer is pointing to is released and overwritten immediately after returning from the method.
I tried also return CGImageCreateCopy(imageRef); but it changed nothing.
What am I missing?
CGDataProviderCreateWithData() does not copy the buffer you provide. Its purpose is to allow creation of a data provider that accesses that buffer directly.
Your buffer is created on the stack. It goes invalid after +processImageData: returns. However, the CGImage still refers to the provider and the provider still refers to the now-invalid buffer.
One solution would be to create the buffer on the heap and provide a callback via the releaseData parameter that frees it. Another would be to create a CFData from the buffer (which copies it) and then create the data provider using CGDataProviderCreateWithCFData(). Probably the best would be to create a CFMutableData of the desired capacity, set its length to match, and use its storage (CFDataGetMutableBytePtr()) as your buffer from the beginning. That's heap-allocated, memory-managed, and doesn't require any copying.

Open CV memory stacked (not released properly)

I am using 3rd party library for image processing, this method seems to be the cause of large memory usage (+30MB) everytime it executed, and it won't release properly. Repeated use of it ends up crashing the app (memory overload). The image used is directly from camera of my iP6.
+ (UIImage *)UIImageFromCVMat:(cv::Mat)cvMat
{
NSData *data = [NSData dataWithBytes:cvMat.data length:cvMat.elemSize() * cvMat.total()];
CGColorSpaceRef colorSpace;
if (cvMat.elemSize() == 1) {
colorSpace = CGColorSpaceCreateDeviceGray();
} else {
colorSpace = CGColorSpaceCreateDeviceRGB();
}
CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGImageRef imageRef = CGImageCreate(cvMat.cols, // Width
cvMat.rows, // Height
8, // Bits per component
8 * cvMat.elemSize(), // Bits per pixel
cvMat.step[0], // Bytes per row
colorSpace, // Colorspace
kCGImageAlphaNone | kCGBitmapByteOrderDefault, // Bitmap info flags
provider, // CGDataProviderRef
NULL, // Decode
false, // Should interpolate
kCGRenderingIntentDefault); // Intent
// UIImage *image = [[UIImage alloc] initWithCGImage:imageRef];
UIImage *image = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
return image;
}
I suspect the problem is here: (__bridge CFDataRef)data. I cant use CFRelease on it cause it make app crash. Project is running with ARC.
EDIT:
It seems the same code is also in openCV official website:
http://docs.opencv.org/2.4/doc/tutorials/ios/image_manipulation/image_manipulation.html
Gah!
EDIT 2
Here is the code how I use it (actually below code is also a part of the 3rd party lib, but i added some lines).
cv::Mat undistorted = cv::Mat( cvSize(maxWidth,maxHeight), CV_8UC4); // here nothing
cv::Mat original = [MMOpenCVHelper cvMatFromUIImage:_adjustedImage]; // here +30MB
//NSLog(#"%f %f %f %f",ptBottomLeft.x,ptBottomRight.x,ptTopRight.x,ptTopLeft.x);
cv::warpPerspective(original, undistorted,
cv::getPerspectiveTransform(src, dst), cvSize(maxWidth, maxHeight)); // here +16MB
_cropRect.hidden=YES;
#autoreleasepool {
_sourceImageView.image=[MMOpenCVHelper UIImageFromCVMat:undistorted]; // here +15MB (PROBLEM)
}
original.release(); // here -30MB (THIS IS OK)
undistorted.release(); // here -16MB (ok)
Guess it is a hard subject since not many people knows OpenCV that well. What I found is that most answer for the similar problem involves putting #autoreleasepool where this method is used. But seems to be not releasing memory either.
As temporary solution I resize the image fed to this method by half. At least app will last longer before it crash finally. It just works.

I need help optimizing BGR888 blitting to NSView

This is best I've come up with for blitting a 24-bit BGR image out to an NSView.
I did trim a significant amount of CPU time by ensuring that the NSWindow host also had the same colorSpace.
I think there are 4 or 5 pixel copies going on here:
in the vImage conversion (required)
calling CGDataProviderCreateWithData
calling CGImageCreate
creating the NSBitmapImageRep bitmap
in the final blit with drawInRect (required)
Anyone want to chime in on improving it?
Any help would be much appreciated.
{
// one-time setup code
CGColorSpaceRef useColorSpace = nil;
int w = 1920;
int h = 1080;
[theWindow setColorSpace: [NSColorSpace genericRGBColorSpace]];
// setup vImage buffers (not listed here)
// srcBuffer is my 24-bit BGR image (malloc-ed to be w*h*3)
// dstBuffer is for the resulting 32-bit RGBA image (malloc-ed to be w*h*4)
...
// this is called # 30-60fps
if (!useColorSpace)
useColorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
vImage_Error err = vImageConvert_BGR888toRGBA8888(srcBuffer, NULL, 0xff, dstBuffer, NO, 0);
CGDataProviderRef newProvider = CGDataProviderCreateWithData(NULL,dstBuffer->data,w*h*4,myReleaseProvider);
CGImageRef myImageRGBA = CGImageCreate(w, h, 8, 32, w*4, useColorSpace, kCGBitmapByteOrderDefault | kCGImageAlphaLast, newProvider, NULL, false, kCGRenderingIntentDefault);
CGDataProviderRelease(newProvider);
// store myImageRGBA in an array of frames (using NSObject wrappers) for later access (setNeedsDisplay:)
...
}
- (void)drawRect:(NSRect)dirtyRect
{
// this is called # 30-60fps
CGImageRef storedImage = ...; // retrieve from array
NSBitmapImageRep *repImg = [[NSBitmapImageRep alloc] initWithCGImage:storedImage];
CGRect myFrame = CGRectMake(0,0,CGImageGetWidth(storedImage),CGImageGetHeight(storedImage));
[repImg drawInRect:myFrame fromRect:myFrame operation:NSCompositeCopy fraction:1.0 respectFlipped:TRUE hints:nil];
// free image from array (not listed here)
}
// this is called when the CGDataProvider is ready to release its data
void myReleaseProvider (void *info, const void *data, size_t size)
{
if (data) {
free((void *)data);
data=nil;
}
}
Use CGColorSpaceCreateDeviceRGB instead of genericRGB to avoid colorspace conversion inside CG. Use kCGImageAlphaNoneSkipLast instead of kCGImageAlphaLast since we know alpha is opaque to allow for a copy instead of a blend.
After you make those changes, it would be useful to run an Instruments time profile on it to show where the time is going.

glGrab for screen capture on Mac OS 10.7.3 with XCode 4.3.2

I am trying to integrate the glGrab code for screen capture on Mac OS under mentioned config and I am currently stuck at an all blue screen being rendered inside my window. I believe that there is some issue with how the image texture has been created, but can't tell what. I am just a couple of weeks old in OpenGL so please go easy on me if I missed something obvious.
I am using the glGrab code as it is except CGLSetFullScreen method (and not even CGLSetFullScreenOnDisplay) because these methods are now deprecated. So this one line of code has been commented out for the time being.
I have been doing some research on this topic since some time now and found another thread on stackoverflow which possibly could have been the complete answer, but it helped much nonetheless. Convert UIImage to CVImageBufferRef
A direct reference to the glGrab code is http://code.google.com/p/captureme/source/browse/trunk/glGrab.c
The answer to my above question is present below. So no more opengl or glGrab. Use what's best optimized for Mac OSX. This doesn't include the code for capturing the mouse pointer also, but I am sure that if you have landed on this page you're smart enough to figure it out by yourself. Or if someone reading this knows the solution then it's your chance to help the fraternity :) Also this code returns a CVPixelBufferRef. You may choose to send back either the CGImageRef or even the bytestream as it is, just tweak it to your liking. :
void swizzleBitmap(void *data, int rowBytes, int height) {
int top, bottom;
void * buffer;
void * topP;
void * bottomP;
void * base;
top = 0;
bottom = height - 1;
base = data;
buffer = malloc(rowBytes);
while (top < bottom) {
topP = (void *)((top * rowBytes) + (intptr_t)base);
bottomP = (void *)((bottom * rowBytes) + (intptr_t)base);
bcopy( topP, buffer, rowBytes );
bcopy( bottomP, topP, rowBytes );
bcopy( buffer, bottomP, rowBytes );
++top;
--bottom;
}
free(buffer);
}
CVImageBufferRef grabViaOpenGL() {
int bytewidth;
CGImageRef image = CGDisplayCreateImage(kCGDirectMainDisplay); // Main screenshot capture call
CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image)); // Get screenshot bounds
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:NO], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
frameSize.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
frameSize.height, 8, 4*frameSize.width, rgbColorSpace,
kCGImageAlphaNoneSkipLast);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
bytewidth = frameSize.width * 4; // Assume 4 bytes/pixel for now
bytewidth = (bytewidth + 3) & ~3; // Align to 4 bytes
swizzleBitmap(pxdata, bytewidth, frameSize.height); // Solution for ARGB madness
CGColorSpaceRelease(rgbColorSpace);
CGImageRelease(image);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}

Large Image Processing (ARC) Major memory leaks

My iPad app that I am creating has to be able to create the tiles for a 4096x2992 image that is generated earlier in my app..
4096x2992 image isn't very complex (what i'm testing with) and when written to file in png format is approximately 600kb...
On the simulator, this code seems to work fine however when I run the app in tighter memory conditions (on my iPad) the process quits because it ran out of memory...
I've been using the same code in the app previously what was working fine (was only creating tiles for 3072x2244 images however)...
Either I must be doing something stupidly wrong or my #autoreleasepool's aren't working as they should (i think i mentioned that im using ARC)... When running in instruments I can just see the memory used climb up until ~500mb where it then crashes!
I've analysed the code and it hasn't found a single memory leak related to this part of my app so I'm really confused on why this is crashing on me...
Just a little history on how my function gets called so you know whats happening... The app uses CoreGraphics to render a UIView (4096x2992) with some UIImageView's inside it then it sends that UIImage reference into my function buildFromImage: (below) where it begins cutting up/resizing the image to create my file...
Here is the buildFromImage: code... the memory issues are built up from within the main loop under NSLog(#"LOG ------------> Begin tile loop ");...
-(void)buildFromImage:(UIImage *)__image {
NSLog(#"LOG ------------> Begin Build ");
//if the __image is over 4096 width of 2992 height then we must resize it! (stop crashes ect)
if (__image.size.width > __image.size.height) {
if (__image.size.width > 4096) {
__image = [self resizeImage:__image toSize:CGSizeMake(4096, (__image.size.height * 4096 / __image.size.width))];
}
} else {
if (__image.size.height > 2992) {
__image = [self resizeImage:__image toSize:CGSizeMake((__image.size.width * 2992 / __image.size.height), 2992)];
}
}
//create preview image (if landscape, no more than 748 high... if portrait, no more than 1004 high) must keep scale
NSString *temp_archive_store = [[NSString alloc] initWithFormat:#"%#/%i-temp_imgdat.zip",NSTemporaryDirectory(),arc4random()];
NSString *temp_tile_store = [[NSString alloc] initWithFormat:#"%#/%i-temp_tilestore/",NSTemporaryDirectory(),arc4random()];
//create the temp dir for the tile store
[[NSFileManager defaultManager] createDirectoryAtPath:temp_tile_store withIntermediateDirectories:YES attributes:nil error:nil];
//create each tile and add it to the compressor once its made
//size of tile
CGSize tile_size = CGSizeMake(256, 256);
//the scales that we will be generating the tiles too
NSMutableArray *scales = [[NSMutableArray alloc] initWithObjects:[NSNumber numberWithInt:1000],[NSNumber numberWithInt:500],[NSNumber numberWithInt:250],[NSNumber numberWithInt:125], nil]; //scales to loop round over
NSLog(#"LOG ------------> Begin tile loop ");
#autoreleasepool {
//loop through the scales
for (NSNumber *scale in scales) {
//scale the image
UIImage *imageForScale = [self resizedImage:__image scale:[scale intValue]];
//calculate number of rows...
float rows = ceil(imageForScale.size.height/tile_size.height);
//calulate number of collumns
float cols = ceil(imageForScale.size.width/tile_size.width);
//loop through rows and cols
for (int row = 0; row < rows; row++) {
for (int col = 0; col < cols; col++) {
NSLog(#"LOG ------> Creating Tile (%i,%i,%i)",col,row,[scale intValue]);
//build name for tile...
NSString *tile_name = [NSString stringWithFormat:#"%#_%i_%i_%i.png",#"image",[scale intValue],col,row];
#autoreleasepool {
//build tile for this coordinate
UIImage *tile = [self tileForRow:row column:col size:tile_size image:imageForScale];
//convert image to png data
NSData *tile_data = UIImagePNGRepresentation(tile);
[tile_data writeToFile:[NSString stringWithFormat:#"%#%#",temp_tile_store,tile_name] atomically:YES];
}
}
}
}
}
}
Here are my resizing/cropping functions too as these could also be causing the issue..
-(UIImage *)resizeImage:(UIImage *)inImage toSize:(CGSize)scale {
#autoreleasepool {
CGImageRef inImageRef = [inImage CGImage];
CGColorSpaceRef clrRf = CGColorSpaceCreateDeviceRGB();
CGContextRef ctx = CGBitmapContextCreate(NULL, ceil(scale.width), ceil(scale.height), CGImageGetBitsPerComponent(inImageRef), CGImageGetBitsPerPixel(inImageRef)*ceil(scale.width), clrRf, kCGImageAlphaPremultipliedFirst );
CGColorSpaceRelease(clrRf);
CGContextDrawImage(ctx, CGRectMake(0, 0, scale.width, scale.height), inImageRef);
CGImageRef img = CGBitmapContextCreateImage(ctx);
UIImage *image = [[UIImage alloc] initWithCGImage:img scale:1 orientation:UIImageOrientationUp];
CGImageRelease(img);
CGContextRelease(ctx);
return image;
}
}
- (UIImage *)tileForRow: (int)row column: (int)col size: (CGSize)tileSize image: (UIImage*)inImage
{
#autoreleasepool {
//get the selected tile
CGRect subRect = CGRectMake(col*tileSize.width, row * tileSize.height, tileSize.width, tileSize.height);
CGImageRef inImageRef = [inImage CGImage];
CGImageRef tiledImage = CGImageCreateWithImageInRect(inImageRef, subRect);
UIImage *tileImage = [[UIImage alloc] initWithCGImage:tiledImage scale:1 orientation:UIImageOrientationUp];
CGImageRelease(tiledImage);
return tileImage;
}
}
Now I never use to be that good with memory management, so I did take the time to read up on it and also converted my project to ARC to see if that could address my issues (that was a while ago) but from the results i get after profiling it in instruments I must be doing something STUPIDLY wrong for the memory to leak as bad as it does but I just can't see what i'm doing wrong.
If anybody can point out anything I may be doing wrong it would be great!
Thanks
Liam
(let me know if you need more info)
I use "UIImage+Resize". Inside #autoreleasepool {} it works fine with ARC in a loop.
https://github.com/AliSoftware/UIImage-Resize
-(void)compress:(NSString *)fullPathToFile {
#autoreleasepool {
UIImage *fullImage = [[UIImage alloc] initWithContentsOfFile:fullPathToFile];
UIImage *compressedImage = [fullImage resizedImageByHeight:1024];
NSData *compressedData = UIImageJPEGRepresentation(compressedImage, 75.0);
[compressedData writeToFile:fullPathToFile atomically:NO];
}
}