Objective-C: Memory leak with CGDataProviderCopyData - objective-c

With the help of the instrument, it is shown that CGDataProviderCopyData is using too much memory. How to fix the issue?
-(UIImage*)imageNamed:(NSString*)name {
UIImage *uiimage = [UIImage imageNamed:name];
CGImageRef originalImage = [uiimage CGImage];
CFDataRef imageData = CGDataProviderCopyData(
CGImageGetDataProvider(originalImage));
CGDataProviderRef imageDataProvider = CGDataProviderCreateWithCFData(imageData);
CFRelease(imageData);
CGImageRef image = CGImageCreate(
CGImageGetWidth(originalImage),
CGImageGetHeight(originalImage),
CGImageGetBitsPerComponent(originalImage),
CGImageGetBitsPerPixel(originalImage),
CGImageGetBytesPerRow(originalImage),
CGImageGetColorSpace(originalImage),
CGImageGetBitmapInfo(originalImage),
imageDataProvider,
CGImageGetDecode(originalImage),
CGImageGetShouldInterpolate(originalImage),
CGImageGetRenderingIntent(originalImage));
CGDataProviderRelease(imageDataProvider);
return [UIImage imageWithCGImage:image];
}

Finally, I was able to solve this. Actually there were few extra unnecessary steps which were causing memory leaks. Here is the updated function:
-(UIImage*)imageNamed:(NSString*)name {
UIImage *uiimage = [UIImage imageNamed:name];
CGImageRef originalImage = [uiimage CGImage];
CGImageRef image = CGImageCreate(
CGImageGetWidth(originalImage),
CGImageGetHeight(originalImage),
CGImageGetBitsPerComponent(originalImage),
CGImageGetBitsPerPixel(originalImage),
CGImageGetBytesPerRow(originalImage),
CGImageGetColorSpace(originalImage),
CGImageGetBitmapInfo(originalImage),
CGImageGetDataProvider(originalImage),
CGImageGetDecode(originalImage),
CGImageGetShouldInterpolate(originalImage),
CGImageGetRenderingIntent(originalImage));
return [UIImage imageWithCGImage:image];
}
CGImageGetDataProvider(originalImage) returns CGDataProviderRef and that is required as 8th parameter in CGImageCreate function. Above steps to copy image data and then creating CGDataProviderRef were unnecessary.

The problem is that you never release image.
Update your code as follows:
-(UIImage*)imageNamed:(NSString*)name {
UIImage *uiimage = [UIImage imageNamed:name];
CGImageRef originalImage = [uiimage CGImage];
CFDataRef imageData = CGDataProviderCopyData(
CGImageGetDataProvider(originalImage));
CGDataProviderRef imageDataProvider = CGDataProviderCreateWithCFData(imageData);
CFRelease(imageData);
CGImageRef image = CGImageCreate(
CGImageGetWidth(originalImage),
CGImageGetHeight(originalImage),
CGImageGetBitsPerComponent(originalImage),
CGImageGetBitsPerPixel(originalImage),
CGImageGetBytesPerRow(originalImage),
CGImageGetColorSpace(originalImage),
CGImageGetBitmapInfo(originalImage),
imageDataProvider,
CGImageGetDecode(originalImage),
CGImageGetShouldInterpolate(originalImage),
CGImageGetRenderingIntent(originalImage));
CGDataProviderRelease(imageDataProvider);
UIImage *result = [UIImage imageWithCGImage:image];
CGImageRelease(image);
return result;
}

Related

UIImage from imageWithCGImage doesn't retain pointer (EXC_BAD_ACCESS)

I have a custom class Frame that gets image data from multiple sources. The class can generate an UIImage. The problem is when a generated UIImage is drawn on the screen, it crashes with EXC_BAD_ACCESS.
Callstack is empty, it ends at start->main->UIApplicationMain.
I think it has something to do with CGImageCreate and that the pointer isn't retained somehow. But I have a hard time figuring out why. The XCode debugger shows the UIImage exists right before it's added as a subview through UIImageView, but after it just crashes. I've also tried to draw it directly to a custom UIView with drawRect, but it crashes with EXC_BAD_ACCESS at drawRect.
Any thoughts would be greatly appreciated!
Here's the code:
UIImage *image = [UIImage imageNamed:#"test.png"];
// To NSData
CGImageRef imageRef = image.CGImage;
CFDataRef dataRef = CGDataProviderCopyData(CGImageGetDataProvider(imageRef));
const unsigned char *pixels = CFDataGetBytePtr(dataRef);
const signed long length = CFDataGetLength(dataRef);
NSData *data = [NSData dataWithBytes:pixels length:length];
CGFloat width = CGImageGetWidth(imageRef);
CGFloat height = CGImageGetHeight(imageRef);
CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef);
// NSData to UIImage
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGDataProviderRef dataProviderRef =
CGDataProviderCreateWithData(NULL, data.bytes, data.length, NULL);
CGImageRef imageRef2 =
CGImageCreate(width, height, 8, 32, 4 * width, colorSpaceRef, bitmapInfo,
dataProviderRef, NULL, NO, kCGRenderingIntentDefault);
UIImage *image2 = [UIImage imageWithCGImage:imageRef2];
CGColorSpaceRelease(colorSpaceRef);
CGDataProviderRelease(dataProviderRef);
CGImageRelease(imageRef);
//Show UIImage
UIImageView *imageView = [[UIImageView alloc] initWithFrame:self.view.frame];
imageView.image = image2;
//Breakpoint here shows that `image2` is equal to `image`
[self.view addSubview:imageView];
//EXC_BAD_ACCESS

CGImageRef Memory leak after release and autoreleasepool

Im trying to implement show image from local content but for some reason memory Would not get freed.
#autoreleasepool {
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
UIImage *largeimage = [UIImage imageWithCGImage:iref scale:[UIScreen mainScreen].scale orientation:(UIImageOrientation)rep.orientation];
CFRelease(iref);
self.imageView.image = largeimage;
largeimage = nil;
}
As suggested ,i am used
CGImageRelease(imageRef);
but still i am got an memory leak. After that i am wrap code with an
#autoreleasepool {}
block but that also not solve my problem.
What chould I do ?
I think the issue is when you assign the image to your image view. Can you try resizing the image before assigning it to image view?
Use this method
- (UIImage *)resizeImage:(UIImage *)sourceImage toSize:(CGSize)newSize
{
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[sourceImage drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

UIImage from CIImage - Data length is zero?

I'm using an AVCaptureVideoDataOutput along with its delegate method to manipulate video frames. In the delegate method, I am using the sampleBuffer to create a CIImage, and from here I crop the CIImage, convert it to a UIImage and display it. Unfortunately, I need to determine the file-size of this new UIImage, but it's returning 0. The code works, the image is cropped beautifully, everything. I just don't see why it has no data!
Why might this be? Relevant code follows:
//In delegate method, given sampleBuffer...
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault,
sampleBuffer, kCMAttachmentMode_ShouldPropagate);
CIImage *ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer
options:(NSDictionary *)attachments];
...
dispatch_async(dispatch_get_main_queue(), ^(void) {
CGRect rect = [self drawFaceBoxesForFeatures:features forVideoBox:clap
orientation:curDeviceOrientation];
CIImage *cropped = [ciImage imageByCroppingToRect:rect];
UIImage *image = [[UIImage alloc] initWithCIImage:cropped];
NSData *data = UIImageJPEGRepresentation(image, 1);
NSLog(#"Image size is %d", data.length); //returns 0???
[imageView setImage:image];
[image release];
});
I had the same Problem, but with simple filtered images.
I stumbled upon this and it solved the issue. After this, I was able to save my image.
CGSize size = self.originalImage.size;
CGRect rect;
rect.origin = CGPointZero;
rect.size = size;
UIGraphicsBeginImageContext(size);
[[UIImage imageWithCIImage:self.filteredImage] drawInRect:rect];
UIImage * image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * jpegData = UIImageJPEGRepresentation(image, 1.0);
But I only needed this two lines in the "ImageContext"

Scaled and merged image not saving properly (iOS)

I've taken a screenshot of a certain part of my window, then scaled it down and merged with an image also scaled down. The problem is that when I go to its path, there's a PNG file saved with 0KB, so the image isn't saved. Any idea? Thanks!
Here's my current code:
//Save images
- (void) saveimages {
//Save small one (mini)
NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
UIGraphicsBeginImageContextWithOptions((myImageView.bounds.size), NO, 0.5);
[myImageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *mini = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
float miniVW = VView.image.size.width / 2;
float miniVH = VView.image.size.height / 2;
UIImage *miniV = [self imageWithImage:certain.image convertToSize:CGSizeMake(miniVW, miniVH)];
UIImage *miniTotal = [self mergeIMG:miniV:mini];
NSString *pngFilePath = [NSString stringWithFormat:#"%#/mini%i.png",docDir,number];
NSData *data = [NSData dataWithData:UIImagePNGRepresentation(miniTotal)];
[data writeToFile:pngFilePath atomically:YES];
}
//Merge two images in one
- (UIImage *) mergeIMG:(UIImage *)VextImg:(UIImage *)VintImg {
//Create a new image from two
CGSize newSize = CGSizeMake(VintImg.size.width, VintImg.size.width);
[VintImg drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
[VextImg drawInRect:CGRectMake(0, 0, newSize.width, newSize.height) blendMode:kCGBlendModeNormal alpha:1.0];
UIImage *finalIMG = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return finalIMG;
}
//Scale image
- (UIImage *) imageWithImage:(UIImage *)image convertToSize:(CGSize)size {
//Change size
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *destImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return destImage;
}
mergeImage does not call UIGraphicsBeginImageContextWithOptions

CGImage of UIImage return NULL

I created a function for splitting an image into multiple images, but when I take take the CGImage of the UIImage, the CGImage returns NULL
NSArray* splitImage(UIImage* image,NSUInteger pieces) {
NSLog(#"width: %f, %zu",image.size.width,CGImageGetWidth(image.CGImage));
NSLog(#"%#",image.CGImage);
returns NULL
NSMutableArray* tempArray = [[NSMutableArray alloc]initWithCapacity:pieces];
CGFloat piecesSize = image.size.height/pieces;
for (NSUInteger i = 0; i < pieces; i++) {
// take in account retina displays
CGRect subFrame = CGRectMake(0,i * piecesSize * image.scale ,image.size.width * image.scale,piecesSize * image.scale);
CGImageRef newImage = CGImageCreateWithImageInRect(image.CGImage,subFrame);
UIImage* finalImage =[UIImage imageWithCGImage:newImage];
CGImageRelease(newImage);
[tempArray addObject:finalImage];
}
NSArray* finalArray = [NSArray arrayWithArray:tempArray];
[tempArray release];
return finalArray;
}
I have created UIImage from CGImage.
CIImage *ciImage = image.CIImage;
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef ref = [context createCGImage:ciImage fromRect:ciImage.extent];
UIImage *newImage = [UIImage imageWithCGImage:ref];
And now newImage.CGImage is not nil
The CGImage property will return nil if the UIImage was created from another image such as an IOSurface or CIImage. To get around this in this particular case I can create a CGImage from an IOSurface using the c function then convert that to a UIImage.
UICreateCGImageFromIOSurface(IOSurfaceRef surface);
Convert CGImage to UIImage with this and cgImage will not be null:
func convert(cmage:CIImage) -> UIImage
{
let context:CIContext = CIContext.init(options: nil)
let cgImage:CGImage = context.createCGImage(cmage, from: cmage.extent)!
let image:UIImage = UIImage.init(cgImage: cgImage)
return image
}
What you did now is just create pieces with 0.f width, you should use two fors to define the width & height for your pieces. Code sample like this (not tested yet, but it should works):
for (int height = 0; height < image.size.height; height += piecesSize) {
for (int width = 0; width < image.size.width; width += piecesSize) {
CGRect subFrame = CGRectMake(width, height, piecesSize, piecesSize);
CGImageRef newImage = CGImageCreateWithImageInRect(image.CGImage, subFrame);
UIImage * finalImage = [UIImage imageWithCGImage:newImage];
CGImageRelease(newImage);
[tempArray addObject:finalImage];
}
}
It happens in some cases when we try to crop image. I found the solution like this try this may be this can help you:-
NSData *imageData = UIImageJPEGRepresentation(yourImage, 0.9);
newImage = [UIImage imageWithData:imageData];