Clear memory used by UIImage -- UIImageView.image = nil doesn't do it - objective-c

I have created an UIScrollView that contains photos, like in the Photos app.
Photos are downloaded. Images are set through:
imageView.image = [UIImage imageWithData:downloadedImageData];
To save memory, when the user goes far from a photo, I set
imageView.image = nil;
This doesn't clear the memory, though. "Memory Leaks" from Instruments doesn't show any memory leaks.
Can anyone recommend me how to optimize the memory used by images?

[UIImage imageWithData:] is autoreleased object, so it will be freed somewhere in run loop. To release it for sure, use alloc/init constructor.

Related

NSImage drawInRect and NSView cacheDisplayInRect memory retained

Application that I am working process images. It's like user drops at max 4 images and app layout them based on user's selected template. One image might be added 2-3 times in final view.
Each image in layout is drawn in NSView (drawRect method of NSView using drawInRect method).Now final image (combined image by layouting all images) is created by saving NSView as Image and it all works very well.
Now problem that I am facing is memory is being retained by app once all processing is done. I have used instruments allocation and I don't see memory leaks but I see "Persistent bytes" are increasing continuously with each session of app and one user reported issue in GB's. Please see screenshot.
When I further investigated in Instruments I saw below code snaps of app that is causing memory retentions. All are related to ImageIO and coreImages. See below from instruments:
However this seems to be only problem with 10.10 and above system. Tested same version of the app in 10.9.x and memory usage remains with in 60MB. During session execution in app it goes to 200MB but once it's done it comes back to 50-60MB that usual for kind of app.
[_photoImage drawInRect: self.bounds fromRect: NSZeroRect operation: NSCompositeSourceOver fraction: 1.0 respectFlipped: YES hints: nil];
_photoImage = nil;
Above code I am using to draw image in NSView's drawRect method and code shown in image is being used to get NSView as Image.
Update: After my further investigation I found that it's CGImageSourceCreateWithData that is caching the TIFF data of NSImage. More ever I am using below code to crop the image and if i uncomment it memory consumption just works fine.
NSData *imgData = [imageToCrop TIFFRepresentation];
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)imgData, NULL);
CGImageRef maskRef = CGImageSourceCreateImageAtIndex(source, 0, NULL);
CGImageRef imageRef = CGImageCreateWithImageInRect(maskRef, rect);
NSImage *cropped = [[NSImage alloc] initWithCGImage: imageRef size:rect.size];
CGImageRelease(maskRef);
CGImageRelease(imageRef);
CFRelease(source);
//CFRelease( options );
imgData = nil;
I have also trying explicitly setting kCGImageSourceShouldCache to false (but it's by default false) but same results.
Please help to solve the memory retention issue.
Finally after lots of debugging it turns out that CGImageSourceCreateWithData is somewhere retaining TIFF data of NSImage. When I changed this line:
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)imgData, NULL);
with
CGImageSourceRef source = CGImageSourceCreateWithURL((CFURLRef)[NSURL fileURLWithPath:path], NULL);
everything just started working fine and app's memory usage was dropped from 300MB (for 6images) to 50-60MB and it's consistent behaviour now.
Apart from above changes it was still causing memory retention somewhere so to get rid of that, after all processing is done I cleared image of each layer to 'nil' and that works like charm. I was in impression that making parent as 'nil' would release images as well but that was not working.
Anyway if anyone seems issue with drawInRect or cacheDisplayInRect then make sure to clear out the image if not needed later on.
Update 2nd July 2016
I found that kCGImageSourceShouldCache is false by default in 32bit and true for 64bit. I was able to release memory with below code by setting it to false.
const void *keys[] = { kCGImageSourceShouldCache};
const void *values[] = { kCFBooleanFalse};
CFDictionaryRef optionsDictionary = CFDictionaryCreate(NULL, keys, values, 1, NULL, NULL);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)[image TIFFRepresentation], optionsDictionary);
Hope it helps someone.

Xcode: Array of images fails to instantiate on device

I've got an app that's working perfectly on the simulator, but fails on my iPod Touch (4th gen), and I'd like to know why. The part that's failing is a simple interactive menu that shows six pictures on the root of a UINavigationController, then pushes a viewController which instantiates an array of food images, creates a view which holds all the images side-by-side, and moves the viewing area over the image that correlates to the image clicked in the root view. When I run it on the device, the array only instantiates with pointers to two images, and an exception is thrown when the array is used to create the images side-by-side.
//code from the pushed view controller
- (void)setupScrollView:(UIScrollView*)scrMain {
// we have 6 images here.
// we will add all images into a scrollView & set the appropriate size.
NSMutableArray *array = [NSArray arrayWithObjects:
[UIImage imageNamed:#"shrimpquesadilla.jpg"],
[UIImage imageNamed:#"pulledpork.jpg"],
[UIImage imageNamed:#"filetMignon.jpg"],
[UIImage imageNamed:#"Reuben.jpg"],
[UIImage imageNamed:#"cajunshrimp.jpg"],
[UIImage imageNamed:#"primerib.jpg"], nil];
NSLog(#"stuff: %#", array);
for (int i=1; i<=6; i++) {
UIImage *image = [array objectAtIndex:(i-1)];
UIImageView *imgV = [[UIImageView alloc]
initWithFrame:CGRectMake((i-1)*scrMain.frame.size.width,
0, scrMain.frame.size.width, (scrMain.frame.size.height - 90))];
imgV.contentMode=UIViewContentModeScaleToFill;
[imgV setImage:image];
imgV.tag=i+1;
[scrMain addSubview:imgV];
}
[scrMain setContentSize:CGSizeMake(scrMain.frame.size.width*6,
scrMain.frame.size.height)];
[scrMain scrollRectToVisible:CGRectMake(self.count*scrMain.frame.size.width,
0, scrMain.frame.size.width, scrMain.frame.size.height) animated:YES];
}
The output of the NSLog when run through the simulator:
2012-08-20 09:51:23.812 DemoTabbed[1545:11603] stuff: (
"<UIImage: 0x7931150>",
"<UIImage: 0x6e63270>",
"<UIImage: 0x6e67700>",
"<UIImage: 0x6e68040>",
"<UIImage: 0x6e5c700>",
"<UIImage: 0x6e64210>"
)
And the output when run on the device:
2012-08-20 10:26:50.211 DemoTabbed[2128:707] stuff: (
"<UIImage: 0x197e20>",
"<UIImage: 0x181270>"
)
And then there's the standard error for the index out of bounds. I don't know if it's relevant or not, but two of my icons aren't loading on the device, either, though they work on the simulator. Let me know if you need any more code, or if you have questions about the app or its behavior, I'd be happy to add more.
EDIT: I have tried rearranging the order in which the images are instantiated into the array, and nothing was changed by it. The output stll showed that only two images were pointed to by the array.
iOS has a case-sensetive file system. You have a case problem in #"filetMignon.jpg" file, making it resolve to nil image and ending the array elements early.
To fix that, make sure that images are named in same case as you load them (the better idea would be to always have a lowercase image names).
It's not an issue on Simulator, as OS X is using case-insensetive (in 99% cases) filesystem, meaning, #"filetMignon.jpg" and #"filetmignon.jpg" would resolve to same file.

Memory management with image saved from UIImagePickerController

I'm writing an app in which the user takes a photo of them self, and then goes through a series of views to adjust the image using a navigation controller. This works perfectly fine if the user takes the photo with the front camera (set as default on devices that support it), but when I repeat the process I get about half way through and it crashes after throwing a memory warning.
After profiling in Instruments I see that my apps memory footprint holds at about 20-25 MB when using the lower resolution front camera image, but when using the back camera every view change adds another 33 MB or so until it crashes at about 350 MB (on a 4S)
Below is the code I'm using to handle saving the photo to the documents directory, and then reading that file location to set the image to a UIImageView. The "read" portion of this code is recycled through several view controllers (viewDidLoad) to set the image I saved as the background image in each view as I go.
I have removed all my image modification code to strip this down to the bear minimum attempting to isolate the problem, and I can't seem to find it. As it stands right now, all the app does is take a photo in the first view and then use that photo as the background image for about 10 more views, allocating as the user navigates through the view stack.
Now obviously the higher resolution photos would use more memory, but what I don't understand is that why the low resolution photos don't seem to be using more and more memory as I go, whereas the high resolution photos continuously use more and more until a crash.
How I am saving and reading the image:
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
jpgData = UIImageJPEGRepresentation(image, 1);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsPath = [paths objectAtIndex:0];
filePath = [documentsPath stringByAppendingPathComponent:#"image.jpeg"];
[jpgData writeToFile:filePath atomically:YES];
[self dismissModalViewControllerAnimated:YES];
[disableNextButton setEnabled:YES];
jpgData = [NSData dataWithContentsOfFile:filePath];
UIImage *image2 = [UIImage imageWithData:jpgData];
[imageView setImage:image2];
}
Now I know that I could try scaling the image before I save it, which I plan on looking into next, but I don't see why this doesn't work as is. Maybe I was falsely under the impression that ARC automatically deallocated views and their subviews when they leave the top of the stack.
Can anyone shed some light on why I'm stock piling my devices memory? (Hopefully something simple I'm completely overlooking) Did I somehow manage to throw ARC out the window?
EDIT: How I call for the image in my other views
- (void)loadBackground
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsPath = [paths objectAtIndex:0];
NSString *filePath = [documentsPath stringByAppendingPathComponent:#"image.jpeg"];
UIImage *image = [UIImage imageWithContentsOfFile:filePath];
[backgroundImageView setImage:image];
}
How navigation between my view controllers is established:
EDIT 2:
What my basic declarations look like:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface PhotoPickerViewController : UIViewController <UIImagePickerControllerDelegate, UINavigationControllerDelegate>
{
IBOutlet UIImageView *imageView;
NSData *jpgData;
NSString *filePath;
UIImagePickerController *imagePicker;
IBOutlet UIBarButtonItem *disableNextButton;
}
#end
If relevant, how I call up my image picker:
- (void)callCameraPicker
{
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == YES)
{
NSLog(#"Camera is available and ready");
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
imagePicker.delegate = self;
imagePicker.allowsEditing = NO;
imagePicker.cameraCaptureMode = UIImagePickerControllerCameraCaptureModePhoto;
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *device in devices)
{
if([[UIScreen mainScreen] respondsToSelector:#selector(scale)] && [[UIScreen mainScreen] scale] == 2.0)
{
imagePicker.cameraDevice = UIImagePickerControllerCameraDeviceFront;
}
}
imagePicker.modalTransitionStyle = UIModalTransitionStyleCoverVertical;
[self presentModalViewController:imagePicker animated:YES];
}
else
{
NSLog(#"Camera is not available");
UIAlertView *cameraAlert = [[UIAlertView alloc] initWithTitle:#"Error"
message:#"Your device doesn't seem to have a camera!"
delegate:self cancelButtonTitle:#"Dismiss"
otherButtonTitles:nil];
[cameraAlert show];
}
}
EDIT 3: I logged viewDidUnload, and it was in fact not being called so I'm now calling loadBackground in viewWillAppear and making my backgroundImageView nil in viewDidDisappear. I expected this to help but it made no difference.
- (void)viewWillAppear:(BOOL)animated
{
[self loadBackground];
}
- (void)viewDidDisappear:(BOOL)animated
{
NSLog(#"ViewDidDisappear");
backgroundImageView = nil;
}
The relationship between UIImage and UIImageView is not necessarily intuitive for everyone.
UIImage is a high level representation of your image data - alone, it does nothing in terms of displaying the data.
UIImageView works with UIImage to allow you to display an image.
There is no reason why multiple instances of UIImageView cannot display the same UIImage. This is nice and efficient, because there is only one in-memory representation of the image data, being shared by multiple views.
What you seem to be doing is creating a new UIImage for each one of your views, by loading it from disk. So this is a poor general design in two respects: instantiating what is effectively the same UIImage over and over again, and re-loading the same image data from disk repeatedly.
Your memory problem is really a separate issue, where you are not properly releasing the image data you keep loading into UIImage objects and UIImageViews.
In theory, you should be able to take the very first UIImage you're getting from UIImagePickerController and simply pass that reference around to your views, without reloading from disk.
If you need to be saving and reloading from disk because of higher level functional requirements (e.g. because the image is being changed by the user and you want to keep saving it), you'll need to be sure you are fully tearing down the previous UIView, by removing it from the it's view hierarchy. It is helpful to setup a breakpoint in the dealloc method for the view to confirm it is being removed and dealloced, and make sure you set any iVar references to sub-views (it appears your backgroundImageView is an iVar) are set to nil. If you are not properly tearing down that backgroundImageView, it is continuing to hold a reference to the UIImage you set to it's image property.
There are a couple of things that are curious about the code you posted:
None of your view-callback implementations call super. That’s bad! Make extra sure that you are calling super in viewDidUnload and (if you implemented it) didReceiveMemoryWarning.
Make sure you implement didReceiveMemoryWarning in a meaningful way!
You really should not be re-creating that image over and over again! I assume you are not editing the actual image because you use JPEG compression on it which — even at 100% quality — will deteriorate your image with every save…
Check your implementation of viewDidUnload make sure to set every of your IBOutlets to nil.
ARC is not Pixie Dust™! It just saves you a bit of typing, it does not free you from designing and maintaining your object graphs!
From your question, I see at the very least these graphs that refer to your image:
image 1 <- image-view 1 <- view-controller 1 <- navigation-controller <- key window <- application
image 1 <- image-view 1 <- view 1 <- view-controller 1 <- navigation-controller <- key window <- application
This is repeated for every view-controller with an index shift on the view-controller, view, image view and image. While you have to have separate views, image-views for your view-controllers, I cannot think of a reason why you would want several copies of the same image.
So the first axe on your memory consumption clearly is to no longer create all those copies of the same image data — I’d estimate that this will get you half of the low-hanging memory savings.
The next thing is that ARC can only free the memory consumed by your objects if it is no longer referenced.
Memory wise, views are not exactly lightweight objects and when you build up a deep navigation stack you end up with gobs of them.
So you need to axe any unneeded strong references to those views, as well.
The level, at which this has to happen is the view-controller. The latest time at which this should happen is in the view-controller’s implementation of viewDidUnload.
Why the view-controller?
From what you described, the image itself is only referenced by the UIImageView — this is a bad choice, IMHO, but I digress…
UIViewController is designed to “know”, when its view is needed and when it’s safe to dispose of it — that’s why it implements didReceiveMemoryWarning and viewDidUnload:
If the memory pressure gets to high and the view-controller’s view is not “on screen” the root implementation of didReceiveMemoryWorning will let go of its view and call viewDidUnload upon itself, afterwards.
This is why you must call through to super in your implementations of both of those methods.
In addition, this is why if you have strong IBOutlets that refer to subviews of the view-controller’s view, you must nil them in viewDidUnload or the system cannot reclaim the memory they occupy.
At its heart UIViewController is a big-ass finite state-machine. All of those “something-will/did-whatever” callbacks are used to transition between those states and most of the default implementations do some very important book-keeping to keep all that state in order.
If you are not invoking them in your overrides, you˚ll end up in inconsistent states and bad things — like this out of memory crasher — happen.
Just create separate folder and save your Capture images in it. After your successful operation clear that folder data(or)folder.using the nsfilemager.

How to refresh view in objective-c

I wanted to create a gallery. It loads different images based on the category, that a user selects. I used to populate images in UIImageViews.
My when selecting different categories is that it does not clear the previously selected images. This is my code for populating images.
-(void)refresh{
categoryLabel.text = treatment.treatmentName;
NSMutableArray *galleryImages =[NSMutableArray alloc] ;
galleryImages = [[PatientImage alloc]find:treatment.treatmentId];
int imgCount = [galleryImages count];
for(int i=0;i<imgCount;i++){
PatientImage *apatientImage = [galleryImages objectAtIndex:i];
UIImage *img1 = [UIImage imageNamed:apatientImage.imageBefore];
UIImageView *myImageView = [[UIImageView alloc] initWithImage:img1];
myImageView.contentMode = UIViewContentModeTopRight;
myImageView.frame = CGRectMake(120+i*240,120.0,100.0, 100.0);
[self.view addSubview:myImageView];
UIImage *img2 = [UIImage imageNamed:apatientImage.imageAfter];
UIImageView *myImageView2 = [[UIImageView alloc] initWithImage:img2];
myImageView2.contentMode = UIViewContentModeTopRight;
myImageView2.frame = CGRectMake(120+i*240+300,120.0,100.0, 100.0);
[self.view addSubview:myImageView2];
}
}
First things first, You have some memory leaks there. You are allocating UIImageViews but are not releasing them anywhere, after you have added them to your view. I don't know if that applies to ARC, though. Same applies to your Mutable array, but I suppose you are releasing it after the 'for' loop somewhere, since it seems you posted code after omitting some of it.
As far as your actual question is concerned, I wouldn't do this this way. I would make the mutable array an object variable, and then fill it with my image views. When calling refresh again, I would first call -removeFromSuperview on each image view, then empty the array, then repopulate it and add the new subviews to my view. That is the simple way.
I don't know if you are using ARC, but you should be careful about memory management when using dynamically loaded views. Each time you add a view to another one, you increase its retain counter. You must then call release to remove ownership, and let the iOS runtime handle the rest.
Also note that operations such as this using views are expensive in terms of memory. So, another way of repopulating the gallery view is to just change the image an imageView holds. That will save you some memory, and time. In case the view doesn't have a constant number of images to be displayed, you can refine your algorithm to change the images on the already created image views, and then add more image views if necessary or delete the remaining ones, if any.
I hope I helped.
try at the start of refresh call
[[self subviews] makeObjectsPerformSelector: #selector(removeFromSuperview)];
or
for (id imageView in self.subviews){
if([imageView isKindOfClass:[UIImageView class]]) [imageView removeFromSuperview];
}
call [tableview reloadData] if You are using tableview to show your gallery images
or call view's
[self.view setNeedsDisplay] method for refreshing the view.

imageWithCGImage and memory

If I use [UIImage imageWithCGImage:], passing in a CGImageRef, do I then release the CGImageRef or does UIImage take care of this itself when it is deallocated?
The documentation isn't entirely clear. It says "This method does not cache the image object."
Originally I called CGImageRelease on the CGImageRef after passing it to imageWithCGImage:, but that caused a malloc_error_break warning in the Simulator claiming a double-free was occurring.
According to the fundamental rule of Cocoa memory management, an owning object should release the owned object when it no longer needs it. Other objects are responsible for taking and releasing ownership on their own. If a UIImage needs an object to persist but doesn't retain or copy it, it's a bug in UIImage's implementation and should be reported as such.
I have the same problem in XCode 3.2.1 on Snow Leopard. I have pretty much the same code as Jason.
I have looked at the iPhone sample code which uses imageWithCGImage and they always release the CGImageRef using CGImageRelease after a call to imageWithCGImage.
So, is this a bug in the simulator? I always get the malloc_error_break warning on the console when I use CGImageRelease.
The ownership in UIImage about CGImage is unclear. It seems like don't copy the CGImage but it's not guaranteed by documentation. So we have to handle that ourselves.
I used a new subclass of UIImage to handle this problem. Just retaining passing CGImage, and releasing it when the dealloc.
Here is sample.
#interface EonilImage : UIImage
{
#private
CGImageRef sourceImage;
}
- (id)initWithCGImage:(CGImageRef)imageRef scale:(CGFloat)scale orientation:(UIImageOrientation)orientation;
#end
#implementation EonilImage
- (id)initWithCGImage:(CGImageRef)imageRef scale:(CGFloat)scale orientation:(UIImageOrientation)orientation
{
self = [super initWithCGImage:imageRef scale:scale orientation:orientation];
if (self)
{
sourceImage = imageRef;
CGImageRetain(imageRef);
}
return self;
}
- (void)dealloc
{
CGImageRelease(sourceImage);
[super dealloc];
}
#end
Because the CGImage returned by -[UIImage CGImage] property is not guaranteed to be same CGImage passed into init method, the class stores CGImage separately.
I agree with you -- the documentation is muddled at best for this API. Based on your experiences, then, I would conclude that you are responsible for the lifetime of both objects - the UIImage and the CGImageRef. On top of that you have to make sure the lifetime of the CGImageRef is at least as long as the UIImage (for obvious reasons).
Are you using Xcode 3.1 on Snow Leopard? I am experiencing the same issue:
CGContextRef ctx = ...;
CGImageRef cgImage = CGBitmapContextCreateImage(ctx);
UIImage * newImage = [[UIImage imageWithCGImage:cgImage] retain];
CGImageRelease(cgImage); // this should be OK, but it now crashes in Simulator on SL
I'm guessing that upgrading to Xcode 3.2 will fix the issue.
Not sure if this helps, but I had a similar problem.
I read the answers and then did the following which appears to have fixed it:
CGImageRef cgImage = [asset thumbnail];
UIImage *thumbImage = [[UIImage imageWithCGImage:cgImage ]retain];
UIImageView *thumbImageView = [[UIImageView alloc] initWithImage:thumbImage];
CGImageRelease(cgImage);
I used the autorelease as suggested but it was not enough.
One I had added the image to the UIImageView I then released the cgImage.
It didn't crash and deallocated nicely. Why -- I have no idea but it worked.
The asset thumbnail part is from the ALAsset Library by the way, you might need something else there.
<>
Not my opinion. I had the same problem. According to documentation
UIImage *image = [[UIImage alloc] initWithCGImage:imageRef];
imho keeps all references correct. If you are using a CGDataProvider please have a look at
CGDataProviderReleaseDataCallback
and set a breakpoint in your callback. You can see that is is correctly called after you [release] your image and you can free() your image data buffer there.
"This method does not cache the image object."
So the UIImage take the ownership, and thus you should not release the CGImageRef.