Constantly growing memory allocation while fetching images over HTTP in iOS - objective-c

I am implementing an iOS App that needs to fetch a huge amount of images over HTTP. I've tried several approaches but independently what I do, Instuments shows constantly increasing memory allocations and the App crashes sooner or later when I run it on a device. There are no leaks shown by Instruments.
So far I have tried the following approches:
Fetch the images using a synchronous NSURLConnection within an NSOperation
Fetch the images using a asynchronous NSURLConnection within an NSOperation
Fetch the images using [NSData dataWithContentsOfURL:url] in the Main-Thread
Fetch the images using synchronous ASIHTTPRequest within an NSOperation
Fetch the images using asynchronous ASIHTTPRequest and adding it to a NSOperationQueue
Fetch the images using asynchronous ASIHTTPRequest and using a completionBlock
The Call Tree in Instrumetns shows that the memory is consumed while processing the HTTP-Response. In case of asynchronous NSURLConnection this is in
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data {
[receivedData appendData:data];
}
In case of the synchronous NSURLConnection, Instruments shows a growing CFData (store) entry.
The problem with ASIHTTPRequest seems to be the same as with the asynchronous NSURLConnection in a analogous code-position. The [NSData dataWithContentsOfURL:url] approach shows an increasing amount of total memory allocation in exactely that statement.
I am using an NSAutoReleasePool when the request is done in a separate thread and I have tried to free up memory with [[NSURLCache sharedURLCache] removeAllCachedResponses] - no success.
Any ideas/hints to solve the problem? Thanks.
Edit:
The behaviour only shows up if I persist the images using CoreData. Here is the code I run as a NSInvocationOperation:
-(void) _fetchAndSave:(NSString*) imageId {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSString *url = [NSString stringWithFormat:#"%#%#", kImageUrl, imageId];
HTTPResponse *response = [SimpleHTTPClient GET:url headerOrNil:nil];
NSData *data = [response payload];
if(data && [data length] > 0) {
UIImage *thumbnailImage = [UIImage imageWithData:data];
NSData *thumbnailData = UIImageJPEGRepresentation([thumbnailImage scaleToSize:CGSizeMake(55, 53)], 0.5); // UIImagePNGRepresentation(thumbnail);
[self performSelectorOnMainThread:#selector(_save:) withObject:[NSArray arrayWithObjects:imageId, data, thumbnailData, nil] waitUntilDone:NO];
}
[pool release];
}
All CoreData related stuff is done in the Main-Thread here, so there should not be any CoreData multithreading issue. However, if I persist the images, Instruments shows constantely increasing memory allocations at the positions described above.
Edit II:
CoreData related code:
-(void) _save:(NSArray*)args {
NSString *imageId = [args objectAtIndex:0];
NSData *data = [args objectAtIndex:1];
NSData *thumbnailData = [args objectAtIndex:2];
Image *image = (Image*)[[CoreDataHelper sharedSingleton] createObject:#Image];
image.timestamp = [NSNumber numberWithDouble:[[NSDate date] timeIntervalSince1970]];
image.data = data;
Thumbnail *thumbnail = (Thumbnail*)[[CoreDataHelper sharedSingleton] createObject:#"Thumbnail"];
thumbnail.data = thumbnailData;
thumbnail.timestamp = image.timestamp;
[[CoreDataHelper sharedSingleton] save];
}
From CoreDataHelper (self.managedObjectContext is picking the NSManagedObjectContext usable in the current thread):
-(NSManagedObject *) createObject:(NSString *) entityName {
return [NSEntityDescription insertNewObjectForEntityForName:entityName inManagedObjectContext:self.managedObjectContext];
}

We had a similar problem. While fetching lots of images over http, there was huge growth and a sawtooth pattern in the memory allocation. We'd see the system clean up, more or less, as it went, but slowly, and not predictably. Meanwhile the downloads were streaming in, piling up on whatever was holding onto the memory. Memory allocation would crest around 200M and then we'd die.
The problem was an NSURLCache issue. You stated that you tried [[NSURLCache sharedURLCache] removeAllCachedResponses]. We tried that, too, but then tried something a little different.
Our downloads are done in groups of N images/movies, where N was typically 50 to 500. It was important that we get all of N as an atomic operation.
Before we started our group of http downloads, we did this:
NSURLCache *sharedCache = [[NSURLCache alloc] initWithMemoryCapacity:0 diskCapacity:0 diskPath:0];
[NSURLCache setSharedURLCache:sharedCache];
We then get each image in N over http with a synchronous call. We do this group download in an NSOperation, so we're not blocking the UI.
NSData *movieReferenceData = [NSURLConnection sendSynchronousRequest:request returningResponse:&response error:&error];
Finally, after each individual image download, and after we're done with our NSData object for that image, we call:
[sharedCache removeAllCachedResponses];
Our memory allocation peak behavior dropped to a very comfortable handful of megabytes, and stopped growing.

In this case, you're seeing exactly what you're supposed to see. -[NSMutableData appendData:] increases the size of its internal buffer to hold the new data. Since an NSMutableData is always located in memory, this causes a corresponding increase in memory usage. What were you expecting?
If the ultimate destination for these images is on disk, try using an NSOutputStream instead of NSMutableData. If you then want to display the image, you can create a UIImage pointing to the file when you're done.

Related

Memory issues with ARC on iOS and Mac

I am trying to mirror screen of my mac to iphone. I have this method in Mac app delegate to capture screeen into base64 string.
-(NSString*)baseString{
CGImageRef screen = CGDisplayCreateImage(displays[0]);
CGFloat w = CGImageGetWidth(screen);
CGFloat h = CGImageGetHeight(screen);
NSImage * image = [[NSImage alloc] initWithCGImage:screen size:(NSSize){w,h}];
[image lockFocus];
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithFocusedViewRect:NSMakeRect(0, 0, w, h)];
[bitmapRep setCompression:NSTIFFCompressionJPEG factor:.3];
[image unlockFocus];
NSData *imageData = [bitmapRep representationUsingType:NSJPEGFileType properties:_options];
NSString *base64String = [imageData base64EncodedStringWithOptions:0];
image = nil;
bitmapRep = nil;
imageData = nil;
return base64String;}
after that I am sending it to iphone and present it in UIImageView.
Delay between screenshots is 40 miliseconds. Everything works as expected until there is enough memory. After minute of streaming it starts swapping and use 6GB of RAM. iOS app memory usage is also growing lineary. By the time iOS reaches 90MB of ram, mac has 6GB.
Even if I stop streaming memory is not released.
I'm using ARC in both projects. Would it make any difference if migrate it to manual reference counting ?
I also tried #autoreleasepool {...} block, but it didn't help.
Any ideas ?
EDIT
My iOS code is here
NSString message = [NSString stringWithFormat:#"data:image/png;base64,%#",base64];
NSURL *url = [NSURL URLWithString:message];
NSData *imageData = [NSData dataWithContentsOfURL:url];
UIImage *ret = [UIImage imageWithData:imageData];
self.image.image = ret;
You have a serious memory leak. The docs for CGDisplayCreateImage clearly state:
The caller is responsible for releasing the image created by calling CGImageRelease.
Update your code with a call to:
CGImageRelease(screen);
I'd add that just after creating the NSImage.
We can't help with your iOS memory leaks since you didn't post your iOS code, but I see a big memory leak in your Mac code.
You are calling a Core Foundation function, CGDisplayCreateImage. Core Foundation objects are not managed by ARC. If a Core Foundation function has "Create" (or "copy") in the name then it follows the "create rule" and you are responsible for releasing the returned CF object when you are done with it.
Some CF objects have special release calls. For those that don't, just call CFRelease. CGImageRef has a special release call, CGImageRelease().
You need a corresponding call to CGImageRelease(screen), probably after the call to initWithCGImage.

How to free memory after using NSJSONSerialization?

I'm reading large json file from my iphone (file saved on iphone) using this code:
dispatch_async(kBgQueue, ^{
// do something that takes a long time with receivedData here
NSError* error;
NSString *filePath = [self getFilePathWithName:LOCAL_JSON];
NSData *jsonData = [NSData dataWithContentsOfFile:filePath];
NSDictionary *dict = (NSDictionary*)[NSJSONSerialization JSONObjectWithData:jsonData options:kNilOptions error:&error];
dispatch_async( dispatch_get_main_queue(), ^{
HideHUD;
DLog(#"Reading done");
ShowMemoryUsageWithTitle(#"after reading json");
return;
});
dict = nil;
});
After I'm done with reading, memory usage has increased for 30 MB and for example total memory usage is 60 MB. After that I'm trying to read again and is 90 MB, and after that 120 MB.
So conclusion is that memory is not autoreleasing. Every time I'm reading memory is increasing.I'm using ARC.
How can I free memory after I'm done with reading json file?
UPDATE 1:
Tried to put "hard" work inside #autoreleasepool {} but there was no changed with memory usage
UPDATE 2:
Tried without dispatch_async
UPDATE 3:
I've tried to use third part library like https://github.com/johnezang/JSONKit but same problem. When I'm testing I'm triying only that thing, nothig else.

Save multiple images quickly in iOS 6 (custom album)

I'm writing an application that will take several images from URL's, turn them into a UIImage and then add them to the photo library and then to the custom album. I don't believe its possible to add them to a custom album without having them in the Camera Roll, so I'm accepting it as impossible (but it would be ideal if this is possible).
My problem is that I'm using the code from this site and it does work, but once it's dealing with larger photos it returns a few as 'Write Busy'. I have successfully got them all to save if I copy the function inside its own completion code and then again inside the next one and so on until 6 (the most I saw it take was 3-4 but I don't know the size of the images and I could get some really big ones) - this has lead to the problem that they weren't all included in the custom album as they error'd at this stage too and there was no block in place to get it to repeat.
I understand that the actual image saving is moved to a background thread (although I don't specifically set this) as my code returns as all done before errors start appearing, but ideally I need to queue up images to be saved on a single background thread so they happen synchronously but do not freeze the UI.
My code looks like this:
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:singleImage]]];
[self.library saveImage:image toAlbum:#"Test Album" withCompletionBlock:^(NSError *error) {
if (error!=nil) {
NSLog(#"Error");
}
}];
I've removed the repetition of the code otherwise the code sample would be very long! It was previously where the NSLog code existed.
For my test sample I am dealing with 25 images, but this could easily be 200 or so, and could be very high resolution, so I need something that's able to reliably do this over and over again without missing several images.
thanks
Rob
I've managed to make it work by stripping out the save image code and moving it into its own function which calls itself recursively on an array on objects, if it fails it re-parses the same image back into the function until it works successfully and will display 'Done' when complete. Because I'm using the completedBlock: from the function to complete the loop, its only running one file save per run.
This is the code I used recursively:
- (void)saveImage {
if(self.thisImage)
{
[self.library saveImage:self.thisImage toAlbum:#"Test Album" withCompletionBlock:^(NSError *error) {
if (error!=nil) {
[self saveImage];
}
else
{
[self.imageData removeObject:self.singleImageData];
NSLog(#"Success!");
self.singleImageData = [self.imageData lastObject];
self.thisImage = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:self.singleImageData]]];
[self saveImage];
}
}];
}
else
{
self.singleImageData = nil;
self.thisImage = nil;
self.imageData = nil;
self.images = nil;
NSLog(#"Done!");
}
}
To set this up, I originally used an array of UIImages's but this used a lot of memory and was very slow (I was testing up to 400 photos). I found a much better way to do it was to store an NSMutableArray of URL's as NSString's and then perform the NSData GET within the function.
The following code is what sets up the NSMutableArray with data and then calls the function. It also sets the first UIImage into memory and stores it under self.thisImage:
NSEnumerator *e = [allDataArray objectEnumerator];
NSDictionary *object;
while (object = [e nextObject]) {
NSArray *imagesArray = [object objectForKey:#"images"];
NSString *singleImage = [[imagesArray objectAtIndex:0] objectForKey:#"source"];
[self.imageData addObject:singleImage];
}
self.singleImageData = [self.imageData lastObject];
self.thisImage = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:self.singleImageData]]];
[self saveImage];
This means the rest of the getters for UIImage can be contained in the function and the single instance of UIImage can be monitored. I also log the raw URL into self.singleImageData so that I can remove the correct elements from the array to stop duplication.
These are the variables I used:
self.images = [[NSMutableArray alloc] init];
self.thisImage = [[UIImage alloc] init];
self.imageData = [[NSMutableArray alloc] init];
self.singleImageData = [[NSString alloc] init];
This answer should work for anyone using http://www.touch-code-magazine.com/ios5-saving-photos-in-custom-photo-album-category-for-download/ for iOS 6 (tested on iOS 6.1) and should result in all pictures being saved correctly and without errors.
If saveImage:toAlbum:withCompletionBlock it's using dispatch_async i fear that for i/o operations too many threads are spawned: each write task you trigger is blocked by the previous one (bacause is still doing I/O on the same queue), so gcd will create a new thread (usually dispatch_async on the global_queue is optimized by gcd by using an optimized number of threads).
You should either use semaphores to limit the write operation to a fixed number at the same time or use dispatch_io_ functions that are available from iOS 5 if i'm not mistaken.
There are plenty example on how to do this with both methods.
some on the fly code for giving an idea:
dispatch_semaphore_t aSemaphore = dispatch_semaphore_create(4);
dispatch_queue_t ioQueue = dispatch_queue_create("com.customqueue", NULL);
// dispatch the following block to the ioQueue
// ( for loop with all images )
dispatch_semaphore_wait(aSemaphore , DISPATCH_TIME_FOREVER);
[self.library saveImage:image
toAlbum:#"Test Album"
withCompletionBlock:^(NSError *error){
dispatch_semaphore_signal(aSemaphore);
}];
so every time you will have maximum 4 saveImage:toAlbum, as soon as one completes another one will start.
you have to create a custom queue, like above (the ioQueue) where to dispatch the code that does the for loop on the images, so when the semaphore is waiting the main thread is not blocked.

multiple GLKview snapshot cause crash. Memory issue?

My goal is to make a video out of a short sequence of opengl frames (around 200 frames). So in order to do this, I use the following code to create a array of images:
NSMutableArray* images = [NSMutableArray array];
KTEngine* engine = [KTEngine sharedInstance]; //Opengl - based engine
for (unsigned int i = engine.animationContext.unitStart; i < engine.animationContext.unitEnd ; ++i)
{
NSLog(#"Render Image %d", i);
[engine.animationContext update:i];
[self.view setNeedsDisplay];
[images addObject:[view snapshot]];
}
NSLog(#"Total image rendered %d", [images count]);
[self createVideoFileFromArray:images];
So this works perfectly fine on simulator, but not on device (retina iPad). So my guess is that the device does not support so many UIimages (specially in 2048*1536). The crash always happends after 38 frames or so.
Now as for the solution, I thought to create a video for each 10 frames, and then attached them all together, but when can I know if I have enough space (is the autorelease pool drained?).
Maybe I should use a thread, process 10 images, and fire it again for the next 10 frames once it's over?
Any idea?
It seems quite likely that you're running out of memory.
To reduce memory usage, you could try to store the images as NSData using the PNG or JPG format instead. Both PNG's and JPG's are quite small when represented as data, but loading them into UIImage objects can be very memory consuming.
I would advice you to do something like below in your loop. The autorelease pool is needed to drain the returned snapshot on each iteration.
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
UIImage *image = [view snapshot];
NSData *imageData = UIImagePNGRepresentation(image);
[images addObject:imageData];
[pool release];
This of course requires your createVideoFileFromArray: method to handle pure image data instead of UIImage objects, but that should probably be feasible to implement.

Correct way to multithread in objective-c?

I have a UITableView which displays images. Every cell has an image and every time a cell loads, I call a selector (from the cellForRowAtIndexPath) in the background like this:
[self performSelectorInBackground:#selector(lazyLoad:) withObject:aArrayOfData];
The only problem is that sometimes I get a crash (because I am changing data in the background while it's trying to be read elsewhere). Here's the error:
*** Terminating app due to uncaught exception 'NSGenericException', reason: '*** Collection <CALayerArray: 0xce1a920> was mutated while being enumerated.'
When updating the data in the background, should I move it to the main selector and change it? Or should I call the #selector() differently?
Thanks!
If you can leave the operation on the main thread and have no lagginess nor problems you are done.
However: Let's assume you've already done that and encounter problems. The answer is: don't modify the array in the lazy load. Switch to the main thread to modify the array. See Brad's answer here:
https://stackoverflow.com/a/8186206/8047
for a way to do it with blocks, so you can send your objects over to the main queue (you should probably also use GCD for the call to the lazy load in the first place, but it's not necessary).
You can use #synchronized blocks to keep the threads from walking over each other. If you do
#synchronized(array)
{
id item = [array objectAtIndex:row];
}
in the main thread and
#synchronized(array)
{
[array addObject:item];
}
in the background, you're guaranteed they won't happen at the same time. (Hopefully you can extrapolate from that to your code—I'm not sure what all you're doing with the array there..)
It seems, though, like you'd have to notify the main thread anyway that you've loaded the data for a cell (via performSelectorOnMainThread:withObject:waitUntilDone:, say), so why not pass the data along, too?
Given the term 'lazy load' I am assuming that means you are pulling your images down from a server. (If the images are local then there is really no need for multithreading).
If you are downloading images off a server I would suggest using something along these lines (using ASIHTTPRequest)
static NSCache *cellCache; //Create a Static cache
if (!cellCache)//If the cache is not initialized initialize it
{
cellCache = [[NSCache alloc] init];
}
NSString *key = imageURL;
//Look in the cache for image matching this url
NSData *imageData = [cellCache objectForKey:key];
if (!imageData)
{
//Set a default image while it's loading
cell.icon.image = [UIImage imageNamed:#"defaultImage.png"];'
//Create an async request to the server to get the image
__unsafe_unretained ASIHTTPRequest *request = [ASIHTTPRequest requestWithURL:[NSURL URLWithString:imageURL]];
//This code will run when the request finishes
[request setCompletionBlock:^{
//Put downloaded image into the cache
[cellCache setObject:[request responseData] forKey:key];
//Display image
cell.icon.image = [UIImage imageWithData:[request responseData]];
}];
[request startAsynchronous];
}
else
{
//Image was found in the cache no need to redownload
cell.icon.image = [UIImage imageWithData:imageData];
}