Problems creating lots of thumbnails using AVAssetImageGenerator with ARC - objective-c

I've looked through every list I can find for ideas on this but have been stuck for a couple of days now so here's hoping...
I'm trying to create a lot of video thumbnails (100 or so) at once using AVAssetImageGenerator.
While testing I'm always using the same movie files, but the process (seemingly) randomly succeeds or fails, plus I now get all the movies loaded into physical memory at once which I don't want. I'm obviously doing something wrong but cannot figure out what - and under ARC can't manually manage the memory.
Background;
It's an AVFoundation video player using ARC (Xcode 4.2.1, OSX 10.7.2, MBP 2.4GHz i5, 4GB RAM).
It has a playlist style interface allowing you to drag folders of movie files to it. It recurses through the paths it's given and plucks out the valid movies and adds them to the list.
Movies are represented by a Movie class which has an AVAsset member variable (movieAsset) initialised with the file path. So far so good - everything works, movies can be played, only the movie being played is in physical memory.
I've added a -createThumbnail method to the Movie class which is called in the Movie's init method (code snippet below).
With the addition of this code I'm getting a few behaviours I can't eradicate - none of which occur if I don't call the -createThumbnail code below. Any ideas where I'm going wrong?
Every movie added to the playlist is now being loaded into physical memory immediately - so the apps memory footprint has gone way up (without thumbnail code = 40MB for 100 movies, with thumbnails (NSImages at 32x18 pixels) 750MB for the same 100 movies).
Looking at Activity Monitor->Open Files and Ports, all the movie files are listed even after thumbnail creation has finished. This didn't occur before - only the movie being played was listed.
Thumbnail creation completely locks up my machine until it's complete - even though I'm calling AVAssetImageGenerator within an asynchronous block - (CPU usage never gets above 35%). Could this be a disk access problem trying to read 100 movies at once?
Thumbnail creation is very erratic. Sometimes all thumbnails are created, but often a random 30-70% are not. Maybe also a disk access problem?
I'm very new to OOP and Obj-C so have probably made a newbies mistake - but I just can't track it down...
Also worth noting that the "Error creating thumbnail" and "Error finding duration" NSLogs are never called...
-(void)createThumbnail{
NSArray *keys = [NSArray arrayWithObject:#"duration"];
[movieAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus valueStatus = [movieAsset statusOfValueForKey:#"duration" error:&error];
switch (valueStatus) {
case AVKeyValueStatusLoaded:
if ([movieAsset tracksWithMediaCharacteristic:AVMediaTypeVideo]) {
AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:movieAsset];
Float64 movieDuration = CMTimeGetSeconds([movieAsset duration]);
CMTime middleFrame = CMTimeMakeWithSeconds(movieDuration/2.0, 600);
CGImageRef imageForThumbnail = [imageGenerator copyCGImageAtTime:middleFrame actualTime:NULL error:NULL];
if (imageForThumbnail != NULL) {
NSSize sizeOption = NSMakeSize(32, 18);
self.thumbnail = [[NSImage alloc] initWithCGImage:imageForThumbnail size:sizeOption];
NSLog(#"Thumbnail created for %#", [self filenameString]);
}
else{
NSLog(#"-----Error creating thumbnail for %#", [self filenameString]);
}
CGImageRelease(imageForThumbnail);
}
break;
case AVKeyValueStatusFailed:
NSLog(#"Error finding duration");
break;
case AVKeyValueStatusCancelled:
NSLog(#"Cancelled finding duration");
break;
}
}];
}
(Note: I've been using the same few folders of movie files to develop the app for the last month or so. They're all local valid files that play successfully in the app. Some of these folders dropped contain over a hundred movie files nested within various subfolders).
Many thanks if anyone can help.
Chas.

I had some weird issues when using AVAssetImageGenerator in this way as well (on iOS at least). It seems to me at least to be somewhat broken when used in combination with the blocks/GCD API. Rather than loading everything asynchronously, try making a single queue/loop that operates in a single background thread, and walk through the movies that way. This should also help to keep your memory usage down.

Related

Performance problems with [UIImage imageNamed] in iOS7/iOS8

I have an app that I'm in the process of updating to iOS8 from iOS6. After making a few small changes here and there to get things to compile, I ran the app on my iPad mini and found that the performance had dropped off significantly. In a couple places things that used to have no noticeable delay now frequently take 2-3 seconds to complete. Thinking that perhaps something was wrong with my iPad, I removed my current build and re-downloaded the app from the AppStore. Performance went back to the previous/good state.
After profiling the code, one of the things that sticks out is a significant amount of time in [UIImage imageNamed]. My app makes many calls to this because it has an indoor and an outdoor mode which requires most of the images on a given view to be loaded dynamically. Has something happened to this call in the latest SDKs?
When I profiled the app, here is the method that is eating up >70% of the time
+(UIImage*)imageNamed:(NSString*)rootName extension:(NSString*)extension
{
NSString* typeName = _isIndoor ? #"inside" : #"outside";
NSString* fullName = [NSString stringWithFormat:#"%#-%#.%#", rootName, typeName, extension];
UIImage* result = [UIImage imageNamed:fullName];
if (result == nil) {
NSLog(#"Missing image [name: %#]", fullName);
}
return result;
}
Some other notes:
The original app works fine on my 1st generation iPad Mini running iOS8
The app does not currently use ARC, but I did test switching to ARC w/ no improvement (not surprising)
The app does not use Asset Catalogs, but I tried converting to use Asset Catalogs w/ no improvement (also not surprising)
I did some googling, but wasn't able to find anyone else having the same problem
I assume the [UIImage imageNamed: ] is what is taking up the most time. Considering things were working great before we can make a guess that the "searching" part is what is going awry. Where are the resources stored? Main Bundle or elsewhere?
Also imageNamed: should cache the image, making subsequent calls fast. Is that the case? Or are subsequent calls for the same resource slow as well?
try using the init method on NSImage that takes the inBundle parameter, and pass the correct bundle into that. This way, if the search for the file is the problem, this should be faster.

iPhone iOS6 how to return Image IO type "resident dirty memory" to the OS?

I"m looking at the WWDC 2010 video which deals with advanced memory analysis( session 311):
At around 45:00 into the video, the performance engineer discusses what to do with "Resident Dirty memory" that your app has loaded in RAM. The engineer suggests that in response to memory warnings, your app should clear this. The engineer pastes in his custom class "flush" method into didReceiveMemoryWarning and everything is fine, but the code does not really offer any examples of HOW the memory is to be freed.
The question that I have is - how to do I flush large chunks of dirty memory used by "Image IO"? :
Here's around 74 mb of memory just sitting around dirty ( for close to 6 minutes now), waiting for someone to return it to iOS6. Nothing is happening to it. Since it does not go away on its own, I need to know how to return it to iOS.
These blocks appear to originate from code like this and (maybe other image related operations).
UIImage *screenshot = nil;
#autoreleasepool {
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(iPhoneRetinaIconSize, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(iPhoneRetinaIconSize);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
The issue is that there's a lot of memory sitting around, loaded in RAM, unable to be returned to the operating system until the app crashes.
For webview-related dirty memory, I found that this may work:
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
[[NSURLCache sharedURLCache] removeAllCachedResponses];
[[NSURLCache sharedURLCache] setDiskCapacity:0];
[[NSURLCache sharedURLCache] setMemoryCapacity:0];
// Dispose of any resources that can be recreated.
}
Is there an equivalent for UIImage, CALayer or UIGraphics ?
I am far from an expert in these issues, but based on the tests I conducted with the code you provided, I'd say you just have to release the UIImages created in these blocks of code.
As far as I understand, the Image IO or GC raster data labeled chunks of memory are really just the underlying data of your images (UIImage being a UIKit wrapper on top of these). So to release the memory, release the image.
I tested this by creating a bunch of UIImages using your code, simulating a memory warning which released all of the created images:
The images speak for themselves. Releasing my UIImages (at ~00:08) removed the big GC raster data chunk from resident memory.
Because removing completely an image from your UI may not be the best solution for user experience, maybe you could try to downsize your largest images when receiving a memory warning, a poorer resolution resulting in a smaller memory footprint. Another idea (again this depends on what your images are used for) could be to dump the images to disk, and load them latter, when needed.
Hope that helps.

Memory warning and crash (ARC) - how to identify why it's happening?

I've started to use the ARC recently and since then I blame it for every single memory problem. :) Perhaps, you could help me better understand what I'm doing wrong.
My current project is about CoreGraphics a lot - charts drawing, views filled with thumbnails and so on. I believe there would be no problem while using manual memory management, except maybe a few zombies... But as of now, application simply crashes every time I try to either create a lot of thumbnails or redraw a bit more complicated chart.
While profiling with Instruments I can see an awfully high value in resident memory as well as in the dirty one. Heap analysis shows rather alarming irregular grow...
When drawing just a few thumbnails, resident memory grows for about 200 MB. When everything is drawn, memory drops back on almost the same value as before drawing. However, with a lot of thumbnails, a value in resident memory is higher than 400 MB and that obviously crashes the app. I've tried to limit number of thumbnails drawn at the same time (NSOperationQueue and its maxConcurrentOperationCount), but as releasing so much memory seems to take a bit more time, it didn't really solve the issue.
Right now my app basically doesn't work as the real data works with a lot of complicated charts = lot of thumbnails.
Every thumbnail is created with this code I got from around here: (category on UIImage)
+ (void)beginImageContextWithSize:(CGSize)size
{
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
if ([[UIScreen mainScreen] scale] == 2.0) {
UIGraphicsBeginImageContextWithOptions(size, YES, 2.0);
} else {
UIGraphicsBeginImageContext(size);
}
} else {
UIGraphicsBeginImageContext(size);
}
}
+ (void)endImageContext
{
UIGraphicsEndImageContext();
}
+ (UIImage*)imageFromView:(UIView*)view
{
[self beginImageContextWithSize:[view bounds].size];
BOOL hidden = [view isHidden];
[view setHidden:NO];
[[view layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
[self endImageContext];
[view setHidden:hidden];
return image;
}
+ (UIImage*)imageFromView:(UIView*)view scaledToSize:(CGSize)newSize
{
UIImage *image = [self imageFromView:view];
if ([view bounds].size.width != newSize.width ||
[view bounds].size.height != newSize.height) {
image = [self imageWithImage:image scaledToSize:newSize];
}
return image;
}
+ (UIImage*)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
[self beginImageContextWithSize:newSize];
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
[self endImageContext];
return newImage;
}
Is there some other way which wouldn't eat so much memory or is something really wrong with the code when using ARC?
The other place where memory warning + crash is happening is when there is too much redrawing of any view. It doesn't need to be quick, just many times. Memory stacks up until it crashes and I'm not able to find anything really responsible for it. (I can see a growing resident/dirty memory in VM Tracker and a heap growth in Allocations instrument)
My question basically is: how to find why it is even happening? My understanding is when there is no owner for given object, it's released ASAP. My inspection of code suggests a lot of objects are not released at all even though I don't see any reason for it to happen. I don't know about any retain cycles...
I've read through the Transitioning to ARC Release Notes, bbum's article about heap analysis and probably a dozen of others. Differs somehow heap analysis with and without ARC? I can't seem to do anything useful with its output.
Thank you for any ideas.
UPDATE: (to not force everybody read all the comments and to hold my promise)
By carefully getting through my code and adding #autoreleasepool, where it had any sense, memory consumption got lowered. The biggest problem was calling UIGraphicsBeginImageContext from background thread. After fixing it (see #Tammo Freese's answer for details) deallocation occurred soon enough to not crash an app.
My second crash (caused by many redrawing of the same chart), was completely solved by adding CGContextFlush(context) at the end of my drawing method. Shame on me.
A small warning for anyone trying to do something similar: use OpenGL. CoreGraphics is not quick enough for animating big drawings, especially not on an iPad 3. (first one with retina)
To answer your question: Identifying problems with memory warnings and crashes with ARC basically works like before with manual retain-release (MRR). ARC uses retain, release and autorelease just like MRR, it only inserts the calls for you, and has some optimizations in place that should even lower the memory consumption in some cases.
Regarding your problem:
In the screenshot of Instruments you posted, there are allocation spikes visible. In most cases I encountered so far, these spikes were caused by autoreleased objects hanging around too long.
You mentioned that you use NSOperationQueue. If you override -[NSOperationQueue main], make sure that you wrap the whole content of the method in #autoreleasepool { ... }. An autorelease pool may already be in place, but it is not guaranteed (and even if there is one, it may be around for longer than you think).
If 1. has not helped and you have a loop that processes the images, wrap the inner part of the loop in #autoreleasepool { ... } so that temporary objects are cleaned up immediately.
You mentioned that you use NSOperationQueue. Since iOS 4, drawing to a graphics context in UIKit is thread-safe, but if the documentation is right, UIGraphicsBeginImageContext should still only be called on the main thread! Update: The docs now state that since iOS 4, the function can be called from any thread, to the following is actually unnecessary! To be on the safe side, create the context with CGBitmapContextCreate and retrieve the image with CGBitmapContextCreateImage. Something along these lines:
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL, width, height, 8, 0, colorSpace, kCGImageAlphaPremultipliedLast);
CGColorSpaceRelease(colorSpace);
// draw to the context here
CGImageRef newCGImage = CGBitmapContextCreateImage(context);
CGContextRelease(context);
UIImage *result = [UIImage imageWithCGImage:newCGImage scale:scale orientation: UIImageOrientationUp];
CGImageRelease(newCGImage);
return result;
So, nothing you are doing relative to memory management (there is none!) looks improper. However, you mention using NSOperationQueue. Those UIGraphics... calls are marked as not thread safe, but others have stated they are as of iOS 4 (I cannot find a definitive answer to this, but recall that this is true.
In any case, you should not be calling these class methods from multiple threads. You could create a serial dispatch queue and feed all the work through that to insure single threaded usage.
What's missing here of course is what you do with the images after using them. Its possible you are retaining them in some way that is not obvious. Here are some tricks:
in any of your classes that use lots of images, add a dealloc() method that just logs its name and some identifier.
you can try to add a dealloc to UIImage to do the same.
try to drive your app using the simplest possible setup - fewest images etc - so you can verify that in fact that the images and their owners are getting dealloc'd.
when you want to make sure something is released, set the ivar or property to nil
I converted a 100 file project to ARC last summer and it worked perfectly out of the box. I have converted several open source projects to ARC as well with only one problem when I improperly used bridging. The technology is rock solid.
This is not an answer to your question but I was trying to solve similar problems long before ARC was introduced.
Recently I was working on an application that was caching images in memory and releasing them all after receiving memory warning. This worked fine as long as I was using the application at a normal speed (no crazy tapping). But when I started to generate a lot of events and many images started to load, the application did not manage to get the memory warning and it was crashing.
I once wrote a test application that was creating many autoreleased objects after tapping a button. I was able to tap faster (and create objects) than the OS managed to release the memory. The memory was slowly increasing so after a significant time or simply using bigger objects I would surely crash the application and cause device to reboot (looks really effective ;)). I checked that using Instruments which unfortunately affected the test and made everything slower but I suppose this is true also when not using Instruments.
On the other occasion I was working on a bigger project that is quite complex and has a lot of UI created from code. It also has a lot of string processing and nobody cared to use release - there were few thousands of autorelease calls when I checked last time. So after 5 minutes of slightly extensive usage of this application, it was crashing and rebooting the device.
If I'm correct then the OS/logic that is responsible for actually deallocating memory is not fast enough or has not high enough priority to save an application from crashing when a lot of memory operations are performed. I never confirmed these suspicions and I don't know how to solve this problem other than simply reducing allocated memory.

How to speed up saving a UIImagePickerController image from the camera to the filesystem via UIImagePNGRepresentation()?

I'm making an applications that let users take a photo and show them both in thumbnail and photo viewer.
I have NSManagedObject class called photo and photo has a method that takes UIImage and converts it to PNG using UIImagePNGRepresentation() and saves it to filesystem.
After this operation, resize the image to thumbnail size and save it.
The problem here is UIImagePNGRepresentation() and conversion of image size seems to be really slow and I don't know if this is a right way to do it.
Tell me if anyone know the best way to accomplish what I want to do.
Thank you in advance.
Depending on the image resolution, UIImagePNGRepresentation can indeed be quite slow, as can any writing to the file system.
You should always execute these types of operations in an asynchronous queue. Even if the performance seems good enough for your application when testing, you should still do it an asynch queue -- you never know what other processes the device might have going on which might slow the save down once your app is in the hands of users.
Newer versions of iOS make saving asynchronously really, really easy using Grand Central Dispatch (GCD). The steps are:
Create an NSBlockOperation which saves the image
In the block operation's completion block, read the image from disk & display it. The only caveat here is that you must use the main queue to display the image: all UI operations must occur on the main thread.
Add the block operation to an operation queue and watch it go!
That's it. And here's the code:
// Create a block operation with our saves
NSBlockOperation* saveOp = [NSBlockOperation blockOperationWithBlock: ^{
[UIImagePNGRepresentation(image) writeToFile:file atomically:YES];
[UIImagePNGRepresentation(thumbImage) writeToFile:thumbfile atomically:YES];
}];
// Use the completion block to update our UI from the main queue
[saveOp setCompletionBlock:^{
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
UIImage *image = [UIImage imageWithContentsOfFile:thumbfile];
// TODO: Assign image to imageview
}];
}];
// Kick off the operation, sit back, and relax. Go answer some stackoverflow
// questions or something.
NSOperationQueue *queue = [[NSOperationQueue alloc] init];
[queue addOperation:saveOp];
Once you are comfortable with this code pattern, you will find yourself using it a lot. It's incredibly useful when generating large datasets, long operations on load, etc. Essentially, any operation that makes your UI laggy in the least is a good candidate for this code. Just remember, you can't do anything to the UI while you aren't in the main queue and everything else is cake.
Yes, it does take time on iPhone 4, where the image size is around 6 MB. The solution is to execute UIImagePNGRepresentation() in a background thread, using performSelectorInBackground:withObject:, so that your UI thread does not freeze.
It will probably be much faster to do the resizing before converting to PNG.
Try UIImageJPEGRepresentation with a medium compression quality. If the bottleneck is IO then this may prove faster as the filesize will generally be smaller than a png.
Use Instruments to check whether UIImagePNGRepresentation is the slow part or whether it is writing the data out to the filesystem which is slow.

Data Formatters temporarily unavailable

I had a problem with my iPad application that is, I need to download more than 10,000 images from a server at a time. I successfully downloaded more than 8000 images, But after that I got an exception like "Program received signal: “0”.
Data Formatters temporarily unavailable, will re-try after a 'continue'. (Unknown error loading shared library "/Developer/usr/lib/libXcodeDebuggerSupport.dylib")
(gdb)" this in the debugger. I tested the memory management. There are no issues in memory management. But still I got the exception. Please help me.
Thanks in advance,
Sekhar Bethalam.
Data formatters are used so that the debugger can represent an object as more than just a pointer or integer value (for example, data formatters allow the debugger to show you the underlying value of NSNumber, or the elements inside an NSArray, etc.). Sometimes—and I'm not sure why—they just stop working. In any case, the fact that the data formatters aren't working is not the root of the issue.
Without seeing any of your code (such as how you download and/or store the images) it is not very easy to diagnose your problem. After some digging around on Google, it appears that programs on iPhone OS receive signal 0 when they are using too many resources (not necessarily just memory). There's nothing you can do to stop iPhone OS from killing your program when it wants to.
A few questions:
Are you trying to download 10,000 images simultaneously or sequentially? Hopefully sequentially!
Even though your there are no detected leaks in your code, your program may still be using too much memory. Are you keeping references to these images in an array or something similar?
Are you in a tight loop? Your code may not keep references to any of the data/URLs/images, but the autorelease pool might be. In tight-loop situations, it is sometimes advisable to create a new autorelease pool at the beginning of the loop, and releasing it at the bottom. For example:
for (NSUInteger i = 0; i < 10000; i++)
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
// do the work to get the image. Objects are automatically added to the
// most recently created autorelease pool in the current thread.
[pool release];
}
This is my code which I mentioned above,
for (int i=0;i<10000;i++)
{
printf("\n generating images..............:%d",i);
i++;
UIImageView* imageView=[[UIImageView alloc]initWithFrame:CGRectMake(0, 0, spotItem.imageWidth, spotItem.imageHight)];
NSData *receivedData = [NSData dataWithContentsOfURL:[NSURL URLWithString:#"http://sererpath/prudently/iphone/image_s/e545afbf-4e3e-442e-92f9-a7891fc3ea9f/test.png"]];
imageView.image = [[UIImage alloc] initWithData:receivedData] ;
//imageView.image=[UIImage imageNamed:#"1.png"];
[spotItem.subView addSubview:imageView];
[imageView release];
}
I was downloading them directly.