I have a panning gesture that pans across a series of images. I'm not sure how to correctly manage the memory for this, after panning for a certain length of time, I'm getting crashes.
animImage is a UIImageView;
Here is how it works:
- (IBAction) panAnim:(UIPanGestureRecognizer *)sender{
CGPoint translate = [sender translationInView:sliderView];
if (translate.x >= lastPoint) {
difference = translate.x - lastPoint;
[self forwardAnim:difference];
} else {
difference = lastPoint - translate.x;
[self backwardAnim:difference];
}
lastPoint = translate.x;
}
-(void)forwardAnim:(CGFloat)speed{
NSInteger newFrame = currentFrame+speed;
currentFrame = newFrame;
if (currentFrame>=totalFrames) {
currentFrame=0;
}
NSString *newImagePath = [NSString stringWithFormat:#"%#", [currentAnimation objectAtIndex:currentFrame]];
animImage.image = [UIImage imageNamed:newImagePath];
}
-(void)backwardAnim:(CGFloat)speed{
NSInteger newFrame = currentFrame-speed;
currentFrame = newFrame;
if (currentFrame<0) {
currentFrame=(totalFrames-1);
}
NSString *newImagePath = [NSString stringWithFormat:#"%#", [currentAnimation objectAtIndex:currentFrame]];
animImage.image = [UIImage imageNamed:newImagePath];
}
The animation detects position of the translate and calculates what 'frame' the animation should be at and then swaps out the image.
I'm getting a very smooth animation for this but obviously I'm causing a crash because I'm not managing the memory correctly. I'm getting Memory Warnings and then crashes - but only when I've been scrolling through the images for a while.
I need to figure out a way to pre-load 100 images at a time, so I can only keep the memory of 100 images. It's strange because the images are opened and closed properly in the IO Output in Instruments.
Thanks for your help!
Cheers, D
+[UIImageView imageNamed:] caches the images it loads, so it's not surprising that your memory usage keeps increasing. You might try changing those +imageNamed: calls to +imageWithContentsOfFile: instead. That may hurt your animation performance, but I suspect you'll see memory usage drop significantly.
A memory warning doesn't mean that a crash is imminent. It'd help if you'd post the details of the crash. Looking at the crash log, you should be able to identify a line in your code that leads to the crash -- it may be a few frames from the top of the stack. Find that line, put a breakpoint a line or two before it, and step through the code imagining what could happen on each line that might cause a problem -- failed allocation, bad value, etc.
BTW, your 25kB image files likely expand to something much larger in memory.
Related
I am using 150 images in a sequence for animation .
Here is my code.
NSMutableArray *arrImages =[[NSMutableArray alloc]initWithCapacity:0];
for(int i = 0; i <=158; i++)
{
UIImage* image = [UIImage imageNamed:[NSString stringWithFormat:#"baby%05d.jpg",i]];
[arrImages addObject:image];
}
babyimage.animationImages = arrImages;
[arrImages release];
babyimage.animationDuration=6.15;
[babyimage startAnimating];
but it is taking too much memory.After playing it for 1 minute it shows memory warnings in console. and then crashed.i have reduced images resolution also and i can't make it less then 150 for better quality.
Is there any better way to do this animation without memory issue.
Thanks a lot
plz help ...
Instead of
[UIImage imageNamed:[NSString stringWithFormat:#"baby%05d.jpg",i]]
use
[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:#"baby%05d.jpg",i] ofType:nil]]
Reason is imageNames caches image and does not release until it release memory warning
Edit
Also, don't store entire image into array, just save image name if you want or don't save anything. This will also take much memory.
I am unaware as to what the animation is you haven't specified it. But if I were doing the code, I would follow the following algorithm.
This is assuming there's only one image shown at a particular time but during transition, you will end up having two convolving in a way.
procedureAnimate:
photo_1 = allocate(file_address_array[i++])
while i < MAX_PHOTOS
photo_2 = allocate(file_address_array[i++])
photo_3 = allocate(file_address_array[i++])
perform_animation(photo_1, photo_2)
release(photo_1)
photo_1 = photo_2
photo_2 = photo_3
release(last_two)
I not sure if this is the_perfect_way of doing such a thing. But this will be a lot more efficient.
(May be there's something terribly wrong with the alloc/release login but this is consistent with my working at 5.14 AM in the morning). Let me know if this doesn't work.
I have come across an issue when loading different NSImages repeatedly in an application written using Automatic Reference Counting. It seems as though ARC is not releasing the image objects correctly, and instead the memory usage increases as the list is iterated until the iteration is complete, at which point the memory is freed.
I am seeing up to 2GB of memory being used through the process in some cases. There's a good discussion on SO on a very similar issue which ends up putting the process in an NSAutoReleasePool and releasing the image and draining the pool. This works for me as well if I don't use ARC, but it is not possible to make calls to these objects / methods in ARC.
Is there any way to make this work in ARC as well? It seems as though ARC should be figuring this out all by itself - which makes me think that the fact that these objects are not getting released must be a bug is OS X. (I'm running Lion with XCode 4.2.1).
The kind of code which is causing the issue looks like this:
+(BOOL)checkImage:(NSURL *)imageURL
{
NSImage *img = [[NSImage alloc] initWithContentsOfURL:imageURL];
if (!img)
return NO;
// Do some processing
return YES;
}
This method is called repeatedly in a loop (for example, 300 times). Profiling the app, the memory usage continues to increase with 7.5MB alloced for each image. If ARC is not used, the following can be done (as suggested in this topic):
+(BOOL)checkImage:(NSURL *)imageURL
{
NSAutoReleasePool *apool = [[NSAutoReleasePool alloc] init];
NSImage *img = [[NSImage alloc] initWithContentsOfURL:imageURL];
if (!img)
return NO;
// Do some processing
[img release];
[apool drain];
return YES;
}
Does anyone know how to force ARC to do the memory cleaning? For the time being, I have put the function in a file which is compiled with -fno-objc-arc passed in as a compiler flag. This works OK, but it would be nice to have ARC do it for me.
use #autoreleasepool, like so:
+(BOOL)checkImage:(NSURL *)imageURL
{
#autoreleasepool { // << push a new pool on the autotrelease pool stack
NSImage *img = [[NSImage alloc] initWithContentsOfURL:imageURL];
if (!img) {
return NO;
}
// Do some processing
} // << pushed pool popped at scope exit
return YES;
}
My goal is to make a video out of a short sequence of opengl frames (around 200 frames). So in order to do this, I use the following code to create a array of images:
NSMutableArray* images = [NSMutableArray array];
KTEngine* engine = [KTEngine sharedInstance]; //Opengl - based engine
for (unsigned int i = engine.animationContext.unitStart; i < engine.animationContext.unitEnd ; ++i)
{
NSLog(#"Render Image %d", i);
[engine.animationContext update:i];
[self.view setNeedsDisplay];
[images addObject:[view snapshot]];
}
NSLog(#"Total image rendered %d", [images count]);
[self createVideoFileFromArray:images];
So this works perfectly fine on simulator, but not on device (retina iPad). So my guess is that the device does not support so many UIimages (specially in 2048*1536). The crash always happends after 38 frames or so.
Now as for the solution, I thought to create a video for each 10 frames, and then attached them all together, but when can I know if I have enough space (is the autorelease pool drained?).
Maybe I should use a thread, process 10 images, and fire it again for the next 10 frames once it's over?
Any idea?
It seems quite likely that you're running out of memory.
To reduce memory usage, you could try to store the images as NSData using the PNG or JPG format instead. Both PNG's and JPG's are quite small when represented as data, but loading them into UIImage objects can be very memory consuming.
I would advice you to do something like below in your loop. The autorelease pool is needed to drain the returned snapshot on each iteration.
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
UIImage *image = [view snapshot];
NSData *imageData = UIImagePNGRepresentation(image);
[images addObject:imageData];
[pool release];
This of course requires your createVideoFileFromArray: method to handle pure image data instead of UIImage objects, but that should probably be feasible to implement.
I've been developing a music player recently, I'm writing my own pickers.
I'm trying to test my code to it's limits, so I have around 1600 albums in my iPhone.
I'm using AQGridView for albums view, and since MPMediaItemArtwork is a subclass of NSObject, you need to fire up a method on it to get an image from it, and that method scales images.
Scaling for each cell uses too much CPU as you can guess, so my grid album view is laggy, despite all my effort manually driving each cell's includes.
So I thought of start scaling with GCD on app launch, then save them to file, and read that file for each cell.
But, my code
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^ {
MPMediaQuery *allAlbumsQuery = [MPMediaQuery albumsQuery];
NSArray *albumsArray = allAlbumsQuery.collections;
for (MPMediaItemCollection *collection in albumsArray) {
#autoreleasepool {
MPMediaItem *currentItem = [collection representativeItem];
MPMediaItemArtwork *artwork = [currentItem valueForProperty:MPMediaItemPropertyArtwork];
UIImage *artworkImage = [artwork imageWithSize:CGSizeMake(90, 90)];
if (artworkImage) [toBeCached addObject:artworkImage];
else [toBeCached addObject:blackImage];
NSLog(#"%#", [currentItem valueForProperty:MPMediaItemPropertyAlbumTitle]);
artworkImage = nil;
}
}
dispatch_async(dispatch_get_main_queue(), ^{
[[NSUserDefaults standardUserDefaults] setObject:[NSKeyedArchiver archivedDataWithRootObject:albumsArray] forKey:#"covers"];
});
NSLog(#"finished saving, sir");
});
in AppDelegate's application:didFinishLaunchingWithOptions: method makes my app crash, without any console log etc.
This seems to be a memory problem, so many images are kept in NSArray which is stored on RAM until saving that iOS force closes my app.
Do you have any suggestions on what to do?
Cheers
Take a look at the recently-released SYCache, which combines NSCache and on-disk caching. It's probably a bad idea to get to a memory-warning state as soon as you launch the app, but that's better than force closing.
As far as the commenter above suggested, mapped data is a technique (using mmap or its equivalent) to load data from disk as if it's all in memory at once, which could help with UIImage loading later on down the road. The inverse (with NSMutableData) is also true, that a file is able to be written to as if it's directly in RAM. As a technique, it could be useful.
EDIT: Updating my own question with the answer I figured out months later. Short answer is no, MPMusicPlayerController forwards all calls to the main thread. But it uses a CPDistributedMessagingCenter to actually handle all operations, so one can very easily write a replacement controller that makes asynchronous calls (but it won't work for a sandboxed App Store app, as far as I know - and if it did, Apple would promptly reject it).
I'm making a simple app to control iPod playback, so I've been using an MPMusicPlayerController, which Apple states can only be used in the main thread. However, I've been experiencing some frustrating performance issues in the UI. Switching to the next or previous song is triggered by a swipe, which moves the entire display (of the song info) with it, and then updates the display for the next song when the switch is triggered. The trouble is that once the song has been changed, the UI hangs for up to a second while the song info is retrieved from the MPMusicPlayerController. I've tried most everything I can think of to optimize the code, but it seems to me that the only way to fix it is to move the MPMusicPlayerController code on to a background thread, despite Apple's instructions not to.
The code to update the display, for reference:
// Called when MPMusicPlayerControllerNowPlayingItemDidChangeNotification is received
- (void) nowPlayingDidChange {
if ([iPodMusicPlayer nowPlayingItem]) {
// Temp variables (necessary when updating the subview values in the background)
title = [[iPodMusicPlayer nowPlayingItem] valueForProperty:MPMediaItemPropertyTitle];
artist = [[iPodMusicPlayer nowPlayingItem] valueForProperty:MPMediaItemPropertyArtist];
album = [[iPodMusicPlayer nowPlayingItem] valueForProperty:MPMediaItemPropertyAlbumTitle];
artwork = [[[iPodMusicPlayer nowPlayingItem] valueForProperty:MPMediaItemPropertyArtwork] imageWithSize:CGSizeMake(VIEW_HEIGHT - (2*MARGINS), VIEW_HEIGHT - (2*MARGINS))];
length = [[[iPodMusicPlayer nowPlayingItem] valueForProperty:MPMediaItemPropertyPlaybackDuration] doubleValue];
if (updateViewInBackground)
[self performSelectorInBackground:#selector(updateSongInfo) withObject:nil];
else
[self updateSongInfo];
}
else
[self setSongInfoAsDefault];
}
- (void) updateSongInfo {
// Subviews of the UIScrollView that has performance issues
songTitle.text = title;
songArtist.text = artist;
songAlbum.text = album;
songLength.text = [self formatSongLength:length];
if (!artwork) {
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)] && [[UIScreen mainScreen] scale] == 2.00)
songArtwork.image = [UIImage imageWithContentsOfFile:
#"/System/Library/Frameworks/MediaPlayer.framework/noartplaceholder#2x.png"];
else
songArtwork.image = [UIImage imageWithContentsOfFile:
#"/System/Library/Frameworks/MediaPlayer.framework/noartplaceholder.png"];
}
else
songArtwork.image = artwork;
title = nil;
artist = nil;
album = nil;
artwork = nil;
length = 0.0;
}
Is there anything I'm missing here (ie. performance optimization when updating the UIScrollView subviews)? And if not, would it be such a bad idea to just use the MPMusicPlayerController in a background thread? I know that can lead to issues if something else is accessing the iPodMusicPlayer (shared instance of the iPod in MPMusicPlayerController), but are there any ways I could potentially work around that?
Also, this is a jailbreak tweak (a Notification Center widget), so I can make use of Apple's private frameworks if they would work better than, say, the MPMusicPlayerController class (which is fairly limited anyways) for my purposes. That also means, though, that my app will be running as a part of the SpringBoard process, so I want to be sure that my code is as safe and stable as possible (I experience 2 minute hangs whenever my code does something wrong, which I don't want happening when I release this). So if you have any suggestions, I'd really appreciate it! I can provide more code / info if necessary. Thanks!