I've been developing a music player recently, I'm writing my own pickers.
I'm trying to test my code to it's limits, so I have around 1600 albums in my iPhone.
I'm using AQGridView for albums view, and since MPMediaItemArtwork is a subclass of NSObject, you need to fire up a method on it to get an image from it, and that method scales images.
Scaling for each cell uses too much CPU as you can guess, so my grid album view is laggy, despite all my effort manually driving each cell's includes.
So I thought of start scaling with GCD on app launch, then save them to file, and read that file for each cell.
But, my code
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^ {
MPMediaQuery *allAlbumsQuery = [MPMediaQuery albumsQuery];
NSArray *albumsArray = allAlbumsQuery.collections;
for (MPMediaItemCollection *collection in albumsArray) {
#autoreleasepool {
MPMediaItem *currentItem = [collection representativeItem];
MPMediaItemArtwork *artwork = [currentItem valueForProperty:MPMediaItemPropertyArtwork];
UIImage *artworkImage = [artwork imageWithSize:CGSizeMake(90, 90)];
if (artworkImage) [toBeCached addObject:artworkImage];
else [toBeCached addObject:blackImage];
NSLog(#"%#", [currentItem valueForProperty:MPMediaItemPropertyAlbumTitle]);
artworkImage = nil;
}
}
dispatch_async(dispatch_get_main_queue(), ^{
[[NSUserDefaults standardUserDefaults] setObject:[NSKeyedArchiver archivedDataWithRootObject:albumsArray] forKey:#"covers"];
});
NSLog(#"finished saving, sir");
});
in AppDelegate's application:didFinishLaunchingWithOptions: method makes my app crash, without any console log etc.
This seems to be a memory problem, so many images are kept in NSArray which is stored on RAM until saving that iOS force closes my app.
Do you have any suggestions on what to do?
Cheers
Take a look at the recently-released SYCache, which combines NSCache and on-disk caching. It's probably a bad idea to get to a memory-warning state as soon as you launch the app, but that's better than force closing.
As far as the commenter above suggested, mapped data is a technique (using mmap or its equivalent) to load data from disk as if it's all in memory at once, which could help with UIImage loading later on down the road. The inverse (with NSMutableData) is also true, that a file is able to be written to as if it's directly in RAM. As a technique, it could be useful.
Related
I know very little about using background threads, but this seems to play my sound in the way that I need it to as follows:
1) I need this very short sound effect to play repeatedly even if the sound overlaps.
2) I need the sound to be played perfectly on time.
3) I need the loading of the sound to not affect the on-screen graphics by stuttering.
I am currently just trying out this method with one sound, but if successful, I will roll it out to other sound effects that need the same treatment. My question is this: Am I using the background thread properly? Will there be any sort of memory leaks?
Here's the code:
-(void) playAudio {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
NSString *path = [NSString stringWithFormat:#"%#/metronome.mp3", [[NSBundle mainBundle] resourcePath]];
NSURL *metronomeSound = [NSURL fileURLWithPath:path];
_audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:metronomeSound error:nil];
[_audioPlayer prepareToPlay];
[_audioPlayer play];
});
}
//handles collision detection
-(void) didBeginContact:(SKPhysicsContact *)contact {
uint32_t categoryA = contact.bodyA.categoryBitMask;
uint32_t categoryB = contact.bodyB.categoryBitMask;
if (categoryA == kLineCategory || categoryB == kLineCategory) {
NSLog(#"line contact");
[self playAudio];
}
}
I use the AVAudioPlayer and use it asynchronously and in background threads without any problems and no leaks as far as I can tell. However, I have implemented a singleton class that handles all the allocations and keeps an array of AVAudioPlayer instances that also play asynchronously as needed. If you need to play a sound repeatedly, you should allocate an AVAudioPlayer instance for every time you want to play it. In that case, latency will be negligible and you can even play the same sound simultaneously.
Concerning your strategy I think it needs some refinements, in particular if you want to prevent any delays. The main problem is always reading from disk, which is the slowest operation of all and your limiting step.
Thus, I would also implement an array of AVAudioPlayers each already initialized to play a specific sound, in particular if this sound is played often and repeatedly. You could remove those instances of players that are played less often from the array if memory starts to grow and reload them a few seconds before if you can tell which ones will be needed.
And one more thing... Don't forget to lock and unlock the array, if you are going to access it from multiple threads or better yet, create a GCD queue to handle all accesses to the array.
I am attempting to create an application that goes through various images from the net and aim to cache them onto the iPhone for offline use. The code I am currently working with is:
NSMutableDictionary *Cache;
- (UIImage *)CachedImage: (NSString*)url {
UIImage *image = [Cache objectForKey:url];
if (image == nil) {
image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:url]]];
[Cache setObject:image forKey:url];
//NSLog (#"Stored");
return image;
} else {
//NSLog (#"Taken");
return image;
} }
I call the function and place the image into an ImageView using the strip of code below.
[self.imageView setImage:[self CachedImage:url]]; // Change url to desired URL.
Using the NSLog, the problem I found is that the code doesn't actually store the value because the value is always reading nil. Why is that and are there other ways of storing images for offline use?
Thanks in advance.
-Gon
Use NSCache to cache UIImages. You can also save the image locally (if you reuse these images a lot and during multiple launch) so whenever your app closes or you flush your cache, you can get the images immediately from your local directory.
https://developer.apple.com/library/mac/#documentation/Cocoa/Reference/NSCache_Class/Reference/Reference.html
You are looking for
NSCache
Check it out here: http://nshipster.com/nscache/
Poor NSCache, always being overshadowed by NSMutableDictionary in the
most inappropriate circumstances. It’s like no one knows its there,
ready to provide all of that garbage collection behavior that
developers take great pains to re-implement themselves.
EDIT: Updating my own question with the answer I figured out months later. Short answer is no, MPMusicPlayerController forwards all calls to the main thread. But it uses a CPDistributedMessagingCenter to actually handle all operations, so one can very easily write a replacement controller that makes asynchronous calls (but it won't work for a sandboxed App Store app, as far as I know - and if it did, Apple would promptly reject it).
I'm making a simple app to control iPod playback, so I've been using an MPMusicPlayerController, which Apple states can only be used in the main thread. However, I've been experiencing some frustrating performance issues in the UI. Switching to the next or previous song is triggered by a swipe, which moves the entire display (of the song info) with it, and then updates the display for the next song when the switch is triggered. The trouble is that once the song has been changed, the UI hangs for up to a second while the song info is retrieved from the MPMusicPlayerController. I've tried most everything I can think of to optimize the code, but it seems to me that the only way to fix it is to move the MPMusicPlayerController code on to a background thread, despite Apple's instructions not to.
The code to update the display, for reference:
// Called when MPMusicPlayerControllerNowPlayingItemDidChangeNotification is received
- (void) nowPlayingDidChange {
if ([iPodMusicPlayer nowPlayingItem]) {
// Temp variables (necessary when updating the subview values in the background)
title = [[iPodMusicPlayer nowPlayingItem] valueForProperty:MPMediaItemPropertyTitle];
artist = [[iPodMusicPlayer nowPlayingItem] valueForProperty:MPMediaItemPropertyArtist];
album = [[iPodMusicPlayer nowPlayingItem] valueForProperty:MPMediaItemPropertyAlbumTitle];
artwork = [[[iPodMusicPlayer nowPlayingItem] valueForProperty:MPMediaItemPropertyArtwork] imageWithSize:CGSizeMake(VIEW_HEIGHT - (2*MARGINS), VIEW_HEIGHT - (2*MARGINS))];
length = [[[iPodMusicPlayer nowPlayingItem] valueForProperty:MPMediaItemPropertyPlaybackDuration] doubleValue];
if (updateViewInBackground)
[self performSelectorInBackground:#selector(updateSongInfo) withObject:nil];
else
[self updateSongInfo];
}
else
[self setSongInfoAsDefault];
}
- (void) updateSongInfo {
// Subviews of the UIScrollView that has performance issues
songTitle.text = title;
songArtist.text = artist;
songAlbum.text = album;
songLength.text = [self formatSongLength:length];
if (!artwork) {
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)] && [[UIScreen mainScreen] scale] == 2.00)
songArtwork.image = [UIImage imageWithContentsOfFile:
#"/System/Library/Frameworks/MediaPlayer.framework/noartplaceholder#2x.png"];
else
songArtwork.image = [UIImage imageWithContentsOfFile:
#"/System/Library/Frameworks/MediaPlayer.framework/noartplaceholder.png"];
}
else
songArtwork.image = artwork;
title = nil;
artist = nil;
album = nil;
artwork = nil;
length = 0.0;
}
Is there anything I'm missing here (ie. performance optimization when updating the UIScrollView subviews)? And if not, would it be such a bad idea to just use the MPMusicPlayerController in a background thread? I know that can lead to issues if something else is accessing the iPodMusicPlayer (shared instance of the iPod in MPMusicPlayerController), but are there any ways I could potentially work around that?
Also, this is a jailbreak tweak (a Notification Center widget), so I can make use of Apple's private frameworks if they would work better than, say, the MPMusicPlayerController class (which is fairly limited anyways) for my purposes. That also means, though, that my app will be running as a part of the SpringBoard process, so I want to be sure that my code is as safe and stable as possible (I experience 2 minute hangs whenever my code does something wrong, which I don't want happening when I release this). So if you have any suggestions, I'd really appreciate it! I can provide more code / info if necessary. Thanks!
I'm recording a video from the iSight camera using QTCaptureSession.
I would like to add an image at the end of the video, so I've implemented the didFinishRecordingToOutputFileAtURL delegate methods. Here's what I've done so far:
- (void)captureOutput:(QTCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL forConnections:(NSArray *)connections dueToError:(NSError *)error
{
// Prepare final video
QTMovie *originalMovie = [QTMovie movieWithURL:outputFileURL error:nil];
[originalMovie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieEditableAttribute];
NSImage *splashScreen = [NSImage imageNamed:#"video-ending.jpg"];
NSImage *tiffImage = [[NSImage alloc] initWithData:[splashScreen TIFFRepresentation]];
id attr = [NSDictionary dictionaryWithObjectsAndKeys:#"tiff",
QTAddImageCodecType,
[NSNumber numberWithLong:codecHighQuality], QTAddImageCodecQuality,
nil];
[originalMovie addImage:tiffImage forDuration:QTMakeTime(2, 1) withAttributes:attr];
[tiffImage release];
[originalMovie updateMovieFile];
}
The problem with this code is that while quicktime plays it nice, other players don't. I'm sure I'm missing something basic here.
It would also be cool to add the image to the video before it gets saved (to avoid during it two times). Here's how I stop recording right now:
- (void)stopRecording
{
// It would be cool to add an image here
[mCaptureMovieFileOutput recordToOutputFileURL:nil];
}
While I used Cocoa touch this might still apply. I have two tips based on my experience writing images to movies. First, while I'll bet that addImage:forDuration takes care of a lot of things that AVAssetExportSessions do not, I had to make sure that images were added more regularly than a couple times a second or they would not work well with all players. Second, if there is a network streaming option, such as the AVAssetExportSession shouldOptimizeForNetworkUse to move as much metadata and headers forward in the movie, I found that it made the video compatible with more players as well.
long time reader, first time asker...
I am making a music app which uses AVAssetReader to read mp3 data from the itunes library. I need precise timing, so when I create an AVURLAsset, I use the "AVURLAssetPreferPreciseDurationAndTimingKey" to extract timing data. This has some overhead (and I have no problems when I don't use it, but I need it!)
Every thing works fine on iphone(4) and ipad(1). I would like it to work on my ipod touch (2nd gen). But it doesn't: if the sound file is too long (> ~7 minutes) then the AVAssetReader cannot start reading and throws an error ( AVFoundationErrorDomain error -11800. )
It appears that I am hitting a wall in terms of the scanter resources of the ipod touch. Any ideas what is happening, or how to manage the overhead of creating the AVURLAsset so that it can handle long files?
(I tried running this with the performance tools, and I don't see a major spike in memory).
Thanks, Dan
Maybe you're starting to read too son? As far as I understand, for mp3 it will need to go trough the entire file in order to to enable precise timing. So, try delaying the reading.
You can also try registering as an observer for some of the AVAsset properties. iOS 4.3 has 'readable' property. I've never tried it, but my guess would be it's initially set to NO and as soon as AVAsset has finished loading it gets set to YES.
EDIT:
Actually, just looked into the docs. You're supposed to use AVAsynchronousKeyValueLoading protocol for that and Apple provides an example
NSURL *url = <#A URL that identifies an audiovisual asset such as a movie file#>;
AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSArray *keys = [NSArray arrayWithObject:#"duration"];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus durationStatus = [asset statusOfValueForKey:#"duration" error:&error];
switch (durationStatus) {
case AVKeyValueStatusLoaded:
[self updateUserInterfaceForDuration];
break;
case AVKeyValueStatusFailed:
[self reportError:error forAsset:asset];
break;
case AVKeyValueStatusCancelled:
// Do whatever is appropriate for cancelation.
break;
}
}];
If 'duration' won't help try 'readable' (but like I mentioned before 'readable' requires 4.3). Maybe this will solve your issue.