MPNowPlayingInfoCenter doesn't play on lock screen - objective-c

I'm using MPMoviePlayerController to play audio/video and would like to continue playing when the app backgrounds and lock screen comes on. I'm setting MPNowPlayingInfoCenter's nowPlayingInfo in viewDidLoad and again on UIApplicationWillResignActiveNotification1 notification with the following:
- (void)appWillResignActive {
NSMutableDictionary *nowPlayingInfo = [[NSMutableDictionary alloc] init];
[nowPlayingInfo setObject:_session[#"session"][#"title"] forKey:MPMediaItemPropertyTitle];
[nowPlayingInfo setObject:_session[#"session"][#"author"] forKey:MPMediaItemPropertyArtist];
[nowPlayingInfo setObject:_session[#"session"][#"album"] forKey:MPMediaItemPropertyAlbumTitle];
[nowPlayingInfo setObject:[NSNumber numberWithDouble:_videoController.playableDuration] forKey:MPMediaItemPropertyPlaybackDuration];
[nowPlayingInfo setObject:#(1.0f) forKey:MPNowPlayingInfoPropertyPlaybackRate];
[nowPlayingInfo setObject:[NSNumber numberWithDouble:_videoController.currentPlaybackTime] forKey:MPNowPlayingInfoPropertyElapsedPlaybackTime];
if (_session[#"session"][#"album_art"] != nil) {
UIImage *albumArtImage = [UIImage imageNamed:_session[#"session"][#"album_art"]];
MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:albumArtImage];
[nowPlayingInfo setObject:albumArt forKey:MPMediaItemPropertyArtwork];
}
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:nowPlayingInfo];
}
The lock screen shows the current info, but it's not playing at all. The elapsed time seems about right, but the total playable time is incorrect. Pressing play doesn't start it either. I'm not sure what's wrong, any thoughts?

Related

freeing memory in a simple avaudioengine program?

I'm wondering how to free memory in this simple program that plays a file through a buffer and then stops it.
-(void)setupAudioOne
{
NSError *error;
BOOL success = NO;
_player = [[AVAudioPlayerNode alloc] init];
NSURL *hiphopOneURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"Hip Hop 1" ofType:#"caf"]];
AVAudioFile *hiphopOneFile = [[AVAudioFile alloc] initForReading:hiphopOneURL error:&error];
_playerLoopBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:[hiphopOneFile processingFormat] frameCapacity:(AVAudioFrameCount)[hiphopOneFile length]];
success = [hiphopOneFile readIntoBuffer:_playerLoopBuffer error:&error];
_engine = [[AVAudioEngine alloc] init];
[_engine attachNode:_player];
AVAudioMixerNode *mainMixer = [_engine mainMixerNode];
AVAudioFormat *stereoFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100 channels:2];
[_engine connect:_player to:mainMixer fromBus:0 toBus:0 format:stereoFormat];
[self startEngine];
}
Above is the general setup of the engine and the player node.
Then we implement the player with a play button:
- (IBAction)play:(id)sender {
if (!self.playerIsPlaying)
{
[self setupAudioOne];
[_player scheduleBuffer:_playerLoopBuffer atTime:nil options:AVAudioPlayerNodeBufferLoops completionHandler:nil];
[_player play];
}
}
And finally we stop the player with a stop button:
- (IBAction)stopHipHopOne:(id)sender {
if (self.playerIsPlaying) {
[_player stop];
}
playerIsPlaying is just a simple BOOL that determines if the _player is playing.
So basically my question is, when you hit the stop button as this program is written now, no memory will be freed.
Surely there is a simple line of code that I can add to the stop button that frees up the memory the engine and player is using?
Any thoughts?
Yes, there is. After stopping the player node, you can call:
[_engine disconnectNodeInput:_player];
[_engine detachNode:_player];
I saw you're also keeping a reference to the audio buffer, so you might want to nil that one as well. Let me know if that doesn't work for you. Something else could be leaking.

iPad capturing 16:9 photos

I am building a prototype app on iOS, and I’m cannibalizing some Apple sample code to do it (thin ice, I know—this code uses goto statements :\ ). I am using the AVCam project from Session 520 - What's New in Camera Capture. I don’t need video capture capability, just still photos.
The device inputs and outputs are set up thusly:
// Init the device inputs
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
AVCaptureDeviceInput *newAudioInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:nil];
// Setup the still image file output
AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = #{AVVideoCodecKey: AVVideoCodecJPEG};
[newStillImageOutput setOutputSettings:outputSettings];
// Create session (use default AVCaptureSessionPresetHigh)
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
// Add inputs and output to the capture session
if ([newCaptureSession canAddInput:newVideoInput]) {
[newCaptureSession addInput:newVideoInput];
}
if ([newCaptureSession canAddInput:newAudioInput]) {
[newCaptureSession addInput:newAudioInput];
}
if ([newCaptureSession canAddOutput:newStillImageOutput]) {
[newCaptureSession addOutput:newStillImageOutput];
}
[self setStillImageOutput:newStillImageOutput];
[self setVideoInput:newVideoInput];
[self setAudioInput:newAudioInput];
[self setSession:newCaptureSession];
And here is the method that’s called when I tap the shutter button:
- (void) captureStillImage
{
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
if ([stillImageConnection isVideoOrientationSupported])
[stillImageConnection setVideoOrientation:orientation];
[[self stillImageOutput]
captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
if (error)
{
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)])
{
[[self delegate] captureManager:self didFailWithError:error];
}
}
};
if (imageDataSampleBuffer != NULL)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
UIImage *image = [[UIImage alloc] initWithData:imageData];
if ([self.delegate respondsToSelector:#selector(captureManagerCapturedImage:)])
{
dispatch_async(dispatch_get_main_queue(), ^{
[self.delegate captureManagerCapturedImage:image];
});
}
[library writeImageToSavedPhotosAlbum:[image CGImage]
orientation:(ALAssetOrientation)[image imageOrientation]
completionBlock:completionBlock];
}
else
{
completionBlock(nil, error);
}
if ([[self delegate] respondsToSelector:#selector(captureManagerStillImageCaptured:)])
{
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
This code successfully captures an image and saves it to the library. However, at some point while I was working on it, it changed from capturing 5-megapixel 4:3 images to capturing 1920x1080 16:9 images. I can’t find anywhere that the aspect ratio is specified, and I didn’t change any of the code relating to the configuration of the camera, capture sessions, or capture connection. Why did my camera start taking 16:9 photos?
Update: I just re-ran Apple’s original sample code, and it appears that it is also saving 16:9 images captured directly from the video. It is quite possible that I was insane before, or I took a test shot with Camera.app and was looking at that. So my real question is, how do I show a live feed from the camera on the screen while I’m shooting, and take a full-resolution photo. I can’t use UIImagePickerController, because I need to be able to overlay things on top of the live camera feed.
Update 2: I was able to solve this by throwing out the AVCapture code I was using. It turns out that UIImagePickerController does what I needed. I didn’t realize you could overlay custom controls - I thought it took over the whole screen until you were done taking a picture.
If you're capturing frames from a video source, you'll end up with a resolution of 16:9. Capturing frames from a video source and taking photos are different things.

AVAudioRecorder doesn't record while screen is locked

I've tried to overcome this for a while. I'm trying to record sound, but the AVAudioRecorder doesn't record while screen is locked. It does continue to record once screen is unlocked, but the audio recorded when screen was locked is lost forever. I can't find anything wrong with what I'm doing:
-(void) startRecording
{
// Begin the recording session.
_session = [AVAudioSession sharedInstance];
NSError *setCategoryError = nil;
NSError *startRecordError;
[_session setActive:YES error:&startRecordError];
[self GKLog:[NSString stringWithFormat:#"recorder session error? :%#", startRecordError]];
[_session setCategory: AVAudioSessionCategoryRecord error: &setCategoryError];
if (setCategoryError) { NSLog(#"some error");}
//set me as delegate
_session.delegate=(id <AVAudioSessionDelegate>) self;
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue :[NSNumber numberWithInt:8] forKey:AVEncoderBitRateKey];
[recordSetting setValue:[NSNumber numberWithFloat:8000.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey];
if (!self.currentPath)
{
NSLog(#"can't record, no path set!");
return;
}
NSError *error;
NSURL *url=[NSURL fileURLWithPath:self.currentPath];
//Setup the recorder to use this file and record to it.
_recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&error];
[self GKLog:[NSString stringWithFormat:#" recorder:%#",_recorder]];
_recorder.delegate=(id <AVAudioRecorderDelegate>) self;
[_recorder prepareToRecord];
//Start the actual Recording
[_recorder record];
}
Any ideas, please?
Ok, so the answer to my own question, which took me a long time to find out, is the following: The code I posted is good, but to actually work it needs to work in the background after screen was locked. For this one needs to add a UIBackgroundModes array in the app's plist file, and add 'audio' as one of its objects. This tells the system to let the app work with audio in the background.
Here's the not-so-easy to find documentation. Unfortunately apple doesn't specify that in their documentation of the audio session categories where they claim certain categories work in the background. Anyway, hopefully this answer will be available for others who have a similar problem...
You may want to consider setting the category as AVAudioSessionCategoryRecord to the AudioSession
How about disabling the screen lock until you are done recording?
[UIApplication sharedApplication].idleTimerDisabled = YES;
// Do recording here
[UIApplication sharedApplication].idleTimerDisabled = NO;
Just don't forget to re-enable the screen lock when you're done!

Help on Objective C? My app keeps crashing?

I am creating an app for the Iphone/Itouch, but I keep on running into a couple of major leaks that just crash the app. In my game, right when I press play (from the home screen)It goes to another page that has the game on it. But, right after it appears, ind the console I get a Warning: Memory level=1. What could be happening? Here is my ViewDidLoad Method:
-(void)viewDidLoad {
[super viewDidLoad];
array = [[NSMutableArray alloc] initWithCapacity:100];
bulletArray = [[NSMutableArray alloc] initWithCapacity:100];
pos = CGPointMake(0.0,-5.0);
NSString *path = [[NSBundle mainBundle] pathForResource:#"GloriousMorning" ofType:#"mp3"];
AVAudioPlayer *theAudio=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL];
//float effects_Volume = [[NSUserDefaults standardUserDefaults] floatForKey:#"effectsVolume"];
//theAudio.volume = effects_Volume;
[theAudio play];
}
And also, a second question, since my game is a shooting game, the user presses a button titled "Fire". But, every time I test my app on a device, It crashes when I press the fire button. Here is my code for the fire button.
-(IBAction)Fire {
NSString *path = [[NSBundle mainBundle] pathForResource:#"gunShot" ofType:#"mp3"];
AVAudioPlayer *theAudio=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL];
//float effects_Volume = [[NSUserDefaults standardUserDefaults] floatForKey:#"effectsVolume"];
//theAudio.volume = effects_Volume;
[theAudio play];
//IBOutlet UIImageView *newBullet = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"Bullet.png"]];
UIImageView *newBullet = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"Bullet.png"]];
newBullet.frame = CGRectMake(0, 0, 10.0, 10.0);
newBullet.center = CGPointMake(239.0, 236.0);
[bulletArray addObject:newBullet];
[self.view addSubview:newBullet];
}
First, I create a sound. Then, I place a bullet right where the gun is currently located, and add it to an array so that every .01 of a second, in another bit of code, I can run through the array and check every bullet to detect collision.
Please tell me what I am doing wrong. Thanks!!!
The error when I click the Fire Button that makes the app crash is this:
GDB: Data Formatters temporarily unavailable, will retry after a 'continue'(unknown error loading shared library "
And Also I think I am making a huge leak when I try to play the audio, at least that's what someone told me. (If that is the case, please tell me how to fix it)
You alloc'd theAudio so you need to release it when you are done. Might be better to make that a ivar so you don't have to set up and tear down the audio every time they press the fire button.

iPad video loading issue

I am developing a swipe based application. On each swipe a page will appear containing a video with some other asset. I am using the following code to do this for me on each swipe.
NSArray *file = [videoFile componentsSeparatedByCharactersInSet:[NSCharacterSet characterSetWithCharactersInString:#"."]];
NSString *moviePath = [[NSBundle mainBundle] pathForResource:[file objectAtIndex:0] ofType:[file objectAtIndex:1]];
if (moviePath != nil)
{
//self.videoTimer = [NSTimer scheduledTimerWithTimeInterval:delayTime target:self selector:#selector(playVideo:) userInfo:nil repeats:NO];
self.theMovie = [[MPMoviePlayerViewController alloc] initWithContentURL:[NSURL fileURLWithPath:moviePath]];
self.theMovie.view.frame = self.bounds;
self.theMovie.moviePlayer.scalingMode = MPMovieScalingModeAspectFit;
self.theMovie.moviePlayer.controlStyle = MPMovieControlStyleNone;
self.theMovie.moviePlayer.movieSourceType = MPMovieSourceTypeFile;
[self addSubview:self.theMovie.view];
[self.theMovie release];
}
else
{
[AssetValidator alertMissingFileInfo:videoFile];
}
This works fine.
My problem is that each time I swipe video starts with a delay and black screen.
Please guide how to solve this problem.
Regards.
How big are the movies? If their size is big then it's OK. There is a solution but it's quite irrational - load all the movies to memory during app startup and runtime. Otherwise (if the movies are quite big), the delay will happen anyway.
We can use a background Image to avoid black screen.