iPhone MPMoviePlayerController lost sound while playing video on a real device - objective-c

I am having issue with losing "video" sound while it was playing.
I followed all the standards but the Video/Audio sometimes gets muted or just disappears at the end of the video file. I have no idea why. Does any know what might be the issue?
This only happens when running the app on the real device, I can't reproduce the issue on the simulator.
Could a "didReceiveMemory" warning cause this? I sometimes receive this message when it happens.
The video file is being streamed via a URL address, format in MPEG-4 Movie, size less than 6.2 MB.
I have the following code:
- (IBAction) playMovie:(NSString*)theUrl
setMovieType:(NSString *)theType
setPlayOption:(int)theOption
setSenderUIButton:(UIButton *) thisSender
{
NSString *getvdoUrl = [[NSString alloc] initWithString:theUrl];
NSURL *thisVdoURL = [NSURL URLWithString:getvdoUrl];
[getvdoUrl release];
getvdoUrl = nil;
MPMoviePlayerController *movieplayer = [[MPMoviePlayerController alloc]
initWithContentURL:thisVdoURL];
thisVdoURL = nil;
if (movieplayer)
{
self.vdoPlayer = movieplayer;
[movieplayer release];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(moviePlayBackDidFinish:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:self.vdoPlayer];
[self.vdoPlayer play];
}
}
-(void) moviePlayBackDidFinish:(NSNotification*)theNotification
{
MPMoviePlayerController *movieplayer= [theNotification object];
[[NSNotificationCenter defaultCenter] removeObserver:self
name:MPMoviePlayerPlaybackDidFinishNotification
object:movieplayer];
movieplayer.initialPlaybackTime = 0.0;
[movieplayer stop];
}
Below are the warning messages. I received them before the Video file started to play:
warning: Unable to read symbols for
"/Developer/Platforms/iPhoneOS.platform/DeviceSupport/3.1.3
(7E18)/Symbols/System/Library/VideoDecoders/VCH263.videodecoder"
(file not found).
warning: Unable to read symbols for
"/Developer/Platforms/iPhoneOS.platform/DeviceSupport/3.1.3
(7E18)/Symbols/System/Library/VideoDecoders/H264H1.videodecoder"
(file not found).
warning: Unable to read symbols for
"/Developer/Platforms/iPhoneOS.platform/DeviceSupport/3.1.3
(7E18)/Symbols/System/Library/VideoDecoders/MP4VH1.videodecoder"
(file not found).
2010-03-29 16:57:25.830 ....
v2[4663:207] setting
file:///private/var/mobile/Applications/7DCB1FCC-7268-4551-B737-8B418CA4A07E/tmp/MediaCache/[html]

You should try a different MP4 file, they are not all equal - they should be optimized for streaming if you are creating them from Quicktime - "hinted" if your create them with the help of mp4box for example. There could be issue with your file audio timestamps - try playing it via safari browser and/or quicktime if there would be any issues . I guess you follow the maximum bitrate/profile/level settings for H264 and AAC as well .
If you really are running out of memory in your phone, this can happen, the mediaplayer runs in separated process- not direcly in your application (I guess to allow it to use GPU decoder and sandbox it).
Are you playing this video just once in the app ? Are you removing the notification afterwards? - it is usually enough to register it just once even before playing video.
Something depends on which firmware are you compiling under and running under, bugs tend to be fixed in later versions, but from 4.0 whole API changed and code needs to be updated as well...
The warning messages are useless and common for simulator - they are just from simulator trying to reach debugging symbols for modules outside of SDK - as simulator's Quicktime runs through your desktop Quicktime - you may see your soundcard driver and more codecs there depending on your setup.

Related

Detecting AUv3 invalidations in a host app (iOS)

I'm working on a AUv3 host app, and I want the users to be notified when a loaded audio unit crashes (gets invalidated) for some reason. I looked at Apple's documentations about AUAudioUnit, but couldn't find any information about how to detect an invalidation of an audio unit (AUv3) from the host app.
Does anyone know how to do this?
Thanks
when setting up your AudioEngine you can observe kAudioComponentInstanceInvalidationNotification to figure out if its audio component instance was invalidated later on.
[[NSNotificationCenter defaultCenter] addObserverForName:kAudioComponentInstanceInvalidationNotification object:nil queue:nil usingBlock:^(NSNotification *note) {
AUAudioUnit *auUnit = (AUAudioUnit *)note.object;
NSValue *val = note.userInfo[#"audioUnit"];
AudioUnit auAddress = (AudioUnit)val.pointerValue;
NSLog(#"AudioComponent Invalidation of Unit:%# with Address:%p", auUnit, auAddress);
}];
keep in mind that you may have got the notification but the component ended up in a deadlock. So when trying to recover from that you may have to turn down the whole host app.

iOS MPMoviePlayerController does not play audio after the app was relaunched on iPad

I'm trying to use MPMoviePlayerController to create an audio player without having to implement my own scrubbing and play/pause button.
I have code that records audio into NSData and saves it to disk. The method below tests audio by playing it with the MPMoviePlayerController.
The code below works (plays audio, it is heard) if I execute the method immediately after recording is done. It also works if I press home button, then return to the app.
However, when I kill the app and restart, or hit "run" from xCode, I do not hear any audio. Here are the symptoms:
The code below lists that the NSData exists on disk and has length
The path to NSData is the same both times
NSData is kAudioFormatMPEG4AAC format
The media player displays correct duration
Media player's scrubber moves from start to finish
Speaker volume is set to maximum in both cases.
No audio is heard after the app was killed and restarted.
What could be causing my MPMoviePlayerController to not provide any audio upon app relaunch? I'm writing the audio length into the file's extended attributes, could this be messing with the "Playability" of the file?
-(void)testPlayback:(AudioNote*)note
{
NSString* path = [note filepath];
if(note == nil || path.length == 0)
{
return;
}
NSURL* url = [NSURL fileURLWithPath:path];
//file exists, and data exists on disk in both cases
NSString* exists = ([note fileExists]? #"YES":#"NO");
NSUInteger length = note.fileData.length;
DLog(#"Playing note (exists: %#, data length:%i), duration: %.2f",exists,length,note.durationSeconds);
self.moviePlayer=[[MPMoviePlayerController alloc] initWithContentURL:url];
self.moviePlayer.controlStyle=MPMovieControlStyleDefault;
[self.view addSubview:self.moviePlayer.view];
[self.moviePlayer prepareToPlay];
[self.moviePlayer play];
}

iOS: Deprecation of AudioSessionInitialize and AudioSessionSetProperty

I'm very new to Objective-C, and am trying to update some code that's about 3 years old to work with iOS 7. There are two or two instances of AudioSessionSetProperty and AudioSessionInitialize appearing in the code:
1:
- (void)applicationDidFinishLaunching:(UIApplication *)application {
AudioSessionInitialize(NULL,NULL,NULL,NULL);
[[SCListener sharedListener] listen];
timer = [NSTimer scheduledTimerWithTimeInterval: 0.5 target: self selector: #selector(tick:) userInfo:nil repeats: YES];
// Override point for customization after app launch
[window addSubview:viewController.view];
[window makeKeyAndVisible];
}
And 2:
- (id)init {
if ([super init] == nil){
return nil;
}
AudioSessionInitialize(NULL,NULL,NULL,NULL);
Float64 rate=kSAMPLERATE;
UInt32 size = sizeof(rate);
AudioSessionSetProperty (kAudioSessionProperty_PreferredHardwareSampleRate, size, &rate);
return self;
}
For some reason this code works on iOS7 in the simulator but not a device running iOS7, and I suspect that these deprecations are the cause. I've been reading through the Docs and related questions on this website, and it appears that I need to use AVAudioSession instead. I've been trying to update the code for a long time now, and I'm unsure of how to properly switch over to AVAudioSession. Does anyone know how these two methods above need to look?
Side note: I've managed to hunt down an article that outlines the transition:
https://github.com/software-mariodiana/AudioBufferPlayer/wiki/Replacing-C-functions-deprecated-in-iOS-7
But I can't seem to apply this to the code above.
The code I'm trying to update is a small frequency detection app from git:
https://github.com/jkells/sc_listener
Alternatively, if someone could point me to a sample demo app that can detect frequencies on iOS devices, that would be awesome.
As you have observed, pretty much all of the old Core Audio AudioSession functions have been deprecated in favour of AVAudioSession.
The AVAudioSession is a singleton object which will get initialised when you first call it:
[AVAudioSession sharedInstance]
There is no separate initialize method. But you will want to activate the audio session:
BOOL activated = [[AVAudioSession sharedInstance] setActive:YES error:&error];
As regards setting the hardware sample rate using AVAudioSession, please refer to my answer here:
How can I obtain the native (hardware-supported) audio sampling rates in order to avoid internal sample rate conversion?
For other compares & contrasts between Core Audio audioSession and AVFoundation's AVAudioSession here are some of my other answers around the same topic:
How Do I Route Audio to Speaker without using AudioSessionSetProperty?
use rear microphone of iphone 5
Play audio through upper (phone call) speaker
How to control hardware mic input gain/level on iPhone?
I wrote a short tutorial that discusses how to update to the new AVAudioSession objects. I posted it on GitHub: "Replacing C functions deprecated in iOS 7."

Sound only plays in iPhone Simulator, Not on Device [duplicate]

I created a simple audio app which plays a mp3 file. It works with no problem on Simulator(iOS 5 and 6) and iPod3GS (with iOS 5.1). But when I tried on my iPhone4S(iOS 6), it seems to work but doesn't sound anything.
"[audioPlayer duration]" gives me right value, and even "audioPlayerDidFinishPlaying" message of delegate shows the right finishing of play after duration time.
My iPhone4S works with no problem in other sound apps like Music, Video or Podcast.
Only it can't sound with my own custom app.
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"banana" ofType:#"mp3"]];
NSError *error;
if (error) {
NSLog(#"error in audioPlayer:%#", [error localizedDescription]);
}
else {
self.audioPlayer.delegate = self;
[self.audioPlayer prepareToPlay];
}
Is it simply a device problem or is there any clue about this problem?
Anyone had the similar experience?
I reset setting values on my phone and I rebooted it, but still the same problem.
I tested with mp3, wav and aiff files.
Turn the silencer off.
And double-check that you have both volumes turned up.

UIImageWriteToSavedPhotosAlbum and ALAssetsLibrary not saving an image, no error either

I am trying to save an image to the camera roll. This actually used to work wonderfully, but I had to work on other stuff and now I'm returning to the project to update it for iOS 6 and poof this feature no longer works at all on iOS6.
I have tried two approaches, both are failing silently without NSError objects. First, UIImageWriteToSavedPhotosAlbum:
UIImageWriteToSavedPhotosAlbum(img, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
// Callback
-(void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo
{
// error == nil
}
... and the ALAssetsLibrary approach:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[img CGImage]
orientation:(ALAssetOrientation)[img imageOrientation]
completionBlock:^(NSURL *assetURL, NSError *error)
{
// assetURL == nil
// error == nil
}
Also, [ALAssetsLibrary authorizationStatus] == ALAuthorizationStatusAuthorized evaluates to true
On the Simulator, the app never shows up in the Settings > Privacy > Photos section, however on an actual iPad they do show that the app has permission to access photos. (Also, just to add: The first approach above was what I previously used - it worked on real devices & simulators alike, no problem).
I have also tried running this on the main thread to see if that changed anything - no difference. I was running it on the background previously and it used to work fine (on both simulator and device).
Can anyone shed some light?
Figured it out... I was doing something stupid. UIImage cannot take raw pixel data, you have to first massage it into a form it can accept, with the proper metadata.
Part of the problem was that I was using Cocos2D to get a UIImage from a CCRenderTexture (getUIImageFromBuffer()) and when I switched to Cocos2D-x that function was no longer available, and I simply was ignorant to the fact that UIImage objects cannot be constructed with raw pixel data, I figured it handled header information & formatting automatically.
This answer helped: iPhone - UIImage imageWithData returning nil
And this example was also helpful:
http://www.wmdeveloper.com/2010/09/create-bitmap-graphics-context-on.html?m=1