I created a simple audio app which plays a mp3 file. It works with no problem on Simulator(iOS 5 and 6) and iPod3GS (with iOS 5.1). But when I tried on my iPhone4S(iOS 6), it seems to work but doesn't sound anything.
"[audioPlayer duration]" gives me right value, and even "audioPlayerDidFinishPlaying" message of delegate shows the right finishing of play after duration time.
My iPhone4S works with no problem in other sound apps like Music, Video or Podcast.
Only it can't sound with my own custom app.
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"banana" ofType:#"mp3"]];
NSError *error;
if (error) {
NSLog(#"error in audioPlayer:%#", [error localizedDescription]);
}
else {
self.audioPlayer.delegate = self;
[self.audioPlayer prepareToPlay];
}
Is it simply a device problem or is there any clue about this problem?
Anyone had the similar experience?
I reset setting values on my phone and I rebooted it, but still the same problem.
I tested with mp3, wav and aiff files.
Turn the silencer off.
And double-check that you have both volumes turned up.
Related
I'm using SpriteKit for my Mac OS project with Objective C and I'm trying to play a certain sound over and over again when contact between two nodes occurs. I don't want the player to wait for the sound to complete before playing it again. The sound is only about 1 second long, but it repeats as fast as every 0.5 seconds. I've tried two different methods and they both have issues. I'm probably not setting something up correctly.
Method #1 - SKAction
I tried getting one of the sprites to play the sound using the following code:
[playBarNode runAction:[SKAction playSoundFileNamed:#"metronome" waitForCompletion:NO]];
The sound plays perfectly on time, but the sound is modified. It sounds like reverb (echo) was applied to it and it has lost a lot of volume as well.
Method #2 - AVAudioPlayer
Here's the code I used to set this up:
-(void) initAudio {
NSString *path = [NSString stringWithFormat:#"%#/metronome.mp3", [[NSBundle mainBundle] resourcePath]];
NSURL *metronomeSound = [NSURL fileURLWithPath:path];
_audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:metronomeSound error:nil];
[_audioPlayer prepareToPlay];
}
Later on in code it is called like this:
[_audioPlayer play];
The issue with this one is that it seems to wait until it's completed playing the first time before playing the sound again. Basically it fails to play many times.
Am I setting this up incorrectly? How can I fix this? Thanks in advance.
Retry method 1, but instead of having the sprite play the sound, have the scene play the sound via
[self runAction:[SKAction playSoundFileNamed:#"metronome" waitForCompletion:NO]];
inside the game SKScene class
I'm trying to use MPMoviePlayerController to create an audio player without having to implement my own scrubbing and play/pause button.
I have code that records audio into NSData and saves it to disk. The method below tests audio by playing it with the MPMoviePlayerController.
The code below works (plays audio, it is heard) if I execute the method immediately after recording is done. It also works if I press home button, then return to the app.
However, when I kill the app and restart, or hit "run" from xCode, I do not hear any audio. Here are the symptoms:
The code below lists that the NSData exists on disk and has length
The path to NSData is the same both times
NSData is kAudioFormatMPEG4AAC format
The media player displays correct duration
Media player's scrubber moves from start to finish
Speaker volume is set to maximum in both cases.
No audio is heard after the app was killed and restarted.
What could be causing my MPMoviePlayerController to not provide any audio upon app relaunch? I'm writing the audio length into the file's extended attributes, could this be messing with the "Playability" of the file?
-(void)testPlayback:(AudioNote*)note
{
NSString* path = [note filepath];
if(note == nil || path.length == 0)
{
return;
}
NSURL* url = [NSURL fileURLWithPath:path];
//file exists, and data exists on disk in both cases
NSString* exists = ([note fileExists]? #"YES":#"NO");
NSUInteger length = note.fileData.length;
DLog(#"Playing note (exists: %#, data length:%i), duration: %.2f",exists,length,note.durationSeconds);
self.moviePlayer=[[MPMoviePlayerController alloc] initWithContentURL:url];
self.moviePlayer.controlStyle=MPMovieControlStyleDefault;
[self.view addSubview:self.moviePlayer.view];
[self.moviePlayer prepareToPlay];
[self.moviePlayer play];
}
i'm working on little game, and i've got problem with background music. I use AVAudioPlayer to play loop music. It's look like below:
NSString *path = [[NSBundle mainBundle] pathForResource:#"background" ofType:#"ogg"];
NSError *error;
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL URLWithString:path] error:&error];
NSLog(#"%#", [error description]);
[player setNumberOfLoops:-1];
player.delegate = self;
[player play];
I've got in Supproting Files file background.ogg, background.mp3, background.wav and no one plays. What is wrong with it?
And when i use NSLog to print description error i've got:
Error Domain=NSOSStatusErrorDomain Code=1954115647 "The operation couldn’t be completed. (OSStatus error 1954115647.)"
Please help.
Some times,just because your audio file's quality is not match what it need,such as sampling rate.Please check up whether your audio sampling rate is lower than 24kHz.
I have tested some audio,And I found that when the audio sampling rate is higher than 44kHz,it work great, but if it is lower than 44kHz & higher than 22kHz, the AudioToolbox can't play the sounds first while AVAudioPlayer can work some times. And if it is lower than 22kHz both them will can't play these audio files.(these audio's file format which I tested just "m4a","mp3","wav")
By the way,If you want to convert the low sampling rate audio to 44kHz "m4a" file use itunes,it also can't work.Don't waste your time on this.
That was so easy. There is one more thing. When you use e.g NSTimer you must ivalidate it in some place. When you use AVAudioPlayer you have to stop it in some place.
[player stop];
I have a short method which plays an audio notification using the AudioToolbox/AudioServices library (i've removed most of the functional code for brevity):
- (IBAction)thatWasEasy:(id)sender {
NSBundle *mainBundle = [NSBundle mainBundle];
NSString *filePath = [mainBundle pathForResource:#"easy" ofType:#"mp3"];
NSURL *aFileURL = [NSURL fileURLWithPath:filePath isDirectory:NO];
SystemSoundID soundID;
AudioServicesCreateSystemSoundID((__bridge_retained CFURLRef)aFileURL, &soundID) ;
AudioServicesPlaySystemSound(soundID);
}
The audio plays correctly in the simulator as well as on iPad 1 and iPad 2 devices, however on the iPad 3 (Verizon LTE version) the sound is not heard. I've checked the devices sound settings thoroughly, and also converted the audio file type to wav, and caf with no difference in behavior.
Any suggestions on troubleshooting this a bit deeper? I unfortunately do not have another iPad 3 lying around for comparison. Appreciate the help!
Figured it out. When you go to [Settings] and select [Sounds] under the [General] tab you can verify that your ringer and alerts volume level is above zero, and when moving the slider the device will chime at the newly set level. In addition you will note that all of your specific tones are enabled in the list below. This is what threw me off when checking the device settings. I launched YouTube and other applications and was able to play sound just fine.
What I didn't realize until I spoke to another guy (who knows the product a little better), is that there is another method to mute system sounds. By pressing the circle button twice an icon bar is revealed on the bottom of the screen. If you drag the icons to the right with your finger a speaker icon is revealed. This happened to be muted for some reason...presumably after I performed a system restore recently. Tap it to un-mute and your back in business. Not sure why Apple designed it this way, but it would have been helpful (and saved me a few hours of tinkering) if it had been accessible under the device settings.
I don't think you can play an mp3 file on the device using AudioServicesPlaySystemSound
AudioServicesPlaySystemSound apple document
-(void)playAif:(NSString *)filename {
// NSLog(#"play: %#", filename);
SystemSoundID soundID;
NSString *path = [[NSBundle mainBundle]
pathForResource:filename ofType:#"aif"];
if (path) { // test for path, to guard against crashes
UInt32 sessionCategory = kAudioSessionCategory_AmbientSound; // 1
AudioSessionSetProperty (
kAudioSessionProperty_AudioCategory, // 2
sizeof (sessionCategory), // 3
&sessionCategory // 4
);
AudioServicesCreateSystemSoundID((CFURLRef)[NSURL fileURLWithPath:path],&soundID);
AudioServicesPlaySystemSound (soundID);
}
}
long time reader, first time asker...
I am making a music app which uses AVAssetReader to read mp3 data from the itunes library. I need precise timing, so when I create an AVURLAsset, I use the "AVURLAssetPreferPreciseDurationAndTimingKey" to extract timing data. This has some overhead (and I have no problems when I don't use it, but I need it!)
Every thing works fine on iphone(4) and ipad(1). I would like it to work on my ipod touch (2nd gen). But it doesn't: if the sound file is too long (> ~7 minutes) then the AVAssetReader cannot start reading and throws an error ( AVFoundationErrorDomain error -11800. )
It appears that I am hitting a wall in terms of the scanter resources of the ipod touch. Any ideas what is happening, or how to manage the overhead of creating the AVURLAsset so that it can handle long files?
(I tried running this with the performance tools, and I don't see a major spike in memory).
Thanks, Dan
Maybe you're starting to read too son? As far as I understand, for mp3 it will need to go trough the entire file in order to to enable precise timing. So, try delaying the reading.
You can also try registering as an observer for some of the AVAsset properties. iOS 4.3 has 'readable' property. I've never tried it, but my guess would be it's initially set to NO and as soon as AVAsset has finished loading it gets set to YES.
EDIT:
Actually, just looked into the docs. You're supposed to use AVAsynchronousKeyValueLoading protocol for that and Apple provides an example
NSURL *url = <#A URL that identifies an audiovisual asset such as a movie file#>;
AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSArray *keys = [NSArray arrayWithObject:#"duration"];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus durationStatus = [asset statusOfValueForKey:#"duration" error:&error];
switch (durationStatus) {
case AVKeyValueStatusLoaded:
[self updateUserInterfaceForDuration];
break;
case AVKeyValueStatusFailed:
[self reportError:error forAsset:asset];
break;
case AVKeyValueStatusCancelled:
// Do whatever is appropriate for cancelation.
break;
}
}];
If 'duration' won't help try 'readable' (but like I mentioned before 'readable' requires 4.3). Maybe this will solve your issue.