New to IOS dev, I'm testing AVAudioplayer to play sound on iPad2 (Xcode 4.2 project, ARC/storyboard enabled). Sound plays ok in simulator and no error. No error on device either but no sound.
Been browsing this fine resource temple, but nothing I've tried based on feedback here has produced anything but deafening iPad silence. Could someone help? My .h:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController
<AVAudioPlayerDelegate>
{
AVAudioPlayer *audioPlayer;
UISlider *volumeControl;
UILabel *timerLabel;
NSTimer *playbackTimer;
}
#property (nonatomic, retain) IBOutlet UISlider *volumeControl;
#property (nonatomic, retain) IBOutlet UILabel *timerLabel;
#property (nonatomic, retain) NSTimer *playbackTimer;
#property (nonatomic, strong) AVAudioPlayer *audioPlayer;
-(IBAction) playAudio;
-(IBAction) stopAudio;
-(IBAction) adjustVolume;
#end
My .m:
#import "ViewController.h"
#implementation ViewController
#synthesize volumeControl, timerLabel, playbackTimer, audioPlayer;
-(void)playAudio
{
playbackTimer = [NSTimer scheduledTimerWithTimeInterval:1.0
target:self
selector:#selector(updateTime)
userInfo:nil
repeats:YES];
[audioPlayer play];
}
-(void)stopAudio
{
[playbackTimer invalidate];
[audioPlayer stop];
}
-(void)adjustVolume
{
if (audioPlayer != nil)
{
audioPlayer.volume = volumeControl.value;
}
}
-(void)updateTime
{
float minutes = floor(audioPlayer.currentTime/60);
float seconds = audioPlayer.currentTime - (minutes * 60);
float duration_minutes = floor(audioPlayer.duration/60);
float duration_seconds =
audioPlayer.duration - (duration_minutes * 60);
NSString *timeInfoString = [[NSString alloc]
initWithFormat:#"%0.0f.%0.0f / %0.0f.%0.0f",
minutes, seconds,
duration_minutes, duration_seconds];
timerLabel.text = timeInfoString;
}
-(void)audioPlayerDidFinishPlaying:
(AVAudioPlayer *)player successfully:(BOOL)flag
{
}
-(void)audioPlayerDecodeErrorDidOccur:
(AVAudioPlayer *)player error:(NSError *)error
{
}
-(void)audioPlayerBeginInterruption:(AVAudioPlayer *)player
{
}
-(void)audioPlayerEndInterruption:(AVAudioPlayer *)player
{
}
my viewDidLoad:
- (void)viewDidLoad {
[super viewDidLoad];
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"song"
ofType:#"mp3"]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc]
initWithContentsOfURL:url
error:&error];
if (error)
{
NSLog(#"Error in audioPlayer: %#",
[error localizedDescription]);
} else {
audioPlayer.delegate = self;
[audioPlayer prepareToPlay];
}
[super viewDidLoad];
}
Make sure that the file is indeed an mp3 format. Make sure that you are copying the file into the bundle, and not playing off of local path on your desktop. Check the device volume. Check the return BOOL from the play call. all of these are possible explanations.
Is it not playing sounds at all? No sounds even with headphones plugged in? If it's just no sound through the built-in speaker, but sounds through headphones, make sure that your device ring/sounds volume isn't muted. Check the toggle switch on the side (if you have that set to mute vs orientation lock). The bell shouldn't be crossed out. Just because you press the volume up button doesn't mean it's unmuted from the speaker. Have you tested YouTube videos or music files to ensure your iPad isn't having hardware issues?
Related
I have been experimenting with AudioKit and have made a sample app to try and plot recording audio and audio from playback. I am seeing an issue though, when I record or playback audio, the rolling waveform doesn't show up in the view on a device. It shows up perfectly fine on sim (11.4) however. I've provided the recording view controller code below for context in how I'm trying to implement this while recording audio.
Any help or being pointed in the general direction would be greatly appreciated.
RecordingVC.m code:
#import "FirstViewController.h"
#interface FirstViewController ()
#end
#implementation FirstViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self setupConfig];
[self setupUI];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (void) setupUI
{
//Configure waveform view
self.recordingPlotView.gain = 2;
self.recordingPlotView.backgroundColor = [UIColor colorWithRed: .10 green: .10 blue: .10 alpha: 1];
self.recordingPlotView.color = [UIColor colorWithRed: .44 green: .44 blue: .44 alpha: 1];
self.recordingPlotView.plotType = EZPlotTypeRolling;
self.recordingPlotView.shouldFill = YES;
self.recordingPlotView.shouldMirror = YES;
[self.view addSubview: self.recordingPlotView];
}
- (void) setupConfig
{
self.isRecording = NO;
[AKSettings setAudioInputEnabled: true];
[AKSettings setPlaybackWhileMuted: true];
[AVAudioSession.sharedInstance setCategory: AVAudioSessionCategoryAmbient withOptions: kAudioSessionProperty_OverrideCategoryDefaultToSpeaker error: nil];
self.mic = [[EZMicrophone alloc] initWithMicrophoneDelegate: self];
}
#pragma mark - EZMicrophone Delegate methods
- (void) microphone:(EZMicrophone *)microphone
hasAudioReceived:(float **)buffer
withBufferSize:(UInt32)bufferSize
withNumberOfChannels:(UInt32)numberOfChannels
{
__weak typeof (self) weakSelf = self;
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf.recordingPlotView updateBuffer:buffer[0]
withBufferSize:bufferSize];
});
}
- (void) microphone:(EZMicrophone *)microphone
hasBufferList:(AudioBufferList *)bufferList
withBufferSize:(UInt32)bufferSize
withNumberOfChannels:(UInt32)numberOfChannels
{
if (self.isRecording)
{
[self.recorder appendDataFromBufferList:bufferList
withBufferSize:bufferSize];
}
}
#pragma mark - EZRecorder Delegate methods
- (void)recorderDidClose:(EZRecorder *)recorder
{
self.recorder.delegate = nil;
}
#pragma mark - Utils
- (NSArray *)applicationDocuments
{
return NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
}
- (NSString *)applicationDocumentsDirectory
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *basePath = ([paths count] > 0) ? [paths objectAtIndex:0] : nil;
return basePath;
}
- (NSURL *)testFilePathURL
{
return [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/%#",
[self applicationDocumentsDirectory],
#"test2.m4a"]];
}
#pragma mark - user Interaction
- (IBAction)playButtonTapped:(id)sender {
if (self.isRecording)
{
self.isRecording = NO;
self.playButton.titleLabel.text = #"Record";
[self.mic stopFetchingAudio];
}
else
{
self.isRecording = YES;
self.playButton.titleLabel.text = #"Pause";
[self.mic startFetchingAudio];
self.recorder = [EZRecorder recorderWithURL: [self testFilePathURL] clientFormat: [self.mic audioStreamBasicDescription] fileType: EZRecorderFileTypeM4A delegate: self];
}
}
- (IBAction)stopButtonTapped:(id)sender {
if (self.isRecording)
{
self.isRecording = NO;
self.playButton.titleLabel.text = #"Record";
[self.mic stopFetchingAudio];
[self.recorder closeAudioFile];
}
[self.recordingPlotView clear];
self.recorder = nil;
}
#end
RecordingVC.h code:
#import <UIKit/UIKit.h>
#import AudioKit;
#import AudioKitUI;
#interface FirstViewController : UIViewController <EZMicrophoneDelegate, EZRecorderDelegate>
#property (strong, nonatomic) IBOutlet EZAudioPlot *recordingPlotView;
#property (nonatomic, strong) EZMicrophone* mic;
#property (nonatomic, strong) EZRecorder* recorder;
#property (nonatomic, assign) BOOL isRecording;
#property (strong, nonatomic) IBOutlet UIButton *playButton;
#end
Small Update:
I've managed to get the playback waveform displaying on device by setting the gain in interface builder, even though I was setting it in code during viewDidLoad().
I've tried doing the same (setting the gain for the plot in interface builder) for the recording VC (the code above) but that did solve this as it did for the playback VC.
I ran your project and it works on the device the same as the simulator except that the simulator's microphone is the computer's and seems much more sensitive than on the device, so I had to set the gain higher:
self.recordingPlotView.gain = 20;
before I noticed the waveform.
I am trying to create an app where I can send information from an apple watch to my ios Parent App. I have written the code for it but when I run the WatchConnectivity App, the information does not transfer between the apple watch and the parent ios app. This may be a problem with my code or it may be because for some reason the watch does not start with the app. I have to go to the simulator and click on the app to get it started. Is this why my code is not working?
InterfaceController.m
#import "InterfaceController.h"
#import <WatchConnectivity/WatchConnectivity.h>
#interface InterfaceController() <WCSessionDelegate>
#property (strong, nonatomic) WCSession *session;
#end
#implementation InterfaceController
-(instancetype)init {
self = [super init];
if (self) {
if ([WCSession isSupported]) {
self.session = [WCSession defaultSession];
self.session.delegate = self;
[self.session activateSession];
}
}
return self;
}
- (IBAction)catPressed {
[self sendText:#"cat"];
}
- (IBAction)dogPressed {
[self sendText:#"dog"];
}
- (IBAction)pandaPressed {
[self sendText:#"panda"];
}
- (IBAction)bunnyPressed {
[self sendText:#"bunny"];
}
-(void)sendText:(NSString *)text {
NSDictionary *applicationDict = #{#"emoji":text};
[self.session updateApplicationContext:applicationDict error:nil];
}
ViewController.m
#import "ViewController.h"
#import <WatchConnectivity/WatchConnectivity.h>
#interface ViewController () <WCSessionDelegate>
#property (weak, nonatomic) IBOutlet UILabel *textLabel;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
if ([WCSession isSupported]) {
WCSession *session = [WCSession defaultSession];
session.delegate = self;
[session activateSession];
NSLog(#"HIIII");
}
}
- (void)session:(nonnull WCSession *)session didReceiveApplicationContext:(nonnull NSDictionary<NSString *,id> *)applicationContext {
NSString *text = [applicationContext objectForKey:#"text"];
dispatch_async(dispatch_get_main_queue(), ^{
[self.textLabel setText:[NSString stringWithFormat:#"Text: %#", text]];
});
}
It turns out that I needed to open the parent app on the iPhone first to start sharing information between the iPhone and Watch. Thanks to MSU_Bulldog for suggesting this idea.
So i've done the tutorial where you code a button so that when you press it a sound plays. I'm trying to modify it so that when the button is pressed, a random sound plays.
here is the code:
viewcontroller.h
#import <UIKit/UIKit.h>
#import <AudioToolbox/AudioToolbox.h>
#interface STViewController : UIViewController
- (IBAction)playAudio:(id)sender;
#property (nonatomic, strong) NSArray *sounds;
#end
viewcontroller.m
#import <AVFoundation/AVFoundation.h>
#import "STViewController.h"
#interface STViewController ()
#property (weak, nonatomic) IBOutlet UIButton *playAudio;
#end
#implementation STViewController
- (IBAction)playAudio:(id)sender {
AVAudioPlayer *audioPlayer;
NSString *audioPath = [[NSBundle mainBundle] pathForResource:#"Woof" ofType:#"mp3"];
NSURL *audioURL = [NSURL fileURLWithPath:audioPath];
NSError *audioError = [[NSError alloc] init];
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:audioURL error:&audioError];
if (!audioError) {
[audioPlayer play];
NSLog(#"Woof!");
}
else {
NSLog(#"Error!");
}
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
#end
I've been dabbling with something along these lines
- (NSArray *)sounds
{
NSArray *sounds = [NSArray arrayWithObjects:
#"Woof.mp3",
#"Meow.mp3",
#"tweet.mp3",
#"Squeak.mp3",
#"Moo.mp3",
#"Croak.mp3",
#"Toot.mp3",
#"Quack.mp3",
#"Blub.mp3",
#"OWOwOw.mp3",
#"Fox.mp3",
nil];
return sounds;
}
but i'm not really sure how to make it random or even implement in in the code that I have going right now. Anyone have any ideas?
Just make it random try below:-
NSMutableArray *array=[NSMutableArray
arrayWithObjects:
#"Woof.mp3",
#"Meow.mp3",
#"tweet.mp3",
#"Squeak.mp3",
#"Moo.mp3",
#"Croak.mp3",
#"Toot.mp3",
#"Quack.mp3",
#"Blub.mp3",
#"OWOwOw.mp3",
#"Fox.mp3",
nil];
// now use exchangeobject with index api
int i=0;
for(i=0;i<=[array count]; i++)
{
NSInteger rand=(arc4random() %10);
[array exchangeObjectAtIndex:i
withObjectAtIndex:rand];
}
When calling the numberOfLoops method like so:
[_player setNumberOfLoops:-1];
I get the following error:
-[AVPlayer setNumberOfLoops:]: unrecognized selector sent to instance 0x7d52d30
How can this be fixed?
Code:
Header:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController {
}
#property (strong, nonatomic) AVAudioPlayer *player;
- (IBAction)playMusic:(id)sender;
#end
Implementation:
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)playMusic:(id)sender {
_player = [AVPlayer playerWithURL:[NSURL URLWithString:#"http://urlpath.wav"]];
[_player setNumberOfLoops:-1];
[_player prepareToPlay];
[_player play];
}
#end
Thank you for your time,
Yoni201.
You've created an instance of AVPlayer, not an instance of AVAudioPlayer. It looks like you want to be creating an AVAudioPlayer instead (as is indicated by your choice of that class for the actual player property on your class. AVAudioPlayer actually has the numberOfLoops property, while AVPlayer does not. For more information, see the documentation for AVAudioPlayer and AVPlayer.
AVPlayer doesn't have a numberOfLoops property. That is a property of `AVAudioPlayer. Don't ignore compiler warnings when you build your app.
Also, you defined _player to be an AVAudioPlayer but you alloc/init AVPlayer.
Change your code to:
NSError *error = nil;
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL URLWithString:#"http://urlpath.wav"] error:&error];
if (player) {
[player setNumberOfLoops:-1];
[player prepareToPlay];
[player play];
self.player = player;
} else {
NSLog(#"Error create audio player: %#", error);
}
I think that the method you are calling doesn't exist (acceosing to your error).
Try: _player.numberOfLoops=-1
I'v googled the problem but couldn't find a similar case.
I've used this tutorial to play the testSound.mp3 - File when hitting a button on the iPad-Simulator:
http://mobileorchard.com/easy-audio-playback-with-avaudioplayer/
It works that way (it plays the sound), if the playSound-Method is in my ViewController.m, but not in my Sound.m (which has a identical method).
The Code gets executed (NSLog says: "Sound.m playSound executed"), but there is no sound at all.
I'd really appreciate some help here, guess I'm totally stuck... :(
Best regards,
- Teapot
// ViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import "Sound.h"
#interface ViewController : UIViewController {
AVAudioPlayer *audioPlayer;
}
- (IBAction)pressButton:(id)sender;
- (void)playSound: (NSString*) soundFile volume : (NSInteger) volume repeats : (NSInteger) repeats;
#end
// ViewController.m
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading thea view, typically from a nib.
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)pressButton:(id)sender {
NSLog (#"Method: pressButton");
[self playSound: #"testSound.mp3" volume: 2 repeats: 2 url : url]; //It works!
Sound *tempSound = [[Sound alloc] init];
[tempSound playSound: #"testSound.mp3" volume: 2 repeats: 2]; // Doesn't work. -> Says "Sound.m playSound executed", but there is no Sound.
}
- (void)playSound: (NSString*) soundFile volume : (NSInteger) volume repeats : (NSInteger) repeats {
NSLog(#"ViewControler playSound");
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = -1;
if (audioPlayer == nil){
NSLog([error description]);
NSLog(#"ViewController.m playSound NOT executed");
}
else{
[audioPlayer play];
NSLog(#"ViewController.m playSound executed");
}
}
#end
// Sound.h
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#interface Sound : NSObject {
AVAudioPlayer *audioPlayer;
}
- (void) playSound: (NSString*) soundFile volume : (NSInteger) volume repeats : (NSInteger) repeats;
#end
// Sound.m
#import "Sound.h"
#implementation Sound
- (void)playSound: (NSString*) soundFile volume : (NSInteger) volume repeats : (NSInteger) repeats {
NSLog(#"Sound playSound");
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = -1;
if (audioPlayer == nil){
NSLog([error description]);
NSLog(#"Sound.m playSound NOT executed");
}
else{
[audioPlayer play];
NSLog(#"Sound.m playSound executed");
}
}
#end
There are some inconsistencies in your code: playSound: has an NSString parameter, but AVAudioPlayer inside that method uses a NSURL. Then you set numberOfLoops = -1 (which means infinite repetition) instead of numberOfLoops = repeat.
But the main problem is that here (assuming that you compile with "Automatic Reference Counting")
Sound *tempSound = [[Sound alloc] init];
[tempSound playSound: #"testSound.mp3" volume: 2 repeats: 2];
the tempSound object is deallocated when the pressButton: is left, because no strong references to that object exist anymore.
If you add an instance variable (or property) sound to the view controller class, and assign to that
sound = [[Sound alloc] init];
[sound playSound: #"testSound.mp3" volume: 2 repeats: 2];
then it should work as expected.
Alternatively, you could prevent the Sound object from being deallocated too early by maintaining a "self reference" inside the object, which is removed only when the sound has finished playing:
#interface Sound () <AVAudioPlayerDelegate>
#property(strong, nonatomic) AVAudioPlayer *audioPlayer;
#property(strong, nonatomic) Sound *selfRef;
#end
#implementation Sound
- (void)playSound:(NSString *)soundFile volume:(NSInteger)volume repeats:(NSInteger)repeats
{
NSLog(#"Sound playSound");
NSURL *soundURL = [[NSBundle mainBundle] URLForResource:soundFile withExtension:nil];
NSError *error;
self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:soundURL error:&error];
if (self.audioPlayer == nil) {
NSLog(#"%#", [error description]);
NSLog(#"Sound.m playSound NOT executed");
} else{
self.audioPlayer.numberOfLoops = repeats;
self.audioPlayer.delegate = self;
[self.audioPlayer play];
self.selfRef = self; // self reference to avoid deallocation
NSLog(#"Sound.m playSound executed");
}
}
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
self.selfRef = nil; // remove self reference
}
#end
Of course, you shouldn't do this with "infinite repetition"!