Text To Speech And Speech To Text Recognition -->self - Recognition is occurring - objective-c

I want to developing an app which should support speech to text and text to speech ,
i)Speech to Text- Procedure-i have used Speech framework for speech to text ,whenever i open a app and if i start speaking ,the app should recognize the voice and should convert the speech into text .This is working
ii)Text to Speech - Procedure -i have used AVFoundation and MediaPlayer library If user press the play button it should convert the text i.e,whatever appear in the screen into speech.Working now .
Here is the problem is am facing
while processing text to speech ,the speech recognizer recognizes the playing voice and printing the words again in textbox.
Example- if i say "Hello Good Morning" it is printing in text box and then if i press a play button it is playing a voice Hello Good Morning but at this time speech to text recognize recognize this voice i mean self- recognition and it's printing "Hello Good Morning Hello Good Morning"
I want to stop the Speech To Text Process while processing the Text To Speech
For this , i have stopped speech recognition request while playing the speech
here is the code,
#implementation ViewController
{
SFSpeechAudioBufferRecognitionRequest *recognitionRequest;
SFSpeechRecognitionTask *recognitionTask;
AVAudioEngine *audioEngine;
NSMutableArray *speechStringsArray;
BOOL SpeechToText;
NSString* resultString;
NSString *str ;
NSString *searchString;
NSString *textToSpeak;
}
- (void)viewDidLoad {
[super viewDidLoad];
//Speech To Text ****
speechStringsArray = [[NSMutableArray alloc]init];
// Initialize background audio session
NSError *error = NULL;
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayback error:&error];
if(error) {
NSLog(#"#error: %#", error);
}
[session setActive:YES error:&error];
if (error) {
NSLog(#"#error: %#", error);
}
// Enabled remote controls
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
// Voice setup
self.voicePicker.delegate = self;
self.voice = [AVSpeechSynthesisVoice voiceWithLanguage:#"en-us"];
self.voices = [NSMutableArray arrayWithObjects:
#{#"voice" : #"en-us", #"label" : #"American English (Female)"},
#{#"voice" : #"en-au", #"label" : #"Australian English (Female)"},
#{#"voice" : #"en-gb", #"label" : #"British English (Male)"},
#{#"voice" : #"en-ie", #"label" : #"Irish English (Female)"},
#{#"voice" : #"en-za", #"label" : #"South African English (Female)"},
nil];
// Synthesizer setup
self.synthesizer = [[AVSpeechSynthesizer alloc] init];
self.synthesizer.delegate = self;
// UITextView delegate
self.textView.delegate = self;
// This notifcation is generated from the AppDelegate applicationDidBecomeActive method to make sure that if the play or pause button is updated in the background then the button will be updated in the toolbar
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(updateToolbar) name:#"updateToolbar" object:nil];
}
-(void)viewDidAppear:(BOOL)animated
{
self.speechRecognizer = [[SFSpeechRecognizer alloc]initWithLocale:[NSLocale localeWithLocaleIdentifier:#"en-US en-UK"]];
self.speechRecognizer.delegate = self;
audioEngine = [[AVAudioEngine alloc]init];
[SFSpeechRecognizer requestAuthorization:^(SFSpeechRecognizerAuthorizationStatus authStatus) {
switch (authStatus) {
case SFSpeechRecognizerAuthorizationStatusAuthorized:
//User gave access to speech recognition
NSLog(#"Authorized");
[self start_record];
break;
case SFSpeechRecognizerAuthorizationStatusDenied:
//User denied access to speech recognition
NSLog(#"AuthorizationStatusDenied");
break;
case SFSpeechRecognizerAuthorizationStatusRestricted:
//Speech recognition restricted on this device
NSLog(#"AuthorizationStatusRestricted");
break;
case SFSpeechRecognizerAuthorizationStatusNotDetermined:
//Speech recognition not yet authorized
break;
default:
NSLog(#"Default");
break;
}
}];
//MARK : Interface Builder Actions
}
****coding for increase the speed and pitch****
- (IBAction)handleSpeedStepper:(UIStepper *)sender
{
double speedValue = self.speedStepper.value;
[self.speedValueLabel setText:[NSString stringWithFormat:#"%.1f", speedValue]];
}
- (IBAction)handlePitchStepper:(UIStepper *)sender
{
double pitchValue = self.pitchStepper.value;
[self.pitchValueLabel setText:[NSString stringWithFormat:#"%.1f", pitchValue]];
}
//Play button for text to speech
- (IBAction)handlePlayPauseButton:(UIBarButtonItem *)sender
{
if (self.synthesizer.speaking && !self.synthesizer.paused) {
if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
// Stop immediately
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
}
else {
// Stop at end of current word
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
}
[self updateToolbarWithButton:#"play"];
}
else if (self.synthesizer.paused) {
[self.synthesizer continueSpeaking];
[self updateToolbarWithButton:#"pause"];
}
else {
[self speakUtterance];
[self updateToolbarWithButton:#"pause"];
}
}
//method for speech to text
-(void)start_record{
NSError * outError;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&outError];
[audioSession setMode:AVAudioSessionModeMeasurement error:&outError];
[audioSession setActive:YES withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&outError];
recognitionRequest = [[SFSpeechAudioBufferRecognitionRequest alloc]init];
AVAudioInputNode *inputNode = audioEngine.inputNode;
if (recognitionRequest == nil) {
NSLog(#"Unable to created a SFSpeechAudioBufferRecognitionRequest object");
}
if (inputNode == nil) {
NSLog(#"Audio engine has no input node ");}
//configure request so that results are returned before audio recording is finished
[recognitionRequest setShouldReportPartialResults:YES];
// A recognition task represents a speech recognition session.
//We keep a reference to the task so that it can be cancelled .
recognitionTask = [self.speechRecognizer recognitionTaskWithRequest:recognitionRequest resultHandler:^(SFSpeechRecognitionResult * result, NSError * error1) {
BOOL isFinal = false;
if ((result = result)) {
NSString *speech = result.bestTranscription.formattedString;
NSLog(#"the speech:%#",speech);
// coding for fixing append string issue
for (int i = 0 ;i <speechStringsArray.count;i++)
{
str = [speechStringsArray objectAtIndex:i];
NSRange range = [speech rangeOfString:str options:NSCaseInsensitiveSearch];
NSLog(#"found: %#", (range.location != NSNotFound) ? #"Yes" : #"No");
if (range.location != NSNotFound) {
resultString = [speech stringByReplacingCharactersInRange:range withString:#""];
speech = resultString;
NSLog(#" the result is : %#",resultString);
}
}
//specific functions - space for second word
if (resultString.length>0) {
self.textView.text = [NSString stringWithFormat:#"%#%#",self.textView.text,resultString];
[speechStringsArray addObject:resultString]; }
//specific function space for first word -Working fine
else
{
[speechStringsArray addObject:speech];
self.textView.text = speech;
}
}
NSLog(#" array %#",speechStringsArray);
isFinal = result.isFinal;
}
if (error1 != nil || isFinal) {
[audioEngine stop];
[inputNode removeTapOnBus:0];
recognitionRequest = nil;
recognitionTask = nil;
[self start_record];
}}];
AVAudioFormat *recordingFormat = [inputNode outputFormatForBus:0];
[inputNode installTapOnBus:0 bufferSize:1024 format:recordingFormat block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when){
[recognitionRequest appendAudioPCMBuffer:buffer];}
];
NSError *error1;
[audioEngine prepare];
[audioEngine startAndReturnError:&error1];}
- (void)speakUtterance
{
NSLog(#"speakUtterance");
didStartSpeaking = NO;
textToSpeak = [NSString stringWithFormat:#"%#", self.textView.text];
AVSpeechUtterance *utterance = [[AVSpeechUtterance alloc] initWithString:textToSpeak];
utterance.rate = self.speedStepper.value;
utterance.pitchMultiplier = self.pitchStepper.value;
utterance.voice = self.voice;
[self.synthesizer speakUtterance:utterance];
[self displayBackgroundMediaFields];
}
- (void)displayBackgroundMediaFields
{
MPMediaItemArtwork *artwork = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:#"Play"]];
NSDictionary *info = #{ MPMediaItemPropertyTitle: self.textView.text,
MPMediaItemPropertyAlbumTitle: #"TextToSpeech App",
MPMediaItemPropertyArtwork: artwork};
[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = info;
}
- (void)updateToolbar
{
if (self.synthesizer.speaking && !self.synthesizer.paused) {
[self updateToolbarWithButton:#"pause"];
}
else {
[self updateToolbarWithButton:#"play"];
}}
- (void)updateToolbarWithButton:(NSString *)buttonType
{
//stopping the speech to text process
if (audioEngine.isRunning) {
[audioEngine stop];
[recognitionRequest endAudio];
}
NSLog(#"updateToolbarWithButton: %#", buttonType);
UIBarButtonItem *audioControl;
if ([buttonType isEqualToString:#"play"]) {
// Play
audioControl = [[UIBarButtonItem alloc]initWithBarButtonSystemItem:UIBarButtonSystemItemPlay target:self action:#selector(handlePlayPauseButton:)];
}
else {
// Pause
audioControl = [[UIBarButtonItem alloc]initWithBarButtonSystemItem:UIBarButtonSystemItemPause target:self action:#selector(handlePlayPauseButton:)];
}
UIBarButtonItem *flexibleItem = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemFlexibleSpace target:nil action:nil];
[self.toolbar setItems:#[flexibleItem, audioControl, flexibleItem]];
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)receivedEvent
{
NSLog(#"receivedEvent: %#", receivedEvent);
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlPlay:
NSLog(#"UIEventSubtypeRemoteControlPlay");
if (self.synthesizer.speaking) {
[self.synthesizer continueSpeaking];
}
else {
[self speakUtterance];
}
break;
case UIEventSubtypeRemoteControlPause:
NSLog(#"pause - UIEventSubtypeRemoteControlPause");
if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
// Pause immediately
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
}
else {
// Pause at end of current word
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
}
break;
case UIEventSubtypeRemoteControlTogglePlayPause:
if (self.synthesizer.paused) {
NSLog(#"UIEventSubtypeRemoteControlTogglePlayPause");
[self.synthesizer continueSpeaking];
}
else {
NSLog(#"UIEventSubtypeRemoteControlTogglePlayPause");
if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
// Pause immediately
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
}
else {
// Pause at end of current word
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
}
}
break;
case UIEventSubtypeRemoteControlNextTrack:
NSLog(#"UIEventSubtypeRemoteControlNextTrack - appropriate for playlists");
break;
case UIEventSubtypeRemoteControlPreviousTrack:
NSLog(#"UIEventSubtypeRemoteControlPreviousTrack - appropriatefor playlists");
break;
default:
break;
}
}
}
pragma mark UIPickerViewDelegate Methods
- (NSInteger)numberOfComponentsInPickerView:(UIPickerView *)pickerView
{
return 1;
}
- (NSInteger)pickerView:(UIPickerView *)pickerView numberOfRowsInComponent:(NSInteger)component
{
return self.voices.count;
}
- (UIView *)pickerView:(UIPickerView *)pickerView viewForRow:(NSInteger)row forComponent:(NSInteger)component reusingView:(UIView *)view
{
UILabel *rowLabel = [[UILabel alloc] init];
NSDictionary *voice = [self.voices objectAtIndex:row];
rowLabel.text = [voice objectForKey:#"label"];
return rowLabel;
}
- (void)pickerView:(UIPickerView *)pickerView didSelectRow: (NSInteger)row inComponent:(NSInteger)component
{
NSDictionary *voice = [self.voices objectAtIndex:row];
NSLog(#"new picker voice selected with label: %#", [voice objectForKey:#"label"]);
self.voice = [AVSpeechSynthesisVoice voiceWithLanguage:[voice objectForKey:#"voice"]];
}
pragma mark SpeechSynthesizerDelegate methods
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer didFinishSpeechUtterance:(AVSpeechUtterance *)utterance
{
// This is a workaround of a bug. When we change the voice the first time the speech utterence is set fails silently. We check that the method willSpeakRangeOfSpeechString is called and set didStartSpeaking to YES there. If this method is not called (silent fail) then we simply request to speak again.
if (!didStartSpeaking) {
[self speakUtterance];
}
else {
[self updateToolbarWithButton:#"play"];
NSLog(#"the text are:%#",self.textView.text);
}}
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer willSpeakRangeOfSpeechString:(NSRange)characterRange utterance:(AVSpeechUtterance *)utterance
{
didStartSpeaking = YES;
//[self setTextViewTextWithColoredCharacterRange:characterRange];
}
#pragma mark UITextViewDelegate Methods
- (BOOL)textView:(UITextView *)textView shouldChangeTextInRange:(NSRange)range replacementText:(NSString *)text {
if([text isEqualToString:#"\n"]) {
[textView resignFirstResponder];
return NO;
}
return YES;
}

Dont initialize all the things in the ViewDidLoad. When You tap on button to convert text to speech, at that that time make the speech to text conversion object as nil also set the delegate the nil. Same things for the vice versa also.

Related

This item cannot be shared. Please select a different item when sharing on whatsApp

first time app install and share image is not working, it show alert:
This item cannot be shared. Please select a different item.
and I will try second time image share successfully. What is the issue?
-(void)TwitterAndmanyMore
{
UIImage * image = _imageView.image;
NSArray * items = #[image];
UIActivityViewController *controller = [[UIActivityViewController alloc]initWithActivityItems:items applicationActivities:nil];
// and present it
[self presentActivityController:controller];
}
- (void)presentActivityController:(UIActivityViewController *)controller {
// for iPad: make the presentation a Popover
controller.modalPresentationStyle = UIModalPresentationPopover;
[self presentViewController:controller animated:YES completion:nil];
UIPopoverPresentationController *popController = [controller popoverPresentationController];
popController.permittedArrowDirections = UIPopoverArrowDirectionAny;
popController.barButtonItem = self.navigationItem.leftBarButtonItem;
// access the completion handler
controller.completionWithItemsHandler = ^(NSString *activityType,
BOOL completed,
NSArray *returnedItems,
NSError *error)
{
// react to the completion
NSLog(#"----retu------%#",returnedItems);
if (completed)
{
[self showContine];
//[self viewWillAppear:YES];
// user shared an item
NSLog(#"We used activity type%#", activityType);
} else {
// user cancelled
NSLog(#"We didn't want to share anything after all.");
}
if (error) {
NSLog(#"An Error occured: %#, %#", error.localizedDescription, error.localizedFailureReason);
}
};
}
use below code for this issue it work
-(void)TwitterAndmanyMore
{
// UIImage * image = _imageView.image;
// NSLog(#"Image Data %#",image);
// NSArray * items = #[#"", image];
tempImage = [UIImage imageNamed:#"Sample.jpg"];
tempImage = _imageView.image;
NSString *theMessage = #"";
NSArray *items;
CGImageRef cgref = [tempImage CGImage];
CIImage *cim = [tempImage CIImage];
if (cim != nil || cgref != NULL)
{
items = #[theMessage,tempImage];
}
else
{
items = #[theMessage];
}
NSLog(#"Image Data %#",items);
// build an activity view controller
UIActivityViewController *controller = [[UIActivityViewController alloc]initWithActivityItems:items applicationActivities:nil];
// and present it
[self presentActivityController:controller];
}
- (void)presentActivityController:(UIActivityViewController *)controller {
// for iPad: make the presentation a Popover
controller.modalPresentationStyle = UIModalPresentationPopover;
[self presentViewController:controller animated:YES completion:nil];
UIPopoverPresentationController *popController = [controller popoverPresentationController];
popController.permittedArrowDirections = UIPopoverArrowDirectionAny;
popController.barButtonItem = self.navigationItem.leftBarButtonItem;
// access the completion handler
controller.completionWithItemsHandler = ^(NSString *activityType,
BOOL completed,
NSArray *returnedItems,
NSError *error)
{
// react to the completion
NSLog(#"----retu------%#",returnedItems);
if (completed)
{
[self showContine];
//[self viewWillAppear:YES];
// user shared an item
NSLog(#"We used activity type%#", activityType);
} else {
// user cancelled
NSLog(#"We didn't want to share anything after all.");
}
if (error) {
NSLog(#"An Error occured: %#, %#", error.localizedDescription, error.localizedFailureReason);
}
};
}

Error "Timeout" UserInfoNSLocalizedDescription=Timeout,Error Domain=SiriSpeechErrorDomain Code=100(null)

I am developing app like converting speech into text, here is my code,
I have used Apple's framework - speech.
- (void)viewDidLoad {
[super viewDidLoad];
[self.start_btn setEnabled:NO];
}
-(void)viewDidAppear:(BOOL)animated
{
self.speechRecognizer = [[SFSpeechRecognizer alloc]initWithLocale:[NSLocale localeWithLocaleIdentifier:#"en-US"]];
self.speechRecognizer.delegate = self;
[SFSpeechRecognizer requestAuthorization:^(SFSpeechRecognizerAuthorizationStatus authStatus) {
switch (authStatus) {
case SFSpeechRecognizerAuthorizationStatusAuthorized:
//User gave access to speech recognition
NSLog(#"Authorized");
[self.start_btn setEnabled:YES];
break;
case SFSpeechRecognizerAuthorizationStatusDenied:
//User denied access to speech recognition
NSLog(#"AuthorizationStatusDenied");
[self.start_btn setEnabled:NO];
break;
case SFSpeechRecognizerAuthorizationStatusRestricted:
//Speech recognition restricted on this device
NSLog(#"AuthorizationStatusRestricted");
[self.start_btn setEnabled:NO];
break;
case SFSpeechRecognizerAuthorizationStatusNotDetermined:
//Speech recognition not yet authorized
[self.start_btn setEnabled:NO];
break;
default:
NSLog(#"Default");
break;
}
}];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
-(void)start_record{
//CAncel the previous task if it's running
NSError * outError;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryRecord error:&outError];
[audioSession setMode:AVAudioSessionModeMeasurement error:&outError];
[audioSession setActive:YES withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&outError];
SFSpeechAudioBufferRecognitionRequest *request2 = [[SFSpeechAudioBufferRecognitionRequest alloc] init];
self.audioEngine = [[AVAudioEngine alloc]init];
AVAudioInputNode *inputNode = self.audioEngine.inputNode;
if (request2 == nil) {
NSLog(#"Unable to created a SFSpeechAudioBufferRecognitionRequest object");
}
if (inputNode == nil) {
NSLog(#"Audio engine has no input node ");
}
//configure request so that results are returned before audio recording is finished
request2.shouldReportPartialResults = YES;
self.recognitionTask = [self.speechRecognizer recognitionTaskWithRequest:request2 resultHandler:^(SFSpeechRecognitionResult * result, NSError * error1) {
BOOL isFinal = false;
if(result != nil)
{
self.speech_txt.text = result.bestTranscription.formattedString;
// NSLog(#" the result:%#",result.bestTranscription.formattedString);
NSLog(#"%#",self.speech_txt.text);
isFinal = result.isFinal;
}
if (error1 != nil || isFinal) {
[self.audioEngine stop];
[inputNode removeTapOnBus:0];
// [self.audioEngine stop];
// [self.recognitionRequest endAudio];
self.recognitionRequest = nil;
self.recognitionTask = nil;
[self.start_btn setEnabled:YES];
[self.start_btn setTitle:#"Start Recording" forState:UIControlStateNormal];
}
}];
AVAudioFormat *recordingFormat = [inputNode outputFormatForBus:0];
[inputNode installTapOnBus:0 bufferSize:1024 format:recordingFormat block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when){
[self.recognitionRequest appendAudioPCMBuffer:buffer];
}];
NSError *error1;
[self.audioEngine prepare];
[self.audioEngine startAndReturnError:&error1];
self.speech_txt.text = #"(Go ahead , I'm listening)";
}
//MARK: SFSpeechRecognizerDelegate
-(void)speechRecognizer:(SFSpeechRecognizer *)speechRecognizer availabilityDidChange:(BOOL)available
{
if (available) {
[self.start_btn setEnabled:YES];
[self.start_btn setTitle:#"Start Recording" forState:UIControlStateNormal];
}
else{
[self.start_btn setEnabled:NO];
[self.start_btn setTitle:#"Recognition not available" forState:UIControlStateDisabled];
}}
- (IBAction)start_btn_action:(id)sender {
if (self.audioEngine.isRunning) {
[self.audioEngine stop];
[self.recognitionRequest endAudio];
[self.start_btn setEnabled:NO];
[self.start_btn setTitle:#"Stopping" forState:UIControlStateDisabled];
}
else{
[self start_record];
[self.start_btn setTitle:#"Stop Recording" forState:#""];
}}
I have implemented this code, while running it shows error like:
[Utility] +[AFAggregator logDictationFailedWithError:] Error Domain=kAFAssistantErrorDomain Code=203 "Timeout" UserInfo={NSLocalizedDescription=Timeout, NSUnderlyingError=0x170250140 {Error Domain=SiriSpeechErrorDomain Code=100 "(null)"}}
How can I resolve this?

Coding Multiple Leaderboards

I am making a game in which the player can achieve a positive high score or a negative low score depending on the choices they make. The high score has been working fine, but I'm having trouble with the low score.
-(void)authenticateLocalPlayer{
GKLocalPlayer *localPlayer = [GKLocalPlayer localPlayer];
localPlayer.authenticateHandler = ^(UIViewController *viewController, NSError *error){
if (viewController != nil) {
[self presentViewController:viewController animated:YES completion:nil];
}
else{
if ([GKLocalPlayer localPlayer].authenticated) {
_gameCenterEnabled = YES;
// Get the default leaderboard identifier.
[[GKLocalPlayer localPlayer] loadDefaultLeaderboardIdentifierWithCompletionHandler:^(NSString *leaderboardIdentifier, NSError *error) {
if (error != nil) {
NSLog(#"%#", [error localizedDescription]);
}
else{
_leaderboardIdentifier = leaderboardIdentifier;
}
}];
}
else{
_gameCenterEnabled = NO;
}
}
};
}
-(void)reportScore{
GKScore *highscore = [[GKScore alloc] initWithLeaderboardIdentifier:_leaderboardIdentifier];
highscore.value = HighScoreNumber;
[GKScore reportScores:#[highscore] withCompletionHandler:^(NSError *error) {
if (error != nil) {
NSLog(#"%#", [error localizedDescription]);
}
}];
-(void)showLeaderboardAndAchievements:(BOOL)shouldShowLeaderboard{
GKGameCenterViewController *gcViewController = [[GKGameCenterViewController alloc] init];
gcViewController.gameCenterDelegate = self;
if (shouldShowLeaderboard) {
gcViewController.viewState = GKGameCenterViewControllerStateLeaderboards;
gcViewController.leaderboardIdentifier = _leaderboardIdentifier;
}
else{
gcViewController.viewState = GKGameCenterViewControllerStateAchievements;
}
[self presentViewController:gcViewController animated:YES completion:nil];
}
You'll notice leaderboardidentifier, it is useful for reporting scores to the default leaderboard, but when I try to get it to work for two different ones, the code shuts down.
I've tried adding this to "Report Score":
GKScore *lowscore = [[GKScore alloc] initWithLeaderboardIdentifier:_leaderboardIdentifier];
lowscore.value = LowScoreNumber;
[GKScore reportScores:#[lowscore] withCompletionHandler:^(NSError *error) {
if (error != nil) {
NSLog(#"%#", [error localizedDescription]);
}
}];
}
Then, I change the leaderboard identifiers to match itunesconnect, but I'm not sure how I need to change authenticateLocalPlayer and shouldShowLeaderboardandAchievements.
I have some experience with game center. I have two games on the App Store that have multiple leaderboards (Squared!! and Primes!).
Instead of having one method for each leaderboard I made one method for submitting scores. Here is that method:
-(void) submitScore:(int64_t)score Leaderboard: (NSString*)leaderboard
{
//1: Check if Game Center
// features are enabled
if (!_gameCenterFeaturesEnabled) {
return;
}
//2: Create a GKScore object
GKScore* gkScore =
[[GKScore alloc]
initWithLeaderboardIdentifier:leaderboard];
//3: Set the score value
gkScore.value = score;
//4: Send the score to Game Center
[gkScore reportScoreWithCompletionHandler:
^(NSError* error) {
[self setLastError:error];
BOOL success = (error == nil);
if ([_delegate
respondsToSelector:
#selector(onScoresSubmitted:)]) {
[_delegate onScoresSubmitted:success];
}
}];
}
When you want to submit your high scores all you have to do is add something like:
[[GCHelper sharedGameKitHelper] submitScore:myLowScore Leaderboard:TELS];
[[GCHelper sharedGameKitHelper] submitScore:myHighScore Leaderboard:TEHS];
GCHelper is the class that contains my submitScore:Leaderboard: method.
To view your leaderboards or achievements within your app try this:
- (void) presentLeaderboards {
GKGameCenterViewController* gameCenterController = [[GKGameCenterViewController alloc] init];
gameCenterController.viewState = GKGameCenterViewControllerStateLeaderboards;
gameCenterController.gameCenterDelegate = self;
[self presentViewController:gameCenterController animated:YES completion:nil];
}
- (void) gameCenterViewControllerDidFinish:(GKGameCenterViewController*) gameCenterViewController {
[self dismissViewControllerAnimated:YES completion:nil];
}
- (void) presentAchievements {
GKGameCenterViewController* gameCenterController = [[GKGameCenterViewController alloc] init];
gameCenterController.viewState = GKGameCenterViewControllerStateAchievements;
gameCenterController.gameCenterDelegate = self;
I hope this answers your question!

Take a picture in iOS without UIImagePicker and without preview it

Do you know any way / method to take a photo in iOS and saving it to camera Roll only with a simple button pressure without showing any preview?
I already know how to show the camera view but it show the preview of the image and the user need to click the take photo button to take the photo.
In few Words: the user click the button, the picture is taken, without previews nor double checks to take / save the photo.
I already found the takePicture method of UIIMagePickerController Class http://developer.apple.com/library/ios/documentation/uikit/reference/UIImagePickerController_Class/UIImagePickerController/UIImagePickerController.html#//apple_ref/occ/instm/UIImagePickerController/takePicture
Set the showsCameraControls-Property to NO.
poc = [[UIImagePickerController alloc] init];
[poc setTitle:#"Take a photo."];
[poc setDelegate:self];
[poc setSourceType:UIImagePickerControllerSourceTypeCamera];
poc.showsCameraControls = NO;
You also have to add your own Controls as a custom view on the top of poc.view. But that is very simple and you can add your own UI-style by that way.
You receive the image-data as usually within the imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:
To take the photo, you call
[poc takePicture];
from your custom button.
Hope, that works for you.
Assuming you want a point-and-shoot method, you can create an AVSession and just call the UIImageWriteToSavedPhotosAlbum method.
Here is a link that goes into that exact process: http://www.musicalgeometry.com/?p=1297
It's also worth noting that your users need to have given the app access to their camera roll or you may experience issues saving the images.
You need to design your own custom preview according to your size, on capture button pressed and call buttonPressed method and do stuff what you want
(void)buttonPressed:(UIButton *)sender {
NSLog(#" Capture Clicked");
[self.imagePicker takePicture];
//[NSTimer scheduledTimerWithTimeInterval:3.0f target:self
selector:#selector(timerFired:) userInfo:nil repeats:NO];
}
following is code that will take photo without showing preview screen. when i tried the accepted answer, which uses UIImagePickerController, the preview screen showed, then auto disappeared. with the code below, user taps 'takePhoto' button, and the devices takes photo with zero change to UI (in my app, i add a green check mark next to take photo button). the code below is from apple https://developer.apple.com/LIBRARY/IOS/samplecode/AVCam/Introduction/Intro.html with the 'extra functions' (that do not relate to taking still photo) commented out. thank you incmiko for suggesting this code in your answer iOS take photo from camera without modalViewController.
updating code, 26 mar 2015:
to trigger snap photo:
[self snapStillImage:sender];
in .h file:
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
// include code below in header file, after #import and before #interface
// avfoundation copy paste code
static void * CapturingStillImageContext = &CapturingStillImageContext;
static void * RecordingContext = &RecordingContext;
static void * SessionRunningAndDeviceAuthorizedContext = &SessionRunningAndDeviceAuthorizedContext;
// avfoundation, include code below after #interface
// avf - Session management.
#property (nonatomic) dispatch_queue_t sessionQueue; // Communicate with the session and other session objects on this queue.
#property (nonatomic) AVCaptureSession *session;
#property (nonatomic) AVCaptureDeviceInput *videoDeviceInput;
#property (nonatomic) AVCaptureMovieFileOutput *movieFileOutput;
#property (nonatomic) AVCaptureStillImageOutput *stillImageOutput;
// avf - Utilities.
#property (nonatomic) UIBackgroundTaskIdentifier backgroundRecordingID;
#property (nonatomic, getter = isDeviceAuthorized) BOOL deviceAuthorized;
#property (nonatomic, readonly, getter = isSessionRunningAndDeviceAuthorized) BOOL sessionRunningAndDeviceAuthorized;
#property (nonatomic) BOOL lockInterfaceRotation;
#property (nonatomic) id runtimeErrorHandlingObserver;
in .m file:
#pragma mark - AV Foundation
- (BOOL)isSessionRunningAndDeviceAuthorized
{
return [[self session] isRunning] && [self isDeviceAuthorized];
}
+ (NSSet *)keyPathsForValuesAffectingSessionRunningAndDeviceAuthorized
{
return [NSSet setWithObjects:#"session.running", #"deviceAuthorized", nil];
}
// call following method from viewDidLoad
- (void)CreateAVCaptureSession
{
// Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// Check for device authorization
[self checkDeviceAuthorizationStatus];
// In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.
// Why not do all of this on the main queue?
// -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive).
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
dispatch_async(sessionQueue, ^{
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
NSError *error = nil;
AVCaptureDevice *videoDevice = [ViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionFront];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error)
{
NSLog(#"%#", error);
}
if ([session canAddInput:videoDeviceInput])
{
[session addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
dispatch_async(dispatch_get_main_queue(), ^{
// Why are we dispatching this to the main queue?
// Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
// Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
});
}
/* AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (error)
{
NSLog(#"%#", error);
}
if ([session canAddInput:audioDeviceInput])
{
[session addInput:audioDeviceInput];
}
*/
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([session canAddOutput:movieFileOutput])
{
[session addOutput:movieFileOutput];
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported])
[connection setEnablesVideoStabilizationWhenAvailable:YES];
[self setMovieFileOutput:movieFileOutput];
}
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
if ([session canAddOutput:stillImageOutput])
{
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
[session addOutput:stillImageOutput];
[self setStillImageOutput:stillImageOutput];
}
});
}
// call method below from viewWilAppear
- (void)AVFoundationStartSession
{
dispatch_async([self sessionQueue], ^{
[self addObserver:self forKeyPath:#"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];
[self addObserver:self forKeyPath:#"stillImageOutput.capturingStillImage" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:CapturingStillImageContext];
[self addObserver:self forKeyPath:#"movieFileOutput.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
__weak ViewController *weakSelf = self;
[self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {
ViewController *strongSelf = weakSelf;
dispatch_async([strongSelf sessionQueue], ^{
// Manually restarting the session since it must have been stopped due to an error.
[[strongSelf session] startRunning];
});
}]];
[[self session] startRunning];
});
}
// call method below from viewDidDisappear
- (void)AVFoundationStopSession
{
dispatch_async([self sessionQueue], ^{
[[self session] stopRunning];
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
[[NSNotificationCenter defaultCenter] removeObserver:[self runtimeErrorHandlingObserver]];
[self removeObserver:self forKeyPath:#"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext];
[self removeObserver:self forKeyPath:#"stillImageOutput.capturingStillImage" context:CapturingStillImageContext];
[self removeObserver:self forKeyPath:#"movieFileOutput.recording" context:RecordingContext];
});
}
- (BOOL)prefersStatusBarHidden
{
return YES;
}
- (BOOL)shouldAutorotate
{
// Disable autorotation of the interface when recording is in progress.
return ![self lockInterfaceRotation];
}
- (NSUInteger)supportedInterfaceOrientations
{
return UIInterfaceOrientationMaskAll;
}
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration
{
// [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)toInterfaceOrientation];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (context == CapturingStillImageContext)
{
BOOL isCapturingStillImage = [change[NSKeyValueChangeNewKey] boolValue];
if (isCapturingStillImage)
{
[self runStillImageCaptureAnimation];
}
}
else if (context == RecordingContext)
{
BOOL isRecording = [change[NSKeyValueChangeNewKey] boolValue];
dispatch_async(dispatch_get_main_queue(), ^{
if (isRecording)
{
// [[self cameraButton] setEnabled:NO];
// [[self recordButton] setTitle:NSLocalizedString(#"Stop", #"Recording button stop title") forState:UIControlStateNormal];
// [[self recordButton] setEnabled:YES];
}
else
{
// [[self cameraButton] setEnabled:YES];
// [[self recordButton] setTitle:NSLocalizedString(#"Record", #"Recording button record title") forState:UIControlStateNormal];
// [[self recordButton] setEnabled:YES];
}
});
}
else if (context == SessionRunningAndDeviceAuthorizedContext)
{
BOOL isRunning = [change[NSKeyValueChangeNewKey] boolValue];
dispatch_async(dispatch_get_main_queue(), ^{
if (isRunning)
{
// [[self cameraButton] setEnabled:YES];
// [[self recordButton] setEnabled:YES];
// [[self stillButton] setEnabled:YES];
}
else
{
// [[self cameraButton] setEnabled:NO];
// [[self recordButton] setEnabled:NO];
// [[self stillButton] setEnabled:NO];
}
});
}
else
{
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
#pragma mark Actions
- (IBAction)toggleMovieRecording:(id)sender
{
// [[self recordButton] setEnabled:NO];
dispatch_async([self sessionQueue], ^{
if (![[self movieFileOutput] isRecording])
{
[self setLockInterfaceRotation:YES];
if ([[UIDevice currentDevice] isMultitaskingSupported])
{
// Setup background task. This is needed because the captureOutput:didFinishRecordingToOutputFileAtURL: callback is not received until AVCam returns to the foreground unless you request background execution time. This also ensures that there will be time to write the file to the assets library when AVCam is backgrounded. To conclude this background execution, -endBackgroundTask is called in -recorder:recordingDidFinishToOutputFileURL:error: after the recorded file has been saved.
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]];
}
// Update the orientation on the movie file output video connection before starting recording.
// [[[self movieFileOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Turning OFF flash for video recording
[ViewController setFlashMode:AVCaptureFlashModeOff forDevice:[[self videoDeviceInput] device]];
// Start recording to a temporary file.
NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[#"movie" stringByAppendingPathExtension:#"mov"]];
[[self movieFileOutput] startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self];
}
else
{
[[self movieFileOutput] stopRecording];
}
});
}
- (IBAction)changeCamera:(id)sender
{
// [[self cameraButton] setEnabled:NO];
// [[self recordButton] setEnabled:NO];
// [[self stillButton] setEnabled:NO];
dispatch_async([self sessionQueue], ^{
AVCaptureDevice *currentVideoDevice = [[self videoDeviceInput] device];
AVCaptureDevicePosition preferredPosition = AVCaptureDevicePositionUnspecified;
AVCaptureDevicePosition currentPosition = [currentVideoDevice position];
switch (currentPosition)
{
case AVCaptureDevicePositionUnspecified:
preferredPosition = AVCaptureDevicePositionBack;
break;
case AVCaptureDevicePositionBack:
preferredPosition = AVCaptureDevicePositionFront;
break;
case AVCaptureDevicePositionFront:
preferredPosition = AVCaptureDevicePositionBack;
break;
}
AVCaptureDevice *videoDevice = [ViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:preferredPosition];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
[[self session] beginConfiguration];
[[self session] removeInput:[self videoDeviceInput]];
if ([[self session] canAddInput:videoDeviceInput])
{
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:currentVideoDevice];
[ViewController setFlashMode:AVCaptureFlashModeAuto forDevice:videoDevice];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:videoDevice];
[[self session] addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
}
else
{
[[self session] addInput:[self videoDeviceInput]];
}
[[self session] commitConfiguration];
dispatch_async(dispatch_get_main_queue(), ^{
// [[self cameraButton] setEnabled:YES];
// [[self recordButton] setEnabled:YES];
// [[self stillButton] setEnabled:YES];
});
});
}
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
// [[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[ViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
// do someting good with saved image
[self saveImageToParse:image];
}
}];
});
}
- (IBAction)focusAndExposeTap:(UIGestureRecognizer *)gestureRecognizer
{
// CGPoint devicePoint = [(AVCaptureVideoPreviewLayer *)[[self previewView] layer] captureDevicePointOfInterestForPoint:[gestureRecognizer locationInView:[gestureRecognizer view]]];
// [self focusWithMode:AVCaptureFocusModeAutoFocus exposeWithMode:AVCaptureExposureModeAutoExpose atDevicePoint:devicePoint monitorSubjectAreaChange:YES];
}
- (void)subjectAreaDidChange:(NSNotification *)notification
{
CGPoint devicePoint = CGPointMake(.5, .5);
[self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO];
}
#pragma mark File Output Delegate
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
if (error)
NSLog(#"%#", error);
[self setLockInterfaceRotation:NO];
// Note the backgroundRecordingID for use in the ALAssetsLibrary completion handler to end the background task associated with this recording. This allows a new recording to be started, associated with a new UIBackgroundTaskIdentifier, once the movie file output's -isRecording is back to NO — which happens sometime after this method returns.
UIBackgroundTaskIdentifier backgroundRecordingID = [self backgroundRecordingID];
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
[[[ALAssetsLibrary alloc] init] writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) {
if (error)
NSLog(#"%#", error);
[[NSFileManager defaultManager] removeItemAtURL:outputFileURL error:nil];
if (backgroundRecordingID != UIBackgroundTaskInvalid)
[[UIApplication sharedApplication] endBackgroundTask:backgroundRecordingID];
}];
}
#pragma mark Device Configuration
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async([self sessionQueue], ^{
AVCaptureDevice *device = [[self videoDeviceInput] device];
NSError *error = nil;
if ([device lockForConfiguration:&error])
{
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:focusMode])
{
[device setFocusMode:focusMode];
[device setFocusPointOfInterest:point];
}
if ([device isExposurePointOfInterestSupported] && [device isExposureModeSupported:exposureMode])
{
[device setExposureMode:exposureMode];
[device setExposurePointOfInterest:point];
}
[device setSubjectAreaChangeMonitoringEnabled:monitorSubjectAreaChange];
[device unlockForConfiguration];
}
else
{
NSLog(#"%#", error);
}
});
}
+ (void)setFlashMode:(AVCaptureFlashMode)flashMode forDevice:(AVCaptureDevice *)device
{
if ([device hasFlash] && [device isFlashModeSupported:flashMode])
{
NSError *error = nil;
if ([device lockForConfiguration:&error])
{
[device setFlashMode:flashMode];
[device unlockForConfiguration];
}
else
{
NSLog(#"%#", error);
}
}
}
+ (AVCaptureDevice *)deviceWithMediaType:(NSString *)mediaType preferringPosition:(AVCaptureDevicePosition)position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:mediaType];
AVCaptureDevice *captureDevice = [devices firstObject];
for (AVCaptureDevice *device in devices)
{
if ([device position] == position)
{
captureDevice = device;
break;
}
}
return captureDevice;
}
#pragma mark UI
- (void)runStillImageCaptureAnimation
{
/*
dispatch_async(dispatch_get_main_queue(), ^{
[[[self previewView] layer] setOpacity:0.0];
[UIView animateWithDuration:.25 animations:^{
[[[self previewView] layer] setOpacity:1.0];
}];
});
*/
}
- (void)checkDeviceAuthorizationStatus
{
NSString *mediaType = AVMediaTypeVideo;
[AVCaptureDevice requestAccessForMediaType:mediaType completionHandler:^(BOOL granted) {
if (granted)
{
//Granted access to mediaType
[self setDeviceAuthorized:YES];
}
else
{
//Not granted access to mediaType
dispatch_async(dispatch_get_main_queue(), ^{
[[[UIAlertView alloc] initWithTitle:#"AVCam!"
message:#"AVCam doesn't have permission to use Camera, please change privacy settings"
delegate:self
cancelButtonTitle:#"OK"
otherButtonTitles:nil] show];
[self setDeviceAuthorized:NO];
});
}
}];
}

Can't play system sounds after capturing audio / video

I'm recoding audio/video using AVfoudnation. and I need to play a sounds, using system sounds, before I start capturing video/audio. This is working correctly the first time, but when I try to do it the second time, the system audi doesn't play. My guess is that something in the AVfoundation is not been released correctly.
In my application deletage, I have this code in the applicationDidFinishLaunching method:
VKRSAppSoundPlayer *aPlayer = [[VKRSAppSoundPlayer alloc] init];
[aPlayer addSoundWithFilename:#"sound1" andExtension:#"caf"];
self.appSoundPlayer = aPlayer;
[aPlayer release];
and also this method
- (void)playSound:(NSString *)sound
{
[appSoundPlayer playSound:sound];
}
As you can see I'm using VKRSAppSoundPlayer, which works great!
In a view, I have this code:
- (void) startSession
{
self.session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
if([session canSetSessionPreset:AVCaptureSessionPreset640x480])
session.sessionPreset = AVCaptureSessionPresetMedium;
[session commitConfiguration];
CALayer *viewLayer = [videoPreviewView layer];
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = viewLayer.bounds;
[viewLayer addSublayer:captureVideoPreviewLayer];
self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:[self frontFacingCameraIfAvailable] error:nil];
self.audioInput = [AVCaptureDeviceInput deviceInputWithDevice:[self audioDevice] error:nil];
if(videoInput){
self.videoOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:videoOutput];
//[videoOutput release];
if([session canAddInput:videoInput]){
//[session beginConfiguration];
[session addInput:videoInput];
}
//[videoInput release];
[session removeInput:[self audioInput]];
if([session canAddInput:audioInput]){
[session addInput:audioInput];
}
//[audioInput release];
if([session canAddInput:audioInput])
[session addInput:audioInput];
NSLog(#"startRunning!");
[session startRunning];
[self startRecording];
if(![self recordsVideo])
[self showAlertWithTitle:#"Video Recording Unavailable" msg:#"This device can't record video."];
}
}
- (void) stopSession
{
[session stopRunning];
[session release];
}
- (AVCaptureDevice *)frontFacingCameraIfAvailable
{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
Boolean cameraFound = false;
for (AVCaptureDevice *device in videoDevices)
{
NSLog(#"1 frontFacingCameraIfAvailable %d", device.position);
if (device.position == AVCaptureDevicePositionBack){
NSLog(#"1 frontFacingCameraIfAvailable FOUND");
captureDevice = device;
cameraFound = true;
break;
}
}
if(cameraFound == false){
for (AVCaptureDevice *device in videoDevices)
{
NSLog(#"2 frontFacingCameraIfAvailable %d", device.position);
if (device.position == AVCaptureDevicePositionFront){
NSLog(#"2 frontFacingCameraIfAvailable FOUND");
captureDevice = device;
break;
}
}
}
return captureDevice;
}
- (AVCaptureDevice *) audioDevice
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio];
if ([devices count] > 0) {
return [devices objectAtIndex:0];
}
return nil;
}
- (void) startRecording
{
#if _Multitasking_
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]];
}
#endif
[videoOutput startRecordingToOutputFileURL:[self generatenewVideoPath]
recordingDelegate:self];
}
- (void) stopRecording
{
[videoOutput stopRecording];
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections error:(NSError *)error
{
NSFileManager *man = [[NSFileManager alloc] init];
NSDictionary *attrs = [man attributesOfItemAtPath: [outputFileURL path] error: NULL];
NSString *fileSize = [NSString stringWithFormat:#"%llu", [attrs fileSize]];
// close this screen
[self exitScreen];
}
-(BOOL)recordsVideo
{
AVCaptureConnection *videoConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo
fromConnections:[videoOutput connections]];
return [videoConnection isActive];
}
-(BOOL)recordsAudio
{
AVCaptureConnection *audioConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeAudio
fromConnections:[videoOutput connections]];
return [audioConnection isActive];
}
If I do [videoInput release]; and [audioInput release]; I got a bad access error. that's why they are commented out. This may be part of the issue.
If I try to play the system sound n times, it work, but if I go first to the recording script, it wont work after that.
Any ideas?
The proper way to release AVCaptureSession is the following:
- (void) destroySession {
// Notify the view that the session will end
if ([delegate respondsToSelector:#selector(captureManagerSessionWillEnd:)]) {
[delegate captureManagerSessionWillEnd:self];
}
// remove the device inputs
[session removeInput:[self videoInput]];
[session removeInput:[self audioInput]];
// release
[session release];
// remove AVCamRecorder
[recorder release];
// Notify the view that the session has ended
if ([delegate respondsToSelector:#selector(captureManagerSessionEnded:)]) {
[delegate captureManagerSessionEnded:self];
}
}
If you're having some sort of release problems (bad access), I can recommend taking your code out of your current "messy" project to some other new project and debug the problem over there.
When I had similar problem, I just did that. I shared it on Github, you might find this project useful: AVCam-CameraReleaseTest