Can't play system sounds after capturing audio / video - objective-c

I'm recoding audio/video using AVfoudnation. and I need to play a sounds, using system sounds, before I start capturing video/audio. This is working correctly the first time, but when I try to do it the second time, the system audi doesn't play. My guess is that something in the AVfoundation is not been released correctly.
In my application deletage, I have this code in the applicationDidFinishLaunching method:
VKRSAppSoundPlayer *aPlayer = [[VKRSAppSoundPlayer alloc] init];
[aPlayer addSoundWithFilename:#"sound1" andExtension:#"caf"];
self.appSoundPlayer = aPlayer;
[aPlayer release];
and also this method
- (void)playSound:(NSString *)sound
{
[appSoundPlayer playSound:sound];
}
As you can see I'm using VKRSAppSoundPlayer, which works great!
In a view, I have this code:
- (void) startSession
{
self.session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
if([session canSetSessionPreset:AVCaptureSessionPreset640x480])
session.sessionPreset = AVCaptureSessionPresetMedium;
[session commitConfiguration];
CALayer *viewLayer = [videoPreviewView layer];
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = viewLayer.bounds;
[viewLayer addSublayer:captureVideoPreviewLayer];
self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:[self frontFacingCameraIfAvailable] error:nil];
self.audioInput = [AVCaptureDeviceInput deviceInputWithDevice:[self audioDevice] error:nil];
if(videoInput){
self.videoOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:videoOutput];
//[videoOutput release];
if([session canAddInput:videoInput]){
//[session beginConfiguration];
[session addInput:videoInput];
}
//[videoInput release];
[session removeInput:[self audioInput]];
if([session canAddInput:audioInput]){
[session addInput:audioInput];
}
//[audioInput release];
if([session canAddInput:audioInput])
[session addInput:audioInput];
NSLog(#"startRunning!");
[session startRunning];
[self startRecording];
if(![self recordsVideo])
[self showAlertWithTitle:#"Video Recording Unavailable" msg:#"This device can't record video."];
}
}
- (void) stopSession
{
[session stopRunning];
[session release];
}
- (AVCaptureDevice *)frontFacingCameraIfAvailable
{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
Boolean cameraFound = false;
for (AVCaptureDevice *device in videoDevices)
{
NSLog(#"1 frontFacingCameraIfAvailable %d", device.position);
if (device.position == AVCaptureDevicePositionBack){
NSLog(#"1 frontFacingCameraIfAvailable FOUND");
captureDevice = device;
cameraFound = true;
break;
}
}
if(cameraFound == false){
for (AVCaptureDevice *device in videoDevices)
{
NSLog(#"2 frontFacingCameraIfAvailable %d", device.position);
if (device.position == AVCaptureDevicePositionFront){
NSLog(#"2 frontFacingCameraIfAvailable FOUND");
captureDevice = device;
break;
}
}
}
return captureDevice;
}
- (AVCaptureDevice *) audioDevice
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio];
if ([devices count] > 0) {
return [devices objectAtIndex:0];
}
return nil;
}
- (void) startRecording
{
#if _Multitasking_
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]];
}
#endif
[videoOutput startRecordingToOutputFileURL:[self generatenewVideoPath]
recordingDelegate:self];
}
- (void) stopRecording
{
[videoOutput stopRecording];
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections error:(NSError *)error
{
NSFileManager *man = [[NSFileManager alloc] init];
NSDictionary *attrs = [man attributesOfItemAtPath: [outputFileURL path] error: NULL];
NSString *fileSize = [NSString stringWithFormat:#"%llu", [attrs fileSize]];
// close this screen
[self exitScreen];
}
-(BOOL)recordsVideo
{
AVCaptureConnection *videoConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo
fromConnections:[videoOutput connections]];
return [videoConnection isActive];
}
-(BOOL)recordsAudio
{
AVCaptureConnection *audioConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeAudio
fromConnections:[videoOutput connections]];
return [audioConnection isActive];
}
If I do [videoInput release]; and [audioInput release]; I got a bad access error. that's why they are commented out. This may be part of the issue.
If I try to play the system sound n times, it work, but if I go first to the recording script, it wont work after that.
Any ideas?

The proper way to release AVCaptureSession is the following:
- (void) destroySession {
// Notify the view that the session will end
if ([delegate respondsToSelector:#selector(captureManagerSessionWillEnd:)]) {
[delegate captureManagerSessionWillEnd:self];
}
// remove the device inputs
[session removeInput:[self videoInput]];
[session removeInput:[self audioInput]];
// release
[session release];
// remove AVCamRecorder
[recorder release];
// Notify the view that the session has ended
if ([delegate respondsToSelector:#selector(captureManagerSessionEnded:)]) {
[delegate captureManagerSessionEnded:self];
}
}
If you're having some sort of release problems (bad access), I can recommend taking your code out of your current "messy" project to some other new project and debug the problem over there.
When I had similar problem, I just did that. I shared it on Github, you might find this project useful: AVCam-CameraReleaseTest

Related

Text To Speech And Speech To Text Recognition -->self - Recognition is occurring

I want to developing an app which should support speech to text and text to speech ,
i)Speech to Text- Procedure-i have used Speech framework for speech to text ,whenever i open a app and if i start speaking ,the app should recognize the voice and should convert the speech into text .This is working
ii)Text to Speech - Procedure -i have used AVFoundation and MediaPlayer library If user press the play button it should convert the text i.e,whatever appear in the screen into speech.Working now .
Here is the problem is am facing
while processing text to speech ,the speech recognizer recognizes the playing voice and printing the words again in textbox.
Example- if i say "Hello Good Morning" it is printing in text box and then if i press a play button it is playing a voice Hello Good Morning but at this time speech to text recognize recognize this voice i mean self- recognition and it's printing "Hello Good Morning Hello Good Morning"
I want to stop the Speech To Text Process while processing the Text To Speech
For this , i have stopped speech recognition request while playing the speech
here is the code,
#implementation ViewController
{
SFSpeechAudioBufferRecognitionRequest *recognitionRequest;
SFSpeechRecognitionTask *recognitionTask;
AVAudioEngine *audioEngine;
NSMutableArray *speechStringsArray;
BOOL SpeechToText;
NSString* resultString;
NSString *str ;
NSString *searchString;
NSString *textToSpeak;
}
- (void)viewDidLoad {
[super viewDidLoad];
//Speech To Text ****
speechStringsArray = [[NSMutableArray alloc]init];
// Initialize background audio session
NSError *error = NULL;
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayback error:&error];
if(error) {
NSLog(#"#error: %#", error);
}
[session setActive:YES error:&error];
if (error) {
NSLog(#"#error: %#", error);
}
// Enabled remote controls
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
// Voice setup
self.voicePicker.delegate = self;
self.voice = [AVSpeechSynthesisVoice voiceWithLanguage:#"en-us"];
self.voices = [NSMutableArray arrayWithObjects:
#{#"voice" : #"en-us", #"label" : #"American English (Female)"},
#{#"voice" : #"en-au", #"label" : #"Australian English (Female)"},
#{#"voice" : #"en-gb", #"label" : #"British English (Male)"},
#{#"voice" : #"en-ie", #"label" : #"Irish English (Female)"},
#{#"voice" : #"en-za", #"label" : #"South African English (Female)"},
nil];
// Synthesizer setup
self.synthesizer = [[AVSpeechSynthesizer alloc] init];
self.synthesizer.delegate = self;
// UITextView delegate
self.textView.delegate = self;
// This notifcation is generated from the AppDelegate applicationDidBecomeActive method to make sure that if the play or pause button is updated in the background then the button will be updated in the toolbar
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(updateToolbar) name:#"updateToolbar" object:nil];
}
-(void)viewDidAppear:(BOOL)animated
{
self.speechRecognizer = [[SFSpeechRecognizer alloc]initWithLocale:[NSLocale localeWithLocaleIdentifier:#"en-US en-UK"]];
self.speechRecognizer.delegate = self;
audioEngine = [[AVAudioEngine alloc]init];
[SFSpeechRecognizer requestAuthorization:^(SFSpeechRecognizerAuthorizationStatus authStatus) {
switch (authStatus) {
case SFSpeechRecognizerAuthorizationStatusAuthorized:
//User gave access to speech recognition
NSLog(#"Authorized");
[self start_record];
break;
case SFSpeechRecognizerAuthorizationStatusDenied:
//User denied access to speech recognition
NSLog(#"AuthorizationStatusDenied");
break;
case SFSpeechRecognizerAuthorizationStatusRestricted:
//Speech recognition restricted on this device
NSLog(#"AuthorizationStatusRestricted");
break;
case SFSpeechRecognizerAuthorizationStatusNotDetermined:
//Speech recognition not yet authorized
break;
default:
NSLog(#"Default");
break;
}
}];
//MARK : Interface Builder Actions
}
****coding for increase the speed and pitch****
- (IBAction)handleSpeedStepper:(UIStepper *)sender
{
double speedValue = self.speedStepper.value;
[self.speedValueLabel setText:[NSString stringWithFormat:#"%.1f", speedValue]];
}
- (IBAction)handlePitchStepper:(UIStepper *)sender
{
double pitchValue = self.pitchStepper.value;
[self.pitchValueLabel setText:[NSString stringWithFormat:#"%.1f", pitchValue]];
}
//Play button for text to speech
- (IBAction)handlePlayPauseButton:(UIBarButtonItem *)sender
{
if (self.synthesizer.speaking && !self.synthesizer.paused) {
if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
// Stop immediately
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
}
else {
// Stop at end of current word
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
}
[self updateToolbarWithButton:#"play"];
}
else if (self.synthesizer.paused) {
[self.synthesizer continueSpeaking];
[self updateToolbarWithButton:#"pause"];
}
else {
[self speakUtterance];
[self updateToolbarWithButton:#"pause"];
}
}
//method for speech to text
-(void)start_record{
NSError * outError;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&outError];
[audioSession setMode:AVAudioSessionModeMeasurement error:&outError];
[audioSession setActive:YES withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&outError];
recognitionRequest = [[SFSpeechAudioBufferRecognitionRequest alloc]init];
AVAudioInputNode *inputNode = audioEngine.inputNode;
if (recognitionRequest == nil) {
NSLog(#"Unable to created a SFSpeechAudioBufferRecognitionRequest object");
}
if (inputNode == nil) {
NSLog(#"Audio engine has no input node ");}
//configure request so that results are returned before audio recording is finished
[recognitionRequest setShouldReportPartialResults:YES];
// A recognition task represents a speech recognition session.
//We keep a reference to the task so that it can be cancelled .
recognitionTask = [self.speechRecognizer recognitionTaskWithRequest:recognitionRequest resultHandler:^(SFSpeechRecognitionResult * result, NSError * error1) {
BOOL isFinal = false;
if ((result = result)) {
NSString *speech = result.bestTranscription.formattedString;
NSLog(#"the speech:%#",speech);
// coding for fixing append string issue
for (int i = 0 ;i <speechStringsArray.count;i++)
{
str = [speechStringsArray objectAtIndex:i];
NSRange range = [speech rangeOfString:str options:NSCaseInsensitiveSearch];
NSLog(#"found: %#", (range.location != NSNotFound) ? #"Yes" : #"No");
if (range.location != NSNotFound) {
resultString = [speech stringByReplacingCharactersInRange:range withString:#""];
speech = resultString;
NSLog(#" the result is : %#",resultString);
}
}
//specific functions - space for second word
if (resultString.length>0) {
self.textView.text = [NSString stringWithFormat:#"%#%#",self.textView.text,resultString];
[speechStringsArray addObject:resultString]; }
//specific function space for first word -Working fine
else
{
[speechStringsArray addObject:speech];
self.textView.text = speech;
}
}
NSLog(#" array %#",speechStringsArray);
isFinal = result.isFinal;
}
if (error1 != nil || isFinal) {
[audioEngine stop];
[inputNode removeTapOnBus:0];
recognitionRequest = nil;
recognitionTask = nil;
[self start_record];
}}];
AVAudioFormat *recordingFormat = [inputNode outputFormatForBus:0];
[inputNode installTapOnBus:0 bufferSize:1024 format:recordingFormat block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when){
[recognitionRequest appendAudioPCMBuffer:buffer];}
];
NSError *error1;
[audioEngine prepare];
[audioEngine startAndReturnError:&error1];}
- (void)speakUtterance
{
NSLog(#"speakUtterance");
didStartSpeaking = NO;
textToSpeak = [NSString stringWithFormat:#"%#", self.textView.text];
AVSpeechUtterance *utterance = [[AVSpeechUtterance alloc] initWithString:textToSpeak];
utterance.rate = self.speedStepper.value;
utterance.pitchMultiplier = self.pitchStepper.value;
utterance.voice = self.voice;
[self.synthesizer speakUtterance:utterance];
[self displayBackgroundMediaFields];
}
- (void)displayBackgroundMediaFields
{
MPMediaItemArtwork *artwork = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:#"Play"]];
NSDictionary *info = #{ MPMediaItemPropertyTitle: self.textView.text,
MPMediaItemPropertyAlbumTitle: #"TextToSpeech App",
MPMediaItemPropertyArtwork: artwork};
[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = info;
}
- (void)updateToolbar
{
if (self.synthesizer.speaking && !self.synthesizer.paused) {
[self updateToolbarWithButton:#"pause"];
}
else {
[self updateToolbarWithButton:#"play"];
}}
- (void)updateToolbarWithButton:(NSString *)buttonType
{
//stopping the speech to text process
if (audioEngine.isRunning) {
[audioEngine stop];
[recognitionRequest endAudio];
}
NSLog(#"updateToolbarWithButton: %#", buttonType);
UIBarButtonItem *audioControl;
if ([buttonType isEqualToString:#"play"]) {
// Play
audioControl = [[UIBarButtonItem alloc]initWithBarButtonSystemItem:UIBarButtonSystemItemPlay target:self action:#selector(handlePlayPauseButton:)];
}
else {
// Pause
audioControl = [[UIBarButtonItem alloc]initWithBarButtonSystemItem:UIBarButtonSystemItemPause target:self action:#selector(handlePlayPauseButton:)];
}
UIBarButtonItem *flexibleItem = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemFlexibleSpace target:nil action:nil];
[self.toolbar setItems:#[flexibleItem, audioControl, flexibleItem]];
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)receivedEvent
{
NSLog(#"receivedEvent: %#", receivedEvent);
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlPlay:
NSLog(#"UIEventSubtypeRemoteControlPlay");
if (self.synthesizer.speaking) {
[self.synthesizer continueSpeaking];
}
else {
[self speakUtterance];
}
break;
case UIEventSubtypeRemoteControlPause:
NSLog(#"pause - UIEventSubtypeRemoteControlPause");
if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
// Pause immediately
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
}
else {
// Pause at end of current word
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
}
break;
case UIEventSubtypeRemoteControlTogglePlayPause:
if (self.synthesizer.paused) {
NSLog(#"UIEventSubtypeRemoteControlTogglePlayPause");
[self.synthesizer continueSpeaking];
}
else {
NSLog(#"UIEventSubtypeRemoteControlTogglePlayPause");
if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
// Pause immediately
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
}
else {
// Pause at end of current word
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
}
}
break;
case UIEventSubtypeRemoteControlNextTrack:
NSLog(#"UIEventSubtypeRemoteControlNextTrack - appropriate for playlists");
break;
case UIEventSubtypeRemoteControlPreviousTrack:
NSLog(#"UIEventSubtypeRemoteControlPreviousTrack - appropriatefor playlists");
break;
default:
break;
}
}
}
pragma mark UIPickerViewDelegate Methods
- (NSInteger)numberOfComponentsInPickerView:(UIPickerView *)pickerView
{
return 1;
}
- (NSInteger)pickerView:(UIPickerView *)pickerView numberOfRowsInComponent:(NSInteger)component
{
return self.voices.count;
}
- (UIView *)pickerView:(UIPickerView *)pickerView viewForRow:(NSInteger)row forComponent:(NSInteger)component reusingView:(UIView *)view
{
UILabel *rowLabel = [[UILabel alloc] init];
NSDictionary *voice = [self.voices objectAtIndex:row];
rowLabel.text = [voice objectForKey:#"label"];
return rowLabel;
}
- (void)pickerView:(UIPickerView *)pickerView didSelectRow: (NSInteger)row inComponent:(NSInteger)component
{
NSDictionary *voice = [self.voices objectAtIndex:row];
NSLog(#"new picker voice selected with label: %#", [voice objectForKey:#"label"]);
self.voice = [AVSpeechSynthesisVoice voiceWithLanguage:[voice objectForKey:#"voice"]];
}
pragma mark SpeechSynthesizerDelegate methods
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer didFinishSpeechUtterance:(AVSpeechUtterance *)utterance
{
// This is a workaround of a bug. When we change the voice the first time the speech utterence is set fails silently. We check that the method willSpeakRangeOfSpeechString is called and set didStartSpeaking to YES there. If this method is not called (silent fail) then we simply request to speak again.
if (!didStartSpeaking) {
[self speakUtterance];
}
else {
[self updateToolbarWithButton:#"play"];
NSLog(#"the text are:%#",self.textView.text);
}}
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer willSpeakRangeOfSpeechString:(NSRange)characterRange utterance:(AVSpeechUtterance *)utterance
{
didStartSpeaking = YES;
//[self setTextViewTextWithColoredCharacterRange:characterRange];
}
#pragma mark UITextViewDelegate Methods
- (BOOL)textView:(UITextView *)textView shouldChangeTextInRange:(NSRange)range replacementText:(NSString *)text {
if([text isEqualToString:#"\n"]) {
[textView resignFirstResponder];
return NO;
}
return YES;
}
Dont initialize all the things in the ViewDidLoad. When You tap on button to convert text to speech, at that that time make the speech to text conversion object as nil also set the delegate the nil. Same things for the vice versa also.

Record video via AVFoundation in iOS

I try to create an application that records video and takes images at receiving a message via sockets.
What I do not arrive to do is video recording using AVFoundation.
is that because of the fact that I use a single session ?
Here is my code :
#import "ViewController.h"
#import "Parser.h"
#import "DeviceConfig.h"
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// z-index
_vImage.layer.zPosition = 1;
_toggle.layer.zPosition = 1;
recording = NO;
[self initNetworkCommunication];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (void) initNetworkCommunication {
CFReadStreamRef readStream;
CFWriteStreamRef writeStream;
CFStreamCreatePairWithSocketToHost(NULL, (CFStringRef)#"192.168.0.103", 8080, &readStream, &writeStream);
input = (__bridge NSInputStream*) readStream;
output = (__bridge NSOutputStream*) writeStream;
[input setDelegate: self];
[output setDelegate: self];
[input scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[output scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[input open];
[output open];
}
- (void) sendMessage : (NSString*) message {
NSString *response = [NSString stringWithFormat:#"iam:%#", message];
NSData *data = [[NSData alloc] initWithData:[response dataUsingEncoding:NSASCIIStringEncoding]];
[output write:[data bytes] maxLength:[data length]];
}
- (void) stream : (NSStream*) theStream handleEvent:(NSStreamEvent) streamEvent {
switch(streamEvent) {
case NSStreamEventOpenCompleted:
[self openCamera];
break;
case NSStreamEventHasBytesAvailable:
NSLog(#"kayn");
if(theStream == input) {
uint8_t buffer[1024];
int length;
while([input hasBytesAvailable]) {
length = [input read:buffer maxLength:sizeof(buffer)];
if(length > 0) {
NSString* outputString = [[NSString alloc] initWithBytes:buffer length:length encoding:NSASCIIStringEncoding];
// To split different messages sended in buffer via Socket (in one message)
NSArray *options = [[outputString substringToIndex:[outputString length] - 1] componentsSeparatedByString:#":"];
NSLog(#"%#", options);
if(outputString != nil) {
if([options[1] isEqualToString:#"TAKE_PIC"]) {
[self takePic];
}else if([options[1] isEqualToString:#"PARAMS"]) {
Parser* parser = [[Parser alloc] initWithXMLData: options[2]];
[self changeCameraConfig:[parser getCameraOptions]];
}else if([options[1] isEqualToString:#"REC_VIDEO"]) {
[self recordVideo];
}else{
[self showMessage:#"Unknow order, please contact the administrator."];
}
}
}
}
}
break;
default:
NSLog(#"Unknown event");
}
}
- (void) showMessage : (NSString*) msg {
UIAlertView *helloWorldAlert = [[UIAlertView alloc]
initWithTitle:#"My First App" message:msg delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
// Display the Hello World Message
[helloWorldAlert show];
}
- (void) openCamera {
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureVideoPreviewLayer* previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.frame = self.liveCameraView.bounds;
[self.liveCameraView.layer addSublayer:previewLayer];
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:deviceInput];
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[_stillImageOutput setOutputSettings:outputSettings];
_videoOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:_videoOutput];
[session addOutput:_stillImageOutput];
[session startRunning];
}
-(void) changeCameraConfig:(NSMutableDictionary *)config {
[device lockForConfiguration:nil];
DeviceConfig* conf;
for(id key in config) {
conf = [config objectForKey:key];
//[self showMessage:#"ok"];
// Config Torch
if([key isEqualToString:#"torch"]) {
if([[conf curValue] isEqualToString: #"on"]) {
[device setTorchMode:AVCaptureTorchModeOn];
}else{
[device setTorchMode:AVCaptureTorchModeOff];
}
}
// Config Focus
if([key isEqualToString:#"focus_mode"]) {
if([[conf curValue] isEqualToString: #"auto"]) {
[device setFocusMode: AVCaptureFocusModeAutoFocus];
}else if([[conf curValue] isEqualToString: #"continous_auto"]) {
[device setFocusMode: AVCaptureFocusModeContinuousAutoFocus];
}else{
[device setFocusMode: AVCaptureFocusModeLocked];
}
}
// Config Flash
if([key isEqualToString:#"flash"]) {
if([[conf curValue] isEqualToString: #"on"]) {
[device setFlashMode: AVCaptureFlashModeOn];
}else{
[device setFlashMode:AVCaptureFlashModeOff];
}
}
// Config Exposure
if([key isEqualToString:#"white_balance"]) {
if([[conf curValue] isEqualToString: #"auto"]) {
[device setExposureMode: AVCaptureExposureModeAutoExpose];
}else if([[conf curValue] isEqualToString: #"continous_auto"]) {
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
}else if([[conf curValue] isEqualToString: #"custom"]) {
[device setExposureMode:AVCaptureExposureModeCustom];
}else{
[device setExposureMode:AVCaptureExposureModeLocked];
}
}
// Config White balance
if([key isEqualToString:#"exposure_mode"]) {
if([[conf curValue] isEqualToString: #"auto"]) {
[device setExposureMode: AVCaptureExposureModeAutoExpose];
}else if([[conf curValue] isEqualToString: #"continous_auto"]) {
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
}else if([[conf curValue] isEqualToString: #"custom"]) {
[device setExposureMode:AVCaptureExposureModeCustom];
}else{
[device setExposureMode:AVCaptureExposureModeLocked];
}
}
}
[device unlockForConfiguration];
}
-(void) takePic {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in _stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", _stillImageOutput);
[_stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}];
}
- (AVCaptureDevice *) CameraWithPosition:(AVCaptureDevicePosition) Position
{
NSArray *Devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *Device in Devices)
{
if ([Device position] == Position)
{
return Device;
}
}
return nil;
}
/*
#pragma mark - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
// Get the new view controller using [segue destinationViewController].
// Pass the selected object to the new view controller.
}
*/
- (IBAction)toggleCam:(id)sender {
AVCaptureDeviceInput* newDeviceInput;
NSError* error;
AVCaptureDevicePosition position = [[deviceInput device] position];
if(position == AVCaptureDevicePositionBack) {
newDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self CameraWithPosition:AVCaptureDevicePositionBack] error:&error];
}else{
newDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self CameraWithPosition:AVCaptureDevicePositionFront] error:&error];
}
if(error != nil) {
[session beginConfiguration]; //We can now change the inputs and output configuration. Use commitConfiguration to end
[session removeInput:deviceInput];
if ([session canAddInput:newDeviceInput])
{
[session addInput:newDeviceInput];
deviceInput = newDeviceInput;
}
else
{
[session addInput:deviceInput];
}
//Set the connection properties again
//[self CameraSetOutputProperties];
[session commitConfiguration];
}
}
-(void) recordVideo {
[self showMessage:#"recording .."];
// If we are not recording
if(!recording) {
recording = YES;
NSString* path = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output_video.wav"];
NSURL* url = [[NSURL alloc] initFileURLWithPath:path];
NSFileManager* fileManager = [NSFileManager defaultManager];
if([fileManager fileExistsAtPath:path]) {
NSError* errorMsg;
if ([fileManager removeItemAtPath:path error:&errorMsg] == NO)
{
//Error - handle if requried
}
}
[_videoOutput startRecordingToOutputFileURL:url recordingDelegate:self];
}else{
recording = NO;
[_videoOutput stopRecording];
}
}
-(void) captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
BOOL recordedSuccessfully = YES;
if ([error code] != noErr) {
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value) {
recordedSuccessfully = [value boolValue];
}
}
if(recordedSuccessfully) {
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if([ALAssetsLibrary authorizationStatus]) {
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL]) {
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
}
}];
}
}else{
[self showMessage:#"Error"];
}
}
}
#end

Coding Multiple Leaderboards

I am making a game in which the player can achieve a positive high score or a negative low score depending on the choices they make. The high score has been working fine, but I'm having trouble with the low score.
-(void)authenticateLocalPlayer{
GKLocalPlayer *localPlayer = [GKLocalPlayer localPlayer];
localPlayer.authenticateHandler = ^(UIViewController *viewController, NSError *error){
if (viewController != nil) {
[self presentViewController:viewController animated:YES completion:nil];
}
else{
if ([GKLocalPlayer localPlayer].authenticated) {
_gameCenterEnabled = YES;
// Get the default leaderboard identifier.
[[GKLocalPlayer localPlayer] loadDefaultLeaderboardIdentifierWithCompletionHandler:^(NSString *leaderboardIdentifier, NSError *error) {
if (error != nil) {
NSLog(#"%#", [error localizedDescription]);
}
else{
_leaderboardIdentifier = leaderboardIdentifier;
}
}];
}
else{
_gameCenterEnabled = NO;
}
}
};
}
-(void)reportScore{
GKScore *highscore = [[GKScore alloc] initWithLeaderboardIdentifier:_leaderboardIdentifier];
highscore.value = HighScoreNumber;
[GKScore reportScores:#[highscore] withCompletionHandler:^(NSError *error) {
if (error != nil) {
NSLog(#"%#", [error localizedDescription]);
}
}];
-(void)showLeaderboardAndAchievements:(BOOL)shouldShowLeaderboard{
GKGameCenterViewController *gcViewController = [[GKGameCenterViewController alloc] init];
gcViewController.gameCenterDelegate = self;
if (shouldShowLeaderboard) {
gcViewController.viewState = GKGameCenterViewControllerStateLeaderboards;
gcViewController.leaderboardIdentifier = _leaderboardIdentifier;
}
else{
gcViewController.viewState = GKGameCenterViewControllerStateAchievements;
}
[self presentViewController:gcViewController animated:YES completion:nil];
}
You'll notice leaderboardidentifier, it is useful for reporting scores to the default leaderboard, but when I try to get it to work for two different ones, the code shuts down.
I've tried adding this to "Report Score":
GKScore *lowscore = [[GKScore alloc] initWithLeaderboardIdentifier:_leaderboardIdentifier];
lowscore.value = LowScoreNumber;
[GKScore reportScores:#[lowscore] withCompletionHandler:^(NSError *error) {
if (error != nil) {
NSLog(#"%#", [error localizedDescription]);
}
}];
}
Then, I change the leaderboard identifiers to match itunesconnect, but I'm not sure how I need to change authenticateLocalPlayer and shouldShowLeaderboardandAchievements.
I have some experience with game center. I have two games on the App Store that have multiple leaderboards (Squared!! and Primes!).
Instead of having one method for each leaderboard I made one method for submitting scores. Here is that method:
-(void) submitScore:(int64_t)score Leaderboard: (NSString*)leaderboard
{
//1: Check if Game Center
// features are enabled
if (!_gameCenterFeaturesEnabled) {
return;
}
//2: Create a GKScore object
GKScore* gkScore =
[[GKScore alloc]
initWithLeaderboardIdentifier:leaderboard];
//3: Set the score value
gkScore.value = score;
//4: Send the score to Game Center
[gkScore reportScoreWithCompletionHandler:
^(NSError* error) {
[self setLastError:error];
BOOL success = (error == nil);
if ([_delegate
respondsToSelector:
#selector(onScoresSubmitted:)]) {
[_delegate onScoresSubmitted:success];
}
}];
}
When you want to submit your high scores all you have to do is add something like:
[[GCHelper sharedGameKitHelper] submitScore:myLowScore Leaderboard:TELS];
[[GCHelper sharedGameKitHelper] submitScore:myHighScore Leaderboard:TEHS];
GCHelper is the class that contains my submitScore:Leaderboard: method.
To view your leaderboards or achievements within your app try this:
- (void) presentLeaderboards {
GKGameCenterViewController* gameCenterController = [[GKGameCenterViewController alloc] init];
gameCenterController.viewState = GKGameCenterViewControllerStateLeaderboards;
gameCenterController.gameCenterDelegate = self;
[self presentViewController:gameCenterController animated:YES completion:nil];
}
- (void) gameCenterViewControllerDidFinish:(GKGameCenterViewController*) gameCenterViewController {
[self dismissViewControllerAnimated:YES completion:nil];
}
- (void) presentAchievements {
GKGameCenterViewController* gameCenterController = [[GKGameCenterViewController alloc] init];
gameCenterController.viewState = GKGameCenterViewControllerStateAchievements;
gameCenterController.gameCenterDelegate = self;
I hope this answers your question!

Take a picture in iOS without UIImagePicker and without preview it

Do you know any way / method to take a photo in iOS and saving it to camera Roll only with a simple button pressure without showing any preview?
I already know how to show the camera view but it show the preview of the image and the user need to click the take photo button to take the photo.
In few Words: the user click the button, the picture is taken, without previews nor double checks to take / save the photo.
I already found the takePicture method of UIIMagePickerController Class http://developer.apple.com/library/ios/documentation/uikit/reference/UIImagePickerController_Class/UIImagePickerController/UIImagePickerController.html#//apple_ref/occ/instm/UIImagePickerController/takePicture
Set the showsCameraControls-Property to NO.
poc = [[UIImagePickerController alloc] init];
[poc setTitle:#"Take a photo."];
[poc setDelegate:self];
[poc setSourceType:UIImagePickerControllerSourceTypeCamera];
poc.showsCameraControls = NO;
You also have to add your own Controls as a custom view on the top of poc.view. But that is very simple and you can add your own UI-style by that way.
You receive the image-data as usually within the imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:
To take the photo, you call
[poc takePicture];
from your custom button.
Hope, that works for you.
Assuming you want a point-and-shoot method, you can create an AVSession and just call the UIImageWriteToSavedPhotosAlbum method.
Here is a link that goes into that exact process: http://www.musicalgeometry.com/?p=1297
It's also worth noting that your users need to have given the app access to their camera roll or you may experience issues saving the images.
You need to design your own custom preview according to your size, on capture button pressed and call buttonPressed method and do stuff what you want
(void)buttonPressed:(UIButton *)sender {
NSLog(#" Capture Clicked");
[self.imagePicker takePicture];
//[NSTimer scheduledTimerWithTimeInterval:3.0f target:self
selector:#selector(timerFired:) userInfo:nil repeats:NO];
}
following is code that will take photo without showing preview screen. when i tried the accepted answer, which uses UIImagePickerController, the preview screen showed, then auto disappeared. with the code below, user taps 'takePhoto' button, and the devices takes photo with zero change to UI (in my app, i add a green check mark next to take photo button). the code below is from apple https://developer.apple.com/LIBRARY/IOS/samplecode/AVCam/Introduction/Intro.html with the 'extra functions' (that do not relate to taking still photo) commented out. thank you incmiko for suggesting this code in your answer iOS take photo from camera without modalViewController.
updating code, 26 mar 2015:
to trigger snap photo:
[self snapStillImage:sender];
in .h file:
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
// include code below in header file, after #import and before #interface
// avfoundation copy paste code
static void * CapturingStillImageContext = &CapturingStillImageContext;
static void * RecordingContext = &RecordingContext;
static void * SessionRunningAndDeviceAuthorizedContext = &SessionRunningAndDeviceAuthorizedContext;
// avfoundation, include code below after #interface
// avf - Session management.
#property (nonatomic) dispatch_queue_t sessionQueue; // Communicate with the session and other session objects on this queue.
#property (nonatomic) AVCaptureSession *session;
#property (nonatomic) AVCaptureDeviceInput *videoDeviceInput;
#property (nonatomic) AVCaptureMovieFileOutput *movieFileOutput;
#property (nonatomic) AVCaptureStillImageOutput *stillImageOutput;
// avf - Utilities.
#property (nonatomic) UIBackgroundTaskIdentifier backgroundRecordingID;
#property (nonatomic, getter = isDeviceAuthorized) BOOL deviceAuthorized;
#property (nonatomic, readonly, getter = isSessionRunningAndDeviceAuthorized) BOOL sessionRunningAndDeviceAuthorized;
#property (nonatomic) BOOL lockInterfaceRotation;
#property (nonatomic) id runtimeErrorHandlingObserver;
in .m file:
#pragma mark - AV Foundation
- (BOOL)isSessionRunningAndDeviceAuthorized
{
return [[self session] isRunning] && [self isDeviceAuthorized];
}
+ (NSSet *)keyPathsForValuesAffectingSessionRunningAndDeviceAuthorized
{
return [NSSet setWithObjects:#"session.running", #"deviceAuthorized", nil];
}
// call following method from viewDidLoad
- (void)CreateAVCaptureSession
{
// Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// Check for device authorization
[self checkDeviceAuthorizationStatus];
// In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.
// Why not do all of this on the main queue?
// -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive).
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
dispatch_async(sessionQueue, ^{
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
NSError *error = nil;
AVCaptureDevice *videoDevice = [ViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionFront];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error)
{
NSLog(#"%#", error);
}
if ([session canAddInput:videoDeviceInput])
{
[session addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
dispatch_async(dispatch_get_main_queue(), ^{
// Why are we dispatching this to the main queue?
// Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
// Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
});
}
/* AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (error)
{
NSLog(#"%#", error);
}
if ([session canAddInput:audioDeviceInput])
{
[session addInput:audioDeviceInput];
}
*/
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([session canAddOutput:movieFileOutput])
{
[session addOutput:movieFileOutput];
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported])
[connection setEnablesVideoStabilizationWhenAvailable:YES];
[self setMovieFileOutput:movieFileOutput];
}
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
if ([session canAddOutput:stillImageOutput])
{
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
[session addOutput:stillImageOutput];
[self setStillImageOutput:stillImageOutput];
}
});
}
// call method below from viewWilAppear
- (void)AVFoundationStartSession
{
dispatch_async([self sessionQueue], ^{
[self addObserver:self forKeyPath:#"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];
[self addObserver:self forKeyPath:#"stillImageOutput.capturingStillImage" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:CapturingStillImageContext];
[self addObserver:self forKeyPath:#"movieFileOutput.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
__weak ViewController *weakSelf = self;
[self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {
ViewController *strongSelf = weakSelf;
dispatch_async([strongSelf sessionQueue], ^{
// Manually restarting the session since it must have been stopped due to an error.
[[strongSelf session] startRunning];
});
}]];
[[self session] startRunning];
});
}
// call method below from viewDidDisappear
- (void)AVFoundationStopSession
{
dispatch_async([self sessionQueue], ^{
[[self session] stopRunning];
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
[[NSNotificationCenter defaultCenter] removeObserver:[self runtimeErrorHandlingObserver]];
[self removeObserver:self forKeyPath:#"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext];
[self removeObserver:self forKeyPath:#"stillImageOutput.capturingStillImage" context:CapturingStillImageContext];
[self removeObserver:self forKeyPath:#"movieFileOutput.recording" context:RecordingContext];
});
}
- (BOOL)prefersStatusBarHidden
{
return YES;
}
- (BOOL)shouldAutorotate
{
// Disable autorotation of the interface when recording is in progress.
return ![self lockInterfaceRotation];
}
- (NSUInteger)supportedInterfaceOrientations
{
return UIInterfaceOrientationMaskAll;
}
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration
{
// [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)toInterfaceOrientation];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (context == CapturingStillImageContext)
{
BOOL isCapturingStillImage = [change[NSKeyValueChangeNewKey] boolValue];
if (isCapturingStillImage)
{
[self runStillImageCaptureAnimation];
}
}
else if (context == RecordingContext)
{
BOOL isRecording = [change[NSKeyValueChangeNewKey] boolValue];
dispatch_async(dispatch_get_main_queue(), ^{
if (isRecording)
{
// [[self cameraButton] setEnabled:NO];
// [[self recordButton] setTitle:NSLocalizedString(#"Stop", #"Recording button stop title") forState:UIControlStateNormal];
// [[self recordButton] setEnabled:YES];
}
else
{
// [[self cameraButton] setEnabled:YES];
// [[self recordButton] setTitle:NSLocalizedString(#"Record", #"Recording button record title") forState:UIControlStateNormal];
// [[self recordButton] setEnabled:YES];
}
});
}
else if (context == SessionRunningAndDeviceAuthorizedContext)
{
BOOL isRunning = [change[NSKeyValueChangeNewKey] boolValue];
dispatch_async(dispatch_get_main_queue(), ^{
if (isRunning)
{
// [[self cameraButton] setEnabled:YES];
// [[self recordButton] setEnabled:YES];
// [[self stillButton] setEnabled:YES];
}
else
{
// [[self cameraButton] setEnabled:NO];
// [[self recordButton] setEnabled:NO];
// [[self stillButton] setEnabled:NO];
}
});
}
else
{
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
#pragma mark Actions
- (IBAction)toggleMovieRecording:(id)sender
{
// [[self recordButton] setEnabled:NO];
dispatch_async([self sessionQueue], ^{
if (![[self movieFileOutput] isRecording])
{
[self setLockInterfaceRotation:YES];
if ([[UIDevice currentDevice] isMultitaskingSupported])
{
// Setup background task. This is needed because the captureOutput:didFinishRecordingToOutputFileAtURL: callback is not received until AVCam returns to the foreground unless you request background execution time. This also ensures that there will be time to write the file to the assets library when AVCam is backgrounded. To conclude this background execution, -endBackgroundTask is called in -recorder:recordingDidFinishToOutputFileURL:error: after the recorded file has been saved.
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]];
}
// Update the orientation on the movie file output video connection before starting recording.
// [[[self movieFileOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Turning OFF flash for video recording
[ViewController setFlashMode:AVCaptureFlashModeOff forDevice:[[self videoDeviceInput] device]];
// Start recording to a temporary file.
NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[#"movie" stringByAppendingPathExtension:#"mov"]];
[[self movieFileOutput] startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self];
}
else
{
[[self movieFileOutput] stopRecording];
}
});
}
- (IBAction)changeCamera:(id)sender
{
// [[self cameraButton] setEnabled:NO];
// [[self recordButton] setEnabled:NO];
// [[self stillButton] setEnabled:NO];
dispatch_async([self sessionQueue], ^{
AVCaptureDevice *currentVideoDevice = [[self videoDeviceInput] device];
AVCaptureDevicePosition preferredPosition = AVCaptureDevicePositionUnspecified;
AVCaptureDevicePosition currentPosition = [currentVideoDevice position];
switch (currentPosition)
{
case AVCaptureDevicePositionUnspecified:
preferredPosition = AVCaptureDevicePositionBack;
break;
case AVCaptureDevicePositionBack:
preferredPosition = AVCaptureDevicePositionFront;
break;
case AVCaptureDevicePositionFront:
preferredPosition = AVCaptureDevicePositionBack;
break;
}
AVCaptureDevice *videoDevice = [ViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:preferredPosition];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
[[self session] beginConfiguration];
[[self session] removeInput:[self videoDeviceInput]];
if ([[self session] canAddInput:videoDeviceInput])
{
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:currentVideoDevice];
[ViewController setFlashMode:AVCaptureFlashModeAuto forDevice:videoDevice];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:videoDevice];
[[self session] addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
}
else
{
[[self session] addInput:[self videoDeviceInput]];
}
[[self session] commitConfiguration];
dispatch_async(dispatch_get_main_queue(), ^{
// [[self cameraButton] setEnabled:YES];
// [[self recordButton] setEnabled:YES];
// [[self stillButton] setEnabled:YES];
});
});
}
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
// [[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[ViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
// do someting good with saved image
[self saveImageToParse:image];
}
}];
});
}
- (IBAction)focusAndExposeTap:(UIGestureRecognizer *)gestureRecognizer
{
// CGPoint devicePoint = [(AVCaptureVideoPreviewLayer *)[[self previewView] layer] captureDevicePointOfInterestForPoint:[gestureRecognizer locationInView:[gestureRecognizer view]]];
// [self focusWithMode:AVCaptureFocusModeAutoFocus exposeWithMode:AVCaptureExposureModeAutoExpose atDevicePoint:devicePoint monitorSubjectAreaChange:YES];
}
- (void)subjectAreaDidChange:(NSNotification *)notification
{
CGPoint devicePoint = CGPointMake(.5, .5);
[self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO];
}
#pragma mark File Output Delegate
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
if (error)
NSLog(#"%#", error);
[self setLockInterfaceRotation:NO];
// Note the backgroundRecordingID for use in the ALAssetsLibrary completion handler to end the background task associated with this recording. This allows a new recording to be started, associated with a new UIBackgroundTaskIdentifier, once the movie file output's -isRecording is back to NO — which happens sometime after this method returns.
UIBackgroundTaskIdentifier backgroundRecordingID = [self backgroundRecordingID];
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
[[[ALAssetsLibrary alloc] init] writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) {
if (error)
NSLog(#"%#", error);
[[NSFileManager defaultManager] removeItemAtURL:outputFileURL error:nil];
if (backgroundRecordingID != UIBackgroundTaskInvalid)
[[UIApplication sharedApplication] endBackgroundTask:backgroundRecordingID];
}];
}
#pragma mark Device Configuration
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async([self sessionQueue], ^{
AVCaptureDevice *device = [[self videoDeviceInput] device];
NSError *error = nil;
if ([device lockForConfiguration:&error])
{
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:focusMode])
{
[device setFocusMode:focusMode];
[device setFocusPointOfInterest:point];
}
if ([device isExposurePointOfInterestSupported] && [device isExposureModeSupported:exposureMode])
{
[device setExposureMode:exposureMode];
[device setExposurePointOfInterest:point];
}
[device setSubjectAreaChangeMonitoringEnabled:monitorSubjectAreaChange];
[device unlockForConfiguration];
}
else
{
NSLog(#"%#", error);
}
});
}
+ (void)setFlashMode:(AVCaptureFlashMode)flashMode forDevice:(AVCaptureDevice *)device
{
if ([device hasFlash] && [device isFlashModeSupported:flashMode])
{
NSError *error = nil;
if ([device lockForConfiguration:&error])
{
[device setFlashMode:flashMode];
[device unlockForConfiguration];
}
else
{
NSLog(#"%#", error);
}
}
}
+ (AVCaptureDevice *)deviceWithMediaType:(NSString *)mediaType preferringPosition:(AVCaptureDevicePosition)position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:mediaType];
AVCaptureDevice *captureDevice = [devices firstObject];
for (AVCaptureDevice *device in devices)
{
if ([device position] == position)
{
captureDevice = device;
break;
}
}
return captureDevice;
}
#pragma mark UI
- (void)runStillImageCaptureAnimation
{
/*
dispatch_async(dispatch_get_main_queue(), ^{
[[[self previewView] layer] setOpacity:0.0];
[UIView animateWithDuration:.25 animations:^{
[[[self previewView] layer] setOpacity:1.0];
}];
});
*/
}
- (void)checkDeviceAuthorizationStatus
{
NSString *mediaType = AVMediaTypeVideo;
[AVCaptureDevice requestAccessForMediaType:mediaType completionHandler:^(BOOL granted) {
if (granted)
{
//Granted access to mediaType
[self setDeviceAuthorized:YES];
}
else
{
//Not granted access to mediaType
dispatch_async(dispatch_get_main_queue(), ^{
[[[UIAlertView alloc] initWithTitle:#"AVCam!"
message:#"AVCam doesn't have permission to use Camera, please change privacy settings"
delegate:self
cancelButtonTitle:#"OK"
otherButtonTitles:nil] show];
[self setDeviceAuthorized:NO];
});
}
}];
}

AVCaptureOutput Delegate

I'm creating an application that is using -(void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection { } function but that function is not being called. To furtherly explain, the application is using code from this tutorial to create a video recording app. When I ran the tutorial's code in xCode it ran the function above but when I copied it over into my application, not modifying it in anyway, it was never called.
Here's the code used:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
NSError *error = nil;
AVCaptureSession *session = [[AVCaptureSession alloc] init];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone){
[session setSessionPreset:AVCaptureSessionPreset640x480];
} else {
[session setSessionPreset:AVCaptureSessionPresetPhoto];
}
// Select a video device, make an input
AVCaptureDevice *device;
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionFront;
// find the front facing camera
for (AVCaptureDevice *d in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if ([d position] == desiredPosition) {
device = d;
isUsingFrontFacingCamera = YES;
break;
}
}
// fall back to the default camera.
if( nil == device )
{
isUsingFrontFacingCamera = NO;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
// get the input device
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if( !error ) {
// add the input to the session
if ( [session canAddInput:deviceInput] ){
[session addInput:deviceInput];
}
previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
previewLayer.backgroundColor = [[UIColor blackColor] CGColor];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
CALayer *rootLayer = [previewView layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:[rootLayer bounds]];
[rootLayer addSublayer:previewLayer];
[session startRunning];
}
session = nil;
if (error) {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:
[NSString stringWithFormat:#"Failed with error %d", (int)[error code]]
message:[error localizedDescription]
delegate:nil
cancelButtonTitle:#"Dismiss"
otherButtonTitles:nil];
[alertView show];
[self teardownAVCapture];
}
NSDictionary *detectorOptions = [[NSDictionary alloc] initWithObjectsAndKeys:CIDetectorAccuracyLow, CIDetectorAccuracy, nil];
faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
// Make a video data output
videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
// we want BGRA, both CoreGraphics and OpenGL work well with 'BGRA'
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[videoDataOutput setVideoSettings:rgbOutputSettings];
[videoDataOutput setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked
// create a serial dispatch queue used for the sample buffer delegate
// a serial dispatch queue must be used to guarantee that video frames will be delivered in order
// see the header doc for setSampleBufferDelegate:queue: for more information
videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
if ( [session canAddOutput:videoDataOutput] ){
[session addOutput:videoDataOutput];
}
// get the output for doing face detection.
[[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];
//[self setupCaptureSession];
}
Okay I think I know what the problem is. You had [session startRunning] even before you set up your videoDataOutput. A session with no video data output....well, will not call the AVCaptureOutput delegate.