AVCaptureOutput Delegate - objective-c

I'm creating an application that is using -(void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection { } function but that function is not being called. To furtherly explain, the application is using code from this tutorial to create a video recording app. When I ran the tutorial's code in xCode it ran the function above but when I copied it over into my application, not modifying it in anyway, it was never called.
Here's the code used:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
NSError *error = nil;
AVCaptureSession *session = [[AVCaptureSession alloc] init];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone){
[session setSessionPreset:AVCaptureSessionPreset640x480];
} else {
[session setSessionPreset:AVCaptureSessionPresetPhoto];
}
// Select a video device, make an input
AVCaptureDevice *device;
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionFront;
// find the front facing camera
for (AVCaptureDevice *d in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if ([d position] == desiredPosition) {
device = d;
isUsingFrontFacingCamera = YES;
break;
}
}
// fall back to the default camera.
if( nil == device )
{
isUsingFrontFacingCamera = NO;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
// get the input device
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if( !error ) {
// add the input to the session
if ( [session canAddInput:deviceInput] ){
[session addInput:deviceInput];
}
previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
previewLayer.backgroundColor = [[UIColor blackColor] CGColor];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
CALayer *rootLayer = [previewView layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:[rootLayer bounds]];
[rootLayer addSublayer:previewLayer];
[session startRunning];
}
session = nil;
if (error) {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:
[NSString stringWithFormat:#"Failed with error %d", (int)[error code]]
message:[error localizedDescription]
delegate:nil
cancelButtonTitle:#"Dismiss"
otherButtonTitles:nil];
[alertView show];
[self teardownAVCapture];
}
NSDictionary *detectorOptions = [[NSDictionary alloc] initWithObjectsAndKeys:CIDetectorAccuracyLow, CIDetectorAccuracy, nil];
faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
// Make a video data output
videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
// we want BGRA, both CoreGraphics and OpenGL work well with 'BGRA'
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[videoDataOutput setVideoSettings:rgbOutputSettings];
[videoDataOutput setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked
// create a serial dispatch queue used for the sample buffer delegate
// a serial dispatch queue must be used to guarantee that video frames will be delivered in order
// see the header doc for setSampleBufferDelegate:queue: for more information
videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
if ( [session canAddOutput:videoDataOutput] ){
[session addOutput:videoDataOutput];
}
// get the output for doing face detection.
[[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];
//[self setupCaptureSession];
}

Okay I think I know what the problem is. You had [session startRunning] even before you set up your videoDataOutput. A session with no video data output....well, will not call the AVCaptureOutput delegate.

Related

How to record video from iPhone camera in iOS 9 objective c?

hello I need to build app that record video from iPhone camera.
I search in the net and not found notting that works in iOS 9.
I get this code from this git project :
https://github.com/versluis/Video-Recorder
This code open camera but not allow Me to take a video.
- (IBAction)recordButton:(id)sender {
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
UIImagePickerController *picker = [[UIImagePickerController alloc]init];
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
picker.delegate = self;
picker.allowsEditing = NO;
NSArray *mediaTypes = [[NSArray alloc]initWithObjects:(NSString *)kUTTypeMovie, nil];
picker.mediaTypes = mediaTypes;
[self presentViewController:picker animated:YES completion:nil];
} else {
UIAlertView *alertView = [[UIAlertView alloc]initWithTitle:nil message:#"I'm afraid there's no camera on this device!" delegate:nil cancelButtonTitle:#"Dang!" otherButtonTitles:nil, nil];
[alertView show];
}
}
- (IBAction)playbackButton:(id)sender {
// pick a video from the documents directory
NSURL *video = [self grabFileURL:#"video.mov"];
// create a movie player view controller
MPMoviePlayerViewController * controller = [[MPMoviePlayerViewController alloc]initWithContentURL:video];
[controller.moviePlayer prepareToPlay];
[controller.moviePlayer play];
// and present it
[self presentMoviePlayerViewControllerAnimated:controller];
}
#pragma mark - Delegate Methods
- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {
// user hit cancel
[self dismissViewControllerAnimated:YES completion:nil];
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// grab our movie URL
NSURL *chosenMovie = [info objectForKey:UIImagePickerControllerMediaURL];
// save it to the documents directory
NSURL *fileURL = [self grabFileURL:#"video.mov"];
NSData *movieData = [NSData dataWithContentsOfURL:chosenMovie];
[movieData writeToURL:fileURL atomically:YES];
// save it to the Camera Roll
UISaveVideoAtPathToSavedPhotosAlbum([chosenMovie path], nil, nil, nil);
// and dismiss the picker
[self dismissViewControllerAnimated:YES completion:nil];
}
- (NSURL*)grabFileURL:(NSString *)fileName {
// find Documents directory
NSURL *documentsURL = [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] lastObject];
// append a file name to it
documentsURL = [documentsURL URLByAppendingPathComponent:fileName];
return documentsURL;
}
Ray gave pretty good tutorial about this. Hope it helped.

Monitor current dB level in Objective-C

I'm trying to display the decibel level of incoming sound via the iPhone microphone, everything works in the simulator, but does not work on an actual iOS device. It's like the iPhone is not detecting sound, here is my code:
- (void)viewDidLoad {
[[AVAudioSession sharedInstance] requestRecordPermission:^(BOOL granted) {
if (granted) {
[self setup];
NSLog(#"granted");
} else {
NSLog(#"denied");
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Microphone Access Denied"
message:#"You must allow microphone access in Settings > Privacy > Microphone"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
}
}];
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (void)setup {
// record audio to /dev/null
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
// some settings
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 2], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
// create a AVAudioRecorder
NSError *error;
self.recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
[self.recorder setMeteringEnabled:YES];
if (self.recorder) {
[self.recorder prepareToRecord];
self.recorder.meteringEnabled = YES;
[self.recorder record];
self.levelTimer = [NSTimer scheduledTimerWithTimeInterval:0.01f target:self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
[self.recorder updateMeters];
}
}
- (void)levelTimerCallback:(NSTimer *)timer {
[self.recorder updateMeters];
// here is the DB!
float peakDecebels = [self.recorder peakPowerForChannel:1];
NSLog(#"peak: %f", peakDecebels);
float avaeragePower = [self.recorder averagePowerForChannel:1];
NSLog(#"averagePower: %f", avaeragePower);
[self.dbLabel setText:[NSString stringWithFormat:#"db %f",peakDecebels]];
}
You need to set your audio session appropriately:
NSError* error = nil;
BOOL success = [[AVAudioSession sharedInstance]
setCategory:AVAudioSessionCategoryRecord
error:&error];
if (!success && error) {
NSLog(#"error %#",error);
}
This should be done before you start recording.

Coding Multiple Leaderboards

I am making a game in which the player can achieve a positive high score or a negative low score depending on the choices they make. The high score has been working fine, but I'm having trouble with the low score.
-(void)authenticateLocalPlayer{
GKLocalPlayer *localPlayer = [GKLocalPlayer localPlayer];
localPlayer.authenticateHandler = ^(UIViewController *viewController, NSError *error){
if (viewController != nil) {
[self presentViewController:viewController animated:YES completion:nil];
}
else{
if ([GKLocalPlayer localPlayer].authenticated) {
_gameCenterEnabled = YES;
// Get the default leaderboard identifier.
[[GKLocalPlayer localPlayer] loadDefaultLeaderboardIdentifierWithCompletionHandler:^(NSString *leaderboardIdentifier, NSError *error) {
if (error != nil) {
NSLog(#"%#", [error localizedDescription]);
}
else{
_leaderboardIdentifier = leaderboardIdentifier;
}
}];
}
else{
_gameCenterEnabled = NO;
}
}
};
}
-(void)reportScore{
GKScore *highscore = [[GKScore alloc] initWithLeaderboardIdentifier:_leaderboardIdentifier];
highscore.value = HighScoreNumber;
[GKScore reportScores:#[highscore] withCompletionHandler:^(NSError *error) {
if (error != nil) {
NSLog(#"%#", [error localizedDescription]);
}
}];
-(void)showLeaderboardAndAchievements:(BOOL)shouldShowLeaderboard{
GKGameCenterViewController *gcViewController = [[GKGameCenterViewController alloc] init];
gcViewController.gameCenterDelegate = self;
if (shouldShowLeaderboard) {
gcViewController.viewState = GKGameCenterViewControllerStateLeaderboards;
gcViewController.leaderboardIdentifier = _leaderboardIdentifier;
}
else{
gcViewController.viewState = GKGameCenterViewControllerStateAchievements;
}
[self presentViewController:gcViewController animated:YES completion:nil];
}
You'll notice leaderboardidentifier, it is useful for reporting scores to the default leaderboard, but when I try to get it to work for two different ones, the code shuts down.
I've tried adding this to "Report Score":
GKScore *lowscore = [[GKScore alloc] initWithLeaderboardIdentifier:_leaderboardIdentifier];
lowscore.value = LowScoreNumber;
[GKScore reportScores:#[lowscore] withCompletionHandler:^(NSError *error) {
if (error != nil) {
NSLog(#"%#", [error localizedDescription]);
}
}];
}
Then, I change the leaderboard identifiers to match itunesconnect, but I'm not sure how I need to change authenticateLocalPlayer and shouldShowLeaderboardandAchievements.
I have some experience with game center. I have two games on the App Store that have multiple leaderboards (Squared!! and Primes!).
Instead of having one method for each leaderboard I made one method for submitting scores. Here is that method:
-(void) submitScore:(int64_t)score Leaderboard: (NSString*)leaderboard
{
//1: Check if Game Center
// features are enabled
if (!_gameCenterFeaturesEnabled) {
return;
}
//2: Create a GKScore object
GKScore* gkScore =
[[GKScore alloc]
initWithLeaderboardIdentifier:leaderboard];
//3: Set the score value
gkScore.value = score;
//4: Send the score to Game Center
[gkScore reportScoreWithCompletionHandler:
^(NSError* error) {
[self setLastError:error];
BOOL success = (error == nil);
if ([_delegate
respondsToSelector:
#selector(onScoresSubmitted:)]) {
[_delegate onScoresSubmitted:success];
}
}];
}
When you want to submit your high scores all you have to do is add something like:
[[GCHelper sharedGameKitHelper] submitScore:myLowScore Leaderboard:TELS];
[[GCHelper sharedGameKitHelper] submitScore:myHighScore Leaderboard:TEHS];
GCHelper is the class that contains my submitScore:Leaderboard: method.
To view your leaderboards or achievements within your app try this:
- (void) presentLeaderboards {
GKGameCenterViewController* gameCenterController = [[GKGameCenterViewController alloc] init];
gameCenterController.viewState = GKGameCenterViewControllerStateLeaderboards;
gameCenterController.gameCenterDelegate = self;
[self presentViewController:gameCenterController animated:YES completion:nil];
}
- (void) gameCenterViewControllerDidFinish:(GKGameCenterViewController*) gameCenterViewController {
[self dismissViewControllerAnimated:YES completion:nil];
}
- (void) presentAchievements {
GKGameCenterViewController* gameCenterController = [[GKGameCenterViewController alloc] init];
gameCenterController.viewState = GKGameCenterViewControllerStateAchievements;
gameCenterController.gameCenterDelegate = self;
I hope this answers your question!

Can't play system sounds after capturing audio / video

I'm recoding audio/video using AVfoudnation. and I need to play a sounds, using system sounds, before I start capturing video/audio. This is working correctly the first time, but when I try to do it the second time, the system audi doesn't play. My guess is that something in the AVfoundation is not been released correctly.
In my application deletage, I have this code in the applicationDidFinishLaunching method:
VKRSAppSoundPlayer *aPlayer = [[VKRSAppSoundPlayer alloc] init];
[aPlayer addSoundWithFilename:#"sound1" andExtension:#"caf"];
self.appSoundPlayer = aPlayer;
[aPlayer release];
and also this method
- (void)playSound:(NSString *)sound
{
[appSoundPlayer playSound:sound];
}
As you can see I'm using VKRSAppSoundPlayer, which works great!
In a view, I have this code:
- (void) startSession
{
self.session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
if([session canSetSessionPreset:AVCaptureSessionPreset640x480])
session.sessionPreset = AVCaptureSessionPresetMedium;
[session commitConfiguration];
CALayer *viewLayer = [videoPreviewView layer];
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = viewLayer.bounds;
[viewLayer addSublayer:captureVideoPreviewLayer];
self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:[self frontFacingCameraIfAvailable] error:nil];
self.audioInput = [AVCaptureDeviceInput deviceInputWithDevice:[self audioDevice] error:nil];
if(videoInput){
self.videoOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:videoOutput];
//[videoOutput release];
if([session canAddInput:videoInput]){
//[session beginConfiguration];
[session addInput:videoInput];
}
//[videoInput release];
[session removeInput:[self audioInput]];
if([session canAddInput:audioInput]){
[session addInput:audioInput];
}
//[audioInput release];
if([session canAddInput:audioInput])
[session addInput:audioInput];
NSLog(#"startRunning!");
[session startRunning];
[self startRecording];
if(![self recordsVideo])
[self showAlertWithTitle:#"Video Recording Unavailable" msg:#"This device can't record video."];
}
}
- (void) stopSession
{
[session stopRunning];
[session release];
}
- (AVCaptureDevice *)frontFacingCameraIfAvailable
{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
Boolean cameraFound = false;
for (AVCaptureDevice *device in videoDevices)
{
NSLog(#"1 frontFacingCameraIfAvailable %d", device.position);
if (device.position == AVCaptureDevicePositionBack){
NSLog(#"1 frontFacingCameraIfAvailable FOUND");
captureDevice = device;
cameraFound = true;
break;
}
}
if(cameraFound == false){
for (AVCaptureDevice *device in videoDevices)
{
NSLog(#"2 frontFacingCameraIfAvailable %d", device.position);
if (device.position == AVCaptureDevicePositionFront){
NSLog(#"2 frontFacingCameraIfAvailable FOUND");
captureDevice = device;
break;
}
}
}
return captureDevice;
}
- (AVCaptureDevice *) audioDevice
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio];
if ([devices count] > 0) {
return [devices objectAtIndex:0];
}
return nil;
}
- (void) startRecording
{
#if _Multitasking_
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]];
}
#endif
[videoOutput startRecordingToOutputFileURL:[self generatenewVideoPath]
recordingDelegate:self];
}
- (void) stopRecording
{
[videoOutput stopRecording];
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections error:(NSError *)error
{
NSFileManager *man = [[NSFileManager alloc] init];
NSDictionary *attrs = [man attributesOfItemAtPath: [outputFileURL path] error: NULL];
NSString *fileSize = [NSString stringWithFormat:#"%llu", [attrs fileSize]];
// close this screen
[self exitScreen];
}
-(BOOL)recordsVideo
{
AVCaptureConnection *videoConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo
fromConnections:[videoOutput connections]];
return [videoConnection isActive];
}
-(BOOL)recordsAudio
{
AVCaptureConnection *audioConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeAudio
fromConnections:[videoOutput connections]];
return [audioConnection isActive];
}
If I do [videoInput release]; and [audioInput release]; I got a bad access error. that's why they are commented out. This may be part of the issue.
If I try to play the system sound n times, it work, but if I go first to the recording script, it wont work after that.
Any ideas?
The proper way to release AVCaptureSession is the following:
- (void) destroySession {
// Notify the view that the session will end
if ([delegate respondsToSelector:#selector(captureManagerSessionWillEnd:)]) {
[delegate captureManagerSessionWillEnd:self];
}
// remove the device inputs
[session removeInput:[self videoInput]];
[session removeInput:[self audioInput]];
// release
[session release];
// remove AVCamRecorder
[recorder release];
// Notify the view that the session has ended
if ([delegate respondsToSelector:#selector(captureManagerSessionEnded:)]) {
[delegate captureManagerSessionEnded:self];
}
}
If you're having some sort of release problems (bad access), I can recommend taking your code out of your current "messy" project to some other new project and debug the problem over there.
When I had similar problem, I just did that. I shared it on Github, you might find this project useful: AVCam-CameraReleaseTest

GameCenter Invitation Handler

trying to implement a multiplayer. Using the sample from Game Center - Sending and receiving data.
Everything seems okay, but in apple documentation there is also said about invitation handler.
[GKMatchmaker sharedMatchmaker].inviteHandler = ^(GKInvite *acceptedInvite, NSArray *playersToInvite) {
// Insert application-specific code here to clean up any games in progress.
if (acceptedInvite) {
GKMatchmakerViewController *mmvc = [[[GKMatchmakerViewController alloc] initWithInvite:acceptedInvite] autorelease];
mmvc.matchmakerDelegate = self;
[self presentModalViewController:mmvc animated:YES];
} else if (playersToInvite) {
GKMatchRequest *request = [[[GKMatchRequest alloc] init] autorelease];
request.minPlayers = 2;
request.maxPlayers = 4;
request.playersToInvite = playersToInvite;
GKMatchmakerViewController *mmvc = [[[GKMatchmakerViewController alloc] initWithMatchRequest:request] autorelease];
mmvc.matchmakerDelegate = self;
[self presentModalViewController:mmvc animated:YES];
}
};
The problem is quite simple: I do not know where to add this code.
As stated in the docs
Your application should set the
invitation handler as early as
possible after your application is
launched; an appropriate place to set
the handler is in the completion block
you provided that executes after the
local player is authenticated.
Somewhere in your code, you should have authenticated the local player with something like this
[[GKLocalPlayer localPlayer] authenticateWithCompletionHandler:^(NSError *error) {
if (error == nil) {
// Insert your piece of code here
} else {
// Handle the error
}
}];
Hope that helps
My code is below, and it works very well. In the authenticateLocalUser, add the code below:
[[GKLocalPlayer localPlayer] authenticateWithCompletionHandler:^(NSError *error) {
[GKMatchmaker sharedMatchmaker].inviteHandler = ^(GKInvite *acceptedInvite, NSArray *playersToInvite) { // Add for invite handler
// Insert application-specific code here to clean up any games in progress.
if (acceptedInvite) {
GKMatchmakerViewController *mmvc = [[GKMatchmakerViewController alloc] initWithInvite:acceptedInvite] ;
mmvc.matchmakerDelegate = self;
// [self presentModalViewController:mmvc animated:YES];
[_delegate matchStart];
} else if (playersToInvite) {
GKMatchRequest *request = [[GKMatchRequest alloc] init] ;
request.minPlayers = 2;
request.maxPlayers = 2;
request.playersToInvite = playersToInvite;
GKMatchmakerViewController *mmvc = [[GKMatchmakerViewController alloc] initWithMatchRequest:request] ;
mmvc.matchmakerDelegate = self;
// [self presentModalViewController:mmvc animated:YES];
[_delegate matchStart];
}
};
[self callDelegateOnMainThread:#selector(processGameCenterAuth:) withArg:NULL error:error];
}];