iOS speech kit,the SFSpeechRecognizer and AVAudioEngine work together to speech recognition, crash sometime when the mic is used by other progress.
self.audioEngine = [[AVAudioEngine alloc] init];
AVAudioInputNode *inputNode = self.audioEngine.inputNode;
AVAudioFormat *nativeAudioFormat = [inputNode outputFormatForBus:0];
__weak typeof(self)weakSelf = self;
[inputNode installTapOnBus:0 bufferSize:1024 format:nativeAudioFormat block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when) {
[weakSelf.recognitionRequest appendAudioPCMBuffer:buffer];
}];
[self.audioEngine prepare];
[self.audioEngine startAndReturnError:&error];
Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)'
How are you setting your AVAudioSession? This error is usually caused when that isn't set properly.
That is, you need to call the below code (or similar according to your use-case) before each time you use the mic to ensure the audio session is set up properly. If it isn't, for example, you are using the .playback category and try to use the microphone, you'll get the IsFormatSampleRateAndChannelCountValid(format) crash.
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playAndRecord, options: .defaultToSpeaker)
try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
catch let error as NSError {
print("ERROR:", error)
}
Related
I have an iOS app with a simple UIView placed in the view controller. I am trying to show the camera feed of the front facing camera, in the UIView. I am not trying to take a picture or record a video, I simply want to show the live feed in a UIView.
I have tried to implement AVCaptureVideoPreviewLayer, however the feed I get is blank. Nothing seems to happen. Here is my code:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input;
#try {
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
} #catch (NSException *exception) {
NSLog(#"Error; %#", error);
} #finally {
if (error == nil) {
if ([session canAddInput:input]) {
[session addInput:input];
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
if ([session canAddOutput:stillImageOutput]) {
[session setSessionPreset:AVCaptureSessionPresetHigh];
[session addOutput:stillImageOutput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[previewLayer.connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[backgroundStreamView.layer addSublayer:previewLayer];
[session startRunning];
NSLog(#"session running");
} else {
NSLog(#"cannot add output");
}
} else {
NSLog(#"cannot add inout");
}
} else {
NSLog(#"general error: %#", error);
}
}
The session runs perfectly fine, however no video feed is shown. What am I doing wrong?
Managed to fix it myself, turned out to be a fairly simple issue - I didn't specify the frame size of the AVCapturePreviewLayer and as a result it was not appearing (presumably because it defaults to a frame size of zero).
To fix this I set the frame to match the frame of my custom UIView:
[previewLayer setFrame:backgroundStreamView.bounds];
Deprecation code fix
AVCaptureStillImageOutput is also deprecated, so to fix that, I replaced it with the AVCapturePhotoOutput class. Thus the code changed from:
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}]
to the following:
AVCapturePhotoOutput *stillImageOutput = [[AVCapturePhotoOutput alloc] init];
I Have used EKSpanThisEvent so when I try to remove single event from device calendar it crashes, Instead of that if I use EKSpanFutureEvents it does not get crash but it will remove all future events with same eventIdentifier even though we try to remove single event from device calendar
Crash detail is as Follow
Assertion failure in -[EKEvent _deleteThisOccurrence]
Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'I screwed up somewhere in date calculation'
I tried all solutions from all sources but unfortunately none was useful, Any Help will be Highly Appreciated
CODE
if([identifier length] > 0)
{
__block EKEventStore* store = [[EKEventStore alloc] init];
[store requestAccessToEntityType:EKEntityTypeEvent completion:^(BOOL granted, NSError *error) {
if (!granted) return;
NSDate *eventdate = [commonUnit getSystemTimeZoneDateFromDate:strdate fromFormate:#"yyyy-MM-dd"];
NSPredicate *predicate = [store predicateForEventsWithStartDate:eventdate
endDate:[eventdate dateByAddingTimeInterval:86400]
calendars:nil];
NSMutableArray *events = [NSMutableArray arrayWithArray:[store eventsMatchingPredicate:predicate]];
NSPredicate *predicateidentifier = [NSPredicate predicateWithFormat:[NSString stringWithFormat:#"eventIdentifier = '%#'",identifier]];
[events filterUsingPredicate:predicateidentifier];
if (events.count > 0) {
NSError* error = nil;
[store removeEvent:events[0] span:status?EKSpanFutureEvents:EKSpanThisEvent commit:YES error:&error];
if(error == nil)
{
callbackBlock(TRUE);
}
else
{
callbackBlock(FALSE);
}
}
}];
}
Source
So, I have a QTCaptureSession set up thusly:
//Setup Camera
cameraSession = [[QTCaptureSession alloc] init];
QTCaptureDevice *camera = [QTCaptureDevice deviceWithUniqueID: cameraID];
BOOL success = [camera open: &error];
if (!success || error)
{
NSLog(#"Could not open device %#.", cameraID);
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
//Setup Input Session
QTCaptureDeviceInput *cameraInput = [[QTCaptureDeviceInput alloc] initWithDevice: camera];
success = [cameraSession addInput: cameraInput error: &error];
if (!success || error)
{
NSLog(#"Could not initialize input session.");
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
//Setup Output
QTCaptureDecompressedVideoOutput *cameraOutput = [[QTCaptureDecompressedVideoOutput alloc] init];
[cameraOutput setDelegate: self];
success = [cameraSession addOutput: cameraOutput error: &error];
if (!success || error)
{
NSLog(#"Could not initialize output session.");
NSLog(#"Error: %#", [error localizedDescription]);
return nil;
}
And the QTCaptureDecompressedVideoOutput delegate's captureOutput:didOutputVideoFrame:WithSampleBuffer:fromConnection: thusly:
- (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputVideoFrame:(CVImageBufferRef)videoFrame withSampleBuffer:(QTSampleBuffer *)sampleBuffer fromConnection:(QTCaptureConnection *)connection
{
NSLog(#"starting convert\n");
}
I then start the capture processing using:
[cameraSession startRunning];
All of the variables initialize fine, and the session starts fine, but captureOutput:didOutputVideoFrame:withSampleBuffer:fromConnection: never gets called.
Context
This is a command-line app, compiled with the GCC. It's linked against the following frameworks:
Foundation
Cocoa
QTKit
QuartzCore
Relevant Miscellany
The frame is not likely dropping because captureOutput:didDropVideoFrameWithSampleBuffer:fromConnection: is also not getting called.
So, with some help from Mike Ash, I managed to figure out that my program was terminating immediately and not waiting for the delegate callback (which, according to Apple's QTKit docs, might occur on a separate thread).
My solution was to add a BOOL properties to my object named captureIsFinished, then add this to the main() function:
//Wait Until Capture is Finished
while (![snap captureIsFinished])
{
[[NSRunLoop currentRunLoop] runUntilDate: [NSDate dateWithTimeIntervalSinceNow: 1]];
}
Which effectively perpetuates the run-loop of the app for 1 second, checks to see if the capture is finished, then runs for another second.
I am trying to create a method that will return me a ALAsset for a given asset url. (I need upload the asset later and want to do it outside the result block with the result.)
+ (ALAsset*) assetForPhoto:(Photo*)photo
{
ALAssetsLibrary* library = [[[ALAssetsLibrary alloc] init] autorelease];
__block ALAsset* assetToReturn = nil;
NSURL* url = [NSURL URLWithString:photo.assetUrl];
NSLog(#"assetForPhoto: %#[", url);
[library assetForURL:url resultBlock:^(ALAsset *asset)
{
NSLog(#"asset: %#", asset);
assetToReturn = asset;
NSLog(#"asset: %# %d", assetToReturn, [assetToReturn retainCount]);
} failureBlock:^(NSError *error)
{
assetToReturn = nil;
}];
NSLog(#"assetForPhoto: %#]", url);
NSLog(#"assetToReturn: %#", assetToReturn); // Invalid access exception coming here.
return assetToReturn;
}
The problem is assetToReturn gives an EXC_BAD_ACCESS.
Is there some problem if I try to assign pointers from inside the block? I saw some examples of blocks but they are always with simple types like integers etc.
A few things:
You must keep the ALAssetsLibrary instance around that created the ALAsset for as long as you use the asset.
You must register an observer for the ALAssetsLibraryChangedNotification, when that is received any ALAssets you have and any other AssetsLibrary objects will need to be refetched as they will no longer be valid. This can happen at any time.
You shouldn't expect the -assetForURL:resultBlock:failureBlock:, or any of the AssetsLibrary methods with a failureBlock: to be synchronous. They may need to prompt the user for access to the library and will not always have their blocks executed immediately. It's better to put actions that need to happen on success in the success block itself.
Only if you absolutely must make this method synchronous in your app (which I'd advise you to not do), you'll need to wait on a semaphore after calling assetForURL:resultBlock:failureBlock: and optionally spin the runloop if you end up blocking the main thread.
The following implementation should satisfy as a synchronous call under all situations, but really, you should try very hard to make your code asynchronous instead.
- (ALAsset *)assetForURL:(NSURL *)url {
__block ALAsset *result = nil;
__block NSError *assetError = nil;
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
[[self assetsLibrary] assetForURL:url resultBlock:^(ALAsset *asset) {
result = [asset retain];
dispatch_semaphore_signal(sema);
} failureBlock:^(NSError *error) {
assetError = [error retain];
dispatch_semaphore_signal(sema);
}];
if ([NSThread isMainThread]) {
while (!result && !assetError) {
[[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]];
}
}
else {
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);
}
dispatch_release(sema);
[assetError release];
return [result autorelease];
}
You should retain and autorelease the asset:
// ...
assetToReturn = [asset retain];
// ...
return [assetToReturn autorelease];
I have the following code:
BOOL success;
QTCaptureSession *session = [[QTCaptureSession alloc] init];
QTCaptureDevice *device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo];
success = [device open: &e];
if ( !success )
{
NSLog(#"error opening input device: %#", e);
return;
}
QTCaptureDeviceInput *input = [QTCaptureDeviceInput deviceInputWithDevice: device];
success = [session addInput: input error: &e];
if ( !success )
{
NSLog(#"error adding input device to session: %#", e);
return;
}
QTCaptureDecompressedVideoOutput *output = [[QTCaptureDecompressedVideoOutput alloc] init];
[output setDelegate: self];
success = [session addOutput: output error: &e];
if ( !success )
{
NSLog(#"error adding output device to session: %#", e);
return;
}
[session startRunning];
this is located in a bundle loaded during runtime and is part of a method that is the selector of a NSThread (aka. it runs in in a background thread and not the main thread).
My problem is, that the call to #addInput:error: never returns. What am I missing here?
The problem was, that the main thread of the application was not using the Run Loop normally provided by NSApplicationMain() in Cocoa applications. Instead I was handling events within my own while loop.
The solution was calling:
CFRunLoopRunInMode(kCFRunLoopDefaultMode, 0, YES);
This thread on Apple's QuickTime mailing list for a more detailed explanation of the problem.