My app reads audio and plays it back in a producer / consumer setup. The consumer thread requests new samples to render to hardware. The producer thread reads audio data from disk into its buffer using AVAssetReader. The producer thread runs in a loop, checking if more samples need to be read. The producer's buffer size is equal to 4 seconds of audio.
When I instruct my app to buffer audio, samples are read successfully without error. When I trigger my producer thread to begin rendering audio, the 4 second buffer is played back perfectly, then silence. Further investigation reveals that my asset reader fails upon the start of playback, thus no further samples where read after the initial buffering:
CMSampleBufferRef ref = [readaudiofile copyNextSampleBuffer];
if (ref == NULL && filereader.status == AVAssetReaderStatusFailed) {
NSLog(#"reader failed: %#", filereader.error);
}
// this produces:
// {NSLocalizedFailureReason=An unknown error occurred (-12785),
// NSUnderlyingError=0x161f60 "The operation couldn’t be completed.
// (OSStatus error -12785.)", NSLocalizedDescription=The operation
// could not be completed}
My producer thread code is identical to the functional code example MusicLibraryRemoteIODemo. My consumer thread code is different, but I've compared the two line by line for a couple days and can't seem to find the hitch. Any idea what might be causing the AVAssetReader reader to fail?
This post describes similar solutions and it concluded that the Audio Session needed to be set up properly (although it doesn't say how). Is there any connection between AVAudioSesion configuration and the behavior of AVAssetReader?
I was getting this same error, I posted this answer on the other thread as well
- (void)setupAudio {
[[AVAudioSession sharedInstance] setDelegate:self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:nil];
NSError *activationError = nil;
[[AVAudioSession sharedInstance] setActive: YES error:&activationError];
NSLog(#"setupAudio ACTIVATION ERROR IS %#", activationError);
[[AVAudioSession sharedInstance] setPreferredIOBufferDuration:0.1 error:&activationError];
NSLog(#"setupAudio BUFFER DURATION ERROR IS %#", activationError);
}
Related
I have run into a bit of a conundrum with a service I am working on in objective-c. The purpose of the service is to parse through a list of core-data entities and download a corresponding image file for each object. The original design of the service was choking my web-server with too many simultaneous download requests. To get around that, I moved the code responsible for executing the download request into a recursive method. The completion handler for each download request will call the method again, thus ensuring that each download will wait for the previous one to complete before dispatching.
Where things get tricky is the code responsible for actually updating my core-data model and the progress indicator view. In the completion handler for the download, before the method recurses, I make an asynchronous call the a block that is responsible for updating the core data and then updating the view to show the progress. That block needs to have a variable to track how many times the block has been executed. In the original code, I could simply have a method-level variable with block scope that would get incremented inside the block. Since the method is recursive now, that strategy no longer works. The method level variable would simply get reset on each recursion. I can't simply pass the variable to the next level either thanks to the async nature of the block calls.
I'm at a total loss here. Can anyone suggest an approach for dealing with this?
Update:
As matt pointed out below, the core issue here is how to control the timing of the requests. After doing some more research, I found out why my original code was not working. As it turns out, the timeout interval starts running as soon as the first task is initiated, and once the time is up, any additional requests would fail. If you know exactly how much time all your requests will take, it is possible to simply increase the timeout on your requests. The better approach however is to use an NSOperationQueue to control when the requests are dispatched. For a great example of how to do this see: https://code-examples.net/en/q/19c5248
If you take this approach, keep in mind that you will have to call the completeOperation() method of each operation you create on the completion handler of the downloadTask.
Some sample code:
-(void) downloadSkuImages:(NSArray *) imagesToDownload onComplete:(void (^)(BOOL update,NSError *error))onComplete
{
[self runSerializedRequests:imagesToDownload progress:weakProgress downloaded:0 index:0 onComplete:onComplete ];
}
-(void)runSerializedRequests:(NSArray *) skuImages progress:(NSProgress *) progress downloaded:(int) totalDownloaded index:(NSUInteger) index onComplete:(void (^)(BOOL update,NSError *error))onComplete
{
int __block downloaded = totalDownloaded;
TotalDownloadProgressBlock totalDownloadProgressBlock = ^BOOL (SkuImageID *skuImageId, NSString *imageFilePath, NSError *error) {
if(error==nil) {
downloaded++;
weakProgress.completedUnitCount = downloaded;
//save change to core-data here
}
else {
downloaded++;
weakProgress.completedUnitCount = downloaded;
[weakSelf setSyncOperationDetail:[NSString stringWithFormat:#"Problem downloading sku image %#",error.localizedDescription]];
}
if(weakProgress.totalUnitCount==weakProgress.completedUnitCount) {
[weakSelf setSyncOperationIndicator:SYNC_INDICATOR_WORKING];
[weakSelf setSyncOperationDetail:#"All product images up to date"];
[weakSelf setSyncOperationStatus:SYNC_STATUS_SUCCESS];
weakProgress.totalUnitCount = 1;
weakProgress.completedUnitCount = 1;
onComplete(false,nil);
return true;
}
return false;
};
NSURLSessionDownloadTask *downloadTask = [manager downloadTaskWithRequest:request progress:nil destination:nil
completionHandler:^(NSURLResponse * _Nonnull response, NSURL * _Nullable filePath, NSError * _Nullable error) {
NSLog(#"finished download %u of %lu", index +1, (unsigned long)skuImages.count);
if(error != nil)
{
NSLog(#"Download failed for URL: %# with error: %#",skuImage.url, error.localizedDescription);
}
else
{
NSLog(#"Download succeeded for URL: %#", skuImage.url);
}
dispatch_async(dispatch_get_main_queue(), ^(void){
totalDownloadProgressBlock(skuImageId, imageFilePath, error);
});
[self runSerializedRequests:manager skuImages:skuImages progress:progress downloaded:downloaded index:index+1 onComplete:onComplete ];
}];
NSLog(#"Starting download %u of %lu", index +1, (unsigned long)skuImages.count);
[downloadTask resume];
}
The original design of the service was choking my web-server with too many simultaneous download requests. To get around that, I moved the code responsible for executing the download request into a recursive method.
But that was never the right way to solve the problem. Use a single persistent custom NSURLSession with your own configuration, and set the configuration's httpMaximumConnectionsPerHost.
I've written a Quick Look plugin that attempts to play music like this:
OSStatus GeneratePreviewForURL(void *thisInterface, QLPreviewRequestRef preview, CFURLRef url, CFStringRef contentTypeUTI, CFDictionaryRef options)
{
NSURL *fileURL = (__bridge NSURL*)url;
AudioPlayer *player = // load player with fileURL
// Create a semaphore
sema = dispatch_semaphore_create(0);
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
// Start playback and signal the semaphore once finished
[player play:^{
dispatch_semaphore_signal(sema);
}];
// Wait here until the player completion block signals the semaphore to stop waiting
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);
NSLog(#"%#", #"done!");
return kQLReturnNoError;
}
For various reasons, it's not practical for me to transcode these audio files into a format that macOS knows, or else I could just hand the OS an MP3 file and get the system's plugin to play it for me. So instead I'm using a dirty hack with semaphores to halt execution to keep my player object around, or else it'd abruptly stop immediately after starting playback.
The problem with that is that the file will just continue playing after the Quick Look panel stops previewing it due to the quicklookd process still running.
Is there a way to stop playback the way the system plugins do when they're dismissed?
Have you tried to use following delegate methods:
According to Apple Documentation:
func previewControllerWillDismiss(QLPreviewController)
Called before the preview controller is closed.
func previewControllerDidDismiss(QLPreviewController)
Called after the preview controller is closed.
I'm creating a watch application which records an audio and play that audio. The recorded audio is saved in the Apps Groups and I used the following code for playing the audio.
- (void)playAudio
{
// Playing the audio from the url using the default controller
[self presentMediaPlayerControllerWithURL:self.audioFileURL options:#{WKMediaPlayerControllerOptionsAutoplayKey : #YES} completion:^(BOOL didPlayToEnd, NSTimeInterval endTime, NSError * _Nullable error) {
NSLog(#"Error = %#",error);
}];
}
Unfortunately I am getting the following error and the player controller get dismissed immediately.
Error Domain=com.apple.watchkit.errors Code=4 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (1), NSUnderlyingError=0x16daa810 {Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted"}, NSLocalizedDescription=The operation could not be completed}.
Then I downloaded a sample project (WatchKitAudioRecorder Sample code) from Apple WatchOS Developer library but they are also having the same issue. I don't know why this is not working even in Apple Sample code :(
Finally I found that the data is not writing to the url path. Also should be written in the following path: Library/Caches/filename.extension
Here is a sample code, which may help someone.
- (void)writeAudioToAppGroupsWithData:(NSData *)audioData
{
// Writing the audio data to the App Groups
NSURL *containerURL = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:APP_GROUP_IDENTIFIER];
containerURL = [containerURL URLByAppendingPathComponent:[NSString stringWithFormat:DIRECTORY_PATH]];
[audioData writeToURL:containerURL atomically:YES];
}
In this case make sure that your DIRECTORY_PATH is Library/Caches/filename.extension. Eg: Library/Caches/Audio.mp3
Ok so I have been at this bug all day, and I think I've got it narrowed down to the fundamental problem.
Background:
I am working on an app that has required me to write my own versions of NSNetService and NSNetServiceBrowser to allow for Bonjour over Bluetooth in iOS 5. It has been a great adventure, as I knew nothing of network programming before I started this project. I have learned a lot from various example projects and from the classic Unix Network Programming textbook. My implementation is based largely on Apple's DNSSDObjects sample project. I have added code to actually make the connection between devices once a service has been resolved. An NSInputStream and an NSOutputStream are attained with CFStreamCreatePairWithSocketToHost( ... ).
Problem:
I am trying to send some data over this connection. The data consists of an integer, a few NSStrings and an NSData object archived with NSKeyedArchiver. The size of the NSData is around 150kb so the size of the whole message is around 160kb. After sending the data over the connection I am getting the following exception when I try to unarchive...
Terminating app due to uncaught exception 'NSInvalidArgumentException',
reason: '*** -[NSKeyedUnarchiver initForReadingWithData:]: incomprehensible archive
After further exploration I have noticed that the received data is only about 2kb.. The message is being truncated, thus rendering the archive "incomprehensible."
Potentially relevant code:
The method that sends the data to all connected devices
- (void) sendMessageToPeers:(MyMessage *)msg
{
NSEnumerator *e = [self.peers objectEnumerator];
//MyMessage conforms to NSCoding, messageAsData getter calls encodeWithCoder:
NSData *data = msg.messageAsData;
Peer *peer;
while (peer = [e nextObject]) {
if (![peer sendData:data]) {
NSLog(#"Could not send data to peer..");
}
}
}
The method in the Peer class that actually writes data to the NSOutputStream
- (BOOL) sendData:(NSData *)data
{
if (self.outputStream.hasSpaceAvailable) {
[self.outputStream write:data.bytes maxLength:data.length];
return YES;
}
else {
NSLog(#"PEER DIDN'T HAVE SPACE!!!");
return NO;
}
}
NSStreamDelegate method for handling stream events ("receiving" the data)
The buffer size in this code is 32768 b/c that's what was in whatever example code I learned from.. Is it arbitrary? I tried changing it to 200000, thinking that the problem was just that the buffer was too small, but it didn't change anything.. I don't think I fully understand what's happening.
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
{
switch (eventCode) {
case NSStreamEventHasBytesAvailable: {
NSInteger bytesRead;
uint8_t buffer[32768]; // is this the issue?
// uint8_t buffer[200000]; //this didn't change anything
bytesRead = [self.inputStream read:buffer maxLength:sizeof(buffer)];
if (bytesRead == -1) ...;
else if (bytesRead == 0) ...;
else {
NSData *data = [NSData dataWithBytes:buffer length:bytesRead];
[self didReceiveData:data];
}
} break;
/*omitted code for other events*/
}
}
NSStream over a network like that will be using a TCP connection. It can vary, but the maximum packet size is often around 2k. As the message you’re sending is actually 160k, it will be split up into multiple packets.
TCP abstracts this away to just be a stream of data, so you can be sure all these packets will receive in the correct order.
However, the stream:handleEvent: delegate method is probably being called when only the first 2k of data has arrived – there’s no way for it to know that there’s more coming until it does.
Note the method read:maxLength: doesn’t gauruntee you’ll always get that max length – in this case it seems to be only giving you up to 2k.
You should count up the actual bytesReceived, and concatenate all the data together until you receive the total amount you’re waiting for.
How does the receiver know how much data it wants? – you might want to design your protocol so before sending data, you send an integer of defined size indicating the length of the coming data. Alternatively, if you’re only ever sending one message over the socket, you could simply close it when finished, and have the receiver only unarchive after the socket is closed.
You seem to be reading from self.inputStream but the stream passed into your stream:handleEvent: method is called aStream. Are they referencing the same object somehow? Otherwise I'm not sure you're reading the stream that actually has bytes available
I'm writing an iOS App using an AudioQueue for recording. I create an input queue configured to get linear PCM, stated this queue and everything works as expected.
To manage interruptions, I implemented the delegate methods of AVAudioSession to catch the begin and the end of an interruption. The method endInterruption looks like the following:
- (void)endInterruptionWithFlags:(NSUInteger)flags;
{
if (flags == AVAudioSessionInterruptionFlags_ShouldResume && audioQueue != 0) {
NSLog(#"Current audio session - category: '%#' mode: '%#'",
[[AVAudioSession sharedInstance] category],
[[AVAudioSession sharedInstance] mode]);
NSError *error = nil;
OSStatus errorStatus;
if ((errorStatus = AudioSessionSetActive(true)) != noErr) {
error = [self errorForAudioSessionServiceWithOSStatus:errorStatus];
NSLog(#"Could not reactivate the audio session: %#",
[error localizedDescription]);
} else {
if ((errorStatus = AudioQueueStart(audioQueue, NULL)) != noErr) {
error = [self errorForAudioQueueServiceWithOSStatus:errorStatus];
NSLog(#"Could not restart the audio queue: %#",
[error localizedDescription]);
}
}
}
// ...
}
If the app gets interrupted while it is in foreground, everything works correct. The problem appears, if the interruption happens in the background. Activating the audio session result in the error !cat:
The specified audio session category cannot be used for the attempted audio operation. For example, you attempted to play or record audio with the audio session category set to kAudioSessionCategory_AudioProcessing.
Starting the queue without activating the session results in the error code: -12985
At that point the category is set to AVAudioSessionCategoryPlayAndRecord and the mode is AVAudioSessionModeDefault.
I couldn't find any documentation for this error message, nor if it is possible to restart an input audio queue in the background.
Yes it is possible, but to reactivate the session in the background, the audio session has to either set AudioSessionProperty kAudioSessionProperty_OverrideCategoryMixWithOthers
OSStatus propertySetError = 0;
UInt32 allowMixing = true;
propertySetError = AudioSessionSetProperty (
kAudioSessionProperty_OverrideCategoryMixWithOthers,
sizeof (allowMixing),
&allowMixing
);
or the app has to receive remote control command events:
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
At the present there is no way to reactivate if you are in the background.
Have you made your app support backgrounding in the info.plist? I'm not sure if recording is possible in the background, but you probably need to add "Required Background Modes" and then a value in that array of "App plays audio"
Update I just checked and recording in the background is possible.