I create an AudioQueue in the following steps.
Create a new output with AudioQueueNewOutput
Add a property listener for the kAudioQueueProperty_IsRunning property
Allocate my buffers with AudioQueueAllocateBuffer
Call AudioQueuePrime
Call AudioQueueStart
The problem is, when i call AudioQueuePrime it outputs following error on the console
AudioConverterNew returned -50
Prime failed (-50); will stop (11025/0 frames)
What's wrong here?
PS:
I got this error on iOS (Device & Simulator)
The output callback installed when calling AudioQueueNewOutput is never called!
The file is valid and the AudioStreamBasicDescription does match the format (AAC)
I tested the file with Mat's AudioStreamer and it seems to work there
Sample Init Code:
// Get the stream description from the first sample buffer
OSStatus err = noErr;
EDSampleBuffer *firstBuf = [sampleBufs objectAtIndex:0];
AudioStreamBasicDescription asbd = firstBuf.streamDescription;
// TODO: remove temporary format setup, just to ensure format for now
asbd.mSampleRate = 44100.00;
asbd.mFramesPerPacket = 1024; // AAC default
asbd.mChannelsPerFrame = 2;
pfcc(asbd.mFormatID);
// -----------------------------------
// Create a new output
err = AudioQueueNewOutput(&asbd, _audioQueueOutputCallback, self, NULL, NULL, 0, &audioQueue);
if (err) {
[self _reportError:kASAudioQueueInitializationError];
goto bail;
}
// Add property listener for queue state
err = AudioQueueAddPropertyListener(audioQueue, kAudioQueueProperty_IsRunning, _audioQueueIsRunningCallback, self);
if (err) {
[self _reportError:kASAudioQueuePropertyListenerError];
goto bail;
}
// Allocate a queue buffers
for (int i=0; i<kAQNumBufs; i++) {
err = AudioQueueAllocateBuffer(audioQueue, kAQDefaultBufSize, &queueBuffer[i]);
if (err) {
[self _reportError:kASAudioQueueBufferAllocationError];
goto bail;
}
}
// Prime and start
err = AudioQueuePrime(audioQueue, 0, NULL);
if (err) {
printf("failed to prime audio queue %ld\n", err);
goto bail;
}
err = AudioQueueStart(audioQueue, NULL);
if (err) {
printf("failed to start audio queue %ld\n", err);
goto bail;
}
These are the format flags from the audio file stream
rate: 44100.000000
framesPerPacket: 1024
format: aac
bitsPerChannel: 0
reserved: 0
channelsPerFrame: 2
bytesPerFrame: 0
bytesPerPacket: 0
formatFlags: 0
cookieSize 39
AudioConverterNew returned -50
Prime failed (-50); will stop (11025/0 frames)
What's wrong here?
You did it wrong.
No, really. That's what that error means, and that's ALL that error means.
That's why paramErr (-50) is such an annoying error code: It doesn't say a damn thing about what you (or anyone else) did wrong.
The first step to formulating guesses as to what it's complaining about is to find out what function returned that error. Change your _reportError: method to enable you to log the name of the function that returned the error. Then, log the parameters you're passing to that function and figure out why it's of the opinion that those parameters to that function don't make sense.
My own wild guess is that it's because the values you forced into the ASBD don't match the characteristics of the sample buffer. The log output you included in your question says “11025/0 frames”; 11025 is a common sample rate (but different from 44100). I assume you'd know what the 0 refers to.
Related
I'm starter of RTOS and I'm using Xenomai v2.6.3.
I'm trying to get some data using Serial communication.
I did my best on the task following the xenomai's guide and open sources, but it doesn't work well.
the link of the guide --> (https://xenomai.org//serial-16550a-driver/)
I just followed the sequence to use the module xeno_16550A. (with port io = 0x2f8 and irq=3)
I followed open source http://www.acadis.org/pages/captain.at/serial-port-example
It works well in write task, but read task doesn't work well.
It gave me the error sentence with error while RTSER_RTIOC_WAIT_EVENT, code -110 (it means connection timed out)
Moreover I checked the irq number3 by typing command 'cat /proc/xenomai/irq', but the interrupt number doesn't increase.
In my case, I don't need to write data, so I erase the write task code.
The read task proc is follow
void read_task_proc(void *arg) {
int ret;
ssize_t red = 0;
struct rtser_event rx_event;
while (1) {
/* waiting for event */
ret = rt_dev_ioctl(my_fd, RTSER_RTIOC_WAIT_EVENT, &rx_event );
if (ret) {
printf(RTASK_PREFIX "error while RTSER_RTIOC_WAIT_EVENT, code %d\n",ret);
if (ret == -ETIMEDOUT)
continue;
break;
}
unsigned char buf[1];
red = rt_dev_read(my_fd, &buf, 1);
if (red < 0 ) {
printf(RTASK_PREFIX "error while rt_dev_read, code %d\n",red);
} else {
printf(RTASK_PREFIX "only %d byte received , char : %c\n",red,buf[0]);
}
}
exit_read_task:
if (my_state & STATE_FILE_OPENED) {
if (!close_file( my_fd, READ_FILE " (rtser)")) {
my_state &= ~STATE_FILE_OPENED;
}
}
printf(RTASK_PREFIX "exit\n");
}
I could guess the causes of the problem.
buffer size or buffer is already full when new data is received.
rx_interrupt doesn't work....
I want to check whether the two things are wrong or not, but How can I check?
Furthermore, does anybody know the cause of the problem? Please give me comments.
I'm parsing a very large CSV file using GCD functions (please see code below).
If I encounter an error I'd like to cancel dispatch_io_read. Is there a way to do that?
dispatch_io_read(channel,
0,
Int.max,
dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0))
{ (done, data, error) in
guard error == 0 else {
print("Read Error: \(error)")
return
}
if done {
lineBuffer.dealloc(bufferSize)
}
dispatch_data_apply(data)
{ (region, offset, buffer, size) -> Bool in
print(size)
let bytes = UnsafePointer<UInt8>(buffer)
for var i = 0; i < size; i++ {
switch bytes[i] {
case self.cr: // ignore \r
break
case self.lf: // newline
lineBuffer[bufferLength] = 0x00 // Null terminated
line(line: String(UTF8String: lineBuffer)!)
bufferLength = 0
case _ where bufferLength < (bufferSize - 1): // Leave space for null termination
lineBuffer[bufferLength++] = CChar(bytes[i])
default:
return false // Overflow! I would like to stop reading the file here.
}
}
return true
}
}
Calling dispatch_io_close(DISPATCH_IO_STOP) will cause running dispatch_io_read operations to be interrupted and their handlers to be passed to ECANCELED error (along with partial results), see the dispatch_io_close(3) manpage.
Note that this does not interrupt the actual I/O system calls, it just prevents additional I/O system calls from being entered, so you may have to set an I/O channel high watermark to ensure the appropriate level of I/O granularity for your application.
i am using audio queue for playback and for record on OSX | Mac and have a use case,
Its something like, user may change the Audiodevice ( Input and output both ) while audio queue is running either for playback or for record,
This is what i have done so far,
OSStatus result = noErr;
// get the device list
AudioObjectPropertyAddress thePropertyAddress = { kAudioHardwarePropertyDefaultOutputDevice, kAudioObjectPropertyScopeGlobal,
kAudioObjectPropertyElementMaster };
UInt32 thePropSize;
CFStringRef theDeviceName;
// get the device name
thePropSize = sizeof(CFStringRef);
thePropertyAddress.mSelector = kAudioObjectPropertyName;
thePropertyAddress.mScope = kAudioObjectPropertyScopeGlobal;
thePropertyAddress.mElement = kAudioObjectPropertyElementMaster;
// get the name of the device
result = AudioObjectGetPropertyData( (AudioObjectID)input,
&thePropertyAddress, 0, NULL, &thePropSize, &theDeviceName);
if ( result != noErr){
log("Error while getting property");
return;
}
// get the uid of the device
CFStringRef theDeviceUID;
thePropertyAddress.mSelector = kAudioDevicePropertyDeviceUID;
result = AudioObjectGetPropertyData( (AudioObjectID)input,
&thePropertyAddress, 0, NULL, &thePropSize, &theDeviceUID);
try{
XThrowIfError(AudioQueueSetProperty(mQueue,kAudioQueueProperty_CurrentDevice,&theDeviceUID, sizeof(CFStringRef)),"set input device");
}
catch (CAXException e) {
char buf[256];
fprintf(stderr, "Error: %s (%s)\n", e.mOperation, e.FormatError(buf));
}
but its throwing an exception kAudioQueueErr_InvalidRunState , referencesaying its can't be done while queue is running;
Is there any other way to achieve the same ?
Unfortunately there isn't any other way to achieve this (AudioQueue has too few functions at all). You need firstly to stop audio queue, then change its property, and finally start it again.
OSStatus SetupBuffers(BG_FileInfo *inFileInfo)
{
int numBuffersToQueue = kNumberBuffers;
UInt32 maxPacketSize;
UInt32 size = sizeof(maxPacketSize);
// we need to calculate how many packets we read at a time, and how big a buffer we need
// we base this on the size of the packets in the file and an approximate duration for each buffer
// first check to see what the max size of a packet is - if it is bigger
// than our allocation default size, that needs to become larger
OSStatus result = AudioFileGetProperty(inFileInfo->mAFID, kAudioFilePropertyPacketSizeUpperBound, &size, &maxPacketSize);
AssertNoError("Error getting packet upper bound size", end);
bool isFormatVBR = (inFileInfo->mFileFormat.mBytesPerPacket == 0 || inFileInfo- >mFileFormat.mFramesPerPacket == 0);
CalculateBytesForTime(inFileInfo->mFileFormat, maxPacketSize, 0.5/*seconds*/, &mBufferByteSize, &mNumPacketsToRead);
// if the file is smaller than the capacity of all the buffer queues, always load it at once
if ((mBufferByteSize * numBuffersToQueue) > inFileInfo->mFileDataSize)
inFileInfo->mLoadAtOnce = true;
if (inFileInfo->mLoadAtOnce)
{
UInt64 theFileNumPackets;
size = sizeof(UInt64);
result = AudioFileGetProperty(inFileInfo->mAFID, kAudioFilePropertyAudioDataPacketCount, &size, &theFileNumPackets);
AssertNoError("Error getting packet count for file", end);***>>>>this is where xcode says undefined<<<<***
mNumPacketsToRead = (UInt32)theFileNumPackets;
mBufferByteSize = inFileInfo->mFileDataSize;
numBuffersToQueue = 1;
}
//Here is the exact error
label 'end' used but not defined
I have that error twice
If you look at the SoundEngine.cpp source that the snippet comes from, you'll see it's defined on the very next line:
end:
return result;
It's a label that execution jumps to when there's an error.
Uhm, the only place I can find AssertNoError is here in Technical Note TN2113. And it has a completely different format. AssertNoError(theError, "couldn't unregister the ABL"); Where is AssertNoError defined?
User #Jeremy P mentions this document as well.
I'm following the sample code in CFNetwork Programming Guide, specifically the section on Preventing Blocking When Working with Streams. my code is nearly identical to theirs (below) but, when I connect to my server, I get posix error 14 (bad address -- is that bad IP address (except it's not)? Bad memory address for some call I made? or what?!.
I have no idea how to go about debugging this. I'm really pretty new to the whole CFNetworking thing, and was never particularly expert at networks in the first place (the one thing I really loved about Java: easy networks! :D)
Anyway, log follows, with code below. Any hints would be greatly appreciated.
Log:
[6824:20b] [DEBUG] Compat version: 30000011
[6824:20b] [DEBUG] resovled host.
[6824:20b] [DEBUG] writestream opened.
[6824:20b] [DEBUG] readstream client assigned.
[6824:20b] [DEBUG] readstream opened.
[6824:20b] [DEBUG] *** Read stream reported kCFStreamEventErrorOccurred
[6824:20b] [DEBUG] *** POSIX error: 14 - Bad address
[6824:20b] Error closing readstream
[6824:20b] [DEBUG] Writing int: 0x09000000 (0x00000009)
Code:
+ (BOOL) connectToServerNamed:(NSString*)name atPort:(int)port {
CFHostRef theHost = CFHostCreateWithName (NULL, (CFStringRef)name);
CFStreamError error;
if (CFHostStartInfoResolution (theHost, kCFHostReachability, &error))
{
NSLog (#"[DEBUG] resovled host.");
CFStreamCreatePairWithSocketToCFHost (NULL, theHost, port, &readStream, &writeStream);
if (CFWriteStreamOpen(writeStream))
{
NSLog (#"[DEBUG] writestream opened.");
CFStreamClientContext myContext = { 0, self, NULL, NULL, NULL };
CFOptionFlags registeredEvents = kCFStreamEventHasBytesAvailable |
kCFStreamEventErrorOccurred | kCFStreamEventEndEncountered;
if (CFReadStreamSetClient (readStream, registeredEvents, readCallBack, &myContext))
{
NSLog (#"[DEBUG] readstream client assigned.");
CFReadStreamScheduleWithRunLoop(readStream, CFRunLoopGetCurrent(),
kCFRunLoopCommonModes);
if (CFReadStreamOpen(readStream))
{
NSLog (#"[DEBUG] readstream opened.");
CFRunLoopRun();
// Lots of error condition handling snipped.
[...]
return YES;
}
void readCallBack (CFReadStreamRef stream, CFStreamEventType event, void *myPtr)
{
switch (event)
{
case kCFStreamEventHasBytesAvailable:
{
CFIndex bytesRead = CFReadStreamRead(stream, buffer, kNetworkyBitsBufferSize); // won't block
if (bytesRead > 0) // <= 0 leads to additional events
{
if (listener)
{
UInt8 *tmpBuffer = malloc (sizeof (UInt8) * bytesRead);
memcpy (buffer, tmpBuffer, bytesRead);
NSLog(#"[DEBUG] reveived %d bytes", bytesRead);
[listener networkDataArrived:tmpBuffer count:bytesRead];
}
NSLog(#"[DEBUG] reveived %d bytes; no listener", bytesRead);
}
}
break;
case kCFStreamEventErrorOccurred:
NSLog(#"[DEBUG] *** Read stream reported kCFStreamEventErrorOccurred");
CFStreamError error = CFReadStreamGetError(stream);
logError(error);
[NetworkyBits shutDownRead];
break;
case kCFStreamEventEndEncountered:
NSLog(#"[DEBUG] *** Read stream reported kCFStreamEventEndEncountered");
[NetworkyBits shutDownRead];
break;
}
}
void logError (CFStreamError error)
{
if (error.domain == kCFStreamErrorDomainPOSIX) // Interpret error.error as a UNIX errno.
{
NSLog (#"[DEBUG] *** POSIX error: %d - %s", (int) error.error, strerror(error.error));
}
else if (error.domain == kCFStreamErrorDomainMacOSStatus)
{
NSLog (#"[DEBUG] *** MacOS error: %d", (int) error.error);
}
else
{
NSLog (#"[DEBUG] *** Stream error domain: %d, error: %d", (int) error.error);
}
}
Olie, where does
buffer
that you supply to
CFReadStreamRead()
come from? EFAULT is a bad buffer address... are you sure you've actually initialized this buffer to point to something valid? It's obviously a global or sometime... which itself is a pretty bad idea. You should allocate it in your function or it should be an ivar (if you're using Obj-C).
I'm not familiar with Cocoa or Objective-C, but I can tell you that POSIX error code 14 is called EFAULT, and it means you made a system call with an invalid pointer value. This is almost certainly some user-supplied buffer pointer to a read or write system call of some sort. Check all of the buffer pointers and make sure they're not NULL. Check the return value from malloc() - you might be failing to allocate a buffer.