Is possible to play audio from AudioQueue in real time? - objective-c

I need to get audio from microphone and pass to other device using tcp connection.
I not sure if is possible to play an AudioBuffer from the AudioQueue.
Currently I'm get the buffer with the code below:
void AudioInputCallback(
void *inUserData,
AudioQueueRef inAQ,
AudioQueueBufferRef inBuffer,
const AudioTimeStamp *inStartTime,
UInt32 inNumberPacketDescriptions,
const AudioStreamPacketDescription *inPacketDescs)
{
AudioRecorder *self = (AudioRecorder *)inUserData;
if(!self.recordState.recording)
{
NSLog(#"[ AudioRecorder ] AudioInputCallback not recording");
return;
}
if(inNumberPacketDescriptions > 0)
{
NSData *data = [[NSData alloc] initWithBytes:inBuffer->mAudioData length:inBuffer->mAudioDataByteSize * 2];
[self.delegate didReceivedAudioData:data];
[data release];
}
}
This block just push mAudioData to NSData, the NSData is the key data type to transfer data between devices.
Now I don't know how to run this mAudioData in other device. Other thing is how to get audio continuously, when all buffers are full, I not receive data anymore, just need to clean the buffer I guess.
Any idea are welcome, thanks!

Related

Cliks and distortions in Lame encoded Mp3 file

I'm trying to encode the raw PCM data from microphone to MP3 using AudioToolbox framework and Lame. And although everything seems to run fine, there is this problem with "clicks" and "distortions" present in the encoded stream.
I'm not sure that I setup AudioQueue correctly and also that I process the encoded buffer in the right wat...
My code to setup audio recording:
AudioStreamBasicDescription streamFormat;
memset(&streamFormat, 0, sizeof(AudioStreamBasicDescription));
streamFormat.mSampleRate = 44100;
streamFormat.mFormatID = kAudioFormatLinearPCM;
streamFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger|kLinearPCMFormatFlagIsPacked;
streamFormat.mBitsPerChannel = 16;
streamFormat.mChannelsPerFrame = 1;
streamFormat.mBytesPerPacket = 2;
streamFormat.mBytesPerFrame = 2;
streamFormat.mFramesPerPacket = 1;
streamFormat.mReserved = 0;
AudioQueueNewInput(&streamFormat, InputBufferCallback, (__bridge void*)(self), nil, nil, 0, &mQueue);
UInt32 bufferByteSize = 44100;
memset(&mEncodedBuffer, 0, sizeof(mEncodedBuffer)); //mEncoded buffer is
//unsigned char [72000]
AudioQueueBufferRef buffer;
for (int i=0; i<3; i++) {
AudioQueueAllocateBuffer(mQueue, bufferByteSize, &buffer);
AudioQueueEnqueueBuffer(mQueue, buffer, 0, NULL);
}
AudioQueueStart(mQueue, nil);
Then the AudioQueue callback function calls to lame_encode_buffer and then writes the encoded buffer to file:
void InputBufferCallback (void *inUserData, AudioQueueRef inAQ, AudioQueueBufferRef inBuffer, const AudioTimeStamp *inStartTime, UInt32 inNumPackets, const AudioStreamPacketDescription* inPacketDesc) {
memset(&mEncodedBuffer, 0, sizeof(mEncodedBuffer));
int encodedBytes = lame_encode_buffer(glf, (short*)inBuffer->mAudioData, NULL, inBuffer->mAudioDataByteSize, mEncodedBuffer, 72000);
//What I don't understand is that if I write the full 'encodedBytes' data, then there are A LOT of distortions and original sound is seriously broken
NSData* data = [NSData dataWithBytes:mEncodedBuffer length:encodedBytes/2];
[mOutputFile writeData:data];
}
And when I afterward try to play the file which contains Lame encoded data with AVAudioPlayer I clearly hear original sound but with some clicks and distortions around.
Can anybody advise what's wrong here?
Your code does not appear to be paying attention to inNumPackets, which is the amount of actual audio data given the callback.
Also, doing a long operation, such as running an encoder, inside an audio callback might not be fast enough and thus may violate response requirements. Any long function calls should be done outside the callback.

memory is growing in audio buffer code

I have a code that we use many times with our apps, its a class that take the buffer samples and process it ,then send back notification to the main class.
The code is c and objective-c.
It works just great, but there is a memory growing which i can see in instruments-allocations tool. the "overall bytes" is keep growing, in 100k a second. becuase of some parts of the code that i know who they are .
this is the callback function, with the line that makes problems.
it happens many times a second.
I also dont really understand where to put my *pool :
static OSStatus recordingCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
AudioBuffer buffer;
buffer.mNumberChannels = 1;
buffer.mDataByteSize = inNumberFrames * 2;
//NSLog(#"%ld",inNumberFrames);
buffer.mData = malloc( inNumberFrames * 2 );
// Put buffer in a AudioBufferList
AudioBufferList bufferList;
bufferList.mNumberBuffers = 1;
bufferList.mBuffers[0] = buffer;
// block A
OSStatus status;
status = AudioUnitRender(audioUnit,
ioActionFlags,
inTimeStamp,
inBusNumber,
inNumberFrames,
&bufferList);
//end block A
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
int16_t *q = (int16_t *)(&bufferList)->mBuffers[0].mData;
int16_t average ;
for(int i=0; i < inNumberFrames; i++)
{
average=q[i];
if(average>100) // lineB
reducer++;
//blockC
if(reducer==150 )
{
average= preSignal + alpha*(average-preSignal);
//NSLog(#"average:%d",average);
//call scene
[dict setObject:[NSNumber numberWithInt:average] forKey:#"amp" ] ;
[[NSNotificationCenter defaultCenter] postNotificationName:#"DigitalArrived" object:nil userInfo:dict];
reducer=0;
preSignal=average;
}
//end blockC
}
free(buffer.mData);
[pool release];
return noErr;
}
OK:
ignore blockC for a second.
removing blockA and lineB solve it all.
removing only one of them- leaks.
i just cant undetstand what is growing here .
Just a guess, but allocating a new NSAutoreleasePool inside of your recording callback function (which is a super time-critical function) is probably a bad idea.
Actually, why are you doing this here at all? Shouldn't you just have one pool for the entire app, in your main.m? This is probably causing some of your leaks.
You should not do anything the requires memory allocation inside an Audio Unit render callback. The real-time requirements are too tight for using generic Objective C.
Since you should not allocate a pool, or any other memory, inside an audio unit callback, you should not use any Objective C methods that potentially or actually create any objects, such as dictionary modifications or notification creation. You may have to drop back to using plain C inside the render callback (set a flag), and do your Objective C messaging outside the render callback in another thread (after polling the flag(s) in a timer callback, for instance).

IOS Receiving video from Network

UPDATE
- I have fixed some mistakes in the code below and the images are displayed on the other device, but I have another problem. While video capture is open, the "master" device sends data continuously, sometimes this capture appears on "slave" device and in a very short time, the image "blinks" to blank and repeat this all time for a short period. Any idea about this?
I'm working on a app that's need to send live camera capture and live microphone capture to another device in network.
I have done the connection between devices using a TCP server and publish it with bonjour, this works like a charm.
The most important part is about to send and receive video and audio from "master" device and render it on "slave" device.
First, here a piece of code where the app get the camera sample buffer and transform in UIImage:
#implementation AVCaptureManager (AVCaptureVideoDataOutputSampleBufferDelegate)
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
dispatch_sync(dispatch_get_main_queue(), ^{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
NSData *data = UIImageJPEGRepresentation(image, 0.2);
[self.delegate didReceivedImage:image];
[self.delegate didReceivedFrame:data];
[pool drain];
});
}
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
size_t bytesPerRow = width * 4;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
CGContextRef context = CGBitmapContextCreate(
baseAddress,
width,
height,
8,
bytesPerRow,
colorSpace,
kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little
);
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
UIImage *image = [UIImage imageWithCGImage:quartzImage];
CGImageRelease(quartzImage);
CGColorSpaceRelease(colorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
return image;
}
#end
The message "[self.delegate didReceivedImage:image];" is just to test the image capture on master device, and this image works on capture device.
The next is about how to I send it to network:
- (void) sendData:(NSData *)data
{
if(_outputStream && [_outputStream hasSpaceAvailable])
{
NSInteger bytesWritten = [_outputStream write:[data bytes] maxLength:[data length]];
if(bytesWritten < 0)
NSLog(#"[ APP ] Failed to write message");
}
}
Look I'm using RunLoop to write and read streams, I think this is better than open and closes streams constantly.
Next, I receive the "NSStreamEventHasBytesAvailable" event on the slave device, the piece of code where handle this is:
case NSStreamEventHasBytesAvailable:
/*I can't to start a case without a expression, why not?*/
NSLog(#"[ APP ] stream handleEvent NSStreamEventHasBytesAvailable");
NSUInteger bytesRead;
uint8_t buffer[BUFFER_SIZE];
while ([_inputStream hasBytesAvailable])
{
bytesRead = [_inputStream read:buffer maxLength:BUFFER_SIZE];
NSLog(#"[ APP ] bytes read: %i", bytesRead);
if(bytesRead)
[data appendBytes:(const void *)buffer length:sizeof(buffer)];
}
[_client writeImageWithData:data];
break;
The value of BUFFER_SIZE is 32768.
I think the while block is not necessary, but I use it because if I can't read all available bytes at first iteration, I can read in the next.
So, this is the point, the stream comes correctly but the image serialized on NSData seems be corrupted, in the next, I just send data to client...
[_client writeImageWithData:data];
... and create a UIImage with data in client class simple like this...
[camPreview setImage:[UIImage imageWithData:data]];
In the camPreview (yes is a UIImageView), I have a image just to display the placeholder on the screen, when I get the imagem from network and pass to camPreview, the placeholder gets blank.
Other think is about the output, when I start the capture, first parts where I receive data, I get this message from system:
Error: ImageIO: JPEG Corrupt JPEG data: 28 extraneous bytes before marker 0xbf
Error: ImageIO: JPEG Unsupported marker type 0xbf
After some little time, I get this messages anymore.
The point is find the cause of the image not are displayed on the "slave" device.
Thanks.
I am not sure how often you are sending images, but even if it is not very often I think I would scan for the SOI and EOI markers in the JPEG data to insure you have all the data. Here is a post I quickly found
I found a answer to check jpeg format before render.
This resolved my problem and now I can display video capture from a "master" ios device to a "slave" ios device.

AVplayer Audio routing to speakers and headphones

i am trying to route the audio to headphone when headphone is plugged in and and play it via iPhone speakers when there is no headphones connected.
i have tried the the following
AudioSessionAddPropertyListener(kAudioSessionProperty_AudioRouteChange, audioRouteChangeListenerCallback, nil);
void audioRouteChangeListenerCallback (
void *inUserData,
AudioSessionPropertyID inPropertyID,
UInt32 inPropertyValueSize,
const void *inPropertyValue
) {
DetailViewController *controller = (__bridge_transfer DetailViewController *) inUserData;
if (inPropertyID != kAudioSessionProperty_AudioRouteChange) return;
CFDictionaryRef routeChangeDictionary = (CFDictionaryRef)inPropertyValue;
CFNumberRef routeChangeReasonRef = (CFNumberRef)CFDictionaryGetValue (routeChangeDictionary, CFSTR (kAudioSession_AudioRouteChangeKey_Reason) );
SInt32 routeChangeReason;
CFNumberGetValue (routeChangeReasonRef, kCFNumberSInt32Type, &routeChangeReason);
if (routeChangeReason == kAudioSessionRouteChangeReason_OldDeviceUnavailable) {
NSLog(#"UnPlugged");
[controller headphoneIsPlugged];
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute, sizeof (audioRouteOverride),&audioRouteOverride);
} else if (routeChangeReason == kAudioSessionRouteChangeReason_NewDeviceAvailable) {
NSLog(#"Plugged in");
}
}
the function headphone is not being called at all. the program is not even going to that part of the section.
now when i plug out the headphone the audio just stops and nothing its not being played via the speaker.
inside the audiolister i tried to put the following
[controller.avplayer play];
but still nothing is happening.
Please help
You need to pass the instance of DetailViewController when you are adding the listener. This will be passed in as user data. Without this (__bridge_transfer DetailViewController *) inUserData; would return nil.
Your code would be something like this if you call it from within your DetailViewController:
AudioSessionAddPropertyListener (
kAudioSessionProperty_AudioRouteChange,
audioRouteChangeListenerCallback,
(__bridge void *)(self)
);

Cocoa Touch Bonjour how to deal with NSNetService addresses and uint8_t

I'm attempting to make an iOS app communicate with a server that uses Bonjour and uses HTTP commands. So far I have been able to find the local domain and locate the particular service I'm looking for. I am able to resolve the address of the service, but I don't know how to get something useful out of the address. The address from the NSNetService is a NSData object and I have no idea what to do with it. I need to send commands like GET and PUT. What cocoa classes handle things like this?
I also tried getting input and output streams from the Service, but they seem to be extremely low level streams and I don't know how to properly deal with buffers and all that.
[service getInputStream:&inputStream outputStream:&outputStream]
the NSOutputStream write method takes in a uint8_t buffer which I have no idea how to create.
the NSInputStream read method returns a uint8_t buffer and I don't know how to interpret it.
I am able to communicate with this server using terminal commands. For instance, sending it the command LIST causes it to print out the list of files I am looking for. How do I send and get information like this in Cocoa?
To write data to the output stream, therefore sending it to the server:
NSString * stringToSend = #"Hello World!\n"; //The "\n" lets the receiving method described below function correctly. I don't know if you need it or not.
NSData * dataToSend = [stringToSend dataUsingEncoding:NSUTF8StringEncoding];
if (outputStream) {
int remainingToWrite = [dataToSend length];
void * marker = (void *)[dataToSend bytes];
while (0 < remainingToWrite) {
int actuallyWritten = 0;
actuallyWritten = [outputStream write:marker maxLength:remainingToWrite];
remainingToWrite -= actuallyWritten;
marker += actuallyWritten;
}
}
You can send any data like this, just put it in a NSData object.
To receive data from the server use this code in the input stream's NSStreamDelegate:
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)streamEvent {
NSInputStream * istream;
NSOutputStream * ostream;
switch(streamEvent) {
case NSStreamEventHasBytesAvailable:;
istream = (NSInputStream *)aStream;
ostream = (NSOutputStream *)CFDictionaryGetValue(connections, istream);
uint8_t buffer[2048];
int actuallyRead = [istream read:(uint8_t *)buffer maxLength:2048];
if (actuallyRead > 0) {
NSData *data;
data = [NSData dataWithBytes:buffer length:actuallyRead];
NSString *string = [[[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding]autorelease];
string = [string stringByReplacingOccurrencesOfString:#"\n" withString:#""];
//Do something with the string...
}
break;
case NSStreamEventEndEncountered:;
istream = (NSInputStream *)aStream;
ostream = nil;
if (CFDictionaryGetValueIfPresent(connections, istream, (const void **)&ostream)) {
[self shutdownInputStream:istream outputStream:ostream];
}
break;
case NSStreamEventHasSpaceAvailable:
case NSStreamEventErrorOccurred:
case NSStreamEventOpenCompleted:
case NSStreamEventNone:
default:
break;
}
}
Take a look at Apple's CocoaEcho Sample Code. It should help you.