error in audio Unit code -remoteIO for iphone - objective-c

i have this code, in order to read buffer samples , but i get a strange mach-o linker error ,
Framework of audio unit couldnt loaded so i put the audioTollBox and coreAudio as i read.
my code is :
#define kOutputBus 0
#define kInputBus 1
AudioComponentInstance audioUnit;
#implementation remoteIO
//callback function :
static OSStatus recordingCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
AudioBuffer buffer;
buffer.mNumberChannels = 1;
buffer.mDataByteSize = inNumberFrames * 2;
NSLog(#"%ld",inNumberFrames);
buffer.mData = malloc( inNumberFrames * 2 );
AudioBufferList bufferList;
bufferList.mNumberBuffers = 1;
bufferList.mBuffers[0] = buffer;
OSStatus status;
status = AudioUnitRender(audioUnit,
ioActionFlags,
inTimeStamp,
inBusNumber,
inNumberFrames,
&bufferList);
checkStatus(status); //here is the warnning+error
double *q = (double *)(&bufferList)->mBuffers[0].mData;
for(int i=0; i < strlen((const char *)(&bufferList)->mBuffers[0].mData); i++)
{
NSLog(#"%f",q[i]);
}
}
and the reading method :
-(void)startListeningWithFrequency:(float)freq;
{
OSStatus status;
AudioComponentDescription desc;
desc.componentType = kAudioUnitType_Output;
desc.componentSubType = kAudioUnitSubType_RemoteIO;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc);
status = AudioComponentInstanceNew( inputComponent, &audioUnit);
checkStatus(status);
UInt32 flag = 1;
status = AudioUnitSetProperty(audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input,kInputBus, &flag, sizeof(flag));
checkStatus(status);
AudioStreamBasicDescription audioFormat;
audioFormat.mSampleRate = 44100.00;//44100.00;
audioFormat.mFormatID = kAudioFormatLinearPCM;
audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioFormat.mFramesPerPacket = 1;
audioFormat.mChannelsPerFrame = 1;
audioFormat.mBitsPerChannel = 16;
audioFormat.mBytesPerPacket = 2;
audioFormat.mBytesPerFrame = 2;
status = AudioUnitSetProperty(audioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output,
kInputBus,
&audioFormat,
sizeof(audioFormat));
checkStatus(status);
checkStatus(status);
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = recordingCallback;
callbackStruct.inputProcRefCon = self;
status = AudioUnitSetProperty(audioUnit,
kAudioOutputUnitProperty_SetInputCallback,
kAudioUnitScope_Global,
kInputBus, &callbackStruct, sizeof(callbackStruct));
checkStatus(status);
status = AudioOutputUnitStart(audioUnit);
}
and what i get is this error and warnning :
Undefined symbols for architecture i386:
"_checkStatus", referenced from:
_recordingCallback in remoteIO.o
-[remoteIO startListeningWithFrequency:] in remoteIO.o
ld: symbol(s) not found for architecture i386
collect2: ld returned 1 exit status
whats wrong here, ?
thanks.

You have to write your own checkStatus() function, as what it does (e.g. how it reports an error: dialog box, console output, analytics logging, crash dump, etc.), or whether is does anything at all other than return from the audio code, is specific to each app.

Related

CMSampleBuffer from Replay Kit after encoding to Mpeg-TS Video is normal, Audio is corrupted

I am trying to stream AAC encoded audio data received as CMSampleBuffer. AudioConverterFillComplexBuffer returns 0 status code.
But after passing this data to my FFMPEG HLSWriter audio is not correctly saved (short truncated signals).
Below is the sample code.
static OSStatus inInputDataProc(AudioConverterRef inAudioConverter,
UInt32 *ioNumberDataPackets,
AudioBufferList *ioData,
AudioStreamPacketDescription **outDataPacketDescription,
void *inUserData)
{
KFAACEncoder *encoder = (__bridge KFAACEncoder *)(inUserData);
UInt32 requestedPackets = *ioNumberDataPackets;
if(requestedPackets > encoder.cycledBuffer.size / 2)
{
//NSLog(#"PCM buffer isn't full enough!");
*ioNumberDataPackets = 0;
return -1;
}
static size_t staticBuffSize = 4096;
static void* staticBuff = nil;
if(!staticBuff)
{
staticBuff = malloc(staticBuffSize);
}
size_t outputBytesSize = requestedPackets * 2;
[encoder.cycledBuffer popToBuffer:staticBuff bytes: outputBytesSize];
ioData->mBuffers[0].mData = staticBuff;
ioData->mBuffers[0].mDataByteSize = (int)outputBytesSize;
*ioNumberDataPackets = ioData->mBuffers[0].mDataByteSize / 2;
return noErr;
}
- (void) encodeSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
CFRetain(sampleBuffer);
dispatch_async(self.encoderQueue,
^{
if (!_audioConverter)
{
[self setupAACEncoderFromSampleBuffer:sampleBuffer];
}
CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CFRetain(blockBuffer);
size_t pcmBufferSize = 0;
void* pcmBuffer = nil;
OSStatus status = CMBlockBufferGetDataPointer(blockBuffer, 0, NULL, &pcmBufferSize, &pcmBuffer);
[_cycledBuffer push:pcmBuffer size:pcmBufferSize];
NSError *error = nil;
if (status != kCMBlockBufferNoErr)
{
error = [NSError errorWithDomain:NSOSStatusErrorDomain code:status userInfo:nil];
}
memset(_aacBuffer, 0, _aacBufferSize);
AudioBufferList outAudioBufferList = {0};
outAudioBufferList.mNumberBuffers = 1;
outAudioBufferList.mBuffers[0].mNumberChannels = 1;
outAudioBufferList.mBuffers[0].mDataByteSize = _aacBufferSize;
outAudioBufferList.mBuffers[0].mData = _aacBuffer;
AudioStreamPacketDescription *outPacketDescription = NULL;
UInt32 ioOutputDataPacketSize = 1;
status = AudioConverterFillComplexBuffer(_audioConverter,
inInputDataProc,
(__bridge void *)(self),
&ioOutputDataPacketSize,
&outAudioBufferList,
NULL);
NSData *data = nil;
if (status == 0)
{
NSData *rawAAC = [NSData dataWithBytes:outAudioBufferList.mBuffers[0].mData length:outAudioBufferList.mBuffers[0].mDataByteSize];
if (_addADTSHeader) {
NSData *adtsHeader = [self adtsDataForPacketLength:rawAAC.length];
NSMutableData *fullData = [NSMutableData dataWithData:adtsHeader];
[fullData appendData:rawAAC];
data = fullData;
} else {
data = rawAAC;
}
} else {
error = [NSError errorWithDomain:NSOSStatusErrorDomain code:status userInfo:nil];
}
if (self.delegate) {
KFFrame *frame = [[KFFrame alloc] initWithData:data pts:pts];
NSLog(#"Bytes of data %lu", (unsigned long)data.length);
dispatch_async(self.callbackQueue, ^{
[self.delegate encoder:self encodedFrame:frame];
});
}
CFRelease(sampleBuffer);
CFRelease(blockBuffer);
});
}

How to play pcm audio buffer from a socket server using audio unit circular buffer

I hope someone can help me. I am new to Objective-c and OSX and I am trying to play audio data I am receiving via socket into my audio queue. I found out this link https://stackoverflow.com/a/30318859/4274654 which in away address my issue with circular buffer.
However when I try to run my project it returns
It returns an error (OSStatus) -10865. That is why the code logs " Error enabling AudioUnit output bus".
status = AudioUnitSetProperty(_audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, kOutputBus, &one, sizeof(one));
Here is my code:
Test.h
#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import "TPCircularBuffer.h"
#interface Test : Communicator
#property (nonatomic) AudioComponentInstance audioUnit;
#property (nonatomic) TPCircularBuffer circularBuffer;
-(TPCircularBuffer *) outputShouldUseCircularBuffer;
-(void) start;
#end
Test.m
#import "Test.h"
#define kOutputBus 0
#define kInputBus 1
#implementation Test{
BOOL stopped;
}
static OSStatus OutputRenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData){
Test *output = (__bridge Test*)inRefCon;
TPCircularBuffer *circularBuffer = [output outputShouldUseCircularBuffer];
if( !circularBuffer ){
SInt32 *left = (SInt32*)ioData->mBuffers[0].mData;
for(int i = 0; i < inNumberFrames; i++ ){
left[ i ] = 0.0f;
}
return noErr;
};
int32_t bytesToCopy = ioData->mBuffers[0].mDataByteSize;
SInt16* outputBuffer = ioData->mBuffers[0].mData;
uint32_t availableBytes;
SInt16 *sourceBuffer = TPCircularBufferTail(circularBuffer, &availableBytes);
int32_t amount = MIN(bytesToCopy,availableBytes);
memcpy(outputBuffer, sourceBuffer, amount);
TPCircularBufferConsume(circularBuffer,amount);
return noErr;
}
-(void) start
{
[self circularBuffer:&_circularBuffer withSize:24576*5];
stopped = NO;
[self setupAudioUnit];
// [super setup:#"http://localhost" port:5321];
}
-(void) setupAudioUnit
{
AudioComponentDescription desc;
desc.componentType = kAudioUnitType_Output;
desc.componentSubType = kAudioUnitSubType_VoiceProcessingIO;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
AudioComponent comp = AudioComponentFindNext(NULL, &desc);
OSStatus status;
status = AudioComponentInstanceNew(comp, &_audioUnit);
if(status != noErr)
{
NSLog(#"Error creating AudioUnit instance");
}
// Enable input and output on AURemoteIO
// Input is enabled on the input scope of the input element
// Output is enabled on the output scope of the output element
UInt32 one = 1;
status = AudioUnitSetProperty(_audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, kOutputBus, &one, sizeof(one));
if(status != noErr)
{
NSLog(#"Error enableling AudioUnit output bus");
}
// Explicitly set the input and output client formats
// sample rate = 44100, num channels = 1, format = 16 bit int point
AudioStreamBasicDescription audioFormat = [self getAudioDescription];
status = AudioUnitSetProperty(_audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, kOutputBus, &audioFormat, sizeof(audioFormat));
if(status != noErr)
{
NSLog(#"Error setting audio format");
}
AURenderCallbackStruct renderCallback;
renderCallback.inputProc = OutputRenderCallback;
renderCallback.inputProcRefCon = (__bridge void *)(self);
status = AudioUnitSetProperty(_audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, kOutputBus, &renderCallback, sizeof(renderCallback));
if(status != noErr)
{
NSLog(#"Error setting rendering callback");
}
// Initialize the AURemoteIO instance
status = AudioUnitInitialize(_audioUnit);
if(status != noErr)
{
NSLog(#"Error initializing audio unit");
}
}
- (AudioStreamBasicDescription)getAudioDescription {
AudioStreamBasicDescription audioDescription = {0};
audioDescription.mFormatID = kAudioFormatLinearPCM;
audioDescription.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked | kAudioFormatFlagsNativeEndian;
audioDescription.mChannelsPerFrame = 1;
audioDescription.mBytesPerPacket = sizeof(SInt16)*audioDescription.mChannelsPerFrame;
audioDescription.mFramesPerPacket = 1;
audioDescription.mBytesPerFrame = sizeof(SInt16)*audioDescription.mChannelsPerFrame;
audioDescription.mBitsPerChannel = 8 * sizeof(SInt16);
audioDescription.mSampleRate = 44100.0;
return audioDescription;
}
-(void)circularBuffer:(TPCircularBuffer *)circularBuffer withSize:(int)size {
TPCircularBufferInit(circularBuffer,size);
}
-(void)appendDataToCircularBuffer:(TPCircularBuffer*)circularBuffer
fromAudioBufferList:(AudioBufferList*)audioBufferList {
TPCircularBufferProduceBytes(circularBuffer,
audioBufferList->mBuffers[0].mData,
audioBufferList->mBuffers[0].mDataByteSize);
}
-(void)freeCircularBuffer:(TPCircularBuffer *)circularBuffer {
TPCircularBufferClear(circularBuffer);
TPCircularBufferCleanup(circularBuffer);
}
-(TPCircularBuffer *) outputShouldUseCircularBuffer
{
return &_circularBuffer;
}
-(void) stop
{
OSStatus status = AudioOutputUnitStop(_audioUnit);
if(status != noErr)
{
NSLog(#"Error stopping audio unit");
}
TPCircularBufferClear(&_circularBuffer);
_audioUnit = nil;
stopped = YES;
}
-(void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)event{
switch (event) {
case NSStreamEventOpenCompleted:
NSLog(#"Stream opened");
break;
case NSStreamEventHasBytesAvailable:
if (stream == [super inputStream]) {
NSLog(#"NSStreamEventHasBytesAvailable");
uint8_t buffer[1024];
NSUInteger len;
while ([[super inputStream] hasBytesAvailable]) {
len = [[super inputStream] read:buffer maxLength:sizeof(buffer)];
if (len > 0) {
//converting buffer to byte data
NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
if (nil != output) {
//NSLog(#"server overideddddd said: %#", output);
}
NSData *data0 = [[NSData alloc] initWithBytes:buffer length:len];
if (nil != data0) {
SInt16* byteData = (SInt16*)malloc(len);
memcpy(byteData, [data0 bytes], len);
double sum = 0.0;
for(int i = 0; i < len/2; i++) {
sum += byteData[i] * byteData[i];
}
Byte* soundData = (Byte*)malloc(len);
memcpy(soundData, [data0 bytes], len);
if(soundData)
{
AudioBufferList *theDataBuffer = (AudioBufferList*) malloc(sizeof(AudioBufferList) *1);
theDataBuffer->mNumberBuffers = 1;
theDataBuffer->mBuffers[0].mDataByteSize = (UInt32)len;
theDataBuffer->mBuffers[0].mNumberChannels = 1;
theDataBuffer->mBuffers[0].mData = (SInt16*)soundData;
NSLog(#"soundData here");
[self appendDataToCircularBuffer:&_circularBuffer fromAudioBufferList:theDataBuffer];
}
}
}
}
}
break;
case NSStreamEventErrorOccurred:
NSLog(#"Can't connect to server");
break;
case NSStreamEventEndEncountered:
[stream close];
[stream removeFromRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
break;
default:
NSLog(#"Unknown event");
}
[super stream:stream handleEvent:event];
}
#end
I would highly appreciate if there is any one with an example of playing buffers returned from a socket server into audio queue so that I can be able to listen to sound as it comes from the socket server.
Thanks
Your code seems to be asking for a kAudioUnitSubType_VoiceProcessingIO audio unit. But kAudioUnitSubType_RemoteIO would be a more suitable iOS audio unit for just playing buffers of audio samples.
Also, your code does not seem to first select an appropriate audio session category and activate it before playing audio. See Apple's documentation for doing this: https://developer.apple.com/library/content/documentation/Audio/Conceptual/AudioSessionProgrammingGuide/Introduction/Introduction.html

Sprite Collision With Three Different Sprites?

I've created three sprites that works as walls in my scene. Then I have a sprite and a score. I want the sprite to set the score to 0 only when it touches the floor(one of the three sprites). So that's what I have for the contact.
- (void)didBeginContact:(SKPhysicsContact *)contact
{
SKPhysicsBody *firstBody, *secondBody;
if (contact.bodyA.categoryBitMask < contact.bodyB.categoryBitMask)
{
firstBody = contact.bodyA;
secondBody = contact.bodyB;
}
else
{
firstBody = contact.bodyB;
secondBody = contact.bodyA;
}
if ((firstBody.categoryBitMask & shipCategory) != 0 &&
(secondBody.categoryBitMask & obstacleCategory) != 0)
{
score = 0;
myLabel.text = [NSString stringWithFormat:#"%i", score];
}
}
Here are the categoryBitMask
static const uint32_t shipCategory = 0x1 << 1;
static const uint32_t obstacleCategory = 0x1 << 1;
static const uint32_t wallCategory = 0x1 << 1;
Theese are the codes for the sprite, the floor and the walls
-(SKSpriteNode *)floorNode
{
floorNode = [SKSpriteNode spriteNodeWithImageNamed:#"rectangle.png"];
floorNode.position = CGPointMake(160,100);
floorNode.physicsBody = [SKPhysicsBody bodyWithRectangleOfSize:floorNode.size];
floorNode.physicsBody.categoryBitMask = obstacleCategory;
floorNode.physicsBody.contactTestBitMask = shipCategory;
fireNode.physicsBody.usesPreciseCollisionDetection = YES;
fireNode.physicsBody.collisionBitMask = 2;
floorNode.physicsBody.dynamic = NO;
floorNode.zPosition = 1.0;
return floorNode;
}
-(SKSpriteNode *)walldxNode
{
walldxNode = [SKSpriteNode spriteNodeWithImageNamed:#"wall.png"];
walldxNode.position = CGPointMake(30, 568);
walldxNode.physicsBody = [SKPhysicsBody bodyWithRectangleOfSize:walldxNode.size];
walldxNode.physicsBody.categoryBitMask = wallCategory;
walldxNode.physicsBody.dynamic = NO;
return walldxNode;
}
-(SKSpriteNode *)wallsxNode
{
wallsxNode = [SKSpriteNode spriteNodeWithImageNamed:#"wall.png"];
wallsxNode.position = CGPointMake(290, 568);
wallsxNode.physicsBody = [SKPhysicsBody bodyWithRectangleOfSize:wallsxNode.size];
wallsxNode.physicsBody.categoryBitMask = wallCategory;
wallsxNode.physicsBody.dynamic = NO;
return wallsxNode;
}
-(SKSpriteNode *)fireButtonNode
{
fireNode = [SKSpriteNode spriteNodeWithImageNamed:#"Spaceship.png"];
fireNode.position = CGPointMake(160,450);
fireNode.xScale = 0.32;
fireNode.yScale = 0.32;
fireNode.physicsBody = [SKPhysicsBody bodyWithCircleOfRadius: fireNode.size.height/2];
fireNode.physicsBody.categoryBitMask = shipCategory;
fireNode.physicsBody.dynamic = YES;
fireNode.physicsBody.contactTestBitMask = obstacleCategory;
fireNode.physicsBody.collisionBitMask = 2;
fireNode.physicsBody.usesPreciseCollisionDetection = YES;
fireNode.name = #"fireButtonNode";//how the node is identified later
fireNode.zPosition = 2.0;
return fireNode;
}
The problem is that the sprite sets the score to 0 also when collides with the others two walls, which have differents categoryBitMask. I don't know what to do.
Your category bitmasks all have the same values. Make them differ like this:
static const uint32_t shipCategory = 0x1 << 1; // this equals 2
static const uint32_t obstacleCategory = 0x1 << 2; // this equals 4
static const uint32_t wallCategory = 0x1 << 3; // this equals 8
Batalia is correct, but these days you should probably use an enum rather than some crusty old static const. I did it in some code like this and then filed a bug against the sample code for not using Apple's own current best practices:
// These constans are used to define the physics interactions between physics bodies in the scene.
typedef NS_OPTIONS(NSUInteger, RockBusterCollionsMask) {
RBCmissileCategory = 1 << 0,
RBCasteroidCategory = 1 << 1,
RBCshipCategory = 1 << 2
};

Segmentation Fault 11 | CGEventTap application stops processing mouse events after arbitrary amount of time.

The purpose of this application is to run in the background 24/7 and lock the mouse in the center of the screen. It's for work with a series of flash programs to simulate joystick-style movement for the mouse. I've already attempted to use other methods built into Cocoa/Quartz in order to accomplish this, and none of them worked for my purpose, so this is the way I have to do it.
I have been trying to figure out why, after a seemingly random amount of time, this program simply stops restricting the mouse. The program doesn't give an error or anything like that, it just stops working. The force-quit screen DOES say "Not Responding", however, many of my mouse modifying scripts, including this one, always read as "not responding" and they keep functioning.
Here's the code:
code removed, check below for updated code.
Final Update
Ken Thomases gave me the right answer, I've updated my code based on his suggestions.
Here's the final code that I've gotten to work flawlessly (this ran for 12+ hours without a hitch before I manually stopped it):
#import <Cocoa/Cocoa.h>
#import <CoreMedia/CoreMedia.h>
int screen_width, screen_height;
struct event_tap_data_struct {
CFMachPortRef event_tap;
float speed_modifier;
};
CGEventRef
mouse_filter(CGEventTapProxy proxy, CGEventType type, CGEventRef event, struct event_tap_data_struct *);
int
screen_res(int);
int
main(int argc, char *argv[]) {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
screen_width = screen_res(0);
screen_height = screen_res(1);
CFRunLoopSourceRef runLoopSource;
CGEventMask event_mask = kCGEventMaskForAllEvents;
CGSetLocalEventsSuppressionInterval(0);
CFMachPortRef eventTap;
struct event_tap_data_struct event_tap_data = {eventTap,0.2};
eventTap = CGEventTapCreate(kCGHIDEventTap, kCGHeadInsertEventTap, 0, event_mask, mouse_filter, &event_tap_data);
event_tap_data.event_tap = eventTap;
if (!eventTap) {
NSLog(#"Couldn't create event tap!");
exit(1);
}
runLoopSource = CFMachPortCreateRunLoopSource(kCFAllocatorDefault, event_tap_data.event_tap, 0);
CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSource, kCFRunLoopCommonModes);
CGEventTapEnable(event_tap_data.event_tap, true);
CFRunLoopRun();
CFRelease(eventTap);
CFRelease(runLoopSource);
[pool release];
exit(0);
}
int
screen_res(int width_or_height) {
NSRect screenRect;
NSArray *screenArray = [NSScreen screens];
unsigned screenCount = (unsigned)[screenArray count];
for (unsigned index = 0; index < screenCount; index++)
{
NSScreen *screen = [screenArray objectAtIndex: index];
screenRect = [screen visibleFrame];
}
int resolution_array[] = {(int)CGDisplayPixelsWide(CGMainDisplayID()),(int)CGDisplayPixelsHigh(CGMainDisplayID())};
if(width_or_height==0){
return resolution_array[0];
}else {
return resolution_array[1];
}
}
CGEventRef
mouse_filter(CGEventTapProxy proxy, CGEventType type, CGEventRef event, struct event_tap_data_struct *event_tap_data) {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
if (type == kCGEventTapDisabledByTimeout || type == kCGEventTapDisabledByUserInput) {
CGEventTapEnable(event_tap_data->event_tap,true);
return event;
} else if (type == kCGEventMouseMoved || type == kCGEventLeftMouseDragged || type == kCGEventRightMouseDragged || type == kCGEventOtherMouseDragged){
CGPoint point = CGEventGetLocation(event);
NSPoint old_point;
CGPoint target;
int tX = point.x;
int tY = point.y;
float oX = screen_width/2;
float oY = screen_height/2;
float dX = tX-oX;
float dY = tY-oY;
old_point.x = floor(oX); old_point.y = floor(oY);
dX*=2, dY*=2;
tX = round(oX + dX);
tY = round(oY + dY);
target = CGPointMake(tX, tY);
CGWarpMouseCursorPosition(old_point);
CGEventSetLocation(event,target);
}
[pool release];
return event;
}
(first) Update:
The program is still crashing, but I have now run it as an executable and received an error code.
When it terminates, the console logs "Segmentation Fault: 11".
I've been trying to discover what this means, however it appears to be an impressively broad term, and I've yet to hone in on something useful.
Here is the new code I am using:
#import <Cocoa/Cocoa.h>
#import <CoreMedia/CoreMedia.h>
int screen_width, screen_height;
struct event_tap_data_struct {
CFMachPortRef event_tap;
float speed_modifier;
};
CGEventRef
mouse_filter(CGEventTapProxy proxy, CGEventType type, CGEventRef event, struct event_tap_data_struct *);
int
screen_res(int);
int
main(int argc, char *argv[]) {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
screen_width = screen_res(0);
screen_height = screen_res(1);
CFRunLoopSourceRef runLoopSource;
CGEventMask event_mask;
event_mask = CGEventMaskBit(kCGEventMouseMoved) | CGEventMaskBit(kCGEventLeftMouseDragged) | CGEventMaskBit(kCGEventRightMouseDragged) | CGEventMaskBit(kCGEventOtherMouseDragged);
CGSetLocalEventsSuppressionInterval(0);
CFMachPortRef eventTap;
CFMachPortRef *eventTapPtr = &eventTap;
struct event_tap_data_struct event_tap_data = {*eventTapPtr,0.2};
eventTap = CGEventTapCreate(kCGHIDEventTap, kCGHeadInsertEventTap, 0, event_mask, mouse_filter, &event_tap_data);
if (!eventTap) {
NSLog(#"Couldn't create event tap!");
exit(1);
}
runLoopSource = CFMachPortCreateRunLoopSource(kCFAllocatorDefault, eventTap, 0);
CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSource, kCFRunLoopCommonModes);
CGEventTapEnable(eventTap, true);
CFRunLoopRun();
CFRelease(eventTap);
CFRelease(runLoopSource);
[pool release];
exit(0);
}
int
screen_res(int width_or_height) {
NSRect screenRect;
NSArray *screenArray = [NSScreen screens];
unsigned screenCount = (unsigned)[screenArray count];
for (unsigned index = 0; index < screenCount; index++)
{
NSScreen *screen = [screenArray objectAtIndex: index];
screenRect = [screen visibleFrame];
}
int resolution_array[] = {(int)CGDisplayPixelsWide(CGMainDisplayID()),(int)CGDisplayPixelsHigh(CGMainDisplayID())};
if(width_or_height==0){
return resolution_array[0];
}else {
return resolution_array[1];
}
}
CGEventRef
mouse_filter(CGEventTapProxy proxy, CGEventType type, CGEventRef event, struct event_tap_data_struct *event_tap_data) {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
if (type == kCGEventTapDisabledByTimeout || type == kCGEventTapDisabledByUserInput) {
CGEventTapEnable(event_tap_data->event_tap,true);
}
CGPoint point = CGEventGetLocation(event);
NSPoint old_point;
CGPoint target;
int tX = point.x;
int tY = point.y;
float oX = screen_width/2;
float oY = screen_height/2;
float dX = tX-oX;
float dY = tY-oY;
old_point.x = floor(oX); old_point.y = floor(oY);
dX*=2, dY*=2;
tX = round(oX + dX);
tY = round(oY + dY);
target = CGPointMake(tX, tY);
CGWarpMouseCursorPosition(old_point);
CGEventSetLocation(event,target);
[pool release];
return event;
}
You need to re-enable your event tap when it receives kCGEventTapDisabledByTimeout or kCGEventTapDisabledByUserInput.
Update: here are your lines and how they're (failing to) work:
CFMachPortRef eventTap; // uninitialized value
CFMachPortRef *eventTapPtr = &eventTap; // pointer to eventTap
struct event_tap_data_struct event_tap_data = {*eventTapPtr,0.2}; // dereferences pointer, copying uninitialized value into struct
eventTap = CGEventTapCreate(kCGHIDEventTap, kCGHeadInsertEventTap, 0, event_mask, mouse_filter, &event_tap_data); // sets eventTap but has no effect on event_tap_data

Generating Morse code style tones in objective-c

I have a class that will let me play a tone using audio units, what i would like to be able to do is have the class play morse code style when i send the class a phrase or letter.
How would i go about this? I'm hoping someone can point me in the right direction. i have included the tone generator .h and .m files below
//
// Singer.h
// musiculesdev
//
// Created by Dylan on 2/20/09.
// Copyright 2009 __MyCompanyName__. All rights reserved.
//
#import <Foundation/Foundation.h>
#import <AudioUnit/AudioUnit.h>
#interface Singer : NSObject {
AudioComponentInstance audioUnit;
}
-(void)initAudio; // put this in init?
-(void)start;
-(void)stop;
-(IBAction)turnOnSound:(id)sender;
#end
//
// Singer.m
// musiculesdev
//
// Created by Dylan on 2/20/09.
// Copyright 2009 __MyCompanyName__. All rights reserved.
//
#import <AudioUnit/AudioUnit.h>
#import <math.h>
#import "Singer.h"
#define kOutputBus 0
#define kSampleRate 44100
//44100.0f
#define kWaveform (M_PI * 2.0f / kSampleRate)
#implementation Singer
OSStatus playbackCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData) {
//Singer *me = (Singer *)inRefCon;
static int phase = 0;
for(UInt32 i = 0; i < ioData->mNumberBuffers; i++) {
int samples = ioData->mBuffers[i].mDataByteSize / sizeof(SInt16);
SInt16 values[samples];
float waves;
for(int j = 0; j < samples; j++) {
waves = 0;
waves += sin(kWaveform * 261.63f * phase);
waves += sin(kWaveform * 120.0f * phase);
waves += sin(kWaveform * 1760.3f * phase);
waves += sin(kWaveform * 880.0f * phase);
waves *= 32500 / 4; // <--------- make sure to divide by how many waves you're stacking
values[j] = (SInt16)waves;
values[j] += values[j]<<16;
phase++;
}
memcpy(ioData->mBuffers[i].mData, values, samples * sizeof(SInt16));
}
return noErr;
}
-(IBAction)turnOnSound:(id)sender {
Singer *singer = [[Singer alloc] init];
[singer start];
}
-(id)init {
NSLog(#"In the singer init!!");
if(self = [super init]) {
[self initAudio];
}
return self;
}
-(void)initAudio {
OSStatus status;
AudioComponentDescription desc;
desc.componentType = kAudioUnitType_Output;
desc.componentSubType = kAudioUnitSubType_RemoteIO;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
AudioComponent outputComponent = AudioComponentFindNext(NULL, &desc);
status = AudioComponentInstanceNew(outputComponent, &audioUnit);
UInt32 flag = 1;
status = AudioUnitSetProperty(audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, kOutputBus, &flag, sizeof(flag));
AudioStreamBasicDescription audioFormat;
audioFormat.mSampleRate = kSampleRate;
audioFormat.mFormatID = kAudioFormatLinearPCM;
audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioFormat.mFramesPerPacket = 1;
audioFormat.mChannelsPerFrame = 1;
audioFormat.mBitsPerChannel = 16;
audioFormat.mBytesPerPacket = 2;
audioFormat.mBytesPerFrame = 2;
status = AudioUnitSetProperty(audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, kOutputBus, &audioFormat, sizeof(audioFormat));
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = playbackCallback;
callbackStruct.inputProcRefCon = self;
status = AudioUnitSetProperty(audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, kOutputBus, &callbackStruct, sizeof(callbackStruct));
status = AudioUnitInitialize(audioUnit);
}
-(void)start {
OSStatus status;
status = AudioOutputUnitStart(audioUnit);
}
-(void)stop {
OSStatus status;
status = AudioOutputUnitStop(audioUnit);
}
-(void)dealloc {
AudioUnitUninitialize(audioUnit);
[super dealloc];
}
#end
You need to be able to generate tones of a specific duration, separated by silences of a specific duration. So long as you have these two building blocks you can send morse code:
dot = 1 unit
dash = 3 units
space between dots/dashes within a letter = 1 unit
space between letters = 3 units
space between words = 5 units
The length of unit determines the overall speed of the morse code. Start with e.g. 50 ms.
The tone should just be a pure sine wave at an appropriate frequency, e.g. 400 Hz. The silence can just be an alternate buffer containing all zeroes. That way you can "play" both the tone and the silence using the same API, without worrying about timing/synchronisation, etc.