How to stop monitoring file paths - objective-c

I am currently monitoring multiple file paths from an array with objective-c by using this function:
-(void)monitorPath:(NSString*)path{
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
int fildes = open([path UTF8String], O_EVTONLY);
__block typeof(self) blockSelf = self;
__block dispatch_source_t source = dispatch_source_create(DISPATCH_SOURCE_TYPE_VNODE, fildes,
DISPATCH_VNODE_DELETE | DISPATCH_VNODE_WRITE | DISPATCH_VNODE_RENAME, queue);
dispatch_source_set_event_handler(source, ^{
unsigned long flags = dispatch_source_get_data(source);
NSLog(#"Path altered");
});
dispatch_source_set_cancel_handler(source, ^(void) {
close(fildes);
});
dispatch_resume(source);
}
Is there a method to stop all of the file paths from being monitored?

Related

CHCSV importer memory issue on NSBlockOperation

I'm on XCode 8.2, Objective-C, OSX (not iOS), ARC enabled.
I'm importing a large CSV file in a custom NSBlockOperation with DaveDelongs CHCSV parser. My problem: The memory is not freed after the operation is done. Even if i do NOT save the parsed NSArray* the same happens.
Here is the essential part of the code:
SHImportDataOperation.m (custom NSBlockOperation)
// CHCSV parser delegate
#interface Delegate : NSObject <CHCSVParserDelegate>
#property (readonly) NSArray *lines; // <-- parsing result is stored here
#property (readonly) NSString *errorMessage;
#property unsigned long long filesize;
#property SHGlobalAppData* global;
#end
#implementation SHImportDataOperation
#synthesize finished = _finished;
#synthesize executing = _executing;
- (void)main
{
// ###############################
// UI stuff
dispatch_async(dispatch_get_main_queue(), ^{
// Hide controls
[_global.dataFile setFileDropped:NO]; // _global.dataFile is an NSObject with relevant fileInformation
});
// ###############################
// Import CSV
if (![self importCSV:_global.dataFile])
{
[self breakImportData]; // Cleanup
return;
}
}
// ###############################
// Finishing Import Data
// UI stuff
dispatch_async(dispatch_get_main_queue(), ^{
// Show controls
[_global.dataFile setFileDropped:YES];
// Show OK
[_global.dataFile setImported:YES];
// Cleanup
[self finishOperation];
});
}
// ################################################
// cleanup
- (void)finishOperation
{
[self willChangeValueForKey:#"isFinished"];
[self willChangeValueForKey:#"isExecuting"];
_executing = NO;
_finished = YES;
[self didChangeValueForKey:#"isExecuting"];
[self didChangeValueForKey:#"isFinished"];
}
// ################################################
- (BOOL)importCSV:(SHDataModelData *)dataFile
{
encoding = NSMacOSRomanStringEncoding;
NSString* delimiter = #",";
unichar delimiterUnichar = [delimiter characterAtIndex:0];
// ###############################
NSInputStream* stream = [NSInputStream inputStreamWithFileAtPath:dataFile.filePath];
CHCSVParser* p = [[CHCSVParser alloc] initWithInputStream:stream usedEncoding:&encoding delimiter:delimiterUnichar];
Delegate * d = [[Delegate alloc] init];
[d setGlobal:_global]; // Reference needed for UI progress bar
[p setDelegate:d];
[p setFilepath:dataFile.filePath];
[p parse];
//NSLog(#"Result: %#",d.lines);
// Save imported data
//[dataFile setImportData:d.lines]; // Even if not saved, memory is not free'd after operation
// Even if i nil everything, memory is not freed.
[d setGlobal:nil];
[p setDelegate:nil];
d = nil;
p = nil;
return true;
}
And this is how the operation is started:
NSOperationQueue* operationImportQueue = [NSOperationQueue new];
SHImportLayoutOperation *importLayoutOperation = [[SHImportLayoutOperation alloc] init];
[importLayoutOperation setGlobal:[self theAppDataObject]];
SHImportDataOperation *importDataOperation = [[SHImportDataOperation alloc] init];
[importDataOperation setGlobal:[self theAppDataObject]];
NSBlockOperation *importDoneCompletionOperation = [NSBlockOperation blockOperationWithBlock:^{
[self continueWithStuff];
}];
[importDoneCompletionOperation addDependency:importDataOperation];
[importDoneCompletionOperation addDependency:importLayoutOperation];
[operationImportQueue addOperation:importDoneCompletionOperation];
[operationImportQueue addOperation:importDataOperation];
[operationImportQueue addOperation:importLayoutOperation];
Edit 1:
After further testing i can confirm the NSBlockOperation is successfully performed and removed from memory.

FSEvents- Detect type of event on the folder

Any Idea on how to detect type of folder event(FSEvent) raised in the folder in the callback method (gotEvent method in below code)? Ex: File Renamed, File Created? I want to do some operation only with File Renamed, File Created. Want to ignore other events.
I have below implementation -
- (FSEventStreamRef) eventStreamForFileAtPath: (NSString *) fileInputPath {
if (![[NSFileManager defaultManager] fileExistsAtPath:fileInputPath]) {
#throw [NSException exceptionWithName:#"FileNotFoundException"
reason:#"There is not file at path specified in fileInputPath"
userInfo:nil];
}
NSString *fileInputDir = [fileInputPath stringByDeletingLastPathComponent];
NSArray *pathsToWatch = [NSArray arrayWithObjects:fileInputDir, nil];
void *appPointer = (__bridge void *)self;
FSEventStreamContext context = {0, appPointer, NULL, NULL, NULL};
NSTimeInterval latency = 3.0;
FSEventStreamRef stream = FSEventStreamCreate(NULL,
&gotEvent,
&context,
(__bridge CFArrayRef) pathsToWatch,
kFSEventStreamEventIdSinceNow,
(CFAbsoluteTime) latency,
kFSEventStreamCreateFlagUseCFTypes
);
FSEventStreamScheduleWithRunLoop(stream, CFRunLoopGetCurrent(), kCFRunLoopDefaultMode);
FSEventStreamStart(stream);
return stream;
}
static void gotEvent(ConstFSEventStreamRef stream,
void *clientCallBackInfo,
size_t numEvents,
void *eventPathsVoidPointer,
const FSEventStreamEventFlags eventFlags[],
const FSEventStreamEventId eventIds[]
) {
NSLog(#"File Changed!");
}
The FSEventStreamEventFlags should indicate what happened, according to Apple:
kFSEventStreamEventFlagItemCreated = 0x00000100,
kFSEventStreamEventFlagItemRemoved = 0x00000200,
kFSEventStreamEventFlagItemRenamed = 0x00000800,
kFSEventStreamEventFlagItemModified = 0x00001000,
Evaluating which flag is set should do exactly what you want.

Why does this Grand Central Dispatch code doesn't work?

It's my first Grand Central Dispatch code but it doesn't work. Working on Mac OS X 10.8 and last Xcode version. I know it's too basic. Thanks.
#import <Foundation/Foundation.h>
#import <dispatch/dispatch.h>
void printResult(int r);
void printResult(int r)
{
NSLog(#"%i", r);
}
int main(int argc, const char * argv[])
{
#autoreleasepool {
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
dispatch_async(queue, ^{
int number = pow(2, 5);
dispatch_async(dispatch_get_main_queue(), ^{
printResult(number);
});
});
}
return 0;
}
First. Your application actually exits before the blocks you passed in GCD could finish.
To fix that, you could use GCD groups and synchronization tools that they give you.
#autoreleasepool {
dispatch_group_t group = dispatch_group_create();
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
dispatch_group_async(group, queue, ^{
int number = pow(2, 5);
dispatch_group_async(group, dispatch_get_main_queue(), ^{
printResult(number);
});
});
dispatch_group_wait(group, DISPATCH_TIME_FOREVER);
}
return 0;
Second. Here you will encounter another problem called deadlock. The Second block is added to the main thread queue. And main thread is actually waiting for this block to finish, so it could not be executed. Add the second block to the queue you've created before.
dispatch_group_async(group, queue, ^{
printResult(number);
});
Now you can see 32 in console, which is what you expect.
int main(int argc, const char * argv[])
{
#autoreleasepool {
dispatch_queue_t queue = dispatch_queue_create("com.myQueue", NULL);
dispatch_async(queue, ^{
int number = pow(2, 5);
dispatch_async(dispatch_get_main_queue(), ^{
printResult(number);
});
});
}
[[NSRunLoop currentRunLoop] run];
return 0;
}
I think you just need a runloop here.

Segmentation Fault 11 | CGEventTap application stops processing mouse events after arbitrary amount of time.

The purpose of this application is to run in the background 24/7 and lock the mouse in the center of the screen. It's for work with a series of flash programs to simulate joystick-style movement for the mouse. I've already attempted to use other methods built into Cocoa/Quartz in order to accomplish this, and none of them worked for my purpose, so this is the way I have to do it.
I have been trying to figure out why, after a seemingly random amount of time, this program simply stops restricting the mouse. The program doesn't give an error or anything like that, it just stops working. The force-quit screen DOES say "Not Responding", however, many of my mouse modifying scripts, including this one, always read as "not responding" and they keep functioning.
Here's the code:
code removed, check below for updated code.
Final Update
Ken Thomases gave me the right answer, I've updated my code based on his suggestions.
Here's the final code that I've gotten to work flawlessly (this ran for 12+ hours without a hitch before I manually stopped it):
#import <Cocoa/Cocoa.h>
#import <CoreMedia/CoreMedia.h>
int screen_width, screen_height;
struct event_tap_data_struct {
CFMachPortRef event_tap;
float speed_modifier;
};
CGEventRef
mouse_filter(CGEventTapProxy proxy, CGEventType type, CGEventRef event, struct event_tap_data_struct *);
int
screen_res(int);
int
main(int argc, char *argv[]) {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
screen_width = screen_res(0);
screen_height = screen_res(1);
CFRunLoopSourceRef runLoopSource;
CGEventMask event_mask = kCGEventMaskForAllEvents;
CGSetLocalEventsSuppressionInterval(0);
CFMachPortRef eventTap;
struct event_tap_data_struct event_tap_data = {eventTap,0.2};
eventTap = CGEventTapCreate(kCGHIDEventTap, kCGHeadInsertEventTap, 0, event_mask, mouse_filter, &event_tap_data);
event_tap_data.event_tap = eventTap;
if (!eventTap) {
NSLog(#"Couldn't create event tap!");
exit(1);
}
runLoopSource = CFMachPortCreateRunLoopSource(kCFAllocatorDefault, event_tap_data.event_tap, 0);
CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSource, kCFRunLoopCommonModes);
CGEventTapEnable(event_tap_data.event_tap, true);
CFRunLoopRun();
CFRelease(eventTap);
CFRelease(runLoopSource);
[pool release];
exit(0);
}
int
screen_res(int width_or_height) {
NSRect screenRect;
NSArray *screenArray = [NSScreen screens];
unsigned screenCount = (unsigned)[screenArray count];
for (unsigned index = 0; index < screenCount; index++)
{
NSScreen *screen = [screenArray objectAtIndex: index];
screenRect = [screen visibleFrame];
}
int resolution_array[] = {(int)CGDisplayPixelsWide(CGMainDisplayID()),(int)CGDisplayPixelsHigh(CGMainDisplayID())};
if(width_or_height==0){
return resolution_array[0];
}else {
return resolution_array[1];
}
}
CGEventRef
mouse_filter(CGEventTapProxy proxy, CGEventType type, CGEventRef event, struct event_tap_data_struct *event_tap_data) {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
if (type == kCGEventTapDisabledByTimeout || type == kCGEventTapDisabledByUserInput) {
CGEventTapEnable(event_tap_data->event_tap,true);
return event;
} else if (type == kCGEventMouseMoved || type == kCGEventLeftMouseDragged || type == kCGEventRightMouseDragged || type == kCGEventOtherMouseDragged){
CGPoint point = CGEventGetLocation(event);
NSPoint old_point;
CGPoint target;
int tX = point.x;
int tY = point.y;
float oX = screen_width/2;
float oY = screen_height/2;
float dX = tX-oX;
float dY = tY-oY;
old_point.x = floor(oX); old_point.y = floor(oY);
dX*=2, dY*=2;
tX = round(oX + dX);
tY = round(oY + dY);
target = CGPointMake(tX, tY);
CGWarpMouseCursorPosition(old_point);
CGEventSetLocation(event,target);
}
[pool release];
return event;
}
(first) Update:
The program is still crashing, but I have now run it as an executable and received an error code.
When it terminates, the console logs "Segmentation Fault: 11".
I've been trying to discover what this means, however it appears to be an impressively broad term, and I've yet to hone in on something useful.
Here is the new code I am using:
#import <Cocoa/Cocoa.h>
#import <CoreMedia/CoreMedia.h>
int screen_width, screen_height;
struct event_tap_data_struct {
CFMachPortRef event_tap;
float speed_modifier;
};
CGEventRef
mouse_filter(CGEventTapProxy proxy, CGEventType type, CGEventRef event, struct event_tap_data_struct *);
int
screen_res(int);
int
main(int argc, char *argv[]) {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
screen_width = screen_res(0);
screen_height = screen_res(1);
CFRunLoopSourceRef runLoopSource;
CGEventMask event_mask;
event_mask = CGEventMaskBit(kCGEventMouseMoved) | CGEventMaskBit(kCGEventLeftMouseDragged) | CGEventMaskBit(kCGEventRightMouseDragged) | CGEventMaskBit(kCGEventOtherMouseDragged);
CGSetLocalEventsSuppressionInterval(0);
CFMachPortRef eventTap;
CFMachPortRef *eventTapPtr = &eventTap;
struct event_tap_data_struct event_tap_data = {*eventTapPtr,0.2};
eventTap = CGEventTapCreate(kCGHIDEventTap, kCGHeadInsertEventTap, 0, event_mask, mouse_filter, &event_tap_data);
if (!eventTap) {
NSLog(#"Couldn't create event tap!");
exit(1);
}
runLoopSource = CFMachPortCreateRunLoopSource(kCFAllocatorDefault, eventTap, 0);
CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSource, kCFRunLoopCommonModes);
CGEventTapEnable(eventTap, true);
CFRunLoopRun();
CFRelease(eventTap);
CFRelease(runLoopSource);
[pool release];
exit(0);
}
int
screen_res(int width_or_height) {
NSRect screenRect;
NSArray *screenArray = [NSScreen screens];
unsigned screenCount = (unsigned)[screenArray count];
for (unsigned index = 0; index < screenCount; index++)
{
NSScreen *screen = [screenArray objectAtIndex: index];
screenRect = [screen visibleFrame];
}
int resolution_array[] = {(int)CGDisplayPixelsWide(CGMainDisplayID()),(int)CGDisplayPixelsHigh(CGMainDisplayID())};
if(width_or_height==0){
return resolution_array[0];
}else {
return resolution_array[1];
}
}
CGEventRef
mouse_filter(CGEventTapProxy proxy, CGEventType type, CGEventRef event, struct event_tap_data_struct *event_tap_data) {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
if (type == kCGEventTapDisabledByTimeout || type == kCGEventTapDisabledByUserInput) {
CGEventTapEnable(event_tap_data->event_tap,true);
}
CGPoint point = CGEventGetLocation(event);
NSPoint old_point;
CGPoint target;
int tX = point.x;
int tY = point.y;
float oX = screen_width/2;
float oY = screen_height/2;
float dX = tX-oX;
float dY = tY-oY;
old_point.x = floor(oX); old_point.y = floor(oY);
dX*=2, dY*=2;
tX = round(oX + dX);
tY = round(oY + dY);
target = CGPointMake(tX, tY);
CGWarpMouseCursorPosition(old_point);
CGEventSetLocation(event,target);
[pool release];
return event;
}
You need to re-enable your event tap when it receives kCGEventTapDisabledByTimeout or kCGEventTapDisabledByUserInput.
Update: here are your lines and how they're (failing to) work:
CFMachPortRef eventTap; // uninitialized value
CFMachPortRef *eventTapPtr = &eventTap; // pointer to eventTap
struct event_tap_data_struct event_tap_data = {*eventTapPtr,0.2}; // dereferences pointer, copying uninitialized value into struct
eventTap = CGEventTapCreate(kCGHIDEventTap, kCGHeadInsertEventTap, 0, event_mask, mouse_filter, &event_tap_data); // sets eventTap but has no effect on event_tap_data

Objective-c - How to serialize audio file into small packets that can be played?

So, I would like to get a sound file and convert it in packets, and send it to another computer. I would like that the other computer be able to play the packets as they arrive.
I am using AVAudioPlayer to try to play this packets, but I couldn't find a proper way to serialize the data on the peer1 that the peer2 can play.
The scenario is, peer1 has a audio file, split the audio file in many small packets, put them on a NSData and send them to peer2. Peer 2 receive the packets and play one by one, as they arrive.
Does anyone have know how to do this? or even if it is possible?
EDIT:
Here it is some piece of code to illustrate what I would like to achieve.
// This code is part of the peer1, the one who sends the data
- (void)sendData
{
int packetId = 0;
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:#"myAudioFile" ofType:#"wav"];
NSData *soundData = [[NSData alloc] initWithContentsOfFile:soundFilePath];
NSMutableArray *arraySoundData = [[NSMutableArray alloc] init];
// Spliting the audio in 2 pieces
// This is only an illustration
// The idea is to split the data into multiple pieces
// dependin on the size of the file to be sent
NSRange soundRange;
soundRange.length = [soundData length]/2;
soundRange.location = 0;
[arraySoundData addObject:[soundData subdataWithRange:soundRange]];
soundRange.length = [soundData length]/2;
soundRange.location = [soundData length]/2;
[arraySoundData addObject:[soundData subdataWithRange:soundRange]];
for (int i=0; i<[arraySoundData count]; i++)
{
NSData *soundPacket = [arraySoundData objectAtIndex:i];
if(soundPacket == nil)
{
NSLog(#"soundData is nil");
return;
}
NSMutableData* message = [[NSMutableData alloc] init];
NSKeyedArchiver* archiver = [[NSKeyedArchiver alloc] initForWritingWithMutableData:message];
[archiver encodeInt:packetId++ forKey:PACKET_ID];
[archiver encodeObject:soundPacket forKey:PACKET_SOUND_DATA];
[archiver finishEncoding];
NSError* error = nil;
[connectionManager sendMessage:message error:&error];
if (error) NSLog (#"send greeting failed: %#" , [error localizedDescription]);
[message release];
[archiver release];
}
[soundData release];
[arraySoundData release];
}
// This is the code on peer2 that would receive and play the piece of audio on each packet
- (void) receiveData:(NSData *)data
{
NSKeyedUnarchiver* unarchiver = [[NSKeyedUnarchiver alloc] initForReadingWithData:data];
if ([unarchiver containsValueForKey:PACKET_ID])
NSLog(#"DECODED PACKET_ID: %i", [unarchiver decodeIntForKey:PACKET_ID]);
if ([unarchiver containsValueForKey:PACKET_SOUND_DATA])
{
NSLog(#"DECODED sound");
NSData *sound = (NSData *)[unarchiver decodeObjectForKey:PACKET_SOUND_DATA];
if (sound == nil)
{
NSLog(#"sound is nil!");
}
else
{
NSLog(#"sound is not nil!");
AVAudioPlayer *audioPlayer = [AVAudioPlayer alloc];
if ([audioPlayer initWithData:sound error:nil])
{
[audioPlayer prepareToPlay];
[audioPlayer play];
} else {
[audioPlayer release];
NSLog(#"Player couldn't load data");
}
}
}
[unarchiver release];
}
So, here is what I am trying to achieve...so, what I really need to know is how to create the packets, so peer2 can play the audio.
It would be a kind of streaming. Yes, for now I am not worried about the order that the packet are received or played...I only need to get the sound sliced and them be able to play each piece, each slice, without need to wait for the whole file be received by peer2.
Thanks!
Here is simplest class to play files with AQ
Note that you can play it from any point (just set currentPacketNumber)
#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
#interface AudioFile : NSObject {
AudioFileID fileID; // the identifier for the audio file to play
AudioStreamBasicDescription format;
UInt64 packetsCount;
UInt32 maxPacketSize;
}
#property (readwrite) AudioFileID fileID;
#property (readwrite) UInt64 packetsCount;
#property (readwrite) UInt32 maxPacketSize;
- (id) initWithURL: (CFURLRef) url;
- (AudioStreamBasicDescription *)audioFormatRef;
#end
// AudioFile.m
#import "AudioFile.h"
#implementation AudioFile
#synthesize fileID;
#synthesize format;
#synthesize maxPacketSize;
#synthesize packetsCount;
- (id)initWithURL:(CFURLRef)url{
if (self = [super init]){
AudioFileOpenURL(
url,
0x01, //fsRdPerm, read only
0, //no hint
&fileID
);
UInt32 sizeOfPlaybackFormatASBDStruct = sizeof format;
AudioFileGetProperty (
fileID,
kAudioFilePropertyDataFormat,
&sizeOfPlaybackFormatASBDStruct,
&format
);
UInt32 propertySize = sizeof (maxPacketSize);
AudioFileGetProperty (
fileID,
kAudioFilePropertyMaximumPacketSize,
&propertySize,
&maxPacketSize
);
propertySize = sizeof(packetsCount);
AudioFileGetProperty(fileID, kAudioFilePropertyAudioDataPacketCount, &propertySize, &packetsCount);
}
return self;
}
-(AudioStreamBasicDescription *)audioFormatRef{
return &format;
}
- (void) dealloc {
AudioFileClose(fileID);
[super dealloc];
}
// AQPlayer.h
#import <Foundation/Foundation.h>
#import "AudioFile.h"
#define AUDIOBUFFERS_NUMBER 3
#define MAX_PACKET_COUNT 4096
#interface AQPlayer : NSObject {
#public
AudioQueueRef queue;
AudioQueueBufferRef buffers[AUDIOBUFFERS_NUMBER];
NSInteger bufferByteSize;
AudioStreamPacketDescription packetDescriptions[MAX_PACKET_COUNT];
AudioFile * audioFile;
SInt64 currentPacketNumber;
UInt32 numPacketsToRead;
}
#property (nonatomic) SInt64 currentPacketNumber;
#property (nonatomic, retain) AudioFile * audioFile;
-(id)initWithFile:(NSString *)file;
-(NSInteger)fillBuffer:(AudioQueueBufferRef)buffer;
-(void)play;
#end
// AQPlayer.m
#import "AQPlayer.h"
static void AQOutputCallback(void * inUserData, AudioQueueRef inAQ, AudioQueueBufferRef inBuffer) {
AQPlayer * aqp = (AQPlayer *)inUserData;
[aqp fillBuffer:(AudioQueueBufferRef)inBuffer];
}
#implementation AQPlayer
#synthesize currentPacketNumber;
#synthesize audioFile;
-(id)initWithFile:(NSString *)file{
if ([self init]){
audioFile = [[AudioFile alloc] initWithURL:[NSURL fileURLWithPath:file]];
currentPacketNumber = 0;
AudioQueueNewOutput ([audioFile audioFormatRef], AQOutputCallback, self, CFRunLoopGetCurrent (), kCFRunLoopCommonModes, 0, &queue);
bufferByteSize = 4096;
if (bufferByteSize < audioFile.maxPacketSize) bufferByteSize = audioFile.maxPacketSize;
numPacketsToRead = bufferByteSize/audioFile.maxPacketSize;
for(int i=0; i<AUDIOBUFFERS_NUMBER; i++){
AudioQueueAllocateBuffer (queue, bufferByteSize, &buffers[i]);
}
}
return self;
}
-(void) dealloc{
[audioFile release];
if (queue){
AudioQueueDispose(queue, YES);
queue = nil;
}
[super dealloc];
}
- (void)play{
for (int bufferIndex = 0; bufferIndex < AUDIOBUFFERS_NUMBER; ++bufferIndex){
[self fillBuffer:buffers[bufferIndex]];
}
AudioQueueStart (queue, NULL);
}
-(NSInteger)fillBuffer:(AudioQueueBufferRef)buffer{
UInt32 numBytes;
UInt32 numPackets = numPacketsToRead;
BOOL isVBR = [audioFile audioFormatRef]->mBytesPerPacket == 0 ? YES : NO;
AudioFileReadPackets(
audioFile.fileID,
NO,
&numBytes,
isVBR ? packetDescriptions : 0,
currentPacketNumber,
&numPackets,
buffer->mAudioData
);
if (numPackets > 0) {
buffer->mAudioDataByteSize = numBytes;
AudioQueueEnqueueBuffer (
queue,
buffer,
isVBR ? numPackets : 0,
isVBR ? packetDescriptions : 0
);
}
else{
// end of present data, check if all packets are played
// if yes, stop play and dispose queue
// if no, pause queue till new data arrive then start it again
}
return numPackets;
}
It seems you are solving wrong task, because AVAudioPlayer capable play only whole audiofile. You should use Audio Queue Service from AudioToolbox framework instead, to play audio on packet-by-packet basis. In fact you need not divide audiofile into real sound packets, you can use any data block like in your own example above, but then you should read received data chuncks using Audiofile Service or Audio File Stream Services functions (from AudioToolbox) and feed them to audioqueue buffers.
If you nevertheless want to divide audiofile into sound packets, you can easily do it with Audiofile Service functions. Audiofile consist of header where its properties like number of packets, samplerate, number of channels etc. are stored, and raw sound data.
Use AudioFileOpenURL to open audiofile and take all its properties with AudioFileGetProperty function. Basicaly you need only kAudioFilePropertyDataFormat and kAudioFilePropertyAudioDataPacketCount properties:
AudioFileID fileID; // the identifier for the audio file
CFURLRef fileURL = ...; // file URL
AudioStreamBasicDescription format; // structure containing audio header info
UInt64 packetsCount;
AudioFileOpenURL(fileURL,
0x01, //fsRdPerm, // read only
0, //no hint
&fileID
);
UInt32 sizeOfPlaybackFormatASBDStruct = sizeof format;
AudioFileGetProperty (
fileID,
kAudioFilePropertyDataFormat,
&sizeOfPlaybackFormatASBDStruct,
&format
);
propertySize = sizeof(packetsCount);
AudioFileGetProperty(fileID, kAudioFilePropertyAudioDataPacketCount, &propertySize, &packetsCount);
Then you can take any range of audiopackets data with:
OSStatus AudioFileReadPackets (
AudioFileID inAudioFile,
Boolean inUseCache,
UInt32 *outNumBytes,
AudioStreamPacketDescription *outPacketDescriptions,
SInt64 inStartingPacket,
UInt32 *ioNumPackets,
void *outBuffer
);
Apple already has written something that can do this: AUNetSend and AUNetReceive. AUNetSend is an effect AudioUnit that sends audio to an AUNetReceive AudioUnit on another computer.
Unfortunately these AUs are not available on the iPhone, though.