I'm trying to assemble a toy application to interact with http servers via CoreFoundation. I'm not using NSURLConnection because I want to be able to specify a proxy different from the one set in the OSX System Preferences.
My problem is that I wasn't able to find any working example and the Apple documentation do not provide a working snippet of code.
Anyhow this is the code I pulled together:
#import <CoreFoundation/CoreFoundation.h>
#import <CoreServices/CoreServices.h>
#import <Foundation/Foundation.h>
void myCallBack (CFReadStreamRef stream, CFStreamEventType eventType, void *clientCallBackInfo){
NSLog(#"Something happened");
BOOL *done = ((BOOL *)clientCallBackInfo);
*done = TRUE;
}
int main(int argc, char *argv[]){
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
CFStringRef bodyString = CFSTR(""); // Usually used for POST data
CFStringRef url = CFSTR("http://www.apple.com");
CFURLRef myURL = CFURLCreateWithString(kCFAllocatorDefault, url, NULL);
CFStringRef requestMethod = CFSTR("GET");
CFHTTPMessageRef myRequest = CFHTTPMessageCreateRequest(kCFAllocatorDefault,
requestMethod,
myURL,
kCFHTTPVersion1_1);
if (myRequest){
NSLog(#"Request created: %# %#", CFHTTPMessageCopyRequestMethod(myRequest),
CFURLGetString(CFHTTPMessageCopyRequestURL(myRequest)));
}
CFDataRef bodyData = CFStringCreateExternalRepresentation(kCFAllocatorDefault, bodyString, kCFStringEncodingASCII, 0);
CFHTTPMessageSetBody(myRequest, bodyData);
NSLog(#"HTTPMessage body set");
CFDataRef mySerializedRequest = CFHTTPMessageCopySerializedMessage(myRequest);
NSLog(#"Serialized string generated");
CFStringRef mySerializedString = CFStringCreateFromExternalRepresentation(kCFAllocatorDefault,
mySerializedRequest,
kCFStringEncodingASCII);
NSLog(#"Serialized request: %#", mySerializedString);
CFReadStreamRef myReadStream = CFReadStreamCreateForHTTPRequest(kCFAllocatorDefault, myRequest);
BOOL done = FALSE;
CFReadStreamOpen(myReadStream);
CFOptionFlags registeredEvents = kCFStreamEventHasBytesAvailable | kCFStreamEventErrorOccurred | kCFStreamEventEndEncountered;
NSLog(#"Setting read stream client");
if (CFReadStreamSetClient(myReadStream, registeredEvents, myCallBack, (void *)&done)){
CFReadStreamScheduleWithRunLoop(myReadStream, CFRunLoopGetCurrent(),
kCFRunLoopCommonModes);
while (!done){
NSLog(#"Wait");
[[NSRunLoop currentRunLoop] run];
}
}else{
NSLog(#"Error setting readstream client");
}
CFRelease(myRequest);
CFRelease(myURL);
CFRelease(url);
CFRelease(mySerializedRequest);
myRequest = NULL;
mySerializedRequest = NULL;
[pool drain];
}
The code compiles but it throws a segfault:
Reading symbols for shared libraries .+++++...................... done
2010-09-06 21:36:34.470 a.out[27973:a0f] Request created: GET http://www.apple.com
2010-09-06 21:36:34.473 a.out[27973:a0f] HTTPMessage body set
2010-09-06 21:36:34.474 a.out[27973:a0f] Serialized string generated
2010-09-06 21:36:34.474 a.out[27973:a0f] Serialized request: GET / HTTP/1.1
2010-09-06 21:36:34.478 a.out[27973:a0f] Setting read stream client
2010-09-06 21:36:34.479 a.out[27973:a0f] Wait
Program received signal EXC_BAD_ACCESS, Could not access memory.
Reason: 13 at address: 0x0000000000000000
0x00007fff83b7b83d in _signalEventSync ()
(gdb) bt
#0 0x00007fff83b7b83d in _signalEventSync ()
#1 0x00007fff83b7b7f4 in _cfstream_solo_signalEventSync ()
#2 0x00007fff83b7b734 in _CFStreamSignalEvent ()
#3 0x00007fff8391c3a1 in HTTPReadStream::streamEvent ()
#4 0x00007fff83b7b883 in _signalEventSync ()
#5 0x00007fff83b7c5d9 in _cfstream_shared_signalEventSync ()
#6 0x00007fff83b1be91 in __CFRunLoopDoSources0 ()
#7 0x00007fff83b1a089 in __CFRunLoopRun ()
#8 0x00007fff83b1984f in CFRunLoopRunSpecific ()
#9 0x00007fff8300fa18 in -[NSRunLoop(NSRunLoop) runMode:beforeDate:] ()
#10 0x00007fff8300f8f7 in -[NSRunLoop(NSRunLoop) run] ()
#11 0x0000000100000c94 in main (argc=1, argv=0x7fff5fbff5b8) at test.m:56
I don't understand where the error is. Anyone can show me the light or point me to a clear example on how to manage CFHTTPStreams?
Thanks,
Andrea
The problem was that I didn't read carefully the documentation of CFReadStreamSetClient.
The third parameter has to be a reference to a CFStreamClientContext whereas I was passing a reference to an arbitrary pointer which instead has to be included into the context.
Fixing this resolved the segmentation fault and now the test ca is working as expected.
Related
I'm running some Metal code in a thread, but encountering some issues I don't fully understand. Running the following code with USE_THREAD 0 and USE_AUTORELEASEPOOL 0 works fine but setting either one to 1 results in a SIGSEGV in objc_release:
* thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=1, address=0x20)
* frame #0: 0x00007fff2020d4af libobjc.A.dylib`objc_release + 31
frame #1: 0x00007fff2022b20f libobjc.A.dylib`AutoreleasePoolPage::releaseUntil(objc_object**) + 167
frame #2: 0x00007fff2020de30 libobjc.A.dylib`objc_autoreleasePoolPop + 161
frame #3: 0x0000000100003d60 a.out`render(void*) + 896
frame #4: 0x0000000100003dd8 a.out`main + 24
frame #5: 0x00007fff20388f3d libdyld.dylib`start + 1
frame #6: 0x00007fff20388f3d libdyld.dylib`start + 1
Using the autoreleasepool I can understand since the objects are already released (since release is called manually on them), but why does the same issue occur when the code is running inside a thread? Is this related to pthreads specifically? Is there a "hidden" autoreleasepool somewhere I am missing?
I understand using an autoreleasepool and not releasing manually will achieve the same result but I am trying to understand what is going on here.
// clang++ main.mm -lobjc -framework Metal
#define USE_THREAD 0
#define USE_AUTORELEASEPOOL 1
#import <Metal/Metal.h>
void * render(void *) {
#if USE_AUTORELEASEPOOL
#autoreleasepool {
#else
{
#endif
NSArray<id<MTLDevice>> * devices = MTLCopyAllDevices();
id<MTLDevice> device = devices[0];
id<MTLCommandQueue> command_queue = [device newCommandQueue];
MTLTextureDescriptor * texture_descriptor = [MTLTextureDescriptor texture2DDescriptorWithPixelFormat:MTLPixelFormatBGRA8Unorm width:640 height:480 mipmapped:NO];
texture_descriptor.usage = MTLTextureUsageRenderTarget;
id<MTLTexture> texture = [device newTextureWithDescriptor:texture_descriptor];
[texture_descriptor release];
texture_descriptor = NULL;
id<MTLCommandBuffer> command_buffer = [command_queue commandBuffer];
MTLRenderPassDescriptor * render_pass_descriptor = [MTLRenderPassDescriptor renderPassDescriptor];
render_pass_descriptor.colorAttachments[0].texture = texture;
id<MTLRenderCommandEncoder> render_command_encoder = [command_buffer renderCommandEncoderWithDescriptor:render_pass_descriptor];
[render_pass_descriptor release];
render_pass_descriptor = NULL;
[render_command_encoder endEncoding];
[render_command_encoder release];
render_command_encoder = nil;
[command_buffer commit];
[command_buffer waitUntilCompleted];
[command_buffer release];
command_buffer = nil;
[texture release];
texture = nil;
[command_queue release];
command_queue = nil;
}
return 0;
}
#include <pthread.h>
int main() {
#if USE_THREAD
pthread_t thread;
pthread_create(&thread, NULL, render, NULL);
pthread_join(thread, NULL);
#else
render(NULL);
#endif
return 0;
}
I'm not sure if Apple has ever updated their documentation in regards to this behaviour, but approximately since macOS 10.7, all POSIX threads were populated with autorelease pool block automatically. You can clearly see it from dealloc of autoreleased objects:
#import <Foundation/Foundation.h>
#import <pthread.h>
#interface TDWObject : NSObject
#end
#implementation TDWObject
- (void)dealloc {
[super dealloc];
NSLog(#"%#", [NSThread callStackSymbols]);
}
#end
void *run(void *objPtr) {
TDWObject *obj = (__bridge TDWObject *)objPtr;
[obj autorelease];
return NULL;
}
int main() {
pthread_t thread;
TDWObject *obj = [TDWObject new];
pthread_create(&thread, NULL, run, (__bridge void *)obj);
pthread_join(thread, NULL);
return 0;
}
This code (when compiled with MRC) will print the following stacktrace:
-[TDWObject dealloc] + 83
_ZN19AutoreleasePoolPage12releaseUntilEPP11objc_object + 168
objc_autoreleasePoolPop + 227
_ZN20objc_tls_direct_baseIP19AutoreleasePoolPageL7tls_key3ENS0_14HotPageDeallocEE5dtor_EPv + 140
_pthread_tsd_cleanup + 607
_pthread_exit + 70
_pthread_start + 136
thread_start + 15
objc_autoreleasePoolPop function is exactly what gets called in the end of autorelease pool blocks nowadays.
The following code uses the Vision and AVFoundation frameworks to enable face tracking on the built-in camera on macOS. In some circumstances the code crashes due to EXC_BAD_ACCESS (code=2) on a working thread on the queue com.apple.VN.trackersCollectionManagementQueue (serial). The application works as intended as long as no face is detected, but crashes as soon as it detects a face and attempts to track it by the method
[_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error]
The method is called inside the AVCaptureVideoDataOutputSampleBufferDelegate method
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection)
From what I understand EXC_BAD_ACCESSÂ means I can't access memory [1], and the error code (2) KERN_PROTECTION_FAILURE means that the specified memory is valid, but does not permit the required forms of access [2]. A (possibly outdated) technical note explains that this is caused by the thread trying to write to read-only memory. [3]. From this, I understand the problem is not caused by premature deallocation or memory corruption, but memory access control across threads.
I believe the problem appeared after an update. The crash happens when Debug Executable is checked in the scheme editor for Game template projects (Metal, SceneKit, and SpriteKit), but does not crash when used in App and Document App templates. The code also works as intended when adapted to iOS on a physical device. I have tried to isolate the problem by trimming away as much code as possible, and the following files can be added to any template.
Header file
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
NS_ASSUME_NONNULL_BEGIN
#interface LSPVision : NSObject <AVCaptureVideoDataOutputSampleBufferDelegate>
#property (nonatomic) AVCaptureVideoPreviewLayer *previewLayer;
- (void)captureFrame;
#end
NS_ASSUME_NONNULL_END
Implementation file
#import "LSPVision.h"
#import <Vision/Vision.h>
#implementation LSPVision
{
// AVCapture stuff
AVCaptureSession *_session;
AVCaptureVideoDataOutput *_videoDataOutput;
dispatch_queue_t _videoDataOutputQueue;
CGSize _captureDeviceResolution;
// Vision requests
NSMutableArray *_detectionRequests; // Array of VNDetectFaceRectanglesRequest
NSMutableArray *_trackingRequests; // Array of VNTrackObjectRequest
VNSequenceRequestHandler *_sequenceRequestHandler;
BOOL _frameCapture;
}
- (nonnull instancetype)init
{
self = [super init];
if(self)
{
_session = [self _setupAVCaptureSession];
[self designatePreviewLayerForCaptureSession:_session];
[self _prepareVisionRequest];
_frameCapture = YES;
if (_session) {
[_session startRunning];
}
}
return self;
}
# pragma mark Setup AVSession
- (AVCaptureSession *)_setupAVCaptureSession {
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *device;
#if defined(TARGET_MACOS)
if (#available(macOS 10.15, *)) {
AVCaptureDeviceDiscoverySession *discoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:#[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionFront];
device = discoverySession.devices.firstObject;
}
#endif
if (device != nil) {
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
if ([captureSession canAddInput:deviceInput]) {
[captureSession addInput:deviceInput];
}
AVCaptureDeviceFormat *lowestResolution = [self _lowestResolution420Format:device];
if (lowestResolution != nil) {
if ([device lockForConfiguration:nil]) {
device.activeFormat = lowestResolution;
[device unlockForConfiguration];
}
}
}
if (device != nil) {
[self _configureVideoDataOutput:device captureSession:captureSession];
return captureSession;
}
NSLog(#"Hold up, something went wrong with AVCaptureSession");
[self _tearDownAVCapture];
return nil;
}
- (AVCaptureDeviceFormat *)_lowestResolution420Format:(AVCaptureDevice *)device {
AVCaptureDeviceFormat *lowestResolutionFormat = nil;
CMVideoDimensions lowestResolutionDimensions = { .height = (int32_t)10000, .width = (int32_t)10000 };
for (AVCaptureDeviceFormat *deviceFormat in device.formats) {
CMFormatDescriptionRef deviceFormatDescription = deviceFormat.formatDescription;
if (CMFormatDescriptionGetMediaSubType(deviceFormatDescription) == (kCVPixelFormatType_420YpCbCr8BiPlanarFullRange | kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)) {
CMVideoDimensions candidateDimensions = CMVideoFormatDescriptionGetDimensions(deviceFormatDescription);
if ((lowestResolutionFormat == nil) || candidateDimensions.width > lowestResolutionDimensions.width) {
lowestResolutionFormat = deviceFormat;
lowestResolutionDimensions = candidateDimensions;
NSLog(#"Device Format: Width: %d, Height: %d", candidateDimensions.width, candidateDimensions.height);
_captureDeviceResolution.width = candidateDimensions.width;
_captureDeviceResolution.height = candidateDimensions.height;
}
}
}
if (lowestResolutionFormat != nil) {
return lowestResolutionFormat;
}
return nil;
}
- (void)designatePreviewLayerForCaptureSession:(AVCaptureSession *)session {
AVCaptureVideoPreviewLayer *videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
self.previewLayer = videoPreviewLayer;
videoPreviewLayer.name = #"Camera Preview";
}
- (void)_configureVideoDataOutput:(AVCaptureDevice *)inputDevice captureSession:(AVCaptureSession *)captureSession {
AVCaptureVideoDataOutput *videoDataOutput = [AVCaptureVideoDataOutput new];
videoDataOutput.alwaysDiscardsLateVideoFrames = true;
// Create a serial dispatch queue used for the sample buffer delegate as well as when a still image is captured.
// A serial dispatch queue must be used to guarantee that video frames will be delivered in order.
dispatch_queue_t videoDataOutputQueue = dispatch_queue_create("com.example.apple-samplecode.VisionFaceTrack", NULL);
[videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
if ([captureSession canAddOutput:videoDataOutput]) {
[captureSession addOutput:videoDataOutput];
}
[videoDataOutput connectionWithMediaType:AVMediaTypeVideo].enabled = true;
_videoDataOutput = videoDataOutput;
_videoDataOutputQueue = videoDataOutputQueue;
}
# pragma mark Vision Request
- (void)_prepareVisionRequest {
NSMutableArray *requests = [NSMutableArray array];
VNRequestCompletionHandler handlerBlock = ^(VNRequest * _Nonnull request, NSError * _Nullable error) {
if (error) {
NSLog(#"Handler error: %#", error);
}
VNDetectFaceRectanglesRequest *faceDetectionRequest = (VNDetectFaceRectanglesRequest *)request;
dispatch_async(dispatch_get_main_queue(), ^{
for (VNFaceObservation *observation in faceDetectionRequest.results) {
VNTrackObjectRequest *faceTrackingRequest = [[VNTrackObjectRequest alloc] initWithDetectedObjectObservation:observation];
NSUInteger count = requests.count;
[requests insertObject:faceTrackingRequest atIndex:count];
}
self->_trackingRequests = [requests copy];
});
};
VNDetectFaceRectanglesRequest *faceDetectionRequest = [[VNDetectFaceRectanglesRequest alloc] initWithCompletionHandler:handlerBlock];
_detectionRequests = [NSMutableArray arrayWithObject:faceDetectionRequest];
_sequenceRequestHandler = [[VNSequenceRequestHandler alloc] init];
}
# pragma mark Delegate functions
// AVCaptureVideoDataOutputSampleBufferDelegate
// Handle delegate method callback on receiving a sample buffer.
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if (_frameCapture == YES) {
NSMutableDictionary *requestHandlerOptions = [NSMutableDictionary dictionary];
CFTypeRef cameraIntrinsicData = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix, nil);
if (cameraIntrinsicData != nil) {
[requestHandlerOptions setObject:CFBridgingRelease(cameraIntrinsicData) forKey:VNImageOptionCameraIntrinsics];
}
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (!pixelBuffer) {
NSLog(#"Failed to obtain a CVPixelBuffer for the current output frame.");
return;
}
#if defined(TARGET_MACOS)
CGImagePropertyOrientation exifOrientation = kCGImagePropertyOrientationLeftMirrored;
#endif
NSError *error;
NSArray *requests;
if (_trackingRequests.count > 0) {
requests = _trackingRequests;
} else {
// No tracking object detected, so perform initial detection
VNImageRequestHandler *imageRequestHandler = [[VNImageRequestHandler alloc] initWithCVPixelBuffer:pixelBuffer orientation:exifOrientation options:requestHandlerOptions];
NSArray *detectionRequests = _detectionRequests;
if (detectionRequests == nil) {
return;
}
[imageRequestHandler performRequests:_detectionRequests error:&error];
if (error) {
NSLog(#"Failed to perform FaceRectangleRequest: %#", error);
}
return;
}
// SequenceRequesthandler results in 10-20% cpu utilization
[_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error];
if (error) {
NSLog(#"Failed to perform SequenceRequest: %#", error);
return;
}
// Setup the next round of tracking
NSMutableArray *newTrackingRequests = [NSMutableArray array];
for (VNTrackObjectRequest *trackingRequest in requests) {
NSArray *results = trackingRequest.results;
trackingRequest.trackingLevel = VNRequestTrackingLevelFast;
VNDetectedObjectObservation *observation = results[0];
if (![observation isKindOfClass:[VNDetectedObjectObservation class]]) {
return;
}
if (!trackingRequest.isLastFrame) {
if (observation.confidence > 0.3f ) {
trackingRequest.inputObservation = observation;
} else {
trackingRequest.lastFrame = true;
}
NSUInteger number = newTrackingRequests.count;
[newTrackingRequests insertObject:trackingRequest atIndex:number];
}
}
_trackingRequests = newTrackingRequests;
if (newTrackingRequests.count == 0) {
// Nothing to track, so abort.
return;
}
NSMutableArray *faceLandmarksRequests = [NSMutableArray array];
for (VNTrackObjectRequest* trackingRequest in newTrackingRequests) {
VNRequestCompletionHandler handlerBlock = ^(VNRequest * _Nonnull request, NSError * _Nullable error) {
if (error != nil) {
NSLog(#"Facelandmarks error: %#", error);
}
VNDetectFaceLandmarksRequest *landmarksRequest = (VNDetectFaceLandmarksRequest *)request;
NSArray *results = landmarksRequest.results;
if (results == nil) {
return;
}
// Perform all UI updates (drawing) on the main queue, not the background queue on which this handler is being called.
dispatch_async(dispatch_get_main_queue(), ^{
for (VNFaceObservation *faceObservation in results) {
[self _setEyePositionsForFace:faceObservation];
//NSLog(#"seeing face");
}
});
};
VNDetectFaceLandmarksRequest *faceLandmarksRequest = [[VNDetectFaceLandmarksRequest alloc] initWithCompletionHandler:handlerBlock];
NSArray *trackingResults = trackingRequest.results;
if (trackingResults == nil) {
return;
}
VNDetectedObjectObservation *observation = trackingResults[0];
if (observation == nil) {
return;
}
VNFaceObservation *faceObservation = [VNFaceObservation observationWithBoundingBox:observation.boundingBox];
faceLandmarksRequest.inputFaceObservations = #[faceObservation];
// Continure to track detected facial landmarks.
NSUInteger nr = faceLandmarksRequests.count;
[faceLandmarksRequests insertObject:faceLandmarksRequest atIndex:nr];
VNImageRequestHandler *imageRequestHandler = [[VNImageRequestHandler alloc] initWithCVPixelBuffer:pixelBuffer orientation:exifOrientation options:requestHandlerOptions];
[imageRequestHandler performRequests:faceLandmarksRequests error:&error];
if (error != nil) {
NSLog(#"Failed to perform FaceLandmarkRequest: %#", error);
}
}
}
//_frameCapture = NO;
}
# pragma mark Helper Functions
- (void)captureFrame {
_frameCapture = YES;
}
- (void)_tearDownAVCapture {
_videoDataOutput = nil;
_videoDataOutputQueue = nil;
}
#end
Debugging
The crash seems related to Metal, perhaps on multiple threads. The crash happens when the Vision framework (from a working thread) executes Metal Performance Shaders from a private neural network framework (Espresso). Before the crash, there is a deadlock related to command buffers. This ultimately leads to the Address sanitizer reporting BUS on unknown address. I assume this is the reason I get KERN_PROTECTION_FAILURE. Other threads are either executing Metal or simply waiting. I don't know if the semaphores are related to Metal CPU/GPU synchronization or something else. When the code is used with App templates, the Vision framework is run on the main thread and no crash occurs. I can't see how I can address this in any meaningful way except filing a bug report. That being said, my debugging skills leave much to be desired, so any help is strongly appreciated - not only in solving the issue but also in understanding the problem. Address Sanitizer and Thread Sanitizer are turned on for the following output. Due to size constraints the Crash Report kan be read here. A crashing project (on my computer) can now be viewed and downloaded from dropbox. My computer is a 2019 MB Pro 16.
Console output
ErrorTest1(13661,0x107776e00) malloc: nano zone abandoned due to inability to preallocate reserved vm space.
2020-12-24 09:48:35.709965+0100 ErrorTest1[13661:811227] Metal GPU Frame Capture Enabled
2020-12-24 09:48:36.675326+0100 ErrorTest1[13661:811227] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000b7b50> F8BB1C28-BAE8-11D6-9C31-00039315CD46
2020-12-24 09:48:36.707535+0100 ErrorTest1[13661:811227] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000bb5a0> 30010C1C-93BF-11D8-8B5B-000A95AF9C6A
2020-12-24 09:48:36.845641+0100 ErrorTest1[13661:811227] [] CMIOHardware.cpp:379:CMIOObjectGetPropertyData Error: 2003332927, failed
2020-12-24 09:48:38.717546+0100 ErrorTest1[13661:811794] [logging-persist] cannot open file at line 44580 of [02c344acea]
2020-12-24 09:48:38.717648+0100 ErrorTest1[13661:811794] [logging-persist] os_unix.c:44580: (0) open(/var/db/DetachedSignatures) - Undefined error: 0
2020-12-24 09:48:38.778975+0100 ErrorTest1[13661:811761] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:61:16: warning: unused variable 'input_slice_count'
const uint input_slice_count = (INPUT_FEATURE_CHANNELS + 3) / 4;
^
2020-12-24 09:48:38.779198+0100 ErrorTest1[13661:811812] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
2020-12-24 09:48:38.779441+0100 ErrorTest1[13661:811838] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
2020-12-24 09:48:39.072518+0100 ErrorTest1[13661:811838] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:61:16: warning: unused variable 'input_slice_count'
const uint input_slice_count = (INPUT_FEATURE_CHANNELS + 3) / 4;
^
2020-12-24 09:48:39.073210+0100 ErrorTest1[13661:811842] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:98:16: warning: unused variable 'fm_group'
const uint fm_group = threadgroup_id.z - splitId * params.simdsPerGroupData;
^
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
2020-12-24 09:48:39.073538+0100 ErrorTest1[13661:811812] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:98:16: warning: unused variable 'fm_group'
const uint fm_group = threadgroup_id.z - splitId * params.simdsPerGroupData;
^
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
LLDB bt
* thread #5, queue = 'com.apple.VN.trackersCollectionManagementQueue', stop reason = EXC_BAD_ACCESS (code=2, address=0x70000deb1ff8)
frame #0: 0x000000010739db33 libsystem_pthread.dylib`___chkstk_darwin + 55
frame #1: 0x000000010739dafc libsystem_pthread.dylib`thread_start + 20
frame #2: 0x000000010724277b libMTLCapture.dylib`___lldb_unnamed_symbol2507$$libMTLCapture.dylib + 585
frame #3: 0x00007fff29f597be MPSNeuralNetwork`___lldb_unnamed_symbol4427$$MPSNeuralNetwork + 1907
frame #4: 0x00007fff29f5a3c2 MPSNeuralNetwork`___lldb_unnamed_symbol4432$$MPSNeuralNetwork + 756
frame #5: 0x00007fff29f5aa39 MPSNeuralNetwork`___lldb_unnamed_symbol4435$$MPSNeuralNetwork + 83
frame #6: 0x00007fff339e50e8 Espresso`Espresso::MPSEngine::mps_convolution_kernel::recreate_kernel() + 230
frame #7: 0x00007fff339e3c95 Espresso`Espresso::MPSEngine::convolution_kernel_base<Espresso::generic_convolution_kernel>::set_biases(std::__1::shared_ptr<Espresso::blob<float, 1> >) + 455
frame #8: 0x00007fff339e724b Espresso`Espresso::MPSEngine::convolution_kernel_proxy::set_biases(std::__1::shared_ptr<Espresso::blob<float, 1> >) + 103
frame #9: 0x00007fff338b3a8f Espresso`Espresso::generic_convolution_kernel::set_biases(std::__1::shared_ptr<Espresso::blob<float, 1> >, std::__1::shared_ptr<Espresso::abstract_batch>) + 49
frame #10: 0x00007fff338bdee1 Espresso`Espresso::load_network_layers_post_dispatch(std::__1::shared_ptr<Espresso::net> const&, std::__1::shared_ptr<Espresso::SerDes::generic_serdes_object> const&, std::__1::shared_ptr<Espresso::cpu_context_transfer_algo_t> const&, std::__1::shared_ptr<Espresso::net_info_ir_t> const&, bool, Espresso::network_shape const&, Espresso::compute_path, bool, std::__1::shared_ptr<Espresso::blob_storage_abstract> const&) + 5940
frame #11: 0x00007fff338ba6ee Espresso`Espresso::load_network_layers_internal(std::__1::shared_ptr<Espresso::SerDes::generic_serdes_object>, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::shared_ptr<Espresso::abstract_context> const&, Espresso::network_shape const&, std::__1::basic_istream<char, std::__1::char_traits<char> >*, Espresso::compute_path, bool, std::__1::shared_ptr<Espresso::blob_storage_abstract> const&) + 793
frame #12: 0x00007fff338c9294 Espresso`Espresso::load_and_shape_network(std::__1::shared_ptr<Espresso::SerDes::generic_serdes_object> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::shared_ptr<Espresso::abstract_context> const&, Espresso::network_shape const&, Espresso::compute_path, std::__1::shared_ptr<Espresso::blob_storage_abstract> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) + 576
frame #13: 0x00007fff338cb715 Espresso`Espresso::load_network(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::shared_ptr<Espresso::abstract_context> const&, Espresso::compute_path, bool) + 2496
frame #14: 0x00007fff33d9603c Espresso`EspressoLight::espresso_plan::add_network(char const*, espresso_storage_type_t) + 350
frame #15: 0x00007fff33daa817 Espresso`espresso_plan_add_network + 294
frame #16: 0x00007fff30479b9d Vision`+[VNEspressoHelpers createSingleNetworkPlanFromResourceName:usingProcessingDevice:lowPriorityMode:inputBlobNames:outputBlobNames:explicitNetworkLayersStorageType:espressoResources:error:] + 517
frame #17: 0x00007fff3047992d Vision`+[VNEspressoHelpers createSingleNetworkPlanFromResourceName:usingProcessingDevice:lowPriorityMode:inputBlobNames:outputBlobNames:espressoResources:error:] + 151
frame #18: 0x00007fff303ce123 Vision`-[VNRPNTrackerEspressoModelCacheManager espressoResourcesFromOptions:error:] + 417
frame #19: 0x00007fff303ce8c8 Vision`-[VNObjectTrackerRevision2 initWithOptions:error:] + 262
frame #20: 0x00007fff304152df Vision`__54-[VNTrackerManager _createTracker:type:options:error:]_block_invoke + 207
frame #21: 0x00000001072fc0b0 libdispatch.dylib`_dispatch_client_callout + 8
frame #22: 0x000000010730d3b2 libdispatch.dylib`_dispatch_lane_barrier_sync_invoke_and_complete + 135
frame #23: 0x00007fff30414f01 Vision`-[VNTrackerManager _createTracker:type:options:error:] + 261
frame #24: 0x00007fff30414b52 Vision`-[VNTrackerManager trackerWithOptions:error:] + 509
frame #25: 0x00007fff304dda4a Vision`-[VNRequestPerformer trackerWithOptions:error:] + 85
frame #26: 0x00007fff30343ac4 Vision`-[VNTrackingRequest internalPerformRevision:inContext:error:] + 436
frame #27: 0x00007fff3037fb08 Vision`-[VNRequest performInContext:error:] + 885
frame #28: 0x00007fff303cd9a1 Vision`VNExecuteBlock + 58
frame #29: 0x00007fff304dd105 Vision`-[VNRequestPerformer _performOrderedRequests:inContext:error:] + 674
frame #30: 0x00007fff304dd482 Vision`-[VNRequestPerformer performRequests:inContext:onBehalfOfRequest:error:] + 352
frame #31: 0x00007fff304dd586 Vision`-[VNRequestPerformer performRequests:inContext:error:] + 60
frame #32: 0x00007fff304cbf1a Vision`-[VNSequenceRequestHandler _performRequests:onImageBuffer:gatheredForensics:error:] + 293
frame #33: 0x00007fff304cc122 Vision`-[VNSequenceRequestHandler performRequests:onCVPixelBuffer:orientation:gatheredForensics:error:] + 111
frame #34: 0x00007fff304cc0aa Vision`-[VNSequenceRequestHandler performRequests:onCVPixelBuffer:orientation:error:] + 28
* frame #35: 0x0000000106fc5a97 ErrorTest1`-[LSPVision captureOutput:didOutputSampleBuffer:fromConnection:](self=0x0000608000047c20, _cmd="captureOutput:didOutputSampleBuffer:fromConnection:", output=0x00006030000ce770, sampleBuffer=0x0000614000091240, connection=0x00006030000d0c30) at LSPVision.m:246:9
frame #36: 0x00007fff3786b2e0 AVFCapture`__56-[AVCaptureVideoDataOutput_Tundra _render:sampleBuffer:]_block_invoke + 213
frame #37: 0x00000001077ff3bb libclang_rt.asan_osx_dynamic.dylib`__wrap_dispatch_async_block_invoke + 203
frame #38: 0x00000001072fae78 libdispatch.dylib`_dispatch_call_block_and_release + 12
frame #39: 0x00000001072fc0b0 libdispatch.dylib`_dispatch_client_callout + 8
frame #40: 0x00000001073036b7 libdispatch.dylib`_dispatch_lane_serial_drain + 776
frame #41: 0x0000000107304594 libdispatch.dylib`_dispatch_lane_invoke + 449
frame #42: 0x0000000107312217 libdispatch.dylib`_dispatch_workloop_worker_thread + 1675
frame #43: 0x000000010739eb15 libsystem_pthread.dylib`_pthread_wqthread + 314
frame #44: 0x000000010739dae3 libsystem_pthread.dylib`start_wqthread + 15
Update
The bug seems to be resolved on macOS Monterey 12.1.
More of a comment here, I am trying to reproduce this. I've taken your code as is, but had to comment out [self _setEyePositionsForFace:faceObservation]; in
for (VNFaceObservation *faceObservation in results) {
//[self _setEyePositionsForFace:faceObservation];
//NSLog(#"seeing face");
}
as you do not give its implementation. However, with that done, I was able to run the code without any issue. To test further I added the logs as below.
// SequenceRequesthandler results in 10-20% cpu utilization
NSLog(#"aaa");
[_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error];
NSLog(#"bbb");
as I understand your problem is with [_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error]; specifically, but I did not run into trouble and the log showed a lot of repeating aaa and bbbs. To further test I also added an ok log as shown below
if (error != nil) {
NSLog(#"Failed to perform FaceLandmarkRequest: %#", error);
} else {
NSLog(#"ok");
}
and that happily printed together with the aaa and bbb.
I also hooked up a button as shown below
- (IBAction)buttonAction:(id)sender {
NSLog( #"Button" );
self.v.captureFrame;
}
where self.v is an instance of (my) LSPVision and I could push the button as much as I'd like without trouble.
I think either the problem lies somewhere else, maybe even in the _setEyePositionsForFace that I commented out, or perhaps you can give even more code so I can reproduce it this side?
FWIW here is a sample of the log
2020-12-27 09:14:54.147536+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.184167+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.268926+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.269374+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.314135+0200 MetalCaptureTest[11392:316676] Button
2020-12-27 09:14:54.316025+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.393732+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.394171+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.432979+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.496887+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.497389+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.533118+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.614813+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.615394+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.663343+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.747860+0200 MetalCaptureTest[11392:317094] ok
EDIT
Thanks, I got the dropbox project and it is working this side. No crashes at all. This is the log.
ErrorTest1(11743,0x10900ce00) malloc: nano zone abandoned due to inability to preallocate reserved vm space.
2020-12-27 10:55:10.445333+0200 ErrorTest1[11743:344803] Metal GPU Frame Capture Enabled
2020-12-27 10:55:10.471650+0200 ErrorTest1[11743:344803] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000aabc0> F8BB1C28-BAE8-11D6-9C31-00039315CD46
2020-12-27 10:55:10.528628+0200 ErrorTest1[11743:344803] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000ae130> 30010C1C-93BF-11D8-8B5B-000A95AF9C6A
2020-12-27 10:55:10.608753+0200 ErrorTest1[11743:344803] [] CMIOHardware.cpp:379:CMIOObjectGetPropertyData Error: 2003332927, failed
2020-12-27 10:55:11.408594+0200 ErrorTest1[11743:344873] [logging-persist] cannot open file at line 44580 of [02c344acea]
2020-12-27 10:55:11.408806+0200 ErrorTest1[11743:344873] [logging-persist] os_unix.c:44580: (0) open(/var/db/DetachedSignatures) - Undefined error: 0
2020-12-27 10:55:17.637382+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:17.838354+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:17.987583+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:18.171168+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:18.320957+0200 ErrorTest1[11743:344803] seeing face
FWIW I have latest OS BS 11.1, latest Xcode 12.3 and running this on MB Air 2017. From your description I suspect maybe the multithreading could be a problem but for now my focus is on reproducing it this side.
I also got an unexplainable crash using Vision in one app, but not in another one I created for testing. Turns out enabling "Metal API Validation" made the problem go away for me. I do have several custom Metal kernels in my app, but I still don't understand what the root problem is.
To show errors on some condition, I am using NSRunAlertPanel on Mac OS X (on the code which is ported from Windows, where I was using MessageBox).
Before actual creation of Windows, some code is being run and calling this code to show some conditional error.
In a thread com.apple.libdispatch-manager, under following call stack
0 _dispatch_mgr_invoke
1 _dispatch_mgr_thread
it is giving EXC_BAD_INSTRUCTION
Is it because Windows is not created before NSRunAlertPanel?
What is the reason of this runtime error? What is exact alternative of MessageBox on Mac OS X?
Long ShowDebugMessageBox (const wchar_t * Message, const wchar_t * Title)
{
NSString * message; ///< Message.
NSString * title; ///< Title.
NSInteger response; ///< response.
message = WideToNSString (Message);
title = WideToNSString (Title);
//response = NSRunAlertPanel(title, message, #"Yes", #"No", #"Cancel");
response = NSRunCriticalAlertPanel (title, message, #"Okay", #"Cancel", nil);
switch(response) {
case NSAlertDefaultReturn:
return IDYES;
case NSAlertAlternateReturn:
return IDNO;
default:
return IDCANCEL;
}
}
NSString * WideToNSString (const wchar_t * Str)
{
if(!Str) {
return nil;
}
NSString * str; ///< String in NSString.
#if CP_SIZEOFWCHAR == 4
str = [[NSString alloc] initWithBytes: (CVPtr) Str
length: sizeof(wchar_t)* wcslen(Str)
encoding: NSUTF32LittleEndianStringEncoding];
//encoding: NSUTF32StringEncoding];
#else
str = [[NSString alloc] initWithBytes: (CVPtr) Str
length: sizeof(wchar_t)* wcslen(Str);
encoding: NSUTF16LittleEndianStringEncoding];
//encoding: NSUTF16StringEncoding];
#endif
return str;
}
class File {
public:
int Open(char * fname, int mode)
{
fd = open(fname, mode);
}
int Close()
{
close(fd);
//fd = 0; //CAUSE of the PROBLEM
}
~File ()
{
//ALERT Display message box about the error.
ALERT(fd != 0);
}
private:
int fd;
};
This is the code to show the message box.
Code to get NSString from wchar_t * string (Wide string) is perfectly fine and was tested. It is used in many places and running fine.
Same code on the other application (which creates Window first) is running fine.
Problem occurs when the destructor of File is called. Since fd is not 0, it shows message box and cause the problem.
When fd is set to 0, no alert box is displayed for constructor. However, other alert are shown but no problem occurred.
Is it due fd?
You haven't provided enough information to say what is causing the exception (please show the code).
I use NSRunCriticalAlertPanel() to display fatal errors in my app, which I am able to call pretty much any time I like:
void criticalAlertPanel(NSString *title, NSString *fmt, ...)
{
va_list va;
va_start(va, fmt);
NSString *message = [[NSString alloc] initWithFormat:fmt arguments:va];
va_end(va);
NSRunCriticalAlertPanel(title, message, #"OK", nil, nil);
}
(this code is ARC enabled).
I am implementing a Mac App and I want to handle following events:
Unhandled Exception
Program Crash (memory error dcc)
If I detect them, I can send details to me to analyze and fix bugs using one of the Crash Handlers that I found. Alas I am unable to figure out how to intercept crashes and exceptions.
First question: Have I to differentiate Exceptions from Crashes ? Or detecting Exception is enough?
How can I catch Exceptions and/or crashes redirecting them to my handler ?
PS
I tried following in my MyApp class
NSSetUncaughtExceptionHandler(&uncaughtExceptionHandler);
signal(SIGABRT, SignalHandler);
signal(SIGILL, SignalHandler);
signal(SIGSEGV, SignalHandler);
signal(SIGFPE, SignalHandler);
signal(SIGBUS, SignalHandler);
signal(SIGPIPE, SignalHandler);
but it doesn't work. Every time it crashes, it goes to debugger without classing SignalHandler or uncaughtExceptionHandler
I have found the best way is to create a simple Exception handling delegate class as this allows exceptions in IBAction methods to be caught.
main.mm:
#interface ExceptionDelegate : NSObject
#end
static ExceptionDelegate *exceptionDelegate = nil;
int main(int argc, char **argv)
{
int retval = 1;
#autoreleasepool
{
//
// Set exception handler delegate
//
exceptionDelegate = [[ExceptionDelegate alloc] init];
NSExceptionHandler *exceptionHandler = [NSExceptionHandler defaultExceptionHandler];
exceptionHandler.exceptionHandlingMask = NSLogAndHandleEveryExceptionMask;
exceptionHandler.delegate = exceptionDelegate;
//
// Set signal handler
//
int signals[] =
{
SIGQUIT, SIGILL, SIGTRAP, SIGABRT, SIGEMT, SIGFPE, SIGBUS, SIGSEGV,
SIGSYS, SIGPIPE, SIGALRM, SIGXCPU, SIGXFSZ
};
const unsigned numSignals = sizeof(signals) / sizeof(signals[0]);
struct sigaction sa;
sa.sa_sigaction = signalHandler;
sa.sa_flags = SA_SIGINFO;
sigemptyset(&sa.sa_mask);
for (unsigned i = 0; i < numSignals; i++)
sigaction(signals[i], &sa, NULL);
....
}
....
return retval;
}
static void signalHandler(int sig, siginfo_t *info, void *context)
{
logerr(#"Caught signal %d", sig);
exit(102);
}
#implementation ExceptionDelegate
- (BOOL)exceptionHandler:(NSExceptionHandler *)exceptionHandler
shouldLogException:(NSException *)exception
mask:(unsigned int)mask
{
logerr(#"An unhandled exception occurred: %#", [exception reason]);
return YES;
}
- (BOOL)exceptionHandler:(NSExceptionHandler *)exceptionHandler
shouldHandleException:(NSException *)exception
mask:(unsigned int)mask
{
exit(101);
// not reached
return NO;
}
#end
You'll need to add the ExceptionHandling.framework to your project.
Every now and then I get one of the EXC_BAD_ACCESS errors that seem to plague new Objective-C programmers.
In my program I'm trying to get a get the time in seconds, convert that to a string and then convert that to a NSData object for writing to a file. Here is the code I'm using, but it crashes with an EXC_BAD_ACCESS every time I run it. What am I doing wrong?
-(void) startTheClock{
NSTimeInterval cloqInTime = [NSDate timeIntervalSinceReferenceDate];
NSString * dateStr = [self stringFromTimeInterval:cloqInTime];
NSData * data = [[dateStr stringByAppendingString:#", "] dataUsingEncoding:NSUTF8StringEncoding];
NSLog([#"Data:" stringByAppendingString:[data description]]);
// [data retain]; // <-- Uncommenting this and the [data release] line doesn't prevent the error
[self writeData:data]; // <-- EXC_BAD_ACCESS happens here!
// [data release];
}
When I run this method I get the following output:
timeString: |2011-11-04 16:17:12|
Data:<32303131 2d31312d 30342031 363a3137 3a31322c 20>
As requested here is my stringFromTimeInterval method:
-(NSString *) stringFromTimeInterval:(NSTimeInterval)t{
NSDate * date = [NSDate dateWithTimeIntervalSinceReferenceDate:t];
NSDateFormatter *dateFormat = [[NSDateFormatter alloc] init];
[dateFormat setDateFormat:#"yyyy-MM-dd HH:mm:ss"];
NSString *timeString = [dateFormat stringFromDate:date];
NSLog(#"timeString: %#", timeString);
return timeString;
}
Here is the stacktrace:
#0 0x00007fff84407e90 in objc_msgSend ()
#1 0x00000001054543f0 in 0x00000001054543f0 ()
#2 0x00000001000012ee in -[WorqAppDelegate startTheClock] at /Users/slayton/Documents/Xcode/Worq/Worq/WorqAppDelegate.m:61
#3 0x0000000100001164 in -[WorqAppDelegate cloqInAction:] at /Users/slayton/Documents/Xcode/Worq/Worq/WorqAppDelegate.m:37
#4 0x00007fff8cbb9a1d in -[NSObject performSelector:withObject:] ()
#5 0x00007fff86f69710 in -[NSApplication sendAction:to:from:] ()
#6 0x00007fff86f69642 in -[NSControl sendAction:to:] ()
#7 0x00007fff86f6956d in -[NSCell _sendActionFrom:] ()
#8 0x00007fff86f68a30 in -[NSCell trackMouse:inRect:ofView:untilMouseUp:] ()
#9 0x00007fff86fe88e0 in -[NSButtonCell trackMouse:inRect:ofView:untilMouseUp:] ()
#10 0x00007fff86f6763a in -[NSControl mouseDown:] ()
#11 0x00007fff86f320e0 in -[NSWindow sendEvent:] ()
#12 0x00007fff86eca68f in -[NSApplication sendEvent:] ()
#13 0x00007fff86e60682 in -[NSApplication run] ()
#14 0x00007fff870df80c in NSApplicationMain ()
#15 0x0000000100001092 in main ()
Here is the writeData method, recordFile is an instance of NSFileHandle:
-(BOOL) writeData:(NSData *)data{
if (recordFile != NULL)
[recordFile writeData:data];
else
NSLog(#"Record file is null! No data written");
}
Keep an eye on your recordFile object. I'm wondering if this object has been over-released (and thus deallocated) but is not nil. This would cause the [recordFile writeData:data] method to throw an EXC_BAD_ACCESS. What is the lifespan of recordFile?
Going out on a limb, but I'd say that data is probably nil. Have you checked the value of data?
the debugger and in particular the stack trace is your friend, but yeah, most likely data is nil.
in the debugger window at the prompt, type: po data
or better, just look at the local variables in the left pane and see what data's value is.