EXC_BAD_ACCESS on VNSequenceRequestHandler - objective-c

The following code uses the Vision and AVFoundation frameworks to enable face tracking on the built-in camera on macOS. In some circumstances the code crashes due to EXC_BAD_ACCESS (code=2) on a working thread on the queue com.apple.VN.trackersCollectionManagementQueue (serial). The application works as intended as long as no face is detected, but crashes as soon as it detects a face and attempts to track it by the method
[_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error]
The method is called inside the AVCaptureVideoDataOutputSampleBufferDelegate method
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection)
From what I understand EXC_BAD_ACCESS means I can't access memory [1], and the error code (2) KERN_PROTECTION_FAILURE means that the specified memory is valid, but does not permit the required forms of access [2]. A (possibly outdated) technical note explains that this is caused by the thread trying to write to read-only memory. [3]. From this, I understand the problem is not caused by premature deallocation or memory corruption, but memory access control across threads.
I believe the problem appeared after an update. The crash happens when Debug Executable is checked in the scheme editor for Game template projects (Metal, SceneKit, and SpriteKit), but does not crash when used in App and Document App templates. The code also works as intended when adapted to iOS on a physical device. I have tried to isolate the problem by trimming away as much code as possible, and the following files can be added to any template.
Header file
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
NS_ASSUME_NONNULL_BEGIN
#interface LSPVision : NSObject <AVCaptureVideoDataOutputSampleBufferDelegate>
#property (nonatomic) AVCaptureVideoPreviewLayer *previewLayer;
- (void)captureFrame;
#end
NS_ASSUME_NONNULL_END
Implementation file
#import "LSPVision.h"
#import <Vision/Vision.h>
#implementation LSPVision
{
// AVCapture stuff
AVCaptureSession *_session;
AVCaptureVideoDataOutput *_videoDataOutput;
dispatch_queue_t _videoDataOutputQueue;
CGSize _captureDeviceResolution;
// Vision requests
NSMutableArray *_detectionRequests; // Array of VNDetectFaceRectanglesRequest
NSMutableArray *_trackingRequests; // Array of VNTrackObjectRequest
VNSequenceRequestHandler *_sequenceRequestHandler;
BOOL _frameCapture;
}
- (nonnull instancetype)init
{
self = [super init];
if(self)
{
_session = [self _setupAVCaptureSession];
[self designatePreviewLayerForCaptureSession:_session];
[self _prepareVisionRequest];
_frameCapture = YES;
if (_session) {
[_session startRunning];
}
}
return self;
}
# pragma mark Setup AVSession
- (AVCaptureSession *)_setupAVCaptureSession {
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *device;
#if defined(TARGET_MACOS)
if (#available(macOS 10.15, *)) {
AVCaptureDeviceDiscoverySession *discoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:#[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionFront];
device = discoverySession.devices.firstObject;
}
#endif
if (device != nil) {
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
if ([captureSession canAddInput:deviceInput]) {
[captureSession addInput:deviceInput];
}
AVCaptureDeviceFormat *lowestResolution = [self _lowestResolution420Format:device];
if (lowestResolution != nil) {
if ([device lockForConfiguration:nil]) {
device.activeFormat = lowestResolution;
[device unlockForConfiguration];
}
}
}
if (device != nil) {
[self _configureVideoDataOutput:device captureSession:captureSession];
return captureSession;
}
NSLog(#"Hold up, something went wrong with AVCaptureSession");
[self _tearDownAVCapture];
return nil;
}
- (AVCaptureDeviceFormat *)_lowestResolution420Format:(AVCaptureDevice *)device {
AVCaptureDeviceFormat *lowestResolutionFormat = nil;
CMVideoDimensions lowestResolutionDimensions = { .height = (int32_t)10000, .width = (int32_t)10000 };
for (AVCaptureDeviceFormat *deviceFormat in device.formats) {
CMFormatDescriptionRef deviceFormatDescription = deviceFormat.formatDescription;
if (CMFormatDescriptionGetMediaSubType(deviceFormatDescription) == (kCVPixelFormatType_420YpCbCr8BiPlanarFullRange | kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)) {
CMVideoDimensions candidateDimensions = CMVideoFormatDescriptionGetDimensions(deviceFormatDescription);
if ((lowestResolutionFormat == nil) || candidateDimensions.width > lowestResolutionDimensions.width) {
lowestResolutionFormat = deviceFormat;
lowestResolutionDimensions = candidateDimensions;
NSLog(#"Device Format: Width: %d, Height: %d", candidateDimensions.width, candidateDimensions.height);
_captureDeviceResolution.width = candidateDimensions.width;
_captureDeviceResolution.height = candidateDimensions.height;
}
}
}
if (lowestResolutionFormat != nil) {
return lowestResolutionFormat;
}
return nil;
}
- (void)designatePreviewLayerForCaptureSession:(AVCaptureSession *)session {
AVCaptureVideoPreviewLayer *videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
self.previewLayer = videoPreviewLayer;
videoPreviewLayer.name = #"Camera Preview";
}
- (void)_configureVideoDataOutput:(AVCaptureDevice *)inputDevice captureSession:(AVCaptureSession *)captureSession {
AVCaptureVideoDataOutput *videoDataOutput = [AVCaptureVideoDataOutput new];
videoDataOutput.alwaysDiscardsLateVideoFrames = true;
// Create a serial dispatch queue used for the sample buffer delegate as well as when a still image is captured.
// A serial dispatch queue must be used to guarantee that video frames will be delivered in order.
dispatch_queue_t videoDataOutputQueue = dispatch_queue_create("com.example.apple-samplecode.VisionFaceTrack", NULL);
[videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
if ([captureSession canAddOutput:videoDataOutput]) {
[captureSession addOutput:videoDataOutput];
}
[videoDataOutput connectionWithMediaType:AVMediaTypeVideo].enabled = true;
_videoDataOutput = videoDataOutput;
_videoDataOutputQueue = videoDataOutputQueue;
}
# pragma mark Vision Request
- (void)_prepareVisionRequest {
NSMutableArray *requests = [NSMutableArray array];
VNRequestCompletionHandler handlerBlock = ^(VNRequest * _Nonnull request, NSError * _Nullable error) {
if (error) {
NSLog(#"Handler error: %#", error);
}
VNDetectFaceRectanglesRequest *faceDetectionRequest = (VNDetectFaceRectanglesRequest *)request;
dispatch_async(dispatch_get_main_queue(), ^{
for (VNFaceObservation *observation in faceDetectionRequest.results) {
VNTrackObjectRequest *faceTrackingRequest = [[VNTrackObjectRequest alloc] initWithDetectedObjectObservation:observation];
NSUInteger count = requests.count;
[requests insertObject:faceTrackingRequest atIndex:count];
}
self->_trackingRequests = [requests copy];
});
};
VNDetectFaceRectanglesRequest *faceDetectionRequest = [[VNDetectFaceRectanglesRequest alloc] initWithCompletionHandler:handlerBlock];
_detectionRequests = [NSMutableArray arrayWithObject:faceDetectionRequest];
_sequenceRequestHandler = [[VNSequenceRequestHandler alloc] init];
}
# pragma mark Delegate functions
// AVCaptureVideoDataOutputSampleBufferDelegate
// Handle delegate method callback on receiving a sample buffer.
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if (_frameCapture == YES) {
NSMutableDictionary *requestHandlerOptions = [NSMutableDictionary dictionary];
CFTypeRef cameraIntrinsicData = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix, nil);
if (cameraIntrinsicData != nil) {
[requestHandlerOptions setObject:CFBridgingRelease(cameraIntrinsicData) forKey:VNImageOptionCameraIntrinsics];
}
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (!pixelBuffer) {
NSLog(#"Failed to obtain a CVPixelBuffer for the current output frame.");
return;
}
#if defined(TARGET_MACOS)
CGImagePropertyOrientation exifOrientation = kCGImagePropertyOrientationLeftMirrored;
#endif
NSError *error;
NSArray *requests;
if (_trackingRequests.count > 0) {
requests = _trackingRequests;
} else {
// No tracking object detected, so perform initial detection
VNImageRequestHandler *imageRequestHandler = [[VNImageRequestHandler alloc] initWithCVPixelBuffer:pixelBuffer orientation:exifOrientation options:requestHandlerOptions];
NSArray *detectionRequests = _detectionRequests;
if (detectionRequests == nil) {
return;
}
[imageRequestHandler performRequests:_detectionRequests error:&error];
if (error) {
NSLog(#"Failed to perform FaceRectangleRequest: %#", error);
}
return;
}
// SequenceRequesthandler results in 10-20% cpu utilization
[_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error];
if (error) {
NSLog(#"Failed to perform SequenceRequest: %#", error);
return;
}
// Setup the next round of tracking
NSMutableArray *newTrackingRequests = [NSMutableArray array];
for (VNTrackObjectRequest *trackingRequest in requests) {
NSArray *results = trackingRequest.results;
trackingRequest.trackingLevel = VNRequestTrackingLevelFast;
VNDetectedObjectObservation *observation = results[0];
if (![observation isKindOfClass:[VNDetectedObjectObservation class]]) {
return;
}
if (!trackingRequest.isLastFrame) {
if (observation.confidence > 0.3f ) {
trackingRequest.inputObservation = observation;
} else {
trackingRequest.lastFrame = true;
}
NSUInteger number = newTrackingRequests.count;
[newTrackingRequests insertObject:trackingRequest atIndex:number];
}
}
_trackingRequests = newTrackingRequests;
if (newTrackingRequests.count == 0) {
// Nothing to track, so abort.
return;
}
NSMutableArray *faceLandmarksRequests = [NSMutableArray array];
for (VNTrackObjectRequest* trackingRequest in newTrackingRequests) {
VNRequestCompletionHandler handlerBlock = ^(VNRequest * _Nonnull request, NSError * _Nullable error) {
if (error != nil) {
NSLog(#"Facelandmarks error: %#", error);
}
VNDetectFaceLandmarksRequest *landmarksRequest = (VNDetectFaceLandmarksRequest *)request;
NSArray *results = landmarksRequest.results;
if (results == nil) {
return;
}
// Perform all UI updates (drawing) on the main queue, not the background queue on which this handler is being called.
dispatch_async(dispatch_get_main_queue(), ^{
for (VNFaceObservation *faceObservation in results) {
[self _setEyePositionsForFace:faceObservation];
//NSLog(#"seeing face");
}
});
};
VNDetectFaceLandmarksRequest *faceLandmarksRequest = [[VNDetectFaceLandmarksRequest alloc] initWithCompletionHandler:handlerBlock];
NSArray *trackingResults = trackingRequest.results;
if (trackingResults == nil) {
return;
}
VNDetectedObjectObservation *observation = trackingResults[0];
if (observation == nil) {
return;
}
VNFaceObservation *faceObservation = [VNFaceObservation observationWithBoundingBox:observation.boundingBox];
faceLandmarksRequest.inputFaceObservations = #[faceObservation];
// Continure to track detected facial landmarks.
NSUInteger nr = faceLandmarksRequests.count;
[faceLandmarksRequests insertObject:faceLandmarksRequest atIndex:nr];
VNImageRequestHandler *imageRequestHandler = [[VNImageRequestHandler alloc] initWithCVPixelBuffer:pixelBuffer orientation:exifOrientation options:requestHandlerOptions];
[imageRequestHandler performRequests:faceLandmarksRequests error:&error];
if (error != nil) {
NSLog(#"Failed to perform FaceLandmarkRequest: %#", error);
}
}
}
//_frameCapture = NO;
}
# pragma mark Helper Functions
- (void)captureFrame {
_frameCapture = YES;
}
- (void)_tearDownAVCapture {
_videoDataOutput = nil;
_videoDataOutputQueue = nil;
}
#end
Debugging
The crash seems related to Metal, perhaps on multiple threads. The crash happens when the Vision framework (from a working thread) executes Metal Performance Shaders from a private neural network framework (Espresso). Before the crash, there is a deadlock related to command buffers. This ultimately leads to the Address sanitizer reporting BUS on unknown address. I assume this is the reason I get KERN_PROTECTION_FAILURE. Other threads are either executing Metal or simply waiting. I don't know if the semaphores are related to Metal CPU/GPU synchronization or something else. When the code is used with App templates, the Vision framework is run on the main thread and no crash occurs. I can't see how I can address this in any meaningful way except filing a bug report. That being said, my debugging skills leave much to be desired, so any help is strongly appreciated - not only in solving the issue but also in understanding the problem. Address Sanitizer and Thread Sanitizer are turned on for the following output. Due to size constraints the Crash Report kan be read here. A crashing project (on my computer) can now be viewed and downloaded from dropbox. My computer is a 2019 MB Pro 16.
Console output
ErrorTest1(13661,0x107776e00) malloc: nano zone abandoned due to inability to preallocate reserved vm space.
2020-12-24 09:48:35.709965+0100 ErrorTest1[13661:811227] Metal GPU Frame Capture Enabled
2020-12-24 09:48:36.675326+0100 ErrorTest1[13661:811227] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000b7b50> F8BB1C28-BAE8-11D6-9C31-00039315CD46
2020-12-24 09:48:36.707535+0100 ErrorTest1[13661:811227] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000bb5a0> 30010C1C-93BF-11D8-8B5B-000A95AF9C6A
2020-12-24 09:48:36.845641+0100 ErrorTest1[13661:811227] [] CMIOHardware.cpp:379:CMIOObjectGetPropertyData Error: 2003332927, failed
2020-12-24 09:48:38.717546+0100 ErrorTest1[13661:811794] [logging-persist] cannot open file at line 44580 of [02c344acea]
2020-12-24 09:48:38.717648+0100 ErrorTest1[13661:811794] [logging-persist] os_unix.c:44580: (0) open(/var/db/DetachedSignatures) - Undefined error: 0
2020-12-24 09:48:38.778975+0100 ErrorTest1[13661:811761] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:61:16: warning: unused variable 'input_slice_count'
const uint input_slice_count = (INPUT_FEATURE_CHANNELS + 3) / 4;
^
2020-12-24 09:48:38.779198+0100 ErrorTest1[13661:811812] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
2020-12-24 09:48:38.779441+0100 ErrorTest1[13661:811838] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
2020-12-24 09:48:39.072518+0100 ErrorTest1[13661:811838] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:61:16: warning: unused variable 'input_slice_count'
const uint input_slice_count = (INPUT_FEATURE_CHANNELS + 3) / 4;
^
2020-12-24 09:48:39.073210+0100 ErrorTest1[13661:811842] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:98:16: warning: unused variable 'fm_group'
const uint fm_group = threadgroup_id.z - splitId * params.simdsPerGroupData;
^
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
2020-12-24 09:48:39.073538+0100 ErrorTest1[13661:811812] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:98:16: warning: unused variable 'fm_group'
const uint fm_group = threadgroup_id.z - splitId * params.simdsPerGroupData;
^
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
LLDB bt
* thread #5, queue = 'com.apple.VN.trackersCollectionManagementQueue', stop reason = EXC_BAD_ACCESS (code=2, address=0x70000deb1ff8)
frame #0: 0x000000010739db33 libsystem_pthread.dylib`___chkstk_darwin + 55
frame #1: 0x000000010739dafc libsystem_pthread.dylib`thread_start + 20
frame #2: 0x000000010724277b libMTLCapture.dylib`___lldb_unnamed_symbol2507$$libMTLCapture.dylib + 585
frame #3: 0x00007fff29f597be MPSNeuralNetwork`___lldb_unnamed_symbol4427$$MPSNeuralNetwork + 1907
frame #4: 0x00007fff29f5a3c2 MPSNeuralNetwork`___lldb_unnamed_symbol4432$$MPSNeuralNetwork + 756
frame #5: 0x00007fff29f5aa39 MPSNeuralNetwork`___lldb_unnamed_symbol4435$$MPSNeuralNetwork + 83
frame #6: 0x00007fff339e50e8 Espresso`Espresso::MPSEngine::mps_convolution_kernel::recreate_kernel() + 230
frame #7: 0x00007fff339e3c95 Espresso`Espresso::MPSEngine::convolution_kernel_base<Espresso::generic_convolution_kernel>::set_biases(std::__1::shared_ptr<Espresso::blob<float, 1> >) + 455
frame #8: 0x00007fff339e724b Espresso`Espresso::MPSEngine::convolution_kernel_proxy::set_biases(std::__1::shared_ptr<Espresso::blob<float, 1> >) + 103
frame #9: 0x00007fff338b3a8f Espresso`Espresso::generic_convolution_kernel::set_biases(std::__1::shared_ptr<Espresso::blob<float, 1> >, std::__1::shared_ptr<Espresso::abstract_batch>) + 49
frame #10: 0x00007fff338bdee1 Espresso`Espresso::load_network_layers_post_dispatch(std::__1::shared_ptr<Espresso::net> const&, std::__1::shared_ptr<Espresso::SerDes::generic_serdes_object> const&, std::__1::shared_ptr<Espresso::cpu_context_transfer_algo_t> const&, std::__1::shared_ptr<Espresso::net_info_ir_t> const&, bool, Espresso::network_shape const&, Espresso::compute_path, bool, std::__1::shared_ptr<Espresso::blob_storage_abstract> const&) + 5940
frame #11: 0x00007fff338ba6ee Espresso`Espresso::load_network_layers_internal(std::__1::shared_ptr<Espresso::SerDes::generic_serdes_object>, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::shared_ptr<Espresso::abstract_context> const&, Espresso::network_shape const&, std::__1::basic_istream<char, std::__1::char_traits<char> >*, Espresso::compute_path, bool, std::__1::shared_ptr<Espresso::blob_storage_abstract> const&) + 793
frame #12: 0x00007fff338c9294 Espresso`Espresso::load_and_shape_network(std::__1::shared_ptr<Espresso::SerDes::generic_serdes_object> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::shared_ptr<Espresso::abstract_context> const&, Espresso::network_shape const&, Espresso::compute_path, std::__1::shared_ptr<Espresso::blob_storage_abstract> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) + 576
frame #13: 0x00007fff338cb715 Espresso`Espresso::load_network(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::shared_ptr<Espresso::abstract_context> const&, Espresso::compute_path, bool) + 2496
frame #14: 0x00007fff33d9603c Espresso`EspressoLight::espresso_plan::add_network(char const*, espresso_storage_type_t) + 350
frame #15: 0x00007fff33daa817 Espresso`espresso_plan_add_network + 294
frame #16: 0x00007fff30479b9d Vision`+[VNEspressoHelpers createSingleNetworkPlanFromResourceName:usingProcessingDevice:lowPriorityMode:inputBlobNames:outputBlobNames:explicitNetworkLayersStorageType:espressoResources:error:] + 517
frame #17: 0x00007fff3047992d Vision`+[VNEspressoHelpers createSingleNetworkPlanFromResourceName:usingProcessingDevice:lowPriorityMode:inputBlobNames:outputBlobNames:espressoResources:error:] + 151
frame #18: 0x00007fff303ce123 Vision`-[VNRPNTrackerEspressoModelCacheManager espressoResourcesFromOptions:error:] + 417
frame #19: 0x00007fff303ce8c8 Vision`-[VNObjectTrackerRevision2 initWithOptions:error:] + 262
frame #20: 0x00007fff304152df Vision`__54-[VNTrackerManager _createTracker:type:options:error:]_block_invoke + 207
frame #21: 0x00000001072fc0b0 libdispatch.dylib`_dispatch_client_callout + 8
frame #22: 0x000000010730d3b2 libdispatch.dylib`_dispatch_lane_barrier_sync_invoke_and_complete + 135
frame #23: 0x00007fff30414f01 Vision`-[VNTrackerManager _createTracker:type:options:error:] + 261
frame #24: 0x00007fff30414b52 Vision`-[VNTrackerManager trackerWithOptions:error:] + 509
frame #25: 0x00007fff304dda4a Vision`-[VNRequestPerformer trackerWithOptions:error:] + 85
frame #26: 0x00007fff30343ac4 Vision`-[VNTrackingRequest internalPerformRevision:inContext:error:] + 436
frame #27: 0x00007fff3037fb08 Vision`-[VNRequest performInContext:error:] + 885
frame #28: 0x00007fff303cd9a1 Vision`VNExecuteBlock + 58
frame #29: 0x00007fff304dd105 Vision`-[VNRequestPerformer _performOrderedRequests:inContext:error:] + 674
frame #30: 0x00007fff304dd482 Vision`-[VNRequestPerformer performRequests:inContext:onBehalfOfRequest:error:] + 352
frame #31: 0x00007fff304dd586 Vision`-[VNRequestPerformer performRequests:inContext:error:] + 60
frame #32: 0x00007fff304cbf1a Vision`-[VNSequenceRequestHandler _performRequests:onImageBuffer:gatheredForensics:error:] + 293
frame #33: 0x00007fff304cc122 Vision`-[VNSequenceRequestHandler performRequests:onCVPixelBuffer:orientation:gatheredForensics:error:] + 111
frame #34: 0x00007fff304cc0aa Vision`-[VNSequenceRequestHandler performRequests:onCVPixelBuffer:orientation:error:] + 28
* frame #35: 0x0000000106fc5a97 ErrorTest1`-[LSPVision captureOutput:didOutputSampleBuffer:fromConnection:](self=0x0000608000047c20, _cmd="captureOutput:didOutputSampleBuffer:fromConnection:", output=0x00006030000ce770, sampleBuffer=0x0000614000091240, connection=0x00006030000d0c30) at LSPVision.m:246:9
frame #36: 0x00007fff3786b2e0 AVFCapture`__56-[AVCaptureVideoDataOutput_Tundra _render:sampleBuffer:]_block_invoke + 213
frame #37: 0x00000001077ff3bb libclang_rt.asan_osx_dynamic.dylib`__wrap_dispatch_async_block_invoke + 203
frame #38: 0x00000001072fae78 libdispatch.dylib`_dispatch_call_block_and_release + 12
frame #39: 0x00000001072fc0b0 libdispatch.dylib`_dispatch_client_callout + 8
frame #40: 0x00000001073036b7 libdispatch.dylib`_dispatch_lane_serial_drain + 776
frame #41: 0x0000000107304594 libdispatch.dylib`_dispatch_lane_invoke + 449
frame #42: 0x0000000107312217 libdispatch.dylib`_dispatch_workloop_worker_thread + 1675
frame #43: 0x000000010739eb15 libsystem_pthread.dylib`_pthread_wqthread + 314
frame #44: 0x000000010739dae3 libsystem_pthread.dylib`start_wqthread + 15
Update
The bug seems to be resolved on macOS Monterey 12.1.

More of a comment here, I am trying to reproduce this. I've taken your code as is, but had to comment out [self _setEyePositionsForFace:faceObservation]; in
for (VNFaceObservation *faceObservation in results) {
//[self _setEyePositionsForFace:faceObservation];
//NSLog(#"seeing face");
}
as you do not give its implementation. However, with that done, I was able to run the code without any issue. To test further I added the logs as below.
// SequenceRequesthandler results in 10-20% cpu utilization
NSLog(#"aaa");
[_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error];
NSLog(#"bbb");
as I understand your problem is with [_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error]; specifically, but I did not run into trouble and the log showed a lot of repeating aaa and bbbs. To further test I also added an ok log as shown below
if (error != nil) {
NSLog(#"Failed to perform FaceLandmarkRequest: %#", error);
} else {
NSLog(#"ok");
}
and that happily printed together with the aaa and bbb.
I also hooked up a button as shown below
- (IBAction)buttonAction:(id)sender {
NSLog( #"Button" );
self.v.captureFrame;
}
where self.v is an instance of (my) LSPVision and I could push the button as much as I'd like without trouble.
I think either the problem lies somewhere else, maybe even in the _setEyePositionsForFace that I commented out, or perhaps you can give even more code so I can reproduce it this side?
FWIW here is a sample of the log
2020-12-27 09:14:54.147536+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.184167+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.268926+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.269374+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.314135+0200 MetalCaptureTest[11392:316676] Button
2020-12-27 09:14:54.316025+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.393732+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.394171+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.432979+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.496887+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.497389+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.533118+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.614813+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.615394+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.663343+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.747860+0200 MetalCaptureTest[11392:317094] ok
EDIT
Thanks, I got the dropbox project and it is working this side. No crashes at all. This is the log.
ErrorTest1(11743,0x10900ce00) malloc: nano zone abandoned due to inability to preallocate reserved vm space.
2020-12-27 10:55:10.445333+0200 ErrorTest1[11743:344803] Metal GPU Frame Capture Enabled
2020-12-27 10:55:10.471650+0200 ErrorTest1[11743:344803] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000aabc0> F8BB1C28-BAE8-11D6-9C31-00039315CD46
2020-12-27 10:55:10.528628+0200 ErrorTest1[11743:344803] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000ae130> 30010C1C-93BF-11D8-8B5B-000A95AF9C6A
2020-12-27 10:55:10.608753+0200 ErrorTest1[11743:344803] [] CMIOHardware.cpp:379:CMIOObjectGetPropertyData Error: 2003332927, failed
2020-12-27 10:55:11.408594+0200 ErrorTest1[11743:344873] [logging-persist] cannot open file at line 44580 of [02c344acea]
2020-12-27 10:55:11.408806+0200 ErrorTest1[11743:344873] [logging-persist] os_unix.c:44580: (0) open(/var/db/DetachedSignatures) - Undefined error: 0
2020-12-27 10:55:17.637382+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:17.838354+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:17.987583+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:18.171168+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:18.320957+0200 ErrorTest1[11743:344803] seeing face
FWIW I have latest OS BS 11.1, latest Xcode 12.3 and running this on MB Air 2017. From your description I suspect maybe the multithreading could be a problem but for now my focus is on reproducing it this side.

I also got an unexplainable crash using Vision in one app, but not in another one I created for testing. Turns out enabling "Metal API Validation" made the problem go away for me. I do have several custom Metal kernels in my app, but I still don't understand what the root problem is.

Related

SIGSEGV in objc_release when using pthreads

I'm running some Metal code in a thread, but encountering some issues I don't fully understand. Running the following code with USE_THREAD 0 and USE_AUTORELEASEPOOL 0 works fine but setting either one to 1 results in a SIGSEGV in objc_release:
* thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=1, address=0x20)
* frame #0: 0x00007fff2020d4af libobjc.A.dylib`objc_release + 31
frame #1: 0x00007fff2022b20f libobjc.A.dylib`AutoreleasePoolPage::releaseUntil(objc_object**) + 167
frame #2: 0x00007fff2020de30 libobjc.A.dylib`objc_autoreleasePoolPop + 161
frame #3: 0x0000000100003d60 a.out`render(void*) + 896
frame #4: 0x0000000100003dd8 a.out`main + 24
frame #5: 0x00007fff20388f3d libdyld.dylib`start + 1
frame #6: 0x00007fff20388f3d libdyld.dylib`start + 1
Using the autoreleasepool I can understand since the objects are already released (since release is called manually on them), but why does the same issue occur when the code is running inside a thread? Is this related to pthreads specifically? Is there a "hidden" autoreleasepool somewhere I am missing?
I understand using an autoreleasepool and not releasing manually will achieve the same result but I am trying to understand what is going on here.
// clang++ main.mm -lobjc -framework Metal
#define USE_THREAD 0
#define USE_AUTORELEASEPOOL 1
#import <Metal/Metal.h>
void * render(void *) {
#if USE_AUTORELEASEPOOL
#autoreleasepool {
#else
{
#endif
NSArray<id<MTLDevice>> * devices = MTLCopyAllDevices();
id<MTLDevice> device = devices[0];
id<MTLCommandQueue> command_queue = [device newCommandQueue];
MTLTextureDescriptor * texture_descriptor = [MTLTextureDescriptor texture2DDescriptorWithPixelFormat:MTLPixelFormatBGRA8Unorm width:640 height:480 mipmapped:NO];
texture_descriptor.usage = MTLTextureUsageRenderTarget;
id<MTLTexture> texture = [device newTextureWithDescriptor:texture_descriptor];
[texture_descriptor release];
texture_descriptor = NULL;
id<MTLCommandBuffer> command_buffer = [command_queue commandBuffer];
MTLRenderPassDescriptor * render_pass_descriptor = [MTLRenderPassDescriptor renderPassDescriptor];
render_pass_descriptor.colorAttachments[0].texture = texture;
id<MTLRenderCommandEncoder> render_command_encoder = [command_buffer renderCommandEncoderWithDescriptor:render_pass_descriptor];
[render_pass_descriptor release];
render_pass_descriptor = NULL;
[render_command_encoder endEncoding];
[render_command_encoder release];
render_command_encoder = nil;
[command_buffer commit];
[command_buffer waitUntilCompleted];
[command_buffer release];
command_buffer = nil;
[texture release];
texture = nil;
[command_queue release];
command_queue = nil;
}
return 0;
}
#include <pthread.h>
int main() {
#if USE_THREAD
pthread_t thread;
pthread_create(&thread, NULL, render, NULL);
pthread_join(thread, NULL);
#else
render(NULL);
#endif
return 0;
}
I'm not sure if Apple has ever updated their documentation in regards to this behaviour, but approximately since macOS 10.7, all POSIX threads were populated with autorelease pool block automatically. You can clearly see it from dealloc of autoreleased objects:
#import <Foundation/Foundation.h>
#import <pthread.h>
#interface TDWObject : NSObject
#end
#implementation TDWObject
- (void)dealloc {
[super dealloc];
NSLog(#"%#", [NSThread callStackSymbols]);
}
#end
void *run(void *objPtr) {
TDWObject *obj = (__bridge TDWObject *)objPtr;
[obj autorelease];
return NULL;
}
int main() {
pthread_t thread;
TDWObject *obj = [TDWObject new];
pthread_create(&thread, NULL, run, (__bridge void *)obj);
pthread_join(thread, NULL);
return 0;
}
This code (when compiled with MRC) will print the following stacktrace:
-[TDWObject dealloc] + 83
_ZN19AutoreleasePoolPage12releaseUntilEPP11objc_object + 168
objc_autoreleasePoolPop + 227
_ZN20objc_tls_direct_baseIP19AutoreleasePoolPageL7tls_key3ENS0_14HotPageDeallocEE5dtor_EPv + 140
_pthread_tsd_cleanup + 607
_pthread_exit + 70
_pthread_start + 136
thread_start + 15
objc_autoreleasePoolPop function is exactly what gets called in the end of autorelease pool blocks nowadays.

How is it that Cocoa GUI Class Runs without invoking NSApplication or NSRunLoop

Why does the following code work ? It's a small Cocoa program that uses NSOpenPanel to select a file and open it in Emacs.app. It can be run from the command line with the starting directory as an argument.
How does NSOpenPanel run without invoking NSApplication or NSRunLoop ? What are the limitations on a Cocoa program that doesn't explicitly start NSApplication or NSRunLoop ? I would have thought one of them was: you can't use any kind of GUI. Perhaps by invoking NSOpenPanel, some fallback code being called that invokes NSRunLoop ? I put breakpoints on +[NSApplication alloc] and +[NSRunLoop alloc] and they were not triggered.
main.m:
#import <Cocoa/Cocoa.h>
NSString *selectFileWithStartPath(NSString *path) {
NSString *answer = nil;
NSOpenPanel* panel = [NSOpenPanel openPanel];
panel.allowsMultipleSelection = NO;
panel.canChooseFiles = YES;
panel.canChooseDirectories = NO;
panel.resolvesAliases = YES;
if([panel runModalForDirectory:path file:nil] == NSOKButton)
answer = [[[panel URLs] objectAtIndex:0] path];
return answer;
}
int main(int argc, const char * argv[]) {
NSString *startPath = argc > 1 ? [NSString stringWithUTF8String:argv[1]] : #"/Users/Me/Docs";
printf("%s\n", argv[1]);
BOOL isDir;
if([[NSFileManager defaultManager] fileExistsAtPath:startPath isDirectory:&isDir] && isDir) {
system([[NSString stringWithFormat:#"find %# -name \\*~ -exec rm {} \\;", startPath] UTF8String]);
NSString *file = selectFileWithStartPath(startPath);
if(file) [[NSWorkspace sharedWorkspace] openFile:file withApplication:#"Emacs.app"];
}
}
runModalForDirectory:file:types: creates and runs its own event loop.
From the documentation:
Displays the panel and begins a modal event loop that is terminated
when the user clicks either OK or Cancel.
You can see that also if you pause the program while the "Open" dialog is active
and print the stack backtrace in the debugger console:
frame #0: 0x00007fff8b855a1a libsystem_kernel.dylib`mach_msg_trap + 10
frame #1: 0x00007fff8b854d18 libsystem_kernel.dylib`mach_msg + 64
frame #2: 0x00007fff8549f155 CoreFoundation`__CFRunLoopServiceMachPort + 181
frame #3: 0x00007fff8549e779 CoreFoundation`__CFRunLoopRun + 1161
frame #4: 0x00007fff8549e0b5 CoreFoundation`CFRunLoopRunSpecific + 309
frame #5: 0x00007fff88381a0d HIToolbox`RunCurrentEventLoopInMode + 226
frame #6: 0x00007fff883817b7 HIToolbox`ReceiveNextEventCommon + 479
frame #7: 0x00007fff883815bc HIToolbox`_BlockUntilNextEventMatchingListInModeWithFilter + 65
frame #8: 0x00007fff838ca3de AppKit`_DPSNextEvent + 1434
frame #9: 0x00007fff838c9a2b AppKit`-[NSApplication nextEventMatchingMask:untilDate:inMode:dequeue:] + 122
frame #10: 0x00007fff83c28e2e AppKit`-[NSApplication _realDoModalLoop:peek:] + 642
frame #11: 0x00007fff83c2754e AppKit`-[NSApplication runModalForWindow:] + 117
frame #12: 0x00007fff83ef5d0b AppKit`-[NSSavePanel runModal] + 276
frame #13: 0x0000000100000c61 xxx`selectFileWithStartPath(path=0x00000001000010b8) + 225 at main.m:18
frame #14: 0x0000000100000e0d xxx`main(argc=1, argv=0x00007fff5fbff9c0) + 189 at main.m:26
As you can see in frame #11, -[NSApplication runModalForWindow:] is used to run a modal
event loop for the "Open" dialog.

autosavesInPlace causes New Document save to fail

I have a NSPersistentDocument based app which fails to save a new document when autosavesInPlace is set to return YES , return NO and the problem disappears.
I create a new document
Make some changes
Save it , thus running NSSaveAsOperation , the name of the document and the URL changes and all appears to be well but the next save will throw a very descriptive
NSPersistentStoreSaveError = 134030, // unclassified save error - something we depend on returned an error
This only happens when the document attempts to run a save after a NSSaveAsOperation. Any other save type will work fine as in changes to existing doc. Interesting effect is that if i dont change name or location i dont get this issue either.
Im getting an exception backtrace of
frame #0: 0x00007fff988143c5 libobjc.A.dylibobjc_exception_throw
frame #1: 0x00007fff94c5f5f9 CoreData-[NSPersistentStore(_NSInternalMethods) _preflightCrossCheck] + 697
frame #2: 0x00007fff94c3198b CoreData-[NSPersistentStoreCoordinator executeRequest:withContext:error:] + 603
frame #3: 0x00007fff94c5aa98 CoreData-[NSManagedObjectContext save:] + 456
frame #4: 0x00007fff91baa101 AppKit-[NSPersistentDocument writeToURL:ofType:forSaveOperation:originalContentsURL:error:] + 3743
frame #5: 0x0000000100002de7 ZZZZ-[ZZZZDocument writeToURL:ofType:forSaveOperation:originalContentsURL:error:] + 135 at ZZZZDocument.m:209
frame #6: 0x00007fff91baabc7 AppKit-[NSPersistentDocument writeSafelyToURL:ofType:forSaveOperation:error:] + 611
frame #7: 0x0000000100002ea3 ZZZZ-[ZZZZDocument writeSafelyToURL:ofType:forSaveOperation:error:] + 115 at ZZZZDocument.m:223
any ideas?
Its not possible for a un-wrappered core data file
In the event you attempt to trap NSSaveAsOperation and do a migrate on the persistent store for that enum the construction of the ...-journal file will fail to create as its outside the sandbox.
-(void)saveToURL:(NSURL *)url ofType:(NSString *)typeName forSaveOperation:(NSSaveOperationType)saveOperation completionHandler:(void (^)(NSError *))completionHandler
{
NSLog(#" save op = %ld to %# ",saveOperation,url);
NSURL *targeturl = url;
if(saveOperation == NSSaveAsOperation)
{
//migrate pstore
NSPersistentStore *store = [self.managedObjectContext.persistentStoreCoordinator.persistentStores lastObject];
if (store)
{
NSMutableDictionary *opts = [NSMutableDictionary dictionary];
[opts setValue:[NSNumber numberWithBool:YES] forKey:NSInferMappingModelAutomaticallyOption];
[opts setValue:[NSNumber numberWithBool:YES] forKey:NSMigratePersistentStoresAutomaticallyOption];
NSError *error = NULL;
NSPersistentStore *newstore = [self.managedObjectContext.persistentStoreCoordinator migratePersistentStore:store toURL:url options:opts withType:store.type error:&error];
if(newstore == nil)
{
NSLog(#"migration error %#",[error localizedDescription]);
}
self.fileURL = url;
}
}
[super saveToURL:targeturl ofType:typeName forSaveOperation:saveOperation completionHandler:completionHandler];
}
So we need to wrapper the file in a bundle/folder which is non-trivial using the NSPersistentDocument framework.
Heres waiting for NSManagedDocument (thats a wishing well API)

What causes this EXC_CRASH in my iPad app?

I got this symbolicated stack trace from a crash report from my iPad app (excerpt):
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x00000000, 0x00000000
Crashed Thread: 0
0 ImageIO 0x34528eb4 _CGImagePluginIdentifyPNG + 0
1 ImageIO 0x34528d90 _CGImageSourceBindToPlugin + 368
2 ImageIO 0x34528bda CGImageSourceGetCount + 26
3 UIKit 0x341b8f66 _UIImageRefAtPath + 366
4 UIKit 0x342650ce -[UIImage initWithContentsOfFile:] + 50
5 UIKit 0x342b0314 +[UIImage imageWithContentsOfFile:] + 28
6 DesignScene 0x00013a2a -[LTImageCache fetchImageforURL:] (LTImageCache.m:37)
…
Here are the contents of -[LTImageCache fetchImageforURL:]:
- (UIImage *)fetchImageforURL:(NSString *)theUrl {
NSString *key = theUrl.md5Hash;
return [UIImage imageWithContentsOfFile:[self filenameForKey:key]];
}
And the contents of -[LTImageCache filenameForKey:]:
- (NSString *) filenameForKey:(NSString *) key {
return [_cacheDir stringByAppendingPathComponent:key];
}
The _cacheDir ivar is created and retained in -init. So the question is, what caused this crash? Is the problem that:
the return value of -[LTImageCache filenameForKey:] needs to be retained (it's autoreleased)
An unhandled exception somewhere (+[UIImage imageWithContentsOfFile:] claims to return nil if the image is unrecognizable)
Something else…I'm out of guesses
I would think that an autoreleased value would be fine. An in truth, this code has been working fine for months, and this method is called 100s of times in a session. This is a rare crash under very specific circumstances (the app left loaded overnight, crash upon unlocking the iPad in the morning).
What's causing this crash?
I'm guessing but it looks like a bogus image file. Is this in your app bundle or do you download it?
I don't think it has anything to do with memory mgt.
To test you could try using ImageIO to open the file yourself.
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)self.url, NULL);
if(NULL != imageSource) {
size_t imageCount = CGImageSourceGetCount(imageSource);
if(imageCount > 0) {
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCGImageSourceCreateThumbnailFromImageIfAbsent,
[NSNumber numberWithInteger:maxSize], kCGImageSourceThumbnailMaxPixelSize, nil];
CGImageRef thumbImage = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, (CFDictionaryRef)options);
self.image = [UIImage imageWithCGImage:thumbImage scale:scale orientation:imageOrientation];
CGImageRelease(thumbImage);
CFRelease(imageSource);
[pool drain];
}
} else {
NSLog(#"Unable to open image %#", self.url);
}
and then try to find the image count.
Using maxSize and getting a thumbnail will ensure you don't load a 5 mega-pixel image to put into a 100x100 tile on your UI.
scale is the window's scale (will be 2 for iPhone 4 and 1 for everything else).
To find the orientation you need to use the CGImageSourceCopyPropertiesAtIndex function and then the kCGImagePropertyOrientation key to get the orientation of the particular image.

CFHTTPStream segmentation faul during NSRunLoop

I'm trying to assemble a toy application to interact with http servers via CoreFoundation. I'm not using NSURLConnection because I want to be able to specify a proxy different from the one set in the OSX System Preferences.
My problem is that I wasn't able to find any working example and the Apple documentation do not provide a working snippet of code.
Anyhow this is the code I pulled together:
#import <CoreFoundation/CoreFoundation.h>
#import <CoreServices/CoreServices.h>
#import <Foundation/Foundation.h>
void myCallBack (CFReadStreamRef stream, CFStreamEventType eventType, void *clientCallBackInfo){
NSLog(#"Something happened");
BOOL *done = ((BOOL *)clientCallBackInfo);
*done = TRUE;
}
int main(int argc, char *argv[]){
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
CFStringRef bodyString = CFSTR(""); // Usually used for POST data
CFStringRef url = CFSTR("http://www.apple.com");
CFURLRef myURL = CFURLCreateWithString(kCFAllocatorDefault, url, NULL);
CFStringRef requestMethod = CFSTR("GET");
CFHTTPMessageRef myRequest = CFHTTPMessageCreateRequest(kCFAllocatorDefault,
requestMethod,
myURL,
kCFHTTPVersion1_1);
if (myRequest){
NSLog(#"Request created: %# %#", CFHTTPMessageCopyRequestMethod(myRequest),
CFURLGetString(CFHTTPMessageCopyRequestURL(myRequest)));
}
CFDataRef bodyData = CFStringCreateExternalRepresentation(kCFAllocatorDefault, bodyString, kCFStringEncodingASCII, 0);
CFHTTPMessageSetBody(myRequest, bodyData);
NSLog(#"HTTPMessage body set");
CFDataRef mySerializedRequest = CFHTTPMessageCopySerializedMessage(myRequest);
NSLog(#"Serialized string generated");
CFStringRef mySerializedString = CFStringCreateFromExternalRepresentation(kCFAllocatorDefault,
mySerializedRequest,
kCFStringEncodingASCII);
NSLog(#"Serialized request: %#", mySerializedString);
CFReadStreamRef myReadStream = CFReadStreamCreateForHTTPRequest(kCFAllocatorDefault, myRequest);
BOOL done = FALSE;
CFReadStreamOpen(myReadStream);
CFOptionFlags registeredEvents = kCFStreamEventHasBytesAvailable | kCFStreamEventErrorOccurred | kCFStreamEventEndEncountered;
NSLog(#"Setting read stream client");
if (CFReadStreamSetClient(myReadStream, registeredEvents, myCallBack, (void *)&done)){
CFReadStreamScheduleWithRunLoop(myReadStream, CFRunLoopGetCurrent(),
kCFRunLoopCommonModes);
while (!done){
NSLog(#"Wait");
[[NSRunLoop currentRunLoop] run];
}
}else{
NSLog(#"Error setting readstream client");
}
CFRelease(myRequest);
CFRelease(myURL);
CFRelease(url);
CFRelease(mySerializedRequest);
myRequest = NULL;
mySerializedRequest = NULL;
[pool drain];
}
The code compiles but it throws a segfault:
Reading symbols for shared libraries .+++++...................... done
2010-09-06 21:36:34.470 a.out[27973:a0f] Request created: GET http://www.apple.com
2010-09-06 21:36:34.473 a.out[27973:a0f] HTTPMessage body set
2010-09-06 21:36:34.474 a.out[27973:a0f] Serialized string generated
2010-09-06 21:36:34.474 a.out[27973:a0f] Serialized request: GET / HTTP/1.1
2010-09-06 21:36:34.478 a.out[27973:a0f] Setting read stream client
2010-09-06 21:36:34.479 a.out[27973:a0f] Wait
Program received signal EXC_BAD_ACCESS, Could not access memory.
Reason: 13 at address: 0x0000000000000000
0x00007fff83b7b83d in _signalEventSync ()
(gdb) bt
#0 0x00007fff83b7b83d in _signalEventSync ()
#1 0x00007fff83b7b7f4 in _cfstream_solo_signalEventSync ()
#2 0x00007fff83b7b734 in _CFStreamSignalEvent ()
#3 0x00007fff8391c3a1 in HTTPReadStream::streamEvent ()
#4 0x00007fff83b7b883 in _signalEventSync ()
#5 0x00007fff83b7c5d9 in _cfstream_shared_signalEventSync ()
#6 0x00007fff83b1be91 in __CFRunLoopDoSources0 ()
#7 0x00007fff83b1a089 in __CFRunLoopRun ()
#8 0x00007fff83b1984f in CFRunLoopRunSpecific ()
#9 0x00007fff8300fa18 in -[NSRunLoop(NSRunLoop) runMode:beforeDate:] ()
#10 0x00007fff8300f8f7 in -[NSRunLoop(NSRunLoop) run] ()
#11 0x0000000100000c94 in main (argc=1, argv=0x7fff5fbff5b8) at test.m:56
I don't understand where the error is. Anyone can show me the light or point me to a clear example on how to manage CFHTTPStreams?
Thanks,
Andrea
The problem was that I didn't read carefully the documentation of CFReadStreamSetClient.
The third parameter has to be a reference to a CFStreamClientContext whereas I was passing a reference to an arbitrary pointer which instead has to be included into the context.
Fixing this resolved the segmentation fault and now the test ca is working as expected.