autosavesInPlace causes New Document save to fail - objective-c

I have a NSPersistentDocument based app which fails to save a new document when autosavesInPlace is set to return YES , return NO and the problem disappears.
I create a new document
Make some changes
Save it , thus running NSSaveAsOperation , the name of the document and the URL changes and all appears to be well but the next save will throw a very descriptive
NSPersistentStoreSaveError = 134030, // unclassified save error - something we depend on returned an error
This only happens when the document attempts to run a save after a NSSaveAsOperation. Any other save type will work fine as in changes to existing doc. Interesting effect is that if i dont change name or location i dont get this issue either.
Im getting an exception backtrace of
frame #0: 0x00007fff988143c5 libobjc.A.dylibobjc_exception_throw
frame #1: 0x00007fff94c5f5f9 CoreData-[NSPersistentStore(_NSInternalMethods) _preflightCrossCheck] + 697
frame #2: 0x00007fff94c3198b CoreData-[NSPersistentStoreCoordinator executeRequest:withContext:error:] + 603
frame #3: 0x00007fff94c5aa98 CoreData-[NSManagedObjectContext save:] + 456
frame #4: 0x00007fff91baa101 AppKit-[NSPersistentDocument writeToURL:ofType:forSaveOperation:originalContentsURL:error:] + 3743
frame #5: 0x0000000100002de7 ZZZZ-[ZZZZDocument writeToURL:ofType:forSaveOperation:originalContentsURL:error:] + 135 at ZZZZDocument.m:209
frame #6: 0x00007fff91baabc7 AppKit-[NSPersistentDocument writeSafelyToURL:ofType:forSaveOperation:error:] + 611
frame #7: 0x0000000100002ea3 ZZZZ-[ZZZZDocument writeSafelyToURL:ofType:forSaveOperation:error:] + 115 at ZZZZDocument.m:223
any ideas?

Its not possible for a un-wrappered core data file
In the event you attempt to trap NSSaveAsOperation and do a migrate on the persistent store for that enum the construction of the ...-journal file will fail to create as its outside the sandbox.
-(void)saveToURL:(NSURL *)url ofType:(NSString *)typeName forSaveOperation:(NSSaveOperationType)saveOperation completionHandler:(void (^)(NSError *))completionHandler
{
NSLog(#" save op = %ld to %# ",saveOperation,url);
NSURL *targeturl = url;
if(saveOperation == NSSaveAsOperation)
{
//migrate pstore
NSPersistentStore *store = [self.managedObjectContext.persistentStoreCoordinator.persistentStores lastObject];
if (store)
{
NSMutableDictionary *opts = [NSMutableDictionary dictionary];
[opts setValue:[NSNumber numberWithBool:YES] forKey:NSInferMappingModelAutomaticallyOption];
[opts setValue:[NSNumber numberWithBool:YES] forKey:NSMigratePersistentStoresAutomaticallyOption];
NSError *error = NULL;
NSPersistentStore *newstore = [self.managedObjectContext.persistentStoreCoordinator migratePersistentStore:store toURL:url options:opts withType:store.type error:&error];
if(newstore == nil)
{
NSLog(#"migration error %#",[error localizedDescription]);
}
self.fileURL = url;
}
}
[super saveToURL:targeturl ofType:typeName forSaveOperation:saveOperation completionHandler:completionHandler];
}
So we need to wrapper the file in a bundle/folder which is non-trivial using the NSPersistentDocument framework.
Heres waiting for NSManagedDocument (thats a wishing well API)

Related

EXC_BAD_ACCESS on VNSequenceRequestHandler

The following code uses the Vision and AVFoundation frameworks to enable face tracking on the built-in camera on macOS. In some circumstances the code crashes due to EXC_BAD_ACCESS (code=2) on a working thread on the queue com.apple.VN.trackersCollectionManagementQueue (serial). The application works as intended as long as no face is detected, but crashes as soon as it detects a face and attempts to track it by the method
[_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error]
The method is called inside the AVCaptureVideoDataOutputSampleBufferDelegate method
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection)
From what I understand EXC_BAD_ACCESS means I can't access memory [1], and the error code (2) KERN_PROTECTION_FAILURE means that the specified memory is valid, but does not permit the required forms of access [2]. A (possibly outdated) technical note explains that this is caused by the thread trying to write to read-only memory. [3]. From this, I understand the problem is not caused by premature deallocation or memory corruption, but memory access control across threads.
I believe the problem appeared after an update. The crash happens when Debug Executable is checked in the scheme editor for Game template projects (Metal, SceneKit, and SpriteKit), but does not crash when used in App and Document App templates. The code also works as intended when adapted to iOS on a physical device. I have tried to isolate the problem by trimming away as much code as possible, and the following files can be added to any template.
Header file
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
NS_ASSUME_NONNULL_BEGIN
#interface LSPVision : NSObject <AVCaptureVideoDataOutputSampleBufferDelegate>
#property (nonatomic) AVCaptureVideoPreviewLayer *previewLayer;
- (void)captureFrame;
#end
NS_ASSUME_NONNULL_END
Implementation file
#import "LSPVision.h"
#import <Vision/Vision.h>
#implementation LSPVision
{
// AVCapture stuff
AVCaptureSession *_session;
AVCaptureVideoDataOutput *_videoDataOutput;
dispatch_queue_t _videoDataOutputQueue;
CGSize _captureDeviceResolution;
// Vision requests
NSMutableArray *_detectionRequests; // Array of VNDetectFaceRectanglesRequest
NSMutableArray *_trackingRequests; // Array of VNTrackObjectRequest
VNSequenceRequestHandler *_sequenceRequestHandler;
BOOL _frameCapture;
}
- (nonnull instancetype)init
{
self = [super init];
if(self)
{
_session = [self _setupAVCaptureSession];
[self designatePreviewLayerForCaptureSession:_session];
[self _prepareVisionRequest];
_frameCapture = YES;
if (_session) {
[_session startRunning];
}
}
return self;
}
# pragma mark Setup AVSession
- (AVCaptureSession *)_setupAVCaptureSession {
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *device;
#if defined(TARGET_MACOS)
if (#available(macOS 10.15, *)) {
AVCaptureDeviceDiscoverySession *discoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:#[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionFront];
device = discoverySession.devices.firstObject;
}
#endif
if (device != nil) {
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
if ([captureSession canAddInput:deviceInput]) {
[captureSession addInput:deviceInput];
}
AVCaptureDeviceFormat *lowestResolution = [self _lowestResolution420Format:device];
if (lowestResolution != nil) {
if ([device lockForConfiguration:nil]) {
device.activeFormat = lowestResolution;
[device unlockForConfiguration];
}
}
}
if (device != nil) {
[self _configureVideoDataOutput:device captureSession:captureSession];
return captureSession;
}
NSLog(#"Hold up, something went wrong with AVCaptureSession");
[self _tearDownAVCapture];
return nil;
}
- (AVCaptureDeviceFormat *)_lowestResolution420Format:(AVCaptureDevice *)device {
AVCaptureDeviceFormat *lowestResolutionFormat = nil;
CMVideoDimensions lowestResolutionDimensions = { .height = (int32_t)10000, .width = (int32_t)10000 };
for (AVCaptureDeviceFormat *deviceFormat in device.formats) {
CMFormatDescriptionRef deviceFormatDescription = deviceFormat.formatDescription;
if (CMFormatDescriptionGetMediaSubType(deviceFormatDescription) == (kCVPixelFormatType_420YpCbCr8BiPlanarFullRange | kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)) {
CMVideoDimensions candidateDimensions = CMVideoFormatDescriptionGetDimensions(deviceFormatDescription);
if ((lowestResolutionFormat == nil) || candidateDimensions.width > lowestResolutionDimensions.width) {
lowestResolutionFormat = deviceFormat;
lowestResolutionDimensions = candidateDimensions;
NSLog(#"Device Format: Width: %d, Height: %d", candidateDimensions.width, candidateDimensions.height);
_captureDeviceResolution.width = candidateDimensions.width;
_captureDeviceResolution.height = candidateDimensions.height;
}
}
}
if (lowestResolutionFormat != nil) {
return lowestResolutionFormat;
}
return nil;
}
- (void)designatePreviewLayerForCaptureSession:(AVCaptureSession *)session {
AVCaptureVideoPreviewLayer *videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
self.previewLayer = videoPreviewLayer;
videoPreviewLayer.name = #"Camera Preview";
}
- (void)_configureVideoDataOutput:(AVCaptureDevice *)inputDevice captureSession:(AVCaptureSession *)captureSession {
AVCaptureVideoDataOutput *videoDataOutput = [AVCaptureVideoDataOutput new];
videoDataOutput.alwaysDiscardsLateVideoFrames = true;
// Create a serial dispatch queue used for the sample buffer delegate as well as when a still image is captured.
// A serial dispatch queue must be used to guarantee that video frames will be delivered in order.
dispatch_queue_t videoDataOutputQueue = dispatch_queue_create("com.example.apple-samplecode.VisionFaceTrack", NULL);
[videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
if ([captureSession canAddOutput:videoDataOutput]) {
[captureSession addOutput:videoDataOutput];
}
[videoDataOutput connectionWithMediaType:AVMediaTypeVideo].enabled = true;
_videoDataOutput = videoDataOutput;
_videoDataOutputQueue = videoDataOutputQueue;
}
# pragma mark Vision Request
- (void)_prepareVisionRequest {
NSMutableArray *requests = [NSMutableArray array];
VNRequestCompletionHandler handlerBlock = ^(VNRequest * _Nonnull request, NSError * _Nullable error) {
if (error) {
NSLog(#"Handler error: %#", error);
}
VNDetectFaceRectanglesRequest *faceDetectionRequest = (VNDetectFaceRectanglesRequest *)request;
dispatch_async(dispatch_get_main_queue(), ^{
for (VNFaceObservation *observation in faceDetectionRequest.results) {
VNTrackObjectRequest *faceTrackingRequest = [[VNTrackObjectRequest alloc] initWithDetectedObjectObservation:observation];
NSUInteger count = requests.count;
[requests insertObject:faceTrackingRequest atIndex:count];
}
self->_trackingRequests = [requests copy];
});
};
VNDetectFaceRectanglesRequest *faceDetectionRequest = [[VNDetectFaceRectanglesRequest alloc] initWithCompletionHandler:handlerBlock];
_detectionRequests = [NSMutableArray arrayWithObject:faceDetectionRequest];
_sequenceRequestHandler = [[VNSequenceRequestHandler alloc] init];
}
# pragma mark Delegate functions
// AVCaptureVideoDataOutputSampleBufferDelegate
// Handle delegate method callback on receiving a sample buffer.
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if (_frameCapture == YES) {
NSMutableDictionary *requestHandlerOptions = [NSMutableDictionary dictionary];
CFTypeRef cameraIntrinsicData = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix, nil);
if (cameraIntrinsicData != nil) {
[requestHandlerOptions setObject:CFBridgingRelease(cameraIntrinsicData) forKey:VNImageOptionCameraIntrinsics];
}
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (!pixelBuffer) {
NSLog(#"Failed to obtain a CVPixelBuffer for the current output frame.");
return;
}
#if defined(TARGET_MACOS)
CGImagePropertyOrientation exifOrientation = kCGImagePropertyOrientationLeftMirrored;
#endif
NSError *error;
NSArray *requests;
if (_trackingRequests.count > 0) {
requests = _trackingRequests;
} else {
// No tracking object detected, so perform initial detection
VNImageRequestHandler *imageRequestHandler = [[VNImageRequestHandler alloc] initWithCVPixelBuffer:pixelBuffer orientation:exifOrientation options:requestHandlerOptions];
NSArray *detectionRequests = _detectionRequests;
if (detectionRequests == nil) {
return;
}
[imageRequestHandler performRequests:_detectionRequests error:&error];
if (error) {
NSLog(#"Failed to perform FaceRectangleRequest: %#", error);
}
return;
}
// SequenceRequesthandler results in 10-20% cpu utilization
[_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error];
if (error) {
NSLog(#"Failed to perform SequenceRequest: %#", error);
return;
}
// Setup the next round of tracking
NSMutableArray *newTrackingRequests = [NSMutableArray array];
for (VNTrackObjectRequest *trackingRequest in requests) {
NSArray *results = trackingRequest.results;
trackingRequest.trackingLevel = VNRequestTrackingLevelFast;
VNDetectedObjectObservation *observation = results[0];
if (![observation isKindOfClass:[VNDetectedObjectObservation class]]) {
return;
}
if (!trackingRequest.isLastFrame) {
if (observation.confidence > 0.3f ) {
trackingRequest.inputObservation = observation;
} else {
trackingRequest.lastFrame = true;
}
NSUInteger number = newTrackingRequests.count;
[newTrackingRequests insertObject:trackingRequest atIndex:number];
}
}
_trackingRequests = newTrackingRequests;
if (newTrackingRequests.count == 0) {
// Nothing to track, so abort.
return;
}
NSMutableArray *faceLandmarksRequests = [NSMutableArray array];
for (VNTrackObjectRequest* trackingRequest in newTrackingRequests) {
VNRequestCompletionHandler handlerBlock = ^(VNRequest * _Nonnull request, NSError * _Nullable error) {
if (error != nil) {
NSLog(#"Facelandmarks error: %#", error);
}
VNDetectFaceLandmarksRequest *landmarksRequest = (VNDetectFaceLandmarksRequest *)request;
NSArray *results = landmarksRequest.results;
if (results == nil) {
return;
}
// Perform all UI updates (drawing) on the main queue, not the background queue on which this handler is being called.
dispatch_async(dispatch_get_main_queue(), ^{
for (VNFaceObservation *faceObservation in results) {
[self _setEyePositionsForFace:faceObservation];
//NSLog(#"seeing face");
}
});
};
VNDetectFaceLandmarksRequest *faceLandmarksRequest = [[VNDetectFaceLandmarksRequest alloc] initWithCompletionHandler:handlerBlock];
NSArray *trackingResults = trackingRequest.results;
if (trackingResults == nil) {
return;
}
VNDetectedObjectObservation *observation = trackingResults[0];
if (observation == nil) {
return;
}
VNFaceObservation *faceObservation = [VNFaceObservation observationWithBoundingBox:observation.boundingBox];
faceLandmarksRequest.inputFaceObservations = #[faceObservation];
// Continure to track detected facial landmarks.
NSUInteger nr = faceLandmarksRequests.count;
[faceLandmarksRequests insertObject:faceLandmarksRequest atIndex:nr];
VNImageRequestHandler *imageRequestHandler = [[VNImageRequestHandler alloc] initWithCVPixelBuffer:pixelBuffer orientation:exifOrientation options:requestHandlerOptions];
[imageRequestHandler performRequests:faceLandmarksRequests error:&error];
if (error != nil) {
NSLog(#"Failed to perform FaceLandmarkRequest: %#", error);
}
}
}
//_frameCapture = NO;
}
# pragma mark Helper Functions
- (void)captureFrame {
_frameCapture = YES;
}
- (void)_tearDownAVCapture {
_videoDataOutput = nil;
_videoDataOutputQueue = nil;
}
#end
Debugging
The crash seems related to Metal, perhaps on multiple threads. The crash happens when the Vision framework (from a working thread) executes Metal Performance Shaders from a private neural network framework (Espresso). Before the crash, there is a deadlock related to command buffers. This ultimately leads to the Address sanitizer reporting BUS on unknown address. I assume this is the reason I get KERN_PROTECTION_FAILURE. Other threads are either executing Metal or simply waiting. I don't know if the semaphores are related to Metal CPU/GPU synchronization or something else. When the code is used with App templates, the Vision framework is run on the main thread and no crash occurs. I can't see how I can address this in any meaningful way except filing a bug report. That being said, my debugging skills leave much to be desired, so any help is strongly appreciated - not only in solving the issue but also in understanding the problem. Address Sanitizer and Thread Sanitizer are turned on for the following output. Due to size constraints the Crash Report kan be read here. A crashing project (on my computer) can now be viewed and downloaded from dropbox. My computer is a 2019 MB Pro 16.
Console output
ErrorTest1(13661,0x107776e00) malloc: nano zone abandoned due to inability to preallocate reserved vm space.
2020-12-24 09:48:35.709965+0100 ErrorTest1[13661:811227] Metal GPU Frame Capture Enabled
2020-12-24 09:48:36.675326+0100 ErrorTest1[13661:811227] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000b7b50> F8BB1C28-BAE8-11D6-9C31-00039315CD46
2020-12-24 09:48:36.707535+0100 ErrorTest1[13661:811227] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000bb5a0> 30010C1C-93BF-11D8-8B5B-000A95AF9C6A
2020-12-24 09:48:36.845641+0100 ErrorTest1[13661:811227] [] CMIOHardware.cpp:379:CMIOObjectGetPropertyData Error: 2003332927, failed
2020-12-24 09:48:38.717546+0100 ErrorTest1[13661:811794] [logging-persist] cannot open file at line 44580 of [02c344acea]
2020-12-24 09:48:38.717648+0100 ErrorTest1[13661:811794] [logging-persist] os_unix.c:44580: (0) open(/var/db/DetachedSignatures) - Undefined error: 0
2020-12-24 09:48:38.778975+0100 ErrorTest1[13661:811761] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:61:16: warning: unused variable 'input_slice_count'
const uint input_slice_count = (INPUT_FEATURE_CHANNELS + 3) / 4;
^
2020-12-24 09:48:38.779198+0100 ErrorTest1[13661:811812] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
2020-12-24 09:48:38.779441+0100 ErrorTest1[13661:811838] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
2020-12-24 09:48:39.072518+0100 ErrorTest1[13661:811838] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:61:16: warning: unused variable 'input_slice_count'
const uint input_slice_count = (INPUT_FEATURE_CHANNELS + 3) / 4;
^
2020-12-24 09:48:39.073210+0100 ErrorTest1[13661:811842] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:98:16: warning: unused variable 'fm_group'
const uint fm_group = threadgroup_id.z - splitId * params.simdsPerGroupData;
^
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
2020-12-24 09:48:39.073538+0100 ErrorTest1[13661:811812] [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:98:16: warning: unused variable 'fm_group'
const uint fm_group = threadgroup_id.z - splitId * params.simdsPerGroupData;
^
program_source:121:24: warning: comparison of integers of different signs: 'int' and 'const constant uint' (aka 'const constant unsigned int')
for(int kd = 0; kd < params.inputFeatureChannels; kd++) // _ID = 3, RGB
~~ ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
LLDB bt
* thread #5, queue = 'com.apple.VN.trackersCollectionManagementQueue', stop reason = EXC_BAD_ACCESS (code=2, address=0x70000deb1ff8)
frame #0: 0x000000010739db33 libsystem_pthread.dylib`___chkstk_darwin + 55
frame #1: 0x000000010739dafc libsystem_pthread.dylib`thread_start + 20
frame #2: 0x000000010724277b libMTLCapture.dylib`___lldb_unnamed_symbol2507$$libMTLCapture.dylib + 585
frame #3: 0x00007fff29f597be MPSNeuralNetwork`___lldb_unnamed_symbol4427$$MPSNeuralNetwork + 1907
frame #4: 0x00007fff29f5a3c2 MPSNeuralNetwork`___lldb_unnamed_symbol4432$$MPSNeuralNetwork + 756
frame #5: 0x00007fff29f5aa39 MPSNeuralNetwork`___lldb_unnamed_symbol4435$$MPSNeuralNetwork + 83
frame #6: 0x00007fff339e50e8 Espresso`Espresso::MPSEngine::mps_convolution_kernel::recreate_kernel() + 230
frame #7: 0x00007fff339e3c95 Espresso`Espresso::MPSEngine::convolution_kernel_base<Espresso::generic_convolution_kernel>::set_biases(std::__1::shared_ptr<Espresso::blob<float, 1> >) + 455
frame #8: 0x00007fff339e724b Espresso`Espresso::MPSEngine::convolution_kernel_proxy::set_biases(std::__1::shared_ptr<Espresso::blob<float, 1> >) + 103
frame #9: 0x00007fff338b3a8f Espresso`Espresso::generic_convolution_kernel::set_biases(std::__1::shared_ptr<Espresso::blob<float, 1> >, std::__1::shared_ptr<Espresso::abstract_batch>) + 49
frame #10: 0x00007fff338bdee1 Espresso`Espresso::load_network_layers_post_dispatch(std::__1::shared_ptr<Espresso::net> const&, std::__1::shared_ptr<Espresso::SerDes::generic_serdes_object> const&, std::__1::shared_ptr<Espresso::cpu_context_transfer_algo_t> const&, std::__1::shared_ptr<Espresso::net_info_ir_t> const&, bool, Espresso::network_shape const&, Espresso::compute_path, bool, std::__1::shared_ptr<Espresso::blob_storage_abstract> const&) + 5940
frame #11: 0x00007fff338ba6ee Espresso`Espresso::load_network_layers_internal(std::__1::shared_ptr<Espresso::SerDes::generic_serdes_object>, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::shared_ptr<Espresso::abstract_context> const&, Espresso::network_shape const&, std::__1::basic_istream<char, std::__1::char_traits<char> >*, Espresso::compute_path, bool, std::__1::shared_ptr<Espresso::blob_storage_abstract> const&) + 793
frame #12: 0x00007fff338c9294 Espresso`Espresso::load_and_shape_network(std::__1::shared_ptr<Espresso::SerDes::generic_serdes_object> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::shared_ptr<Espresso::abstract_context> const&, Espresso::network_shape const&, Espresso::compute_path, std::__1::shared_ptr<Espresso::blob_storage_abstract> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) + 576
frame #13: 0x00007fff338cb715 Espresso`Espresso::load_network(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::shared_ptr<Espresso::abstract_context> const&, Espresso::compute_path, bool) + 2496
frame #14: 0x00007fff33d9603c Espresso`EspressoLight::espresso_plan::add_network(char const*, espresso_storage_type_t) + 350
frame #15: 0x00007fff33daa817 Espresso`espresso_plan_add_network + 294
frame #16: 0x00007fff30479b9d Vision`+[VNEspressoHelpers createSingleNetworkPlanFromResourceName:usingProcessingDevice:lowPriorityMode:inputBlobNames:outputBlobNames:explicitNetworkLayersStorageType:espressoResources:error:] + 517
frame #17: 0x00007fff3047992d Vision`+[VNEspressoHelpers createSingleNetworkPlanFromResourceName:usingProcessingDevice:lowPriorityMode:inputBlobNames:outputBlobNames:espressoResources:error:] + 151
frame #18: 0x00007fff303ce123 Vision`-[VNRPNTrackerEspressoModelCacheManager espressoResourcesFromOptions:error:] + 417
frame #19: 0x00007fff303ce8c8 Vision`-[VNObjectTrackerRevision2 initWithOptions:error:] + 262
frame #20: 0x00007fff304152df Vision`__54-[VNTrackerManager _createTracker:type:options:error:]_block_invoke + 207
frame #21: 0x00000001072fc0b0 libdispatch.dylib`_dispatch_client_callout + 8
frame #22: 0x000000010730d3b2 libdispatch.dylib`_dispatch_lane_barrier_sync_invoke_and_complete + 135
frame #23: 0x00007fff30414f01 Vision`-[VNTrackerManager _createTracker:type:options:error:] + 261
frame #24: 0x00007fff30414b52 Vision`-[VNTrackerManager trackerWithOptions:error:] + 509
frame #25: 0x00007fff304dda4a Vision`-[VNRequestPerformer trackerWithOptions:error:] + 85
frame #26: 0x00007fff30343ac4 Vision`-[VNTrackingRequest internalPerformRevision:inContext:error:] + 436
frame #27: 0x00007fff3037fb08 Vision`-[VNRequest performInContext:error:] + 885
frame #28: 0x00007fff303cd9a1 Vision`VNExecuteBlock + 58
frame #29: 0x00007fff304dd105 Vision`-[VNRequestPerformer _performOrderedRequests:inContext:error:] + 674
frame #30: 0x00007fff304dd482 Vision`-[VNRequestPerformer performRequests:inContext:onBehalfOfRequest:error:] + 352
frame #31: 0x00007fff304dd586 Vision`-[VNRequestPerformer performRequests:inContext:error:] + 60
frame #32: 0x00007fff304cbf1a Vision`-[VNSequenceRequestHandler _performRequests:onImageBuffer:gatheredForensics:error:] + 293
frame #33: 0x00007fff304cc122 Vision`-[VNSequenceRequestHandler performRequests:onCVPixelBuffer:orientation:gatheredForensics:error:] + 111
frame #34: 0x00007fff304cc0aa Vision`-[VNSequenceRequestHandler performRequests:onCVPixelBuffer:orientation:error:] + 28
* frame #35: 0x0000000106fc5a97 ErrorTest1`-[LSPVision captureOutput:didOutputSampleBuffer:fromConnection:](self=0x0000608000047c20, _cmd="captureOutput:didOutputSampleBuffer:fromConnection:", output=0x00006030000ce770, sampleBuffer=0x0000614000091240, connection=0x00006030000d0c30) at LSPVision.m:246:9
frame #36: 0x00007fff3786b2e0 AVFCapture`__56-[AVCaptureVideoDataOutput_Tundra _render:sampleBuffer:]_block_invoke + 213
frame #37: 0x00000001077ff3bb libclang_rt.asan_osx_dynamic.dylib`__wrap_dispatch_async_block_invoke + 203
frame #38: 0x00000001072fae78 libdispatch.dylib`_dispatch_call_block_and_release + 12
frame #39: 0x00000001072fc0b0 libdispatch.dylib`_dispatch_client_callout + 8
frame #40: 0x00000001073036b7 libdispatch.dylib`_dispatch_lane_serial_drain + 776
frame #41: 0x0000000107304594 libdispatch.dylib`_dispatch_lane_invoke + 449
frame #42: 0x0000000107312217 libdispatch.dylib`_dispatch_workloop_worker_thread + 1675
frame #43: 0x000000010739eb15 libsystem_pthread.dylib`_pthread_wqthread + 314
frame #44: 0x000000010739dae3 libsystem_pthread.dylib`start_wqthread + 15
Update
The bug seems to be resolved on macOS Monterey 12.1.
More of a comment here, I am trying to reproduce this. I've taken your code as is, but had to comment out [self _setEyePositionsForFace:faceObservation]; in
for (VNFaceObservation *faceObservation in results) {
//[self _setEyePositionsForFace:faceObservation];
//NSLog(#"seeing face");
}
as you do not give its implementation. However, with that done, I was able to run the code without any issue. To test further I added the logs as below.
// SequenceRequesthandler results in 10-20% cpu utilization
NSLog(#"aaa");
[_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error];
NSLog(#"bbb");
as I understand your problem is with [_sequenceRequestHandler performRequests:requests onCVPixelBuffer:pixelBuffer orientation:exifOrientation error:&error]; specifically, but I did not run into trouble and the log showed a lot of repeating aaa and bbbs. To further test I also added an ok log as shown below
if (error != nil) {
NSLog(#"Failed to perform FaceLandmarkRequest: %#", error);
} else {
NSLog(#"ok");
}
and that happily printed together with the aaa and bbb.
I also hooked up a button as shown below
- (IBAction)buttonAction:(id)sender {
NSLog( #"Button" );
self.v.captureFrame;
}
where self.v is an instance of (my) LSPVision and I could push the button as much as I'd like without trouble.
I think either the problem lies somewhere else, maybe even in the _setEyePositionsForFace that I commented out, or perhaps you can give even more code so I can reproduce it this side?
FWIW here is a sample of the log
2020-12-27 09:14:54.147536+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.184167+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.268926+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.269374+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.314135+0200 MetalCaptureTest[11392:316676] Button
2020-12-27 09:14:54.316025+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.393732+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.394171+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.432979+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.496887+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.497389+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.533118+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.614813+0200 MetalCaptureTest[11392:317094] ok
2020-12-27 09:14:54.615394+0200 MetalCaptureTest[11392:317094] aaa
2020-12-27 09:14:54.663343+0200 MetalCaptureTest[11392:317094] bbb
2020-12-27 09:14:54.747860+0200 MetalCaptureTest[11392:317094] ok
EDIT
Thanks, I got the dropbox project and it is working this side. No crashes at all. This is the log.
ErrorTest1(11743,0x10900ce00) malloc: nano zone abandoned due to inability to preallocate reserved vm space.
2020-12-27 10:55:10.445333+0200 ErrorTest1[11743:344803] Metal GPU Frame Capture Enabled
2020-12-27 10:55:10.471650+0200 ErrorTest1[11743:344803] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000aabc0> F8BB1C28-BAE8-11D6-9C31-00039315CD46
2020-12-27 10:55:10.528628+0200 ErrorTest1[11743:344803] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6030000ae130> 30010C1C-93BF-11D8-8B5B-000A95AF9C6A
2020-12-27 10:55:10.608753+0200 ErrorTest1[11743:344803] [] CMIOHardware.cpp:379:CMIOObjectGetPropertyData Error: 2003332927, failed
2020-12-27 10:55:11.408594+0200 ErrorTest1[11743:344873] [logging-persist] cannot open file at line 44580 of [02c344acea]
2020-12-27 10:55:11.408806+0200 ErrorTest1[11743:344873] [logging-persist] os_unix.c:44580: (0) open(/var/db/DetachedSignatures) - Undefined error: 0
2020-12-27 10:55:17.637382+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:17.838354+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:17.987583+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:18.171168+0200 ErrorTest1[11743:344803] seeing face
2020-12-27 10:55:18.320957+0200 ErrorTest1[11743:344803] seeing face
FWIW I have latest OS BS 11.1, latest Xcode 12.3 and running this on MB Air 2017. From your description I suspect maybe the multithreading could be a problem but for now my focus is on reproducing it this side.
I also got an unexplainable crash using Vision in one app, but not in another one I created for testing. Turns out enabling "Metal API Validation" made the problem go away for me. I do have several custom Metal kernels in my app, but I still don't understand what the root problem is.

Error saving core data object

I am quite confused about one error I am receiving when saving an object. I am receiving the following error (when I print out the detailed description):
2015-05-08 08:19:51.589 Br[11240:208443] Core Data Save Error
NSValidationErrorKey
activityObject
NSValidationErrorPredicate
(null)
NSValidationErrorObject
<BRActivity: 0x7deb2aa0> (entity: BRActivity; id: 0x7deb2780 <x-coredata:///BRActivity/t86C0E8CD-2B6C-4AF7-986A-4797B7BEFDF5> ; data: {
activities = (
);
activityObject = nil;
activityType = 0;
body = "Someone left a comment on your post.";
timestamp = "2015-05-08 04:24:46 +0000";
uuidString = "c6a06b45-e2d5-45bd-9f64-20b13ac87526";
})
NSLocalizedDescription
The operation couldn’t be completed. (Cocoa error 1570.)
2015-05-08 08:19:51.590 Br[11240:208443] Core Data Save Error
So from looking on the internet, it seems to be a relationship problem. So a Br post has a relationship called activities relationship, the inverse of which is an activity object. Now that, as we can see from the error, is nil. So what kind of solution are we looking at here... Is there a way to make the relationship "optional" (ok to be nil!) or should I add an activity object? I really don't want to break anything here so if there's a subtle solution let me know. Thanks a bunch guys!
Also here is the method surrounding the save:
- (void)importArray:(NSArray *)array entityName:(NSString *)entityName attributeName:attributeName error:(NSError *__autoreleasing *)error {
NSParameterAssert(array);
NSParameterAssert(entityName);
[self.context performBlockAndWait:^{
for (NSDictionary *jsonDictionary in array) {
NSManagedObject *managedObject = [NSManagedObject upsertWithContext:self.context entityName:entityName dictionary:jsonDictionary attributeName:attributeName error:error];
if (nil == managedObject) {
if ([self.context hasChanges]) {
[self.context rollback];
}
return;
}
}
if ([self.context hasChanges]) {
if (![self.context save:error]) {
NSError *err = *error;
NSDictionary *userInfo = [err userInfo];
if ([userInfo valueForKey:#"NSDetailedErrors"] != nil) {
// ...and loop through the array, if so.
NSArray *errors = [userInfo valueForKey:#"NSDetailedErrors"];
for (NSError *anError in errors) {
NSDictionary *subUserInfo = [anError userInfo];
subUserInfo = [anError userInfo];
// Granted, this indents the NSValidation keys rather a lot
// ...but it's a small loss to keep the code more readable.
NSLog(#"Core Data Save Error\n\n \
NSValidationErrorKey\n%#\n\n \
NSValidationErrorPredicate\n%#\n\n \
NSValidationErrorObject\n%#\n\n \
NSLocalizedDescription\n%#",
[subUserInfo valueForKey:#"NSValidationErrorKey"],
[subUserInfo valueForKey:#"NSValidationErrorPredicate"],
[subUserInfo valueForKey:#"NSValidationErrorObject"],
[subUserInfo valueForKey:#"NSLocalizedDescription"]);
}
}
NSLog(#"Error: %#", err.localizedDescription);
}
return;
}
}];
}
Yes, relationships can be optional. Select the relationship and you will see the optional optional in the Data inspector pane on the top right.

Data corruption when reading realtime H.264 output from AVAssetWriter

I'm using some tricks to try to read the raw output of an AVAssetWriter while it is being written to disk. When I reassemble the individual files by concatenating them, the resulting file is the same exact number of bytes as the AVAssetWriter's output file. However, the reassembled file will not play in QuickTime or be parsed by FFmpeg because there is data corruption. A few bytes here and there have been changed, rendering the resulting file unusable. I assume this is occurring on the EOF boundary of each read, but it isn't consistent corruption.
I plan to eventually use code similar to this to parse out individual H.264 NAL units from the encoder to packetize them and send them over RTP, however if I can't trust the data being read from disk I might have to use another solution.
Is there an explanation/fix for this data corruption? And are there any other resources/links you have found on how to parse the NAL units to packetize over RTP?
Full code here: AVAppleEncoder.m
// Modified from
// http://www.davidhamrick.com/2011/10/13/Monitoring-Files-With-GCD-Being-Edited-With-A-Text-Editor.html
- (void)watchOutputFileHandle
{
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
int fildes = open([[movieURL path] UTF8String], O_EVTONLY);
source = dispatch_source_create(DISPATCH_SOURCE_TYPE_VNODE,fildes,
DISPATCH_VNODE_DELETE | DISPATCH_VNODE_WRITE | DISPATCH_VNODE_EXTEND | DISPATCH_VNODE_ATTRIB | DISPATCH_VNODE_LINK | DISPATCH_VNODE_RENAME | DISPATCH_VNODE_REVOKE,
queue);
dispatch_source_set_event_handler(source, ^
{
unsigned long flags = dispatch_source_get_data(source);
if(flags & DISPATCH_VNODE_DELETE)
{
dispatch_source_cancel(source);
//[blockSelf watchStyleSheet:path];
}
if(flags & DISPATCH_VNODE_EXTEND)
{
//NSLog(#"File size changed");
NSError *error = nil;
NSFileHandle *fileHandle = [NSFileHandle fileHandleForReadingFromURL:movieURL error:&error];
if (error) {
[self showError:error];
}
[fileHandle seekToFileOffset:fileOffset];
NSData *newData = [fileHandle readDataToEndOfFile];
if ([newData length] > 0) {
NSLog(#"newData (%lld): %d bytes", fileOffset, [newData length]);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *basePath = ([paths count] > 0) ? [paths objectAtIndex:0] : nil;
NSString *movieName = [NSString stringWithFormat:#"%d.%lld.%d.mp4", fileNumber, fileOffset, [newData length]];
NSString *path = [NSString stringWithFormat:#"%#/%#", basePath, movieName];
[newData writeToFile:path atomically:NO];
fileNumber++;
fileOffset = [fileHandle offsetInFile];
}
}
});
dispatch_source_set_cancel_handler(source, ^(void)
{
close(fildes);
});
dispatch_resume(source);
}
Here are some similar questions I have found, but don't exactly answer my question:
Get PTS from raw H264 mdat generated by iOS AVAssetWriter
streaming video FROM an iPhone
Parsing h.264 NAL units from a quicktime MOV file
Realtime Audio/Video Streaming FROM iPhone to another device (Browser, or iPhone)
When I eventually figure this out, I will release an open source library to assist people who try to do this in the future.
Thank you!
Update: The corruption doesn't happen at the EOF boundary. It seems like parts of the file are re-written after finishWriting is called. This first file was chunked at 4KB, so the area changed isn't anywhere near an EOF boundary. It seems to be corrupted near new "moov" elements as well when movieFragmentInterval is enabled.
Correct file on the left, broken file on the right.
I ended up abandoning the "read while it's written" approach in favor of a manual chunking approach where I call finishWriting every 5 seconds on a background thread. I was able to drop a negligible number of frames using a method originally described here:
- (void) segmentRecording:(NSTimer*)timer {
AVAssetWriter *tempAssetWriter = self.assetWriter;
AVAssetWriterInput *tempAudioEncoder = self.audioEncoder;
AVAssetWriterInput *tempVideoEncoder = self.videoEncoder;
self.assetWriter = queuedAssetWriter;
self.audioEncoder = queuedAudioEncoder;
self.videoEncoder = queuedVideoEncoder;
//NSLog(#"Switching encoders");
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
[tempAudioEncoder markAsFinished];
[tempVideoEncoder markAsFinished];
if (tempAssetWriter.status == AVAssetWriterStatusWriting) {
if(![tempAssetWriter finishWriting]) {
[self showError:[tempAssetWriter error]];
}
}
if (self.readyToRecordAudio && self.readyToRecordVideo) {
NSError *error = nil;
self.queuedAssetWriter = [[AVAssetWriter alloc] initWithURL:[self newMovieURL] fileType:(NSString *)kUTTypeMPEG4 error:&error];
if (error) {
[self showError:error];
}
self.queuedVideoEncoder = [self setupVideoEncoderWithAssetWriter:self.queuedAssetWriter formatDescription:videoFormatDescription bitsPerSecond:videoBPS];
self.queuedAudioEncoder = [self setupAudioEncoderWithAssetWriter:self.queuedAssetWriter formatDescription:audioFormatDescription bitsPerSecond:audioBPS];
//NSLog(#"Encoder switch finished");
}
});
}
Full source code: https://github.com/chrisballinger/FFmpeg-iOS-Encoder/blob/master/AVSegmentingAppleEncoder.m
When reading a MOV file that is ACTIVELY recording on iOS, you MUST check the 4 bytes mentioned for changes, and re-write this four bytes, then check for additional data in file, and send additional data. Then when done, truncate the file to the file size written.
Obviously this depends on where you are sending the file. I use a send (offset,number of bytes) to receiver. So I send "additional data", "more additional data", ... , new data at (24,4), "more additional data".
Typically iOS only writes the 4 byte (size of data section) record when file is about to be closed (aka after last media write). (see info on "Quicktime atoms"). Unfortunately, this also means the MOV file is not PLAYABLE until recording is completed (and movie descriptors written at END of file).

App crashes on drawing of UILabel - NSString is a zombie

I'm working on an iPad app that has a slider that is used to scroll through data. When scrolling, a map is displayed and the data is updated. The problem is, if you scroll fast enough (or somehow trigger the race condition), the app crashes on accessing a zombie NSString. I've been able to track it down in the Profiler and found this:
Event Type RefCt Timestamp Size Responsible Library Responsible Caller
Malloc 1 01:55.166.466 16 Foundation -[NSPlaceholderString initWithFormat:locale:arguments:]
Autorelease <null> 01:55.166.472 0 Foundation +[NSString stringWithFormat:]
CFRetain 2 01:55.166.473 0 My Program -[StateView updateVotes:]
CFRetain 3 01:55.166.476 0 UIKit -[UILabel setText:]
CFRelease 2 01:55.166.504 0 My Program -[StateView updateVotes:]
CFRelease 1 01:55.177.661 0 Foundation -[NSAutoreleasePool release]
CFRelease 0 01:55.439.090 0 UIKit -[UILabel setText:]
Zombie -1 01:55.439.109 0 UIKit -[NSString(UIStringDrawing) drawAtPoint:forWidth:withFont:lineBreakMode:letterSpacing:includeEmoji:]
I'm using ARC on iOS5, so I'm not in control of the retain/release at all. Even if I was, looking at the above, it is correct. The problem seems to be a race condition between the drawing function and the UILabel string actually changing. The UILabel releases the first string, as a new one has been set, but the drawing function is holding a reference to it somehow, but did not retain it.
As a note, I have not modified the UILabel in any way.
Any ideas?
--- Code added as update:
Slider update:
-(void)sliderValueChanged:(UISlider *)slider {
float position = slider.value - 1790.0f;
int year;
if(position <= 0.0f) {
year = 1789;
} else {
year = 1792 + (floor(position / 4.0f)*4);
}
[self setYear:year];
}
setYear:
-(void)setYear:(int)year {
if (year == currentYear) {
// year didn't change, so don't do anything
return;
}
[yearLabel setText:[[NSString alloc] initWithFormat:#"%i", year]];
currentYear = year;
[self getMapForYear:year];
}
getMapForYear:
-(void) getMapForYear:(int)year {
[self setToMap:[historicalData objectForKey:[NSNumber numberWithInt:year]];
}
setToMap:
-(void) setToMap:(HistoricalMap *)map {
// Label the map
for (State *state in [map states]) {
[mapController setVotes:[state votes] forState:[state abbreviation]];
}
}
setVotes:forState:
-(void)setVotes:(NSNumber *)votes forState:(NSString *)stateAbbreviation {
StateView *state = [states objectForKey:stateAbbreviation];
if (state == nil) {
NSLog(#"Invalid State Votes -- %#", stateAbbreviation);
return;
}
[state updateVotes:votes];
[state setNeedsDisplay];
}
updateVotes:
-(void)updateVotes:(NSNumber *)newVotes {
[self setVotes:newVotes];
NSString *voteString = [[NSString alloc] initWithFormat:#"%#", newVotes];
[voteLabel setText:voteString];
if ([newVotes isEqual:[NSNumber numberWithInt:0]]) {
[[self voteLabel] setHidden:YES];
[[self stateAbbreviationLabel] setHidden:YES];
} else {
[[self stateAbbreviationLabel] setHidden:NO];
[[self voteLabel] setHidden:NO];
}
}
I think you are attempting to do far too much during the slider's movement. Creating and executing core data fetch requests alone would seem to be overkill, let alone updating the entire GUI and a screenful of labels. Have you tested the performance of this on a device?
It could be worth profiling these sections of code and seeing where the time is spent. You could look at caching the fetch requests or the results, for example, or you may have to resort to updating only when the slider has stopped, or only for every n increments along the path.
You havw several memory-leaks with NSString:
[yearLabel setText:[[NSString alloc] initWithFormat:#"%i", year]]; // leak
Create string with stringWithFormat method instead
[yearLabel setText:[NSString stringWithFormat:#"%i", year]];
[NSString stringWithFormat: **is the best way formatting the string than any other..**

What causes this EXC_CRASH in my iPad app?

I got this symbolicated stack trace from a crash report from my iPad app (excerpt):
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x00000000, 0x00000000
Crashed Thread: 0
0 ImageIO 0x34528eb4 _CGImagePluginIdentifyPNG + 0
1 ImageIO 0x34528d90 _CGImageSourceBindToPlugin + 368
2 ImageIO 0x34528bda CGImageSourceGetCount + 26
3 UIKit 0x341b8f66 _UIImageRefAtPath + 366
4 UIKit 0x342650ce -[UIImage initWithContentsOfFile:] + 50
5 UIKit 0x342b0314 +[UIImage imageWithContentsOfFile:] + 28
6 DesignScene 0x00013a2a -[LTImageCache fetchImageforURL:] (LTImageCache.m:37)
…
Here are the contents of -[LTImageCache fetchImageforURL:]:
- (UIImage *)fetchImageforURL:(NSString *)theUrl {
NSString *key = theUrl.md5Hash;
return [UIImage imageWithContentsOfFile:[self filenameForKey:key]];
}
And the contents of -[LTImageCache filenameForKey:]:
- (NSString *) filenameForKey:(NSString *) key {
return [_cacheDir stringByAppendingPathComponent:key];
}
The _cacheDir ivar is created and retained in -init. So the question is, what caused this crash? Is the problem that:
the return value of -[LTImageCache filenameForKey:] needs to be retained (it's autoreleased)
An unhandled exception somewhere (+[UIImage imageWithContentsOfFile:] claims to return nil if the image is unrecognizable)
Something else…I'm out of guesses
I would think that an autoreleased value would be fine. An in truth, this code has been working fine for months, and this method is called 100s of times in a session. This is a rare crash under very specific circumstances (the app left loaded overnight, crash upon unlocking the iPad in the morning).
What's causing this crash?
I'm guessing but it looks like a bogus image file. Is this in your app bundle or do you download it?
I don't think it has anything to do with memory mgt.
To test you could try using ImageIO to open the file yourself.
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)self.url, NULL);
if(NULL != imageSource) {
size_t imageCount = CGImageSourceGetCount(imageSource);
if(imageCount > 0) {
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCGImageSourceCreateThumbnailFromImageIfAbsent,
[NSNumber numberWithInteger:maxSize], kCGImageSourceThumbnailMaxPixelSize, nil];
CGImageRef thumbImage = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, (CFDictionaryRef)options);
self.image = [UIImage imageWithCGImage:thumbImage scale:scale orientation:imageOrientation];
CGImageRelease(thumbImage);
CFRelease(imageSource);
[pool drain];
}
} else {
NSLog(#"Unable to open image %#", self.url);
}
and then try to find the image count.
Using maxSize and getting a thumbnail will ensure you don't load a 5 mega-pixel image to put into a 100x100 tile on your UI.
scale is the window's scale (will be 2 for iPhone 4 and 1 for everything else).
To find the orientation you need to use the CGImageSourceCopyPropertiesAtIndex function and then the kCGImagePropertyOrientation key to get the orientation of the particular image.