I never experienced that issue, I was able to use the same code 3 month ago. After updating to macOS High Sierra, I can't run this code with the ZXing library. Not sure if it's linked but here what I have:
Error:
IIOImageWriteSession:113: cannot create: '/Users/****/Desktop/test.PNG.sb-8ff9679f-SRYKGg'
error = 1 (Operation not permitted)
Code:
NSError * error = nil;
ZXMultiFormatWriter * writer = [ZXMultiFormatWriter writer];
ZXBitMatrix* result = [writer encode:#"AM233X05987"
format:kBarcodeFormatDataMatrix
width:500
height:500
error:&error];
if (result) {
CGImageRef image = [[ZXImage imageWithMatrix:result] cgimage];
CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:#"/Users/****/Desktop/test.PNG"];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL(url, kUTTypePNG, 1, NULL);
if (!destination) {
NSLog(#"Failed to create CGImageDestination" );
return NO;
}
CGImageDestinationAddImage(destination, image, nil);
if (!CGImageDestinationFinalize(destination)) {
NSLog(#"Failed to write image to /Users/alex/Desktop/Barcodes/");
CFRelease(destination);
return NO;
}
CFRelease(destination);
} else {
NSString * errorMessage = [error localizedDescription];
NSLog(#"%#", errorMessage);
}
Finally found the origin of the problem, it turns out this is because the App is Sandboxed. Turning off Sandbox fixed the issue.
To get this with Sandbox ON, a solution is to put up a Save panel and, if it returns OK, saving the file to the Save panel's URL.
Related
I have an OSX application that is supposed to have a list of files from anywhere in the user's disk.
The first version of the app saves the path to these files in a core-data model.
However, if the file is moved or renamed, the tool loses its purpose and the app can crash.
So I decided to use bookmarks. It seems to be working, but every time I try to recover the data, I get the old path of the files. Why is that? What am I missing?
My core-data entity uses a binary data field to persist the bookmark.
The bookmark itself is done like this:
NSData * bookmark = [filePath bookmarkDataWithOptions:NSURLBookmarkCreationMinimalBookmark
includingResourceValuesForKeys:NULL
relativeToURL:NULL
error:NULL];
And on loading the application, I have a loop to iterate all the table and recover the bookmark like this:
while (object = [rowEnumerator nextObject]) {
NSError * error = noErr;
NSURL * bookmark = [NSURL URLByResolvingBookmarkData:[object fileBookmark]
options:NSURLBookmarkResolutionWithoutUI
relativeToURL:NULL
bookmarkDataIsStale:NO
error:&error];
if (error != noErr)
DDLogCError(#"%#", [error description]);
DDLogCInfo(#"File Path: %#", [bookmark fileReferenceURL]);
}
If I rename the file, the path is null. I see no difference between storing this NSData object and a string with the path. So I am obviously missing something.
Edit:
I also often get an error like this: CFURLSetTemporaryResourcePropertyForKey failed because it was passed this URL which has no scheme.
I appreciate any help, thanks!
I can't find any issues in my code, so I changed it.
After looking for the reason of the "no scheme" message, I came to the conclusion some third-party application is required for this code to work, and that's undesirable.
I am now using aliases. This is how I create them:
FSRef fsFile, fsOriginal;
AliasHandle aliasHandle;
NSString * fileOriginalPath = [[filePath absoluteString] stringByReplacingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
OSStatus status = FSPathMakeRef((unsigned char*)[fileOriginalPath cStringUsingEncoding: NSUTF8StringEncoding], &fsOriginal, NULL);
status = FSPathMakeRef((unsigned char*)[fileOriginalPath cStringUsingEncoding: NSUTF8StringEncoding], &fsFile, NULL);
OSErr err = FSNewAlias(&fsOriginal, &fsFile, &aliasHandle);
NSData * aliasData = [NSData dataWithBytes: *aliasHandle length: GetAliasSize(aliasHandle)];
And now I recover the path like this:
while (object = [rowEnumerator nextObject]) {
NSData * aliasData = [object fileBookmark];
NSUInteger aliasLen = [aliasData length];
if (aliasLen > 0) {
FSRef fsFile, fsOriginal;
AliasHandle aliasHandle;
OSErr err = PtrToHand([aliasData bytes], (Handle*)&aliasHandle, aliasLen);
Boolean changed;
err = FSResolveAlias(&fsOriginal, aliasHandle, &fsFile, &changed);
if (err == noErr) {
char pathC[2*1024];
OSStatus status = FSRefMakePath(&fsFile, (UInt8*) &pathC, sizeof(pathC));
NSAssert(status == 0, #"FSRefMakePath failed");
NSLog(#"%#", [NSString stringWithCString: pathC encoding: NSUTF8StringEncoding]);
} else {
NSLog(#"The file disappeared!");
}
} else {
NSLog(#"CardCollectionUserDefault was zero length");
}
}
However, I am still curious on why my previous code failed. I appreciate any thoughts on that. Thanks!
I need to read the properties of an image without load or download it. In fact i have implemented a simple method that use the CGImageSourceCreateWithUrl for accomplish this.
My problem is that it is returning always error because seems that the imageSource is null. So what can i do for fix it?
In the NSURL object i pass urls like:
"http://www.example.com/wp-content/uploads/image.jpg" but also ALAssets library Id used to retrieve images inside the phone like "assets-library://asset/asset.JPG?id=E5F41458-962D-47DD-B5EF-E606E2A8AC7A&ext=JPG".
This is my method:
-(NSString *) getPhotoInfo:(NSString *)paths{
NSString *xmlList = #“test”;
NSURL * imageFileURL = [NSURL fileURLWithPath:paths];
NSLog(#"imageFileURL %#", imageFileURL);
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((__bridge CFURLRef)(imageFileURL), NULL);
if (imageSource == NULL) {
// Error loading image
NSLog(#"Error loading image");
}
CGFloat width = 0.0f, height = 0.0f;
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, NULL);
NSLog(#"image source %#", imageSource);
return xmlList;
}
I have saw this posts for try to fix it but nothing seems working:
CGImageSourceRef imageSource = CGImageSourceCreateWithURL returns NULL
CGImageSourceCreateWithURL with authentication
accessing UIImage properties without loading in memory the image
In my project ARC is enabled.
Thanks
If you are passing the string "http://www.example.com/wp-content/uploads/image.jpg” to -fileURLWithPath: it’s going to return nil, because that string is sure not a file path, it’s a URL string.
Think of -fileURLWithPath: as just prepending the string you pass in with “file://localhost/“...so you’d end up with a URL that looks like "file://localhost/http://www.example.com/wp-content/uploads/image.jpg”. That’s not good.
You need to call [NSURL URLWithString:paths] if you’re going to be passing in entire URL strings, not just a filesystem path strings.
Working with "assets-library://asset/asset.JPG?id.........., try this code
-(UIImage*)resizeImageToMaxSize:(CGFloat)max anImage:(UIImage*)anImage
{
NSData * imgData = UIImageJPEGRepresentation(anImage, 1);
CGImageSourceRef imageSource = CGImageSourceCreateWithData((CFDataRef)imgData, NULL);
if (!imageSource)
return nil;
CFDictionaryRef options = (CFDictionaryRef)[NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageIfAbsent,
(id)[NSNumber numberWithFloat:max], (id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef imgRef = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, options);
UIImage* scaled = [UIImage imageWithCGImage:imgRef];
//these lines might be skipped for ARC enabled
CGImageRelease(imgRef);
CFRelease(imageSource);
return scaled; }
Basically I am doing a screen capture app with optimum image size.
I am trying to capture my Mac screen and save it as JPEG file.
I wanted to capture the screen using Cocoa API [CGWindowListCreateImage] and save the file as JPEG using libjpeg9. I included the static library [libjpeg.a] and the header files[jpeglib.h, jconfig.h, jerror.h and jmorecfg.h] using the "Add Files to Project_Name" option in XCode.
The code I used to save as JPEG file is
int write_jpeg_file( char *filename )
{
struct jpeg_compress_struct cinfo;
struct jpeg_error_mgr jerr;
/* this is a pointer to one row of image data */
JSAMPROW row_pointer[1];
FILE *outfile = fopen( filename, "wb" );
if ( !outfile )
{
printf("Error opening output jpeg file %s\n!", filename );
return -1;
}
cinfo.err = jpeg_std_error( &jerr );
jpeg_create_compress(&cinfo);
jpeg_stdio_dest(&cinfo, outfile);
/* Setting the parameters of the output file here */
cinfo.image_width = width;
cinfo.image_height = height;
cinfo.input_components = bytes_per_pixel;
cinfo.in_color_space = color_space;
/* default compression parameters, we shouldn't be worried about these */
jpeg_set_defaults( &cinfo );
/* Now do the compression .. */
jpeg_start_compress( &cinfo, TRUE );
/* like reading a file, this time write one row at a time */
while( cinfo.next_scanline < cinfo.image_height )
{
row_pointer[0] = &raw_image[ cinfo.next_scanline * cinfo.image_width * cinfo.input_components];
jpeg_write_scanlines( &cinfo, row_pointer, 1 );
}
/* similar to read file, clean up after we're done compressing */
jpeg_finish_compress( &cinfo );
jpeg_destroy_compress( &cinfo );
fclose( outfile );
/* success code is 1! */
return 1;
}
When I Build and Run the project I am getting following error
Parse Issue
Expected }
in the line
typedef enum { FALSE = 0, TRUE = 1 } boolean;
in jmorecfg.h which is libjpeg header file. I am not sure why I am getting this error. Please help me.
I know there is a way to save the file using Cocoa API instead of using libjpeg.
I first tried in XCode using Cocoa. I am able to get the JPEG file. The file size is relatively bigger(~200KB). When I create the image in C using libjpeg the file size is ~100KB.
The code I used:
CGFloat imageCompression = 0.6;
- (NSImage *)captureImageForRect:(NSRect)rect {
CGImageRef screenShot = CGWindowListCreateImage(CGRectInfinite, kCGWindowListOptionOnScreenOnly, kCGNullWindowID, kCGWindowImageDefault);
NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithCGImage:screenShot];
NSLog(#"Bits per pixel : %ld", [imageRep bitsPerPixel]);
NSImage *result = [[NSImage alloc] init];
[result addRepresentation:imageRep];
screenShot=nil;
imageRep=nil;
return result;
}
-(NSData *)jpegReresentationOfImage:(NSImage *) image {
NSBitmapImageRep* myBitmapImageRep=nil;
NSSize imageSize = [image size];
[image lockFocus];
NSRect imageRect = NSMakeRect(0, 0, imageSize.width, imageSize.height);
myBitmapImageRep = [[NSBitmapImageRep alloc] initWithFocusedViewRect:imageRect];
[image unlockFocus];
// set up the options for creating a JPEG
NSDictionary* options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:imageCompression], NSImageCompressionFactor,
[NSNumber numberWithBool:NO], NSImageProgressive,
nil];
NSData* imageData = [myBitmapImageRep representationUsingType:NSJPEGFileType properties:options];
myBitmapImageRep=nil;
return ( imageData);
}
-(void)captureAndSave {
NSString *fileName = #"/Volumes/Official/images/output/Mac_Window_1.jpg";
NSImage *screenshot = [self captureImageForRect:[[NSScreen mainScreen] frame]];
NSData *jpgRep = [self jpegReresentationOfImage:screenshot];
[jpgRep writeToFile:[fileName stringByExpandingTildeInPath] atomically:NO];
[NSApp terminate:self];
}
Adding the following line to jconfig.h solved the problem. After adding this, I am not getting compile error and I am able to use libjpeg inside my Objetive-C program.
#define USE_MAC_MEMMGR
In jmemmac.c, I found the following code which helped me to fix it. Just adding the above line worked!
#ifndef USE_MAC_MEMMGR /* make sure user got configuration right */
You forgot to define USE_MAC_MEMMGR in jconfig.h. /* deliberate syntax error */
#endif
In the code below I am crashing on the CGImageDestinationFinalize() line. The CGImage is a large JPG photo. It works fine if I use a smaller JPG. But anything over 10MB crashes on the device with "out of memory".
I think the CGImageDestination* methods are streaming the write but I must be doing something wrong that is retaining the data.
Is there another API call I should use when writing large JPGs?
BOOL SKImageWriteToDisk(CGImageRef image, NSURL *url, SKImageProfile profile, CGFloat dpi)
{
CFStringRef type = profile == SKmageProfileCompressed ? kUTTypeJPEG : kUTTypePNG;
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)url, type, 1, NULL);
if (!destination) {
return NO;
}
NSMutableDictionary *properties =
[#{
(id) kCGImagePropertyDPIHeight: #(dpi),
(id) kCGImagePropertyDPIWidth: #(dpi)
} mutableCopy];
if (profile == SKImageProfileCompressed) {
properties[(id) kCGImageDestinationLossyCompressionQuality] = #(0.8f);
}
CGImageDestinationAddImage(destination, image, (__bridge CFDictionaryRef)properties);
BOOL finalizeSuccess = CGImageDestinationFinalize(destination); // crash!
CFRelease(destination);
return finalizeSuccess;
}
I'm trying to take a video created using the iVidCap plugin and add audio to it. Basically the exact same thing as in this question: Writing video + generated audio to AVAssetWriterInput, audio stuttering. I've used the code from this post as a basis to try and modify the iVidCap.mm file myself, but the app always crashes in endRecordingSession.
I'm not sure how I need to modify endRecordingSession to accomodate for the audio (the original plugin just creates a video file). Here is the function:
- (int) endRecordingSession: (VideoDisposition) action {
NSLog(#"Start endRecordingSession");
NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
NSLog(#"Auto released pool");
NSString *filePath;
BOOL success = false;
[videoWriterInput markAsFinished];
NSLog(#"Mark video writer input as finished");
//[audioWriterInput markAsFinished];
// Wait for the video status to become known.
// Is this really doing anything?
int status = videoWriter.status;
while (status == AVAssetWriterStatusUnknown) {
NSLog(#"Waiting for video to complete...");
[NSThread sleepForTimeInterval:0.5f];
status = videoWriter.status;
}
NSLog(#"Video completed");
#synchronized(self) {
success = [videoWriter finishWriting];
NSLog(#"Success: %#", success);
if (!success) {
// We failed to successfully finalize the video file.
NSLog(#"finishWriting returned NO");
} else {
// The video file was successfully written to the Documents folder.
filePath = [[self getDocumentsFileURL:videoFileName] path];
if (action == Save_Video_To_Album) {
// Move the video to an accessible location on the device.
NSLog(#"Temporary video filePath=%#", filePath);
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(filePath)) {
NSLog(#"Video IS compatible. Adding it to photo album.");
UISaveVideoAtPathToSavedPhotosAlbum(filePath, self, #selector(copyToPhotoAlbumCompleteFromVideo: didFinishSavingWithError: contextInfo:), nil);
} else {
NSLog(#"Video IS NOT compatible. Could not be added to the photo album.");
success = NO;
}
} else if (action == Discard_Video) {
NSLog(#"Video cancelled. Removing temporary video file: %#", filePath);
[self removeFile:filePath];
}
}
[self cleanupWriter];
}
isRecording = false;
[pool drain];
return success; }
Right now it crashes on [videoWriter finishWriting]. I tried adding [audioWriterInput markAsFinished], but then it crashes on that. I would contact the original poster since it seems like they got it working, but there doesn't seem to be a way to send private messages.
Does anyone have any suggestions on how I can get this to work or why it's crashing? I've tried my best to figure this out but I'm pretty new to Obj-C. I can post the rest of the code if needed (a lot of it is in the original post referenced earlier).
The issue might actually be in the writeAudioBuffer function.
If you copied the code from that post but didnlt change it then you will certainly have some problems.
You need to do something like this:
if ( ![self waitForAudioWriterReadiness]) {
NSLog(#"WARNING: writeAudioBuffer dropped frame after wait limit reached.");
return 0;
}
OSStatus status;
CMBlockBufferRef bbuf = NULL;
CMSampleBufferRef sbuf = NULL;
size_t buflen = n * nchans * sizeof(float);
CMBlockBufferRef tmp_bbuf = NULL;
status = CMBlockBufferCreateWithMemoryBlock(
kCFAllocatorDefault,
samples,
buflen,
kCFAllocatorDefault,
NULL,
0,
buflen,
0,
&tmp_bbuf);
if (status != noErr || !tmp_bbuf) {
NSLog(#"CMBlockBufferCreateWithMemoryBlock error");
return -1;
}
// Copy the buffer so that we get a copy of the samples in memory.
// CMBlockBufferCreateWithMemoryBlock does not actually copy the data!
//
status = CMBlockBufferCreateContiguous(kCFAllocatorDefault, tmp_bbuf, kCFAllocatorDefault, NULL, 0, buflen, kCMBlockBufferAlwaysCopyDataFlag, &bbuf);
//CFRelease(tmp_bbuf); // causes abort?!
if (status != noErr) {
NSLog(#"CMBlockBufferCreateContiguous error");
//CFRelease(bbuf);
return -1;
}
CMTime timestamp = CMTimeMake(sample_position_, 44100);
status = CMAudioSampleBufferCreateWithPacketDescriptions(
kCFAllocatorDefault, bbuf, TRUE, 0, NULL, audio_fmt_desc_, 1, timestamp, NULL, &sbuf);
sample_position_ += n;
if (status != noErr) {
NSLog(#"CMSampleBufferCreate error");
return -1;
}
BOOL r = [audioWriterInput appendSampleBuffer:sbuf];
if (!r) {
NSLog(#"appendSampleBuffer error");
}
//CFRelease(bbuf); // crashes, don't know why.. Is there a leak here?
//CFRelease(sbuf);
return 0;
There are a few things to do with memory management that I am unsure on here.
Additionally be sure to use:
audioWriterInput.expectsMediaDataInRealTime = YES;