CGImageSourceCreateWithURL returns always NULL - objective-c

I need to read the properties of an image without load or download it. In fact i have implemented a simple method that use the CGImageSourceCreateWithUrl for accomplish this.
My problem is that it is returning always error because seems that the imageSource is null. So what can i do for fix it?
In the NSURL object i pass urls like:
"http://www.example.com/wp-content/uploads/image.jpg" but also ALAssets library Id used to retrieve images inside the phone like "assets-library://asset/asset.JPG?id=E5F41458-962D-47DD-B5EF-E606E2A8AC7A&ext=JPG".
This is my method:
-(NSString *) getPhotoInfo:(NSString *)paths{
NSString *xmlList = #“test”;
NSURL * imageFileURL = [NSURL fileURLWithPath:paths];
NSLog(#"imageFileURL %#", imageFileURL);
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((__bridge CFURLRef)(imageFileURL), NULL);
if (imageSource == NULL) {
// Error loading image
NSLog(#"Error loading image");
}
CGFloat width = 0.0f, height = 0.0f;
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, NULL);
NSLog(#"image source %#", imageSource);
return xmlList;
}
I have saw this posts for try to fix it but nothing seems working:
CGImageSourceRef imageSource = CGImageSourceCreateWithURL returns NULL
CGImageSourceCreateWithURL with authentication
accessing UIImage properties without loading in memory the image
In my project ARC is enabled.
Thanks

If you are passing the string "http://www.example.com/wp-content/uploads/image.jpg” to -fileURLWithPath: it’s going to return nil, because that string is sure not a file path, it’s a URL string.
Think of -fileURLWithPath: as just prepending the string you pass in with “file://localhost/“...so you’d end up with a URL that looks like "file://localhost/http://www.example.com/wp-content/uploads/image.jpg”. That’s not good.
You need to call [NSURL URLWithString:paths] if you’re going to be passing in entire URL strings, not just a filesystem path strings.

Working with "assets-library://asset/asset.JPG?id.........., try this code
-(UIImage*)resizeImageToMaxSize:(CGFloat)max anImage:(UIImage*)anImage
{
NSData * imgData = UIImageJPEGRepresentation(anImage, 1);
CGImageSourceRef imageSource = CGImageSourceCreateWithData((CFDataRef)imgData, NULL);
if (!imageSource)
return nil;
CFDictionaryRef options = (CFDictionaryRef)[NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageIfAbsent,
(id)[NSNumber numberWithFloat:max], (id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef imgRef = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, options);
UIImage* scaled = [UIImage imageWithCGImage:imgRef];
//these lines might be skipped for ARC enabled
CGImageRelease(imgRef);
CFRelease(imageSource);
return scaled; }

Related

ZXing can't write png output

I never experienced that issue, I was able to use the same code 3 month ago. After updating to macOS High Sierra, I can't run this code with the ZXing library. Not sure if it's linked but here what I have:
Error:
IIOImageWriteSession:113: cannot create: '/Users/****/Desktop/test.PNG.sb-8ff9679f-SRYKGg'
error = 1 (Operation not permitted)
Code:
NSError * error = nil;
ZXMultiFormatWriter * writer = [ZXMultiFormatWriter writer];
ZXBitMatrix* result = [writer encode:#"AM233X05987"
format:kBarcodeFormatDataMatrix
width:500
height:500
error:&error];
if (result) {
CGImageRef image = [[ZXImage imageWithMatrix:result] cgimage];
CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:#"/Users/****/Desktop/test.PNG"];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL(url, kUTTypePNG, 1, NULL);
if (!destination) {
NSLog(#"Failed to create CGImageDestination" );
return NO;
}
CGImageDestinationAddImage(destination, image, nil);
if (!CGImageDestinationFinalize(destination)) {
NSLog(#"Failed to write image to /Users/alex/Desktop/Barcodes/");
CFRelease(destination);
return NO;
}
CFRelease(destination);
} else {
NSString * errorMessage = [error localizedDescription];
NSLog(#"%#", errorMessage);
}
Finally found the origin of the problem, it turns out this is because the App is Sandboxed. Turning off Sandbox fixed the issue.
To get this with Sandbox ON, a solution is to put up a Save panel and, if it returns OK, saving the file to the Save panel's URL.

No visible #interface for 'NSData' declares the selector 'initWithBase64Encoding:

I am trying to build in xcode but I keep getting this issue
No visible #interface for 'NSData' declares the selector 'initWithBase64Encoding:'
and
No visible #interface for 'NSData' declares the selector 'base64Encoding'
I have looked everywhere not but there isn't a clear solution to my problem. These are what's giving me problem:
- (NSString*)stringFromImage:(UIImage*)image
{
if(image)
{
UIImage* convertImage = [GameUtility imageWithImage:image scaledToSize:CGSizeMake(80, 80)];
NSData *dataObj = UIImageJPEGRepresentation(convertImage, 30);
return [dataObj base64Encoding];
}
return #"";
}
- (UIImage*)imageFromString:(NSString*)imageString
{
NSData* imageData =[[NSData alloc] initWithBase64Encoding:imageString];
return [UIImage imageWithData:imageData];
}
If you are targeting OS X 10.9 or iOS 7 you can use the base64EncodedStringWithOptions: method to encode the string, and the initWithBase64EncodedString:options: method to decode the string.
If you are targeting platforms earlier than this, you need to either write these methods yourself or find a library that implements them for you (as category methods of NSData). Different libraries will have different names for these methods, so make sure you check with that library's documentation or check the header.
You can convert your image to string like this first. And then you can try convert your UIImage to NSData & then convert that data into string by using encodeBase64WithData
NSString * Image = [self encodeBase64WithData:[imageDict objectForKey:#"Image"]];
and then try your UIImage to string like this
[UIImage imageWithData: [self decodeBase64WithString:[registerDataDict objectForKey:#"Image"]]];
Hope it helps you
use this base64 file for encode and decode
- (NSString*)stringFromImage:(UIImage*)image
{
if(image)
{
UIImage* convertImage = [GameUtility imageWithImage:image scaledToSize:CGSizeMake(80, 80)];
NSData *dataObj = UIImageJPEGRepresentation(convertImage, 30);
return [dataObj base64EncodedString];
}
return #"";
}
- (UIImage *)imageFromString:(NSString*)imageString
{
NSData* imageData =[imageString base64DecodedData];
return [UIImage imageWithData:imageData];
}

Using libjpeg from Objective C in XCode

Basically I am doing a screen capture app with optimum image size.
I am trying to capture my Mac screen and save it as JPEG file.
I wanted to capture the screen using Cocoa API [CGWindowListCreateImage] and save the file as JPEG using libjpeg9. I included the static library [libjpeg.a] and the header files[jpeglib.h, jconfig.h, jerror.h and jmorecfg.h] using the "Add Files to Project_Name" option in XCode.
The code I used to save as JPEG file is
int write_jpeg_file( char *filename )
{
struct jpeg_compress_struct cinfo;
struct jpeg_error_mgr jerr;
/* this is a pointer to one row of image data */
JSAMPROW row_pointer[1];
FILE *outfile = fopen( filename, "wb" );
if ( !outfile )
{
printf("Error opening output jpeg file %s\n!", filename );
return -1;
}
cinfo.err = jpeg_std_error( &jerr );
jpeg_create_compress(&cinfo);
jpeg_stdio_dest(&cinfo, outfile);
/* Setting the parameters of the output file here */
cinfo.image_width = width;
cinfo.image_height = height;
cinfo.input_components = bytes_per_pixel;
cinfo.in_color_space = color_space;
/* default compression parameters, we shouldn't be worried about these */
jpeg_set_defaults( &cinfo );
/* Now do the compression .. */
jpeg_start_compress( &cinfo, TRUE );
/* like reading a file, this time write one row at a time */
while( cinfo.next_scanline < cinfo.image_height )
{
row_pointer[0] = &raw_image[ cinfo.next_scanline * cinfo.image_width * cinfo.input_components];
jpeg_write_scanlines( &cinfo, row_pointer, 1 );
}
/* similar to read file, clean up after we're done compressing */
jpeg_finish_compress( &cinfo );
jpeg_destroy_compress( &cinfo );
fclose( outfile );
/* success code is 1! */
return 1;
}
When I Build and Run the project I am getting following error
Parse Issue
Expected }
in the line
typedef enum { FALSE = 0, TRUE = 1 } boolean;
in jmorecfg.h which is libjpeg header file. I am not sure why I am getting this error. Please help me.
I know there is a way to save the file using Cocoa API instead of using libjpeg.
I first tried in XCode using Cocoa. I am able to get the JPEG file. The file size is relatively bigger(~200KB). When I create the image in C using libjpeg the file size is ~100KB.
The code I used:
CGFloat imageCompression = 0.6;
- (NSImage *)captureImageForRect:(NSRect)rect {
CGImageRef screenShot = CGWindowListCreateImage(CGRectInfinite, kCGWindowListOptionOnScreenOnly, kCGNullWindowID, kCGWindowImageDefault);
NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithCGImage:screenShot];
NSLog(#"Bits per pixel : %ld", [imageRep bitsPerPixel]);
NSImage *result = [[NSImage alloc] init];
[result addRepresentation:imageRep];
screenShot=nil;
imageRep=nil;
return result;
}
-(NSData *)jpegReresentationOfImage:(NSImage *) image {
NSBitmapImageRep* myBitmapImageRep=nil;
NSSize imageSize = [image size];
[image lockFocus];
NSRect imageRect = NSMakeRect(0, 0, imageSize.width, imageSize.height);
myBitmapImageRep = [[NSBitmapImageRep alloc] initWithFocusedViewRect:imageRect];
[image unlockFocus];
// set up the options for creating a JPEG
NSDictionary* options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:imageCompression], NSImageCompressionFactor,
[NSNumber numberWithBool:NO], NSImageProgressive,
nil];
NSData* imageData = [myBitmapImageRep representationUsingType:NSJPEGFileType properties:options];
myBitmapImageRep=nil;
return ( imageData);
}
-(void)captureAndSave {
NSString *fileName = #"/Volumes/Official/images/output/Mac_Window_1.jpg";
NSImage *screenshot = [self captureImageForRect:[[NSScreen mainScreen] frame]];
NSData *jpgRep = [self jpegReresentationOfImage:screenshot];
[jpgRep writeToFile:[fileName stringByExpandingTildeInPath] atomically:NO];
[NSApp terminate:self];
}
Adding the following line to jconfig.h solved the problem. After adding this, I am not getting compile error and I am able to use libjpeg inside my Objetive-C program.
#define USE_MAC_MEMMGR
In jmemmac.c, I found the following code which helped me to fix it. Just adding the above line worked!
#ifndef USE_MAC_MEMMGR /* make sure user got configuration right */
You forgot to define USE_MAC_MEMMGR in jconfig.h. /* deliberate syntax error */
#endif

Crash when calling CGImageDestinationFinalize with large photo

In the code below I am crashing on the CGImageDestinationFinalize() line. The CGImage is a large JPG photo. It works fine if I use a smaller JPG. But anything over 10MB crashes on the device with "out of memory".
I think the CGImageDestination* methods are streaming the write but I must be doing something wrong that is retaining the data.
Is there another API call I should use when writing large JPGs?
BOOL SKImageWriteToDisk(CGImageRef image, NSURL *url, SKImageProfile profile, CGFloat dpi)
{
CFStringRef type = profile == SKmageProfileCompressed ? kUTTypeJPEG : kUTTypePNG;
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)url, type, 1, NULL);
if (!destination) {
return NO;
}
NSMutableDictionary *properties =
[#{
(id) kCGImagePropertyDPIHeight: #(dpi),
(id) kCGImagePropertyDPIWidth: #(dpi)
} mutableCopy];
if (profile == SKImageProfileCompressed) {
properties[(id) kCGImageDestinationLossyCompressionQuality] = #(0.8f);
}
CGImageDestinationAddImage(destination, image, (__bridge CFDictionaryRef)properties);
BOOL finalizeSuccess = CGImageDestinationFinalize(destination); // crash!
CFRelease(destination);
return finalizeSuccess;
}

add/pass (exif) thumbnail to CGImageDestinationRef

On CGImageDestinationRef Apple's reference it mentions that It can contain thumbnail images as well as properties for each image. The part with the properties works nice, by setting the properties parameter when adding a image to the CGImageDestinationRef, but I can't figure out how to add the thumbnail.
I want to add a thumbnail (or pass it directly from a CGImageSourceRef) to CGImageDestinationRef, such that when the resulting image is opened with CGImageSourceRef I can use CGImageSourceCreateThumbnailAtIndex to retrieve the original thumbnail.
NSDictionary* thumbOpts = [NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanFalse, (id)kCGImageSourceCreateThumbnailFromImageIfAbsent,
[NSNumber numberWithInt:128], (id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef thumb = CGImageSourceCreateThumbnailAtIndex(iSource, 0, (CFDictionaryRef)thumbOpts);
The code above returns the thumbnail of the images that have it stored, but when I pass the image through CGImageDestinationRef, the thumbnail is lost.
Is there some way to keep the original thumbnail (or create a new one)? CGImageProperties Reference doesn't seem to have any keys where to store the thumbnail.
One way I've found to get a thumbnail back is to specify kCGImageSourceCreateThumbnailFromImageIfAbsent in the settings dictionary.
Specify the dictionary as follows
NSDictionary* thumbOpts = [NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageIfAbsent,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
[NSNumber numberWithInt:128], (id)kCGImageSourceThumbnailMaxPixelSize,
nil];
Update: In order to add a secondary (thumbnail) image to the original CGImageDestinationRef do the following after you've added the primary image
size_t thumbWidth = 640;
size_t thumbHeight = imageHeight / (imageWidth / thumbWidth);
size_t thumbBytesPerRow = thumbWidth * 4;
size_t thumbTotalBytes = thumbBytesPerRow * thumbHeight;
void *thumbData = malloc(thumbTotalBytes);
CGContextRef thumbContext = CGBitmapContextCreate(thumbData,
thumbWidth,
thumbHeight,
8,
thumbBytesPerRow,
CGColorSpaceCreateDeviceRGB(),
kCGImageAlphaNoneSkipFirst);
CGRect thumbRect = CGRectMake(0.f, 0.f, thumbWidth, thumbHeight);
CGContextDrawImage(thumbContext, thumbRect, fullImage);
CGImageRef thumbImage = CGBitmapContextCreateImage(thumbContext);
CGContextRelease(thumbContext);
CGImageDestinationAddImage(destination, thumbImage, nil);
CGImageRelease(thumbImage);
Then in order to get the thumbnail back out, do the following
CFURLRef imageFileURLRef = (__bridge CFURLRef)url;
NSDictionary* sourceOptions = #{(id)kCGImageSourceShouldCache: (id)kCFBooleanFalse,
(id)kCGImageSourceTypeIdentifierHint: (id)kUTTypeTIFF};
CFDictionaryRef sourceOptionsRef = (__bridge CFDictionaryRef)sourceOptions;
CGImageSourceRef imageSource = CGImageSourceCreateWithURL(imageFileURLRef, sourceOptionsRef);
CGImageRef thumb = CGImageSourceCreateImageAtIndex(imageSource, 1, sourceOptionsRef);
UIImage *thumbImage = [[UIImage alloc] initWithCGImage:thumb];
CFRelease(imageSource);
CGImageRelease(thumb);
Ok, so since no one responded (even during bounty), plus after the hours I spent trying to pass the thumbnail, my guess is that the documentation for CGImageDestination is misleading. There doesn't seem to be any (documented) way to pass the image thumbnail to a CGImageDestinationRef (atleast not up to, and including, OSX 10.7.0 and iOS 5.0).
If anyone thinks otherwise, post an answer and I'll accept it (if it's correct).
There is a half-documented property for embedding EXIF thumbnails on iOS 8.0+: kCGImageDestinationEmbedThumbnail.
Apple's developer doc is empty, but the header file (CGImageDestination.h) contains the following comment:
/* Enable or disable thumbnail embedding for JPEG and HEIF.
* The value should be kCFBooleanTrue or kCFBooleanFalse. Defaults to kCFBooleanFalse */
IMAGEIO_EXTERN const CFStringRef kCGImageDestinationEmbedThumbnail IMAGEIO_AVAILABLE_STARTING(__MAC_10_10, __IPHONE_8_0);
However you can't specify arbitrary thumbail data, it will get automatically created for you.