Use NSBitmapImageRep transformed Gif - objective-c

I am a developer from China. I try to use NSBitmap ImageRep to convert GIF images into NSData, but the converted GIF images have no animation, it seems only the first frame. This is my code. I don't know what the problem is. I hope to get your help. Thank you.
NSData *gifData = [gifiImage TIFFRepresentation];
NSBitmapImageRep *bitMapRep = [NSBitmapImageRep imageRepWithData:gifData];
NSData *data = [bitMapRep representationUsingType:bitmapType properties:nil];

Related

How can I extract image information dumped in NSData

I have as an input the dump of an image in an NSData object. Now, I want to extract relevant information of the image from this object like number of pixels, no. of bits per pixel, etc.
Can anyone tell me how to extract this info from the NSData object dump?
P.S.: I have gone through this documentation of the NSData class, but could not isolate out the relevant methods.
So the easiest way is to actually build the UIImage object from the NSData then extract the info from the UIImage then.
UIImage* image = [UIImage imageWithData:yourData];
NSLog(#"Image is %dx%d",image.size.width, image.size.height);
If you are only interested in the properties of the image but don't want to actually build its representation and only get the properties, take a look at CGImageSource
#import <ImageIO/ImageIO.h>
CGImageSourceRef imgSrc = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
size_t nbImages = CGImageSourceGetCount(imgSrc);
for(size_t idx=0; idx<nbImages; ++idx)
{
NSDictionary* props = (__bridge NSDictionary*)CGImageSourceCopyPropertiesAtIndex(imgSrc, idx, NULL);
NSLog(#"properties for image %lu in imageSource: %#", idx, props);
}
CFRelease(imgSrc);
[EDIT] For this to work, obviously add the ImageIO.framework to your "Link Binary With Libraries"
Convert the data to UIImage, then take a look at this post.
UIImage *image = [UIImage imageWithData:data];

Convert data from URL into Image in iPhone

Here I'm working for static URL.
My Code :
NSURL *url = [[NSURL alloc] initWithString:#"http://cns.bu.edu/~lgrady/diagonal_line_nobreak_seg.jpg"];
NSData *myData = [NSData dataWithContentsOfURL:url];
img.image = [UIImage imageWithData:myData];
So, I can see image in ImageView.
But, While passing internal web-service URL.
And I'm getting data for Image in bytes from url instead of Image like below:
{"AppBanner":"data:image\/png;base64,iVBORw0KGgoAAAANSUhEUg.....VORK5CYII="}
But with help of above code, I can't convert this data into Image.
So how can I convert this data into image ?
Any Solution ?
Thanks in advance.
Use NSData-Base64 to convert base64 into NSData and then into UIImage
NSString *strBase64 = [yourResposeDict objectForKey:#"AppBanner"];
//decode strBase64 using NSData-Base64 library

Converting a UIImage to jpeg/png file for sharing using AddThis Plug-in

I want to convert a UIImage into a format such as a jpeg or png so that I can then share that file using the IOS plug-in called "AddThis".
I tried to share it using just the UIImage but the plug-in doesn't support it so I need to find a way to convert the UIImage to a jpeg first, then add it into this code:
[AddThisSDK shareImage:[UIImage imageNamed:#"test.jpg] withService:#"twitter" title:#"I'm sharing something" description:#"Random description of image"];
the code has to have shareImage:[UIImageNamed:#""] otherwise an error occurs.
So far I've tried to convert it using UIImageJPEGRepresentation but I don't think I've done it properly. To be honest I tried to do it similarly to how you'd convert it straight from taking an image:
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"photo_boom.jpg"];
[UIImageJPEGRepresentation(shareImage, 1.0) writeToFile:jpgPath atomically:YES];
NSError *error;
NSFileManager *fileMgr = [NSFileManager defaultManager];
NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"photo_boom.jpg"];
NSLog(#"Documents directory: %#", [fileMgr contentsOfDirectoryAtPath:documentsDirectory error:&error]);
Something tells me this isn't the correct way... mainly because I haven't been able to get it to work.
I'd really appreciate any kind of help!
Basically, I've made a UIImage from converting a UIView:
UIGraphicsBeginImageContext(firstPage.bounds.size);
[firstPage.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
Which I then want to give a jpeg format because when I tried to simply put it in as
[AddThisSDK shareImage:image withService:#"twitter" title:#"I'm sharing something" description:#"Random description of image"];
It gives me an error
UIImage has its own internal representation of an image, so it's irrelevant whether you load it with jpeg or png data.
The API call you're interested has a UIImage as the first parameter, so something along the lines of
[AddThisSDK shareImage:[UIImage imageNamed:#"photo_boom.jpg"]
withService:#"twitter"
title:#"I'm sharing something"
description:#"Random description of image"];
should work, provided photo_boom.jpg is included in your bundle. If you're loading a previously saved image from a folder, you'll need something like this:
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"photo_boom.jpg"];
UIImage * myImage = [UIImage imageWithContentsOfFile: jpgPath];
[AddThisSDK shareImage:myImage
withService:#"twitter"
title:#"I'm sharing something"
description:#"Random description of image"];
If that doesn't work, have you tried putting a breakpoint on the AddThisSDK line, and checking the value of image? Type po image on the console.

rotate CGImage on disk using CGImageSource/CGImageDestination?

I'm working on an application that needs to take a picture using UIImagePickerController, display a thumbnail of it in the app, and also submit that picture to a server using ASIFormDataRequest (from ASIHTTPRequest).
I want to use setFile:withFileName:andContentType:forKey: from ASIFormDataRequest since in my experience it's faster than trying to submit an image using UIImageJPEGRepresentation and submitting raw NSData. To that end I'm using CGImageDestination and creating an image destination with a url, saving that image to disk, and then uploading that file on disk.
In order to create the thumbnail I'm using CGImageSourceCreateThumbnailAtIndex (see docs) and creating an image source with the path of the file I just saved.
My problem is that no matter what options I pass into the image destination or the thumbnail creation call, my thumbnail always comes out rotated 90 degrees counterclockwise. The uploaded image is also rotated. I've tried explicitly setting the orientation in the options of the image using CGImageDestinationSetProperties but it doesn't seem to take. The only solution I've found is to rotate the image in memory, but I really want to avoid that since doing so doubles the time it takes for the thumbnail+saving operation to finish. Ideally I'd like to be able to rotate the image on disk.
Am I missing something in how I'm using CGImageDestination or CGImageSource? I'll post some code below.
Saving the image:
NSURL *filePath = [NSURL fileURLWithPath:self.imagePath];
CGImageRef imageRef = [self.image CGImage];
CGImageDestinationRef ref = CGImageDestinationCreateWithURL((CFURLRef)filePath, kUTTypeJPEG, 1, NULL);
CGImageDestinationAddImage(ref, imageRef, NULL);
NSDictionary *props = [[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:1.0], kCGImageDestinationLossyCompressionQuality,
nil] retain];
//Note that setting kCGImagePropertyOrientation didn't work here for me
CGImageDestinationSetProperties(ref, (CFDictionaryRef) props);
CGImageDestinationFinalize(ref);
CFRelease(ref);
Generating the thumbnail
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)filePath, NULL);
if (!imageSource)
return;
CFDictionaryRef options = (CFDictionaryRef)[NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageIfAbsent,
(id)[NSNumber numberWithFloat:THUMBNAIL_SIDE_LENGTH], (id)kCGImageSourceThumbnailMaxPixelSize, nil];
CGImageRef imgRef = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, options);
UIImage *thumb = [UIImage imageWithCGImage:imgRef];
CGImageRelease(imgRef);
CFRelease(imageSource);
And then to upload the image I just use
[request setFile:path withFileName:fileName andContentType:contentType forKey:#"photo"];
where path is the path to the file saved with the code above.
As far as I know and after trying lots of different things, this cannot be done with current public APIs and has to be done in memory.

Error saving NSImage as NSData

I am using the following code to save a frame of a movie to my desktop:
NSCIImageRep *imageRep = [NSCIImageRep imageRepWithCIImage:[CIImage imageWithCVImageBuffer:imageBuffer]];
NSImage *image = [[[NSImage alloc] initWithSize:[imageRep size]] autorelease];
[image addRepresentation:imageRep];
CVBufferRelease(imageBuffer);
NSArray *representations = [image representations];
NSData *bitmapData = [NSBitmapImageRep representationOfImageRepsInArray:representations usingType:NSJPEGFileType properties:nil];
[bitmapData writeToFile:#"/Users/ricky/Desktop/MyImage.jpeg" atomically:YES];
At the second last line of code, I receive the following messages in the console, with no result being saved to the desktop:
<Error>: CGImageDestinationFinalize image destination does not have enough images
CGImageDestinationFinalize failed for output type 'public.jpeg'
The NSImage is still an allocated object for the entire method call, so I'm not sure why I am receiving complaints about insufficient amount of images.
I'd appreciate any help.
Thanks in advance,
Ricky.
I think the source of the problem is that you're passing an array of NSCIImageRep objects to representationOfImageRepsInArray:usingType:properties:, which I believe expects an array of NSBitmapImageRep objects.
What you want to do is create an NSBitmapImageRep from your CIImage. Then you can use that to write to disk. That would be roughly:
CIImage *myImage = [CIImage imageWithCVImageBuffer:imageBuffer];
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCIImage:myImage];
NSData *jpegData [bitmapRep representationUsingType:NSJPEGFileType properties:nil];
[jpegData writeToFile:#"/Users/ricky/Desktop/MyImage.jpeg" atomically:YES];
Of course, you'd want to handle any error cases and probably pass a properties dictionary to fine-tune the JPEG creation.
I'm sorry i don't really know why your code doesn't work, but approaching it a different way (and i think more efficiently than your CVImageBuffer to CIImage to NSCIImageRep to NSImage to NSData, albeit at a slightly lower level):-
CVImageBuffer to CGImage
CGImage to jpg file
I don't have code ready made to do this but extracting the right stuff from those examples should be straight forward.