Objective C, How to store UIImage - objective-c

I want to store an image from imagePickerController delegate method... I know that the image returned is huge so I will resize it. I have used SQLite for data persistency but don't have any idea how I can store the image...

UIImagePNGRepresentation() takes a UIImage as an argument and returns an NSData object (the image in PNG format.) You can then write that to disk/a database/wherever. When it's time to get the image back, pull the data from its source and use -[UIImage initWithData:] to reconstitute it.

Related

How to convert (OS_dispatch_data *) 5018112 bytes into NSData to put into UIImage

I've been looking for an answer to this and some seem like they might be what I need but I'm not sure. I found this Question #9152851 and this Question #2617625 and poked around on a bunch of links but I need some direction here.
Essentially, I'm dispatching an async call to process an image using OpenCV. You can see by the code here that I'm turning it into NSData * before sending it back to my delegate.
NSData *proccessedData = [NSData dataWithBytes:processedImage.data length:(processedImage.rows * processedImage.cols)];
[self.delegate onProcessedBitmapReady:proccessedData withFocusQuality:focusQuality];
But when I get back to my delegate, my processedBitmap is of type (OS_dispatch_data *) and contains a value of bytes. So, when I try to set the UIImage, it gets set to null.
- (void)onProcessedBitmapReady:(NSData *)processedBitmap withFocusQuality:(double)focusQuality
{
//Use comverted image from self.captureCommand onComplete
UIImage *image = [[UIImage alloc] initWithData:processedBitmap];
[self saveImage:image];
}
Here is a screen capture of the values:
So, how do I convert those bytes (or whatever they are) into something that I can stuff into a UIImage?
Thank you in advance for your help.
------------------------------------------------------- Adding a new image -----------------------------------------
Does this new image help?
Thank you.
NSData is actually a class cluster which just provides the interface, and there are multiple special implementations for it around. It appears that OS_dispatch_data is such a special implementations made to pass data objects around blocks, especially since your UIImage creation doesn't crash (as it would if you would pass it a non NSData object, or just garbage memory). Instead, it looks like UIImage simply doesn't recognize the format the image is in!
By the way, Apple has a great guide about the concept of class clusters, which can be found here.

Efficient Cocoa Animation w/raw bitmap data

I have a raw bitmap image of RGBA malloc-ed data; rows are obviously a multiple of 4 bytes. This data actually originates from an AVI (24-bit BGR format), but I convert it to 32-bit ARGB. There's about 8mb of 32-bit data (1920x1080) per frame.
For each frame:
I convert that frame's data into a NSData object via NSData:initWithBytes:length.
I then convert that into a CIImage object via CIImage:imageWithBitmapData:bytesPerRow:size:format:colorSpace.
From that CIImage, I draw it into my final NSOpenGLView context using NSOpenGLView:drawImage:inRect:fromRect. Due to the "mosaic" nature of the target images, there are approximately 15-20 calls made on this with various source/destination Rects.
Using a 30hz NSTimer that calls [self setNeedsDisplay:YES] on the NSOpenGLView, I can attain about 20-25fps on a 2012 MacMini/2.6ghz/i7 -- it's not rock solid at 30hz. This to be expected with an NSTimer instead of a CVDisplayLink.
But... ignoring the NSTimer issue for now, are there any suggestions/pointers on making this frame-by-frame rendering a little more efficient?
Thanks!
NB: I would like to stick with CIImage objects as I'll want to access transition effects at some point.
Every frame, the call to NSData's initWithBytes:length: causes an 8MB memory allocation & an 8MB copy.
You can get rid of this per-frame allocation/copy by replacing theNSData object with a persistent NSMutableData object (set up once at the beginning), and using its mutableBytes as the destination buffer for the frame's 24- to 32-bit conversion.
(Alternatively, if you prefer to manage the destination-buffer memory yourself, leave the object as NSData class, but initialize it with initWithBytesNoCopy:length:freeWhenDone: & pass NO as the last parameter.)

UIImage to raw NSData / avoid compression

I have my own image downloader class, it holds a queue and downloads images one (or a certain amount) at a time, writes them to the cache folder and retrieves them from the cache folder when necessary. I also have a UIImageView subclass to which I can pass a URL, through the image downloader class it will look if the image already exists on the device and show it if it does, or download and show it after it finished.
After an image finishes downloading I do the following. I create a UIImage from the downloaded NSData, save the downloaded NSData to disk and return the UIImage.
// This is executed in a background thread
downloadedImage = [UIImage imageWithData:downloadedData];
BOOL saved = [fileManager createFileAtPath:filePath contents:downloadedData attributes:attributes];
// Send downloadedImage to the main thread and do something with it
To retrieve an existing image I do this.
// This is executed in a background thread
if ([fileManager fileExistsAtPath:filePath])
{
NSData* imageData = [fileManager contentsAtPath:filePath];
retrievedImage = [UIImage imageWithData:imageData];
// Send retrievedImage to the main thread and do something with it
}
As you can see, I always create a UIImage directly from the downloaded NSData, I never create NSData using UIImagePNGRepresentation so the image never gets compressed. When you create a UIImage from compressed NSData, UIImage will decompress it right before rendering on the main thread and thus block the UI. Since I'm now having a UITableView with a ton of small images in it that have to be downloaded or retrieved from disk, this would be unacceptable as it would slow down my scrolling immensely.
Now my problem. The user is also able to select a photo from the camera roll, save it and it also has to appear in my UITableView. But I can't seem to find a way to turn the UIImage from the camera roll into NSData without using UIImagePNGRepresentation. So here's my question.
How can I convert a UIImage into uncompressed NSData so I can convert it back to a UIImage later using imageWithData so that it doesn't have to be decompressed before rendering?
or
Is there any way I can do the decompression before sending the UIImage to the main thread and cache it so it only has to be decompressed once?
Thanks in advance.
How can I convert a UIImage into uncompressed NSData so I can convert it back to a UIImage later using imageWithData so that it doesn't have to be decompressed before rendering?
What you're really asking here, I take it, is how to store the UIImage on disk in such a way that you can later read the UIImage from disk as fast as possible. You don't really care whether it is stored as NSData; you just want to be able to read it quickly. I suggest you use the ImageIO framework. Save by way of an image destination and fetch later by way of an image source.
http://developer.apple.com/library/ios/#documentation/GraphicsImaging/Conceptual/ImageIOGuide/ikpg_dest/ikpg_dest.html
Is there any way I can do the decompression before sending the UIImage to the main thread and cache it so it only has to be decompressed once?
Yes, good question. That was going to be my second suggestion: use threading. This is what people have to do with tables all the time. When the table asks for the image, you either have the image already or you don't. If you don't, you supply a filler image and, in the background, fetch the real image. When the real image is ready, you have arranged to get a notification. Back on the main thread, you tell the table view to ask for the data for that row again; this time you've got the image and you supply it. The user will thus see a slight delay before the image appears. I'm sure you've seen lots of apps that behave this way (New York Times is a good example).
I have one further suggestion, and it may be the best of all. You speak of it taking time to decompress the image from disk. But this should take no time at all if the image is small. But the image should be small, because it's going to go into a small place - a table cell. In other words, you should shrink the images beforehand, when you first receive them, so that you are ready with the small version of each image when asked. It is a huge waste of time and memory to supply a large image that is to go into a small space.
ADDED LATER: Of course you do understand that a lot of this worry would be unnecessary if you weren't saving the images to disk. I'm not at all clear on why you need to do that. I hope you have a good reason for it; but it's a heck of a lot faster, obviously, if you just hold the images ready in memory.
I found solution:
CGImageRef downloadedImageRef = downloadedImage.CGImage;
CGDataProviderRef provider = CGImageGetDataProvider(downloadedImageRef);
NSData *data = CFBridgingRelease(CGDataProviderCopyData(provider));
// Then you can save the data
IF you download the data and save it to disk, then the data is compressed in either PNG, JPEG, or GIF format. You are not going to be downloading uncompressed image data. So, the root of your question about doing the decompression first needs to be addressed before you save the file to disk. Decompressing before you save will make the file a lot bigger, but it means that decompression is not needed before the data is read back into a CGImageRef or UIImage. It is the loading and then decompressing a bunch of images that is slowing down your CPU and making scrolling slow. But, it is not a solution to simply hold everything in memory already decompressed, because that will use up all your app memory and crash your phone before long. You might be able to get away with it for some small number of images, but this is a basic design flaw that you need to address when first writing your code. If you like, you can have a look at my blog post on this topic video-and-memory-usage-on-ios-devices, the post deals with video, but you have the exact same issue when dealing with lots of different images. I would suggest that you write your small images to disk in an uncompressed format like TIFF or BMP, that way reading them back in is easy as long as ImageIO supports that specific format.

UIImage causing memory leaks

Instruments is telling me that alot of memory is being allocated when I rapidly set the image name of a UIImageview in my app. I have a UIImageView that changes its image name every frame in my game. When profiled with zombie checking in instruments, the app seems to be constantly gaining live bytes at an enourmous rate. Is there a way that I can deallocate the UIImageView's current image to stop it from doing this? I am using ARC.
My code to assign the UIImageView's image is as follows:
aPlanet.image = [UIImage imageNamed:tempPlanetName];
Where aPlanet is the UIImageView and tempPlanetName is the name of the image. This is called every frame.
[UIImage ImageNamed:] method loads the image into image view and adds this newly created uiimage object to autorelease pool. To get rid of this problem you should use -
NSString *imgPath = [NSBundle mainbundle] pathForResource:#"imageName" ofType:#"png"];
aPlanet.image = [[UIImage alloc] ]initWithContentsOfFile:imgPath];
if you are using arc then you don't need to bother about releasing this newly allocated object of uiimage which was created using initWithContentsOfFile: method.
When you use UIImage imageNamed: it will load and cache that image file. This is intended for reuse of icons and other image resources that will be utilized more than once in your application.
Apart from it seeming somewhat unusual to update an image view with a new image every frame, you should look into alternative means of loading images that you will not need more than once - or even if you do, when you need more control over its lifecycle.
For example have a look at UIImage imageWithContentsOfFile: (documented here: Apple Developer Library Reference). It explicitly states that this method will not do any caching of the image contents.
I hope this helps you, but for every frame I doubt that your performance will be good enough with this approach, but this is probably the topic of a different question should the need arise.

NSImage size problem

I'm using the same image resource in two different controllers. In both controllers the image is shown in different sizes, the problem is that once the image is shown in a smaller size than the original, the next time I get the image by [NSImage imageNamed:#"resource.png"] the image size is set to the last size it took. I tried by invoking the recache method on NSImage and also tried to set the cache mode to any the posible value, but it didn't work.
Any ideas?
You should never modify an instance of NSImage obtained from imageNamed:. The returned instance is shared with other clients, so it should not be changed.
If you have to setSize: on the image, just make a copy and use that one:
NSImage *image = [[[NSImage imageNamed:#"foo.png"] copy] autorelease];
[image setSize:(NSSize){128, 128}];
The thing is that
[NSImage imageNamed]
As you mentioned is in the cache, and as long as it is in the cache it will return the cached image so what you need to do is first released the previous reference or use the object's setName method and setting to nil. Here is the documentation reference:
The NSImage class may cache a reference to the returned image object for performance in some cases. However, the class holds onto cached objects only while the object exists. If the image object is subsequently released, either because its retain count was 0 or it was not referenced anywhere in a garbage-collected application, the object may be quietly removed from the cache. Thus, if you plan to hold onto a returned image object, you must retain it like you would any Cocoa object. You can clear an image object from the cache explicitly by calling the object’s setName: method and passing nil for the image name.