Rotate image in IKImageBrowserView. Cocoa/Objective C - objective-c

Im looking for your help, with Cocoa IKImageBrowserView.
I have that browser, and images inside that. The images are taken from the folder that I do provide to it. Kinda like that:
[importedImages removeAllObjects];
[images removeAllObjects];
NSArray *onlyPNGs = [self getAllPngsInLibraryFolder];
NSString *imagesPath = [[FileHelperManager defaultManager] getPathToImagesFolder];
for(NSString *str in onlyPNGs)
{
NSString *mergedPath = [imagesPath stringByAppendingString:str];
[self addAnImageWithPath:mergedPath];
}
[self updateDatasource];
When I do chose particular image from ImageBrowser, I show it in NSImageView, and provide an ability to modify some parameters of this image in database where I do store information about all of them.
The problem what I have met is - I want to support the rotation change of chosen image. I can rotate without any problem an NSImageView preview of selected image, but how can I rotate then this image inside ImageBrowser?
Would be good if somebody can help me with that.
Here is an illustration of what I do try to achieve.
So, steps 1,2 are ok, but how can I do step 3?

Related

Really simple xcode array issue

I'm programming a quiz app in xcode and I have 10 images which I want to use as UIImage views. I want to know how to create an array of all those images.
I also have a switch statement where I want to call 1 image from that array to be the UIImage. If you could explain how to call an image from an array as well I would be very grateful :)
Thanks in advance !!
Instead of storing UIImages in the array you can also store image names in array
NSArray *imagesArray = #[#"<image-name-1>",#"<image-name-2>",...,<image-name-10>];
when you want to get the image to set to UIImageView
UIImageView *imageView = [[UIImageView alloc] initwithFrame:<your-frame>];
[imageView setImage:[UIImage imageNamed:imagesArray[<your-index>]];
Let me know if you need any more info.
If you need to call an object from an array, objectAtIndex: can do the job for you. It doesn't matter whether you store UIImage instances of UIImageView instances, this will work with any form of object.
Here's a little example that will call each item from the array using objectAtIndex::
NSArray *_imageViews = ...
for (int idx; idx < _imageViews.count; idx++)
{
UIImageView *_imageView = [_imageViews objectAtIndex:idx];
/* do whatever you have to do with the image view */
}

Save position in UIImagePickerController

I am developing an app which allows you to select photos and place them into different positions. The workflow is basically:
Tap an area of the screen
UIImagePickerController displays
Select a photo
Photo displays in the tapped area of the screen
I would like it so that if the user goes through this workflow for a second time, the UIImagePickerController when displayed will be showing the same album, and position within that album, that the user was last at.
I've tried saving a reference to the UIImagePickerController, as well as the UIPopoverController, so that they are created only once. However, every time I present the popover containing the UIImagePickerController, it is back at the main photos menu (eg. Camera Roll, Photo Library, My Photo Stream).
Any ideas for how to achieve what I'm after?
You can use ALAssetsLibrary . But this will cost you more effort. First time use – enumerateGroupsWithTypes:usingBlock:failureBlock: to list all album and remember user's choice. And at second time. Just use that album:ALAssetsGroup's – enumerateAssetsUsingBlock: to list all the images and videos. Apple has a few demo you can have a look PhotosByLocation MyImagePicker
keep a UIImagePickerController obj in .h class (for example imagePicker)
alloc the obj once (for example in viewDidLoad)
imagePicker = [[UIImagePickerController alloc] init];
imagePicker.delegate = self;
imagePicker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
[self.view addSubview:imagePicker.view];
imagePicker.view.hidden = YES;
imagePicker.view.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
imagePicker.view.bounds = CGRectMake(0,20,self.view.frame.size.width, self.view.frame.size.height);
In didFinishPickingMediaWithInfo
if([[info valueForKey:UIImagePickerControllerMediaType] isEqualToString:#"public.image"]){
imagePicker.view.hidden = YES;
}
When you want to show the imagePickerView just do
imagePicker.view.hidden = NO;
Just to point you to right direction. You can use asset library to show the images as a picker. You can use the apple sample code MyImagePicker. The method [[assetsLibrary] enumerateGroupsWithTypes:ALAssetsGroupAlbum usingBlock:^(ALAssetsGroup *group, BOOL *stop) can be used for photo album. Using the asset library you can check which image was selected last and then use the method,
- (void)enumerateAssetsAtIndexes:(NSIndexSet *)indexSet options:(NSEnumerationOptions)options usingBlock:(ALAssetsGroupEnumerationResultsBlock)enumerationBlock;
You can use this method next time to enumerate which image onwards you want to enumerate. This method can accept an indexSet as [NSIndexSet indexSetWithIndexesInRange:NSMakeRange(index, count)] which should help to indicate the last selected image.
To know more about how to use asset library check this.
It should be possible to reach into the resulting UITableView and then find its content offset. You can do this by searching the subviews of the UIImagePickerController's view property for a table view.
for (UIView *view in controller.view) {
if ([view isKindOfClass:[UITableView class]]) {
contentOffset = [(UITableView *)view contentOffset];
}
}
When you represent the view controller, you will want to restore the content offset in a similar fashion.
Note, I haven't actually tested to see the view hierarchy of the UIImagePickerController. Verify its structure by printing its subviews. There is also no guarantee that the structure will stay the same, since you are diving into the private implementation (though it's important to note you are not actually using any private APIs so this is okay).
Use AlAssetsLibrary. It control the image & video capture under the application. there a demo on apple.
https://developer.apple.com/library/ios/#documentation/AssetsLibrary/Reference/ALAssetsLibrary_Class/Reference/Reference.html
look for this or
if you want to make a cutomize album for the image and video here a great example.
https://github.com/Kjuly/ALAssetsLibrary-CustomPhotoAlbum

Objective-C: Comparing an image against another image that has been previously saved

I have a question about comparing UIImages in Objective-C when one image has already been through a save and load process using the NSSearchPathForDirectoriesInDomains method.
The aim of what I want is to direct the user to a new screen upon a click depending on what the image displays.
For simplicity let's say that there are two possibilities - a black image and a green image. Clicking on the black image takes you to xib1 and clicking the green image takes you to xib2.
This is simple enough and has been working until I have implemented a save and load system.
To save I do the following:
paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
documentsDirectory = [paths objectAtIndex:0];
pngFilePath = [NSString stringWithFormat:#"%#/test.png",documentsDirectory];
data1 = [NSData dataWithData:UIImagePNGRepresentation([level1part1 objectAtIndex:0])];
[data1 writeToFile:pngFilePath atomically:YES];
and to load I do the following:
paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
documentsDirectory = [paths objectAtIndex:0];
pngFilePath = [NSString stringWithFormat:#"%#/test.png",documentsDirectory];
UIImage *image = [UIImage imageWithContentsOfFile:pngFilePath];
[button1 setImage:image forState:UIControlStateNormal];
This is fine and when I quit the program and restart it the image is retained on the screen as I wish it to be. Hypothetically let's say that image now appearing on button1 is the green image.
When I call the following code after clicking on the button with id of sender (this is button1):
if(sender.currentImage == [UIImage imageNamed:self.greenImage])
{
VisitAlreadyCorrectScreen *screen = [[VisitAlreadyCorrectScreen alloc] initWithNibName:nil bundle:nil];
screen.modalTransitionStyle = UIModalTransitionStyleCoverVertical;
[self presentModalViewController:screen animated:YES];
}
even though the currentImage is the green image and is the same picture as the green image I am comparing it to, I think because I saved the green image into memory in the save process the comparison doesn't work as they are held in different places in the memory - verified by the following NSLog:
Current Image: <UIImage: 0x95614c0>, and Yes Image: <UIImage: 0xde748f0>
I cannot work out how to compare the two images so that in this case they match (they both relate to the same image I have in the resource folder). Does anyone have any suggestions at all?
Please let me know if I have not explained well enough what the problem is!
Thanks in advance!
You can compare the image name or the image URL if it was downloaded from Internet, it will also be faster than comparing the images.
Also, the problem is that by using the == operator you are comparing the memory addresses of the images 0x95614c0 and 0xde748f0. That's why is not equal. You are comparing if they are the same object, not if the images are equal.
To compare images use: As mentioned on Fls'Zen answer.
if ([UIImagePNGRepresentation(blackImage) isEqualToData:UIImagePNGRepresentation(greenImage)])
Your images certainly have different addresses since one is loaded from your application bundle and one is loaded from the documents directory. The [UIImage imageNamed:] function only returns images from the application bundle.
If you really want to compare the images by contents, check out this SO question. In the first answer, a hash value s computed for an image. In your code, you could compare the hash values of the two images you have. The second answer compares the images directly, in case hashes make you nervous.
I recommend going a different route and having your application track which image is loaded outside of the image itself.

How To Load Images From NSHomeDirectory

So, I'm currently saving images in my UIImageView via this method:
- (void)applicationDidEnterBackground:(UIApplication*)application {
NSString *image1 = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/image1.png"];
NSString *image2 = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/image2.png"];
}
The images save just fine, but I have no idea how to set the UIImageViews images in viewDidLoad. This is what I've been trying:
- (void)viewDidLoad
{
self.imageView.image =[UIImage imageNamed:[(NSHomeDirectory *)Documents objectForKey:#"image1"]];
self.imageView2.image =[UIImage imageNamed:[(NSHomeDirectory *)Documents objectForKey:#"image1"]];
}
But, obviously that is not working. I'm having trouble understanding the basics here. Any help would be great, thanks!
The -imageNamed: method looks in the application's main bundle if the named image hasn't been cached yet. From the docs:
The name of the file. If this is the first time the image is being loaded, the method looks for an image with the specified name in the application’s main bundle.
(See UIImage class reference)
You want -imageWithContentsOfFile: instead.

Memory management with image saved from UIImagePickerController

I'm writing an app in which the user takes a photo of them self, and then goes through a series of views to adjust the image using a navigation controller. This works perfectly fine if the user takes the photo with the front camera (set as default on devices that support it), but when I repeat the process I get about half way through and it crashes after throwing a memory warning.
After profiling in Instruments I see that my apps memory footprint holds at about 20-25 MB when using the lower resolution front camera image, but when using the back camera every view change adds another 33 MB or so until it crashes at about 350 MB (on a 4S)
Below is the code I'm using to handle saving the photo to the documents directory, and then reading that file location to set the image to a UIImageView. The "read" portion of this code is recycled through several view controllers (viewDidLoad) to set the image I saved as the background image in each view as I go.
I have removed all my image modification code to strip this down to the bear minimum attempting to isolate the problem, and I can't seem to find it. As it stands right now, all the app does is take a photo in the first view and then use that photo as the background image for about 10 more views, allocating as the user navigates through the view stack.
Now obviously the higher resolution photos would use more memory, but what I don't understand is that why the low resolution photos don't seem to be using more and more memory as I go, whereas the high resolution photos continuously use more and more until a crash.
How I am saving and reading the image:
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
jpgData = UIImageJPEGRepresentation(image, 1);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsPath = [paths objectAtIndex:0];
filePath = [documentsPath stringByAppendingPathComponent:#"image.jpeg"];
[jpgData writeToFile:filePath atomically:YES];
[self dismissModalViewControllerAnimated:YES];
[disableNextButton setEnabled:YES];
jpgData = [NSData dataWithContentsOfFile:filePath];
UIImage *image2 = [UIImage imageWithData:jpgData];
[imageView setImage:image2];
}
Now I know that I could try scaling the image before I save it, which I plan on looking into next, but I don't see why this doesn't work as is. Maybe I was falsely under the impression that ARC automatically deallocated views and their subviews when they leave the top of the stack.
Can anyone shed some light on why I'm stock piling my devices memory? (Hopefully something simple I'm completely overlooking) Did I somehow manage to throw ARC out the window?
EDIT: How I call for the image in my other views
- (void)loadBackground
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsPath = [paths objectAtIndex:0];
NSString *filePath = [documentsPath stringByAppendingPathComponent:#"image.jpeg"];
UIImage *image = [UIImage imageWithContentsOfFile:filePath];
[backgroundImageView setImage:image];
}
How navigation between my view controllers is established:
EDIT 2:
What my basic declarations look like:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface PhotoPickerViewController : UIViewController <UIImagePickerControllerDelegate, UINavigationControllerDelegate>
{
IBOutlet UIImageView *imageView;
NSData *jpgData;
NSString *filePath;
UIImagePickerController *imagePicker;
IBOutlet UIBarButtonItem *disableNextButton;
}
#end
If relevant, how I call up my image picker:
- (void)callCameraPicker
{
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == YES)
{
NSLog(#"Camera is available and ready");
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
imagePicker.delegate = self;
imagePicker.allowsEditing = NO;
imagePicker.cameraCaptureMode = UIImagePickerControllerCameraCaptureModePhoto;
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *device in devices)
{
if([[UIScreen mainScreen] respondsToSelector:#selector(scale)] && [[UIScreen mainScreen] scale] == 2.0)
{
imagePicker.cameraDevice = UIImagePickerControllerCameraDeviceFront;
}
}
imagePicker.modalTransitionStyle = UIModalTransitionStyleCoverVertical;
[self presentModalViewController:imagePicker animated:YES];
}
else
{
NSLog(#"Camera is not available");
UIAlertView *cameraAlert = [[UIAlertView alloc] initWithTitle:#"Error"
message:#"Your device doesn't seem to have a camera!"
delegate:self cancelButtonTitle:#"Dismiss"
otherButtonTitles:nil];
[cameraAlert show];
}
}
EDIT 3: I logged viewDidUnload, and it was in fact not being called so I'm now calling loadBackground in viewWillAppear and making my backgroundImageView nil in viewDidDisappear. I expected this to help but it made no difference.
- (void)viewWillAppear:(BOOL)animated
{
[self loadBackground];
}
- (void)viewDidDisappear:(BOOL)animated
{
NSLog(#"ViewDidDisappear");
backgroundImageView = nil;
}
The relationship between UIImage and UIImageView is not necessarily intuitive for everyone.
UIImage is a high level representation of your image data - alone, it does nothing in terms of displaying the data.
UIImageView works with UIImage to allow you to display an image.
There is no reason why multiple instances of UIImageView cannot display the same UIImage. This is nice and efficient, because there is only one in-memory representation of the image data, being shared by multiple views.
What you seem to be doing is creating a new UIImage for each one of your views, by loading it from disk. So this is a poor general design in two respects: instantiating what is effectively the same UIImage over and over again, and re-loading the same image data from disk repeatedly.
Your memory problem is really a separate issue, where you are not properly releasing the image data you keep loading into UIImage objects and UIImageViews.
In theory, you should be able to take the very first UIImage you're getting from UIImagePickerController and simply pass that reference around to your views, without reloading from disk.
If you need to be saving and reloading from disk because of higher level functional requirements (e.g. because the image is being changed by the user and you want to keep saving it), you'll need to be sure you are fully tearing down the previous UIView, by removing it from the it's view hierarchy. It is helpful to setup a breakpoint in the dealloc method for the view to confirm it is being removed and dealloced, and make sure you set any iVar references to sub-views (it appears your backgroundImageView is an iVar) are set to nil. If you are not properly tearing down that backgroundImageView, it is continuing to hold a reference to the UIImage you set to it's image property.
There are a couple of things that are curious about the code you posted:
None of your view-callback implementations call super. That’s bad! Make extra sure that you are calling super in viewDidUnload and (if you implemented it) didReceiveMemoryWarning.
Make sure you implement didReceiveMemoryWarning in a meaningful way!
You really should not be re-creating that image over and over again! I assume you are not editing the actual image because you use JPEG compression on it which — even at 100% quality — will deteriorate your image with every save…
Check your implementation of viewDidUnload make sure to set every of your IBOutlets to nil.
ARC is not Pixie Dust™! It just saves you a bit of typing, it does not free you from designing and maintaining your object graphs!
From your question, I see at the very least these graphs that refer to your image:
image 1 <- image-view 1 <- view-controller 1 <- navigation-controller <- key window <- application
image 1 <- image-view 1 <- view 1 <- view-controller 1 <- navigation-controller <- key window <- application
This is repeated for every view-controller with an index shift on the view-controller, view, image view and image. While you have to have separate views, image-views for your view-controllers, I cannot think of a reason why you would want several copies of the same image.
So the first axe on your memory consumption clearly is to no longer create all those copies of the same image data — I’d estimate that this will get you half of the low-hanging memory savings.
The next thing is that ARC can only free the memory consumed by your objects if it is no longer referenced.
Memory wise, views are not exactly lightweight objects and when you build up a deep navigation stack you end up with gobs of them.
So you need to axe any unneeded strong references to those views, as well.
The level, at which this has to happen is the view-controller. The latest time at which this should happen is in the view-controller’s implementation of viewDidUnload.
Why the view-controller?
From what you described, the image itself is only referenced by the UIImageView — this is a bad choice, IMHO, but I digress…
UIViewController is designed to “know”, when its view is needed and when it’s safe to dispose of it — that’s why it implements didReceiveMemoryWarning and viewDidUnload:
If the memory pressure gets to high and the view-controller’s view is not “on screen” the root implementation of didReceiveMemoryWorning will let go of its view and call viewDidUnload upon itself, afterwards.
This is why you must call through to super in your implementations of both of those methods.
In addition, this is why if you have strong IBOutlets that refer to subviews of the view-controller’s view, you must nil them in viewDidUnload or the system cannot reclaim the memory they occupy.
At its heart UIViewController is a big-ass finite state-machine. All of those “something-will/did-whatever” callbacks are used to transition between those states and most of the default implementations do some very important book-keeping to keep all that state in order.
If you are not invoking them in your overrides, you˚ll end up in inconsistent states and bad things — like this out of memory crasher — happen.
Just create separate folder and save your Capture images in it. After your successful operation clear that folder data(or)folder.using the nsfilemager.