Restarting UIPanGestureRecognizer while UIView is animating - cocoa-touch

I am trying to allow a user to grab a UIView that is being animated at the completion of a UIPanGestureRecognizer method. I am currently able to stop the view and center it under the finger using touchesBegan and the view's presentation layer. But I want the user to be able to retrigger the UIPanGestureRecognizer method without having to lift the finger up and replace it on the UIView.
I've been working on this challenge for a week or so, and have run through various search strings on SO and elsewhere, read through Ray Wenderlich's tutorials, etc., but still haven't found a solution. It looks like I'm trying to do something similar to this question that seemed unresolved:
iOS - UIPanGestureRecognizer : drag during animation
I have just been working with UIView animation blocks and have yet to dive into CABasicAnimation work. Is it possible to do what I want by changing the code below somehow?
Thank you in advance.
In FlingViewController.h:
#import <UIKit/UIKit.h>
#interface FlingViewController : UIViewController <UIGestureRecognizerDelegate>
{
UIView *myView;
}
#property (nonatomic, weak) UIView *myView;
#end
In FlingViewController.m:
#import "FlingViewController.h"
#import <QuartzCore/QuartzCore.h>
#interface FlingViewController ()
#end
#implementation FlingViewController
#synthesize myView = _myView;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// Allocate and init view
myView = [[UIView alloc] initWithFrame:CGRectMake(self.view.center.x - 50,
self.view.center.y - 50,
100, 100)];
// Set view background color
[myView setBackgroundColor:[UIColor purpleColor]];
// Create pan gesture recognizer
UIPanGestureRecognizer *gesture = [[UIPanGestureRecognizer alloc]
initWithTarget:self action:#selector(handleGesture:)];
// Add gesture recognizer to view
gesture.delegate = self;
[myView addGestureRecognizer:gesture];
// Add myView as subview
[self.view addSubview:myView];
NSLog(#"myView added");
}
- (void)handleGesture:(UIPanGestureRecognizer *)recognizer
{
if ((recognizer.state == UIGestureRecognizerStateBegan) ||
(recognizer.state == UIGestureRecognizerStateChanged))
{
[recognizer.view.layer removeAllAnimations];
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
else if (recognizer.state == UIGestureRecognizerStateEnded)
{
CGPoint velocity = [recognizer velocityInView:self.view];
CGFloat magnitude = sqrtf((velocity.x * velocity.x) + (velocity.y * velocity.y));
CGFloat slideMult = magnitude / 200;
float slideFactor = 0.1 * slideMult;
CGPoint finalPoint = CGPointMake(recognizer.view.center.x + (velocity.x * slideFactor),
recognizer.view.center.y + (velocity.y * slideFactor));
finalPoint.x = MIN(MAX(finalPoint.x, 0), self.view.bounds.size.width);
finalPoint.y = MIN(MAX(finalPoint.y, 0), self.view.bounds.size.height);
[UIView animateWithDuration:slideFactor * 2
delay:0
options:(UIViewAnimationOptionCurveEaseOut|
UIViewAnimationOptionAllowUserInteraction |
UIViewAnimationOptionBeginFromCurrentState)
animations:^{
recognizer.view.center = finalPoint;
}
completion:^(BOOL finished){
NSLog(#"Animation complete");
}];
}
}
// Currently, this method will detect when user touches
// myView's presentationLayer and will center that view under the finger
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
if ([[myView.layer presentationLayer] hitTest:touchPoint])
{
NSLog(#"Touched!");
[myView.layer removeAllAnimations];
myView.center = touchPoint;
// *** NOTE ***
// At this point, I want the user to be able to fling
// the view around again without having to lift the finger
// and replace it on the view; i.e. when myView is moving
// and the user taps it, it should stop moving and follow
// user's pan gesture again
// Similar to unresolved SO question:
// https://stackoverflow.com/questions/13234234/ios-uipangesturerecognizer-drag-during-animation
}
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
#end

In case it helps anyone else, I've found a solution that seems to work pretty well, based on Thomas Rued/Jack's response to this SO question:
CAAnimation - User Input Disabled
I am using two UIPanGestureRecognizers to move the UIView -- one (gesture/handleGesture:) is attached to the UIView (myView), and another (bigGesture/superGesture:) is attached to that UIView's superview (bigView). The superview detects the position of the subview's presentation layer as it is animating.
The only issue (so far) is the user can still grab the animated view by touching the ending location of myView's animation before the animation completes. Ideally, only the presentation layer location would be the place where the user can interact with the view. If anyone has insight into preventing premature interaction, it would be appreciated.
#import "FlingViewController.h"
#import <QuartzCore/QuartzCore.h>
#interface FlingViewController ()
#end
#implementation FlingViewController
#synthesize myView = _myView;
#synthesize bigView = _bigView;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// *** USING A GESTURE RECOGNIZER ATTACHED TO THE SUPERVIEW APPROACH
// CAME FROM SO USER THOMAS RUED/JACK AT FOLLOWING LINK:
// https://stackoverflow.com/questions/7221688/caanimation-user-input-disabled
// Set up size for superView (bigView)
CGRect screenRect = [[UIScreen mainScreen] bounds];
// Allocate big view
bigView = [[UIView alloc] initWithFrame:(screenRect)];
[bigView setBackgroundColor:[UIColor greenColor]];
[self.view addSubview:bigView];
// Allocate and init myView
myView = [[UIView alloc] initWithFrame:CGRectMake(10, 10, 100, 100)];
// Set view background color
[myView setBackgroundColor:[UIColor purpleColor]];
// Create pan gesture recognizer
UIPanGestureRecognizer *gesture = [[UIPanGestureRecognizer alloc]
initWithTarget:self action:#selector(handleGesture:)];
UIPanGestureRecognizer *bigGesture = [[UIPanGestureRecognizer alloc]
initWithTarget:self action:#selector(superGesture:)];
// Add gesture recognizer to view
gesture.delegate = self;
bigGesture.delegate = self;
[myView addGestureRecognizer:gesture];
[bigView addGestureRecognizer:bigGesture];
// Add myView as subview of bigView
[bigView addSubview:myView];
NSLog(#"myView added");
}
// This gesture recognizer is attached to the superView,
// and will detect myView's presentation layer while animated
- (void)superGesture:(UIPanGestureRecognizer *)recognizer
{
CGPoint location = [recognizer locationInView:self.view];
for (UIView* childView in recognizer.view.subviews)
{
CGRect frame = [[childView.layer presentationLayer] frame];
if (CGRectContainsPoint(frame, location))
// ALTERNATE 'IF' STATEMENT; BOTH SEEM TO WORK:
//if ([[childView.layer presentationLayer] hitTest:location])
{
NSLog(#"location = %.2f, %.2f", location.x, location.y);
[childView.layer removeAllAnimations];
childView.center = location;
}
if ((recognizer.state == UIGestureRecognizerStateEnded) &&
(CGRectContainsPoint(frame, location)))
{
CGPoint velocity = [recognizer velocityInView:self.view];
CGFloat magnitude = sqrtf((velocity.x * velocity.x) + (velocity.y * velocity.y));
CGFloat slideMult = magnitude / 200;
float slideFactor = 0.1 * slideMult;
CGPoint finalPoint = CGPointMake(childView.center.x + (velocity.x * slideFactor),
childView.center.y + (velocity.y * slideFactor));
finalPoint.x = MIN(MAX(finalPoint.x, 0), self.view.bounds.size.width);
finalPoint.y = MIN(MAX(finalPoint.y, 0), self.view.bounds.size.height);
[UIView animateWithDuration:slideFactor * 2
delay:0
options:(UIViewAnimationOptionCurveEaseOut|
UIViewAnimationOptionAllowUserInteraction |
UIViewAnimationOptionBeginFromCurrentState)
animations:^{
childView.center = finalPoint;
}
completion:^(BOOL finished){
NSLog(#"Big animation complete");
}];
}
}
}
// This gesture recognizer is attached to myView,
// and will handle movement when myView is not animating
- (void)handleGesture:(UIPanGestureRecognizer *)recognizer
{
if ((recognizer.state == UIGestureRecognizerStateBegan) ||
(recognizer.state == UIGestureRecognizerStateChanged))
{
[recognizer.view.layer removeAllAnimations];
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
else if (recognizer.state == UIGestureRecognizerStateEnded)
{
CGPoint velocity = [recognizer velocityInView:self.view];
CGFloat magnitude = sqrtf((velocity.x * velocity.x) + (velocity.y * velocity.y));
CGFloat slideMult = magnitude / 200;
float slideFactor = 0.1 * slideMult;
CGPoint finalPoint = CGPointMake(recognizer.view.center.x + (velocity.x * slideFactor),
recognizer.view.center.y + (velocity.y * slideFactor));
finalPoint.x = MIN(MAX(finalPoint.x, 0), self.view.bounds.size.width);
finalPoint.y = MIN(MAX(finalPoint.y, 0), self.view.bounds.size.height);
[UIView animateWithDuration:slideFactor * 2
delay:0
options:(UIViewAnimationOptionCurveEaseOut|
UIViewAnimationOptionAllowUserInteraction |
UIViewAnimationOptionBeginFromCurrentState)
animations:^{
recognizer.view.center = finalPoint;
}
completion:^(BOOL finished){
NSLog(#"Animation complete");
}];
}
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
#end

Related

Touch method UIImageView inside UIScrollView

I am wondering how I can use a touch method for a UIImageView inside a UIScrollView in xCode.
When I add the UIImageView subview to the self.view, I can use the touch method. But when I add the UIImageView subview to the UIScrollView, I can't. How can I solve this?
This is my code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
touches = [event allTouches];
for (UITouch *touch in touches) {
NSLog(#"Image Touched");
}
}
- (void)viewDidLoad
{
[super viewDidLoad];
UIScrollView *scrollView = [[UIScrollView alloc] initWithFrame:CGRectMake(0, 44, [UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height * 0.9)];
scrollView.scrollEnabled = TRUE;
scrollView.bounces = TRUE;
scrollView.contentSize = CGSizeMake([UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height);
scrollView.userInteractionEnabled = YES;
[self.view addSubview:scrollView];
UIImageView *ImageView = [UIImageView alloc] initWithFrame:CGRectMake([UIScreen mainScreen].bounds.size.width * 0.04, 10, [UIScreen mainScreen].bounds.size.width * 0.28, [UIScreen mainScreen].bounds.size.height * 0.22)];
ImageView.image = [UIImage imageNamed(#"image.png")];
ImageView.layer.borderColor = [UIColor whiteColor].CGColor;
ImageView.layer.borderWidth = 1;
ImageView.userInteractionEnabled = YES;
[scrollView addSubview:ImageView];
}
Give UIGestureRecognizers a try. They are far easier to manage with multiple layers of touch management.
- (void)touchedImage:(UITapGestureRecognizer *)gesture {
// When the gesture has ended, perform your action.
if (gesture.state == UIGestureRecognizerStateEnded) {
NSLog(#"Touched Image");
}
}
- (void)viewDidLoad
{
[super viewDidLoad];
UIScrollView *scrollView = [[UIScrollView alloc] initWithFrame:CGRectMake(0, 44, [UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height * 0.9)];
scrollView.scrollEnabled = TRUE;
scrollView.bounces = TRUE;
scrollView.contentSize = CGSizeMake([UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height);
scrollView.userInteractionEnabled = YES;
[self.view addSubview:scrollView];
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake([UIScreen mainScreen].bounds.size.width * 0.04, 10, [UIScreen mainScreen].bounds.size.width * 0.28, [UIScreen mainScreen].bounds.size.height * 0.22)];
imageView.image = [UIImage imageNamed:#"image.png"];
imageView.layer.borderColor = [UIColor whiteColor].CGColor;
imageView.layer.borderWidth = 1;
imageView.userInteractionEnabled = YES;
[scrollView addSubview:imageView];
// Create a tap gesture
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(touchedImage:)];
[imageView addGestureRecognizer:tap];
}
if you're dealing with multiple imageviews using the same gesture recognizer, it won't work. Try using a new gesture recognizer per image view.
User interaction of UIImageView is disabled by default you can enable it by setting imageView.userInteractionEnabled = YES;
You'll have to derive a CustomScrollView from UIScrollView and override all touches method like touchesBegan,Moved,Ended and Cancelled like below:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UIView *view in self.subviews)
{
UITouch *touch = [touches anyObject];
CGPoint pt = [touch locationInView:view];
if (CGRectContainsPoint(view.frame, pt))
{
[view touchesMoved:touches withEvent:event];
}
}
}
After you do this. When you add any view in the instance of customScrollView, you'll get the touches in the added view properly.
I had the same problem with an image view as a subview in a scroll view when applying a transformation.
Upon scroll my gesture recognizer no longer worked so I checked the offset of the scroll view and reset my image so the gesture was recognized.
And of course we need to set userInterActionEnabled to true and add the gesture in viewDidLoad()
override func viewDidLoad() {
imageView.userInteractionEnabled = true
addGesture()
}
func addGesture(){
let singleTap = UITapGestureRecognizer(target: self, action: #selector(MyViewController.enableGesture(_:)));
singleTap.numberOfTapsRequired = 1
self.imageView.addGestureRecognizer(singleTap)
self.imageView.tag = 1
}
override func scrollViewDidScroll(scrollView: UIScrollView) {
let offset = scrollView.contentOffset.y
if offset > 0 {
imageView.layer.transform = avatarTransform
}
else {
imageView.transform = CGAffineTransformIdentity
}
}

Bounce UIImageView back when dragged off screen

What I need is when a UIImageView is dragged off of the screen it to bounce back when it gets let go. I have it working in the left and top sides this is what I am doing.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (!CGRectContainsPoint(self.view.frame, imageView.frame.origin)){
CGFloat newX = 0.0f;
CGFloat newY = 0.0f;
// If off screen upper and left
if (imageView.frame.origin.x < 0.0f){
CGFloat negX = imageView.frame.origin.x * -1;
newX = negX;
}else{
newX = imageView.frame.origin.x;
}
if (imageView.frame.origin.y < 0.0f){
CGFloat negY = imageView.frame.origin.y * -1;
newY = negY;
}else{
newY = imageView.frame.origin.y;
}
CGRect newPoint = CGRectMake(newX, newY, imageView.frame.size.width, imageView.frame.size.height);
[UIView beginAnimations:#"BounceAnimations" context:nil];
[UIView setAnimationDuration:.5];
[letterOutOfBounds play];
[imageView setFrame:newPoint];
[UIView commitAnimations];
}
}
}
So I would like to achieve the same thing for the right and bottom sides. But I have been stuck at this for awhile. Any Ideas?
How about something like the following?
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UIImageView *imageView = nil;
BOOL moved = NO;
CGRect newPoint = imageView.frame;
// If off screen left
if (newPoint.origin.x < 0.0f){
newPoint.origin.x *= -1.0;
moved = YES;
}
// if off screen up
if (newPoint.origin.y < 0.0f){
newPoint.origin.y *= -1.0;
moved = YES;
}
// if off screen right
CGFloat howFarOffRight = (newPoint.origin.x + newPoint.size.width) - imageView.superview.frame.size.width;
if (howFarOffRight > 0.0)
{
newPoint.origin.x -= howFarOffRight * 2;
moved = YES;
}
// if off screen bottom
CGFloat howFarOffBottom = (newPoint.origin.y + newPoint.size.height) - imageView.superview.frame.size.height;
if (howFarOffBottom > 0.0)
{
newPoint.origin.y -= howFarOffBottom * 2;
moved = YES;
}
if (moved)
{
[UIView beginAnimations:#"BounceAnimations" context:nil];
[UIView setAnimationDuration:.5];
[letterOutOfBounds play];
[imageView setFrame:newPoint];
[UIView commitAnimations];
}
}
As I read your code, the logic of "if off the left side, move it back on to the view by the same distance it was off the screen." To be honest, that doesn't quite make sense to me (why, when bouncing back, does the coordinate depend upon how far off the screen it was), but I've tried to honor that in the "off screen right" and "off screen bottom" logic. Obviously my logic is using the superview of imageView to determine the width of the containing view, but if that's not appropriate, replace it with whatever is.
Edit:
I personally do this stuff with gesture recognizers, such as:
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
[self.imageView addGestureRecognizer:pan];
self.imageView.userInteractionEnabled = YES;
Thus, a gesture recognizer to animate moving the image back would be:
- (void)handlePan:(UIPanGestureRecognizer *)gesture
{
static CGRect originalFrame; // you could make this an ivar if you want, but just for demonstration purposes
if (gesture.state == UIGestureRecognizerStateBegan)
{
originalFrame = self.imageView.frame;
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
CGPoint translate = [gesture translationInView:gesture.view];
CGRect newFrame = originalFrame;
newFrame.origin.x += translate.x;
newFrame.origin.y += translate.y;
gesture.view.frame = newFrame;
}
else if (gesture.state == UIGestureRecognizerStateEnded || gesture.state == UIGestureRecognizerStateCancelled)
{
CGRect newFrame = gesture.view.frame;
newFrame.origin.x = fmaxf(newFrame.origin.x, 0.0);
newFrame.origin.x = fminf(newFrame.origin.x, gesture.view.superview.bounds.size.width - newFrame.size.width);
newFrame.origin.y = fmaxf(newFrame.origin.y, 0.0);
newFrame.origin.y = fminf(newFrame.origin.y, gesture.view.superview.bounds.size.height - newFrame.size.height);
// animate how ever you want ... I generally just do animateWithDuration
[UIView animateWithDuration:0.5 animations:^{
gesture.view.frame = newFrame;
}];
}
}
Or, if you want a gesture recognizer that just prevents the dragging of the image off the screen in the first place, it would be:
- (void)handlePan:(UIPanGestureRecognizer *)gesture
{
static CGRect originalFrame;
if (gesture.state == UIGestureRecognizerStateBegan)
{
originalFrame = self.imageView.frame;
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
CGPoint translate = [gesture translationInView:gesture.view];
CGRect newFrame = originalFrame;
newFrame.origin.x += translate.x;
newFrame.origin.x = fmaxf(newFrame.origin.x, 0.0);
newFrame.origin.x = fminf(newFrame.origin.x, gesture.view.superview.bounds.size.width - newFrame.size.width);
newFrame.origin.y += translate.y;
newFrame.origin.y = fmaxf(newFrame.origin.y, 0.0);
newFrame.origin.y = fminf(newFrame.origin.y, gesture.view.superview.bounds.size.height - newFrame.size.height);
gesture.view.frame = newFrame;
}
}
By the way, in iOS 7, you can give the animation of the image view back to its original location a little bounciness by using the new animationWithDuration with the usingSpringWithDampening and initialSpringVelocity parameters:
[UIView animateWithDuration:1.0
delay:0.0
usingSpringWithDamping:0.3
initialSpringVelocity:0.1
options:0
animations:^{
// set the new `frame` (or update the constraint constant values that
// will dictate the `frame` and call `layoutViewsIfNeeded`)
}
completion:nil];
Alternatively, in iOS7, you can also use UIKit Dynamics to add a UISnapBehavior:
self.animator = [[UIDynamicAnimator alloc] initWithReferenceView:self.view];
self.animator.delegate = self;
UISnapBehavior *snap = [[UISnapBehavior alloc] initWithItem:self.viewToAnimate snapToPoint:CGPointMake(self.viewToAnimate.center.x, self.view.frame.size.height - 50)];
// optionally, you can control how much bouncing happens when it finishes, e.g., for a lot of bouncing:
//
// snap.damping = 0.2;
// you can also slow down the snap by adding some resistance
//
// UIDynamicItemBehavior *resistance = [[UIDynamicItemBehavior alloc] initWithItems:#[self.viewToAnimate]];
// resistance.resistance = 20.0;
// resistance.angularResistance = 200.0;
// [self.animator addBehavior:resistance];
[self.animator addBehavior:snap];
I think the easiest way is to check whether your imageView has gone out of your self.view.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (!CGRectContainsRect(self.view.frame, imageView.frame)){
// Your animation to bounce.
}
}

Multiple images/buttons staying relative to their position on a zoomable image

I've made a map with UIScrolView, I want to place other small images or buttons onto the map and have them be in a relative position on the map when you zoom in and be able to click on them whenever. So when zoomed out, a button on country A, will still be on Country A when zoomed in, and disappear out of the screen when you scroll out of the countries view whilst zoomed in. How could I go about doing this?
As i can understand, you want to place custom views on your own custom map. And you need to keep the same sizes for views, but they should move when you scroll or zoom imageView.
You have to place views to scrollView's superview and recalculate positions when you zoom or scroll:
CustomMapViewController.h:
#interface CustomMapViewController : UIViewController <UIScrollViewDelegate>
{
UIScrollView *_scrollView;
UIImageView *_mapImageView;
NSArray *_customViews;
}
CustomMapViewController.m:
#import "CustomMapViewController.h"
enum {
kAddContactButton = 1,
kInfoDarkButton,
kInfoLightButton,
kLogoImage,
};
#implementation CustomMapViewController
- (void)dealloc
{
[_scrollView release]; _scrollView = nil;
[_mapImageView release]; _mapImageView = nil;
[_customViews release]; _customViews = nil;
[super dealloc];
}
- (void) loadView
{
[super loadView];
UIImageView *mapImageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"map.png"]];
UIScrollView *scrollView = [[UIScrollView alloc] initWithFrame:self.view.bounds];
scrollView.autoresizingMask = UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth;
scrollView.delegate = self;
scrollView.minimumZoomScale = 0.2;
scrollView.maximumZoomScale = 2.0;
[scrollView addSubview:mapImageView];
scrollView.contentSize = mapImageView.frame.size;
[self.view addSubview:scrollView];
_scrollView = scrollView;
_mapImageView = mapImageView;
// Add custom views
UIButton *btn1 = [UIButton buttonWithType:UIButtonTypeContactAdd];
btn1.tag = kAddContactButton;
[self.view addSubview:btn1];
UIButton *btn2 = [UIButton buttonWithType:UIButtonTypeInfoDark];
btn2.tag = kInfoDarkButton;
[self.view addSubview:btn2];
UIButton *btn3 = [UIButton buttonWithType:UIButtonTypeInfoLight];
btn3.tag = kInfoLightButton;
[self.view addSubview:btn3];
UIImageView *image = [[[UIImageView alloc] initWithImage:[UIImage imageNamed:#"logo.png"]] autorelease];
image.tag = kLogoImage;
[self.view addSubview:image];
_customViews = [[NSArray alloc] initWithObjects:btn1, btn2, btn3, image, nil];
[self _zoomToFit];
}
- (void) _zoomToFit
{
UIScrollView *scrollView = _scrollView;
CGFloat contentWidth = scrollView.contentSize.width;
CGFloat contentHeigth = scrollView.contentSize.height;
CGFloat viewWidth = scrollView.frame.size.width;
CGFloat viewHeight = scrollView.frame.size.height;
CGFloat width = viewWidth / contentWidth;
CGFloat heigth = viewHeight / contentHeigth;
CGFloat scale = MIN(width, heigth); // to fit
// CGFloat scale = MAX(width, heigth); // to fill
// May be should add something like this
if ( scale < _scrollView.minimumZoomScale ) {
_scrollView.minimumZoomScale = scale;
} else if ( scale > _scrollView.maximumZoomScale ) {
_scrollView.maximumZoomScale = scale;
}
_scrollView.zoomScale = scale;
}
////////////////////////////////////////////////////////////////////////////////////////
#pragma mark - Positions
- (void) _updatePositionForViews:(NSArray *)views
{
CGFloat scale = _scrollView.zoomScale;
CGPoint contentOffset = _scrollView.contentOffset;
for ( UIView *view in views ) {
CGPoint basePosition = [self _basePositionForView:view];
[self _updatePositionForView:view scale:scale basePosition:basePosition offset:contentOffset];
}
}
- (CGPoint) _basePositionForView:(UIView *)view
{
switch (view.tag) {
case kAddContactButton:
return CGPointMake(50.0, 50.0);
case kInfoDarkButton:
return CGPointMake(250.0, 250.0);
case kInfoLightButton:
return CGPointMake(450.0, 250.0);
case kLogoImage:
return CGPointMake(650.0, 450.0);
default:
return CGPointZero;
}
}
- (void) _updatePositionForView:(UIView *)view scale:(CGFloat)scale basePosition:(CGPoint)basePosition offset:(CGPoint)offset;
{
CGPoint position;
position.x = (basePosition.x * scale) - offset.x;
position.y = (basePosition.y * scale) - offset.y;
CGRect frame = view.frame;
frame.origin = position;
view.frame = frame;
}
//////////////////////////////////////////////////////////////////////////////////////
#pragma mark - UIScrollViewDelegate
- (void)scrollViewDidZoom:(UIScrollView *)scrollView
{
[self _updatePositionForViews:_customViews];
}
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
{
[self _updatePositionForViews:_customViews];
}
- (UIView *) viewForZoomingInScrollView:(UIScrollView *)scrollView;
{
return _mapImageView;
}
#end

Saving UIImageView and overlayed UIView as an image (xcode/iOS)

I have an image loaded into a UIImageView, which has a UIView overlayed (with a CGRect drawn inside it), I am looking for a way to save what is displayed on the screen as a new image. I am using storyboards and ARC.
I have a UIViewController, which contains UIImageView. The UIView is displayed on top of this, and a circle is drawn where the user touches the screen. This all works fine, but now I want to save what is displayed as an image (JPG).
Screenshot of my storyboard:
Below is the code from my PhotoEditViewController so far. passedPhoto is the photo that is loaded into the UIImageView.
#import "PhotoEditViewController.h"
#interface PhotoEditViewController ()
#end
#implementation PhotoEditViewController
#synthesize selectedPhoto = _selectedPhoto;
#synthesize backButton;
#synthesize savePhoto;
- (void)viewDidLoad {
[super viewDidLoad];
[passedPhoto setImage:_selectedPhoto];
}
- (void)viewDidUnload {
[self setBackButton:nil];
[self setSavePhoto:nil];
[super viewDidUnload];
}
- (IBAction)savePhoto:(id)sender{
NSLog(#"save photo");
// this is where it needs to happen!
}
#end
Below is the code from my PhotoEditView which handles the creation of the circle overlay:
#import "PhotoEditView.h"
#implementation PhotoEditView
#synthesize myPoint = _myPoint;
- (void)setMyPoint:(CGPoint)myPoint {
_myPoint = myPoint;
self.backgroundColor = [UIColor colorWithWhite:1.0 alpha:0.01];
[self setNeedsDisplay];
}
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
return self;
}
-(void) touchesBegan: (NSSet *) touches withEvent: (UIEvent *) event {
NSSet *allTouches = [event allTouches];
switch ([allTouches count]){
case 1: {
UITouch *touch = [[allTouches allObjects] objectAtIndex:0];
CGPoint point = [touch locationInView:self];
NSLog(#"x=%f", point.x);
NSLog(#"y=%f", point.y);
self.myPoint = point;
}
break;
}
}
- (void)drawRect:(CGRect)rect {
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(ctx, 4.0);
CGContextSetRGBFillColor(ctx, 0, 0, 0, 0.5);
CGContextSetRGBStrokeColor(ctx, 1.0, 0.0, 0.0, 1);
CGContextStrokePath(ctx);
CGRect circlePoint = CGRectMake(self.myPoint.x - 50, self.myPoint.y - 50, 100.0, 100.0);
CGContextStrokeEllipseInRect(ctx, circlePoint);
}
#end
Any help would be appreciated.
Have you tried this:
UIGraphicsBeginImageContext(rect.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
[customView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageView *imgView = [[UIImageView alloc] initWithImage:viewImage];
This is just the same approach as for getting an UIImage from a UIView, only you draw two views in the same context.

UIScroll view for comic App

I´m working on an ipad app for a comic but i can´t get my scroll view to work. I need it to do:
Pinch zoom
Rotation
Double tap and move by comic strip
Here´s an example of what i want to get http://vimeo.com/16073699 (Second 35 +-)
Right now i have pinch zoom but i can get the scroll view to rotate and center image.
Here´s my controller code:
#define ZOOM_VIEW_TAG 100
#define ZOOM_STEP 2
#define PAGE_TIME 10
- (void)viewDidLoad{
[super loadView];
self.myImage.image = [UIImage imageNamed:#"Tira01.jpg"];
[self.myImage setTag:ZOOM_VIEW_TAG];
UISwipeGestureRecognizer *recognizer;
recognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(Tap)];
[(UITapGestureRecognizer *)recognizer setNumberOfTouchesRequired:1];
[(UITapGestureRecognizer *)drecognizer setNumberOfTapsRequired:1];
[[self myImage] addGestureRecognizer:recognizer];
[recognizer release];
self.drecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleDoubleTap:)];
[(UITapGestureRecognizer *)drecognizer setNumberOfTouchesRequired:1];
[(UITapGestureRecognizer *)drecognizer setNumberOfTapsRequired:2];
[[self myImage] addGestureRecognizer:drecognizer];
[drecognizer release];
}
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView {
return [myScrollView viewWithTag:ZOOM_VIEW_TAG];
}
- (void)handleDoubleTap:(UIGestureRecognizer *)gestureRecognizer {
UIInterfaceOrientation currentOrientation = [[UIDevice currentDevice] orientation];
if(!self.zoomed)
{
// double tap zooms in
float newScale = [self.myScrollView zoomScale] * ZOOM_STEP;
CGRect zoomRect = [self zoomRectForScale:newScale withCenter:[gestureRecognizer locationInView:gestureRecognizer.view]];
[self.myScrollView zoomToRect:zoomRect animated:YES];
self.zoomed = YES;
}
else {
float newScale = [self.myScrollView zoomScale] / ZOOM_STEP;
CGRect zoomRect = [self zoomRectForScale:newScale withCenter:[gestureRecognizer locationInView:gestureRecognizer.view]];
[self.myScrollView zoomToRect:zoomRect animated:YES];
self.zoomed = NO;
}
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation{
return YES;
}
- (CGRect)zoomRectForScale:(float)scale withCenter:(CGPoint)center {
CGRect zoomRect;
// the zoom rect is in the content view's coordinates.
// At a zoom scale of 1.0, it would be the size of the imageScrollView's bounds.
// As the zoom scale decreases, so more content is visible, the size of the rect grows.
zoomRect.size.height = [self.myScrollView frame].size.height / scale;
zoomRect.size.width = [self.myScrollView frame].size.width / scale;
// choose an origin so as to get the right center.
zoomRect.origin.x = center.x - (zoomRect.size.width / 2.0);
zoomRect.origin.y = center.y - (zoomRect.size.height / 2.0);
return zoomRect;
}
In interface builder i have:
-> UIView(ScaleToFill mode)
--> UIScrollView (ScaleToFill mode)
---> UIImageView (AspectFit mode)
I don´t know what i´m doing wrong but i´m going crazy with that :(
Regarding the UIScrollView centering the content, I might be wrong here, but the UIScrollView doesn't automatically do that.
In my app i'm manually doing it by actually moving the content in the scrollView so that it is centered.
Just as a quick explanation for the code below, if the content (image) frame is smaller than the bounds of the scroll view, then adjust the position of the content horizontally and vertically.
The code below is from my class derived from UIScrollView. If you're using the default UIScrollView, you can use this code in the parent UIView.
- (void)layoutSubviews
{
[super layoutSubviews];
if (![self centersContent])
{
return;
}
// center the image as it becomes smaller than the size of the screen
CGSize boundsSize = self.bounds.size;
CGRect frameToCenter = imageView.frame;
// center horizontally
if (frameToCenter.size.width < boundsSize.width)
frameToCenter.origin.x = (boundsSize.width - frameToCenter.size.width) / 2;
else
frameToCenter.origin.x = 0;
// center vertically
if (frameToCenter.size.height < boundsSize.height)
frameToCenter.origin.y = (boundsSize.height - frameToCenter.size.height) / 2;
else
frameToCenter.origin.y = 0;
imageView.frame = frameToCenter;
}