UIPanGestureRecognizer on multiple view - Objective-c - objective-c

I have 2 imageviews that I want to be able to move. This is my code:
panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(gestureRecognizerMethod:)];
imageview = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"image.png"]];
imageview.frame = CGRectMake(390, 100, 80, 16);
imageview.userInteractionEnabled = YES;
[imageview addGestureRecognizer:panRecognizer];
[self addSubview:imageview];
imageview2 = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"image.png"]];
imageview2.frame = CGRectMake(390, 120, 80, 17);
imageview2.userInteractionEnabled = YES;
[imageview2 addGestureRecognizer:panRecognizer];
[self imageview2];
And in my gestureRecognizer method:
- (void)gestureRecognizerMethod:(UIPanGestureRecognizer *)recognizer
{
if (recognizer.view == imageview)
{
if (recognizer.state == UIGestureRecognizerStateBegan || recognizer.state == UIGestureRecognizerStateChanged)
{
CGPoint startPoint = [recognizer locationInView:self];
imageview.center = startPoint;
}
}
if (recognizer.view == imageview2)
{
if (recognizer.state == UIGestureRecognizerStateBegan || recognizer.state == UIGestureRecognizerStateChanged)
{
CGPoint startPoint = [recognizer locationInView:self];
imageview2.center = startPoint;
}
}
}
I'm only able to move one of the view. What is wrong?

Create a new gesture recognizer for the second image view. You can use the same target and action for both recognisers but you can only attach any one recogniser to a single view.

Try this:
ResizeImage on github

Related

UIAttachmentBehaviour works incorrectly

I would like to drag one UIView, and have it move the position of other views attached to it as if they were all attached via string. I added two attached behaviours, one between my views and one between view which allowed to dragging and anchor point. And it works. But my problem is that my views continue to move even after gesture ended. My current result:
What do i do wrong?
Here is my code:
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
self.views = [[NSMutableArray alloc] init];
self.v2 = [[UIView alloc] initWithFrame:CGRectMake(50, 450, 200, 200)];
self.v2.backgroundColor = [UIColor colorWithRed:34.0f/255.0f green:167.0f/255.0f blue:240.0f/255.0f alpha:1.0];
[self.view addSubview:self.v2];
self.v1 = [[UIView alloc] initWithFrame:CGRectMake(150, 450, 200, 200)];
self.v1.backgroundColor = [UIColor colorWithRed:210.0f/255.0f green:82.0f/255.0f blue:127.0f/255.0f alpha:1.0];
[self.view addSubview:self.v1];
self.animator = [[UIDynamicAnimator alloc] initWithReferenceView:self.view];
self.gravity = [[UIGravityBehavior alloc]initWithItems:#[self.v1, self.v2]];
self.gravity.gravityDirection = CGVectorMake(0.0, 0.0);
[_animator addBehavior:_gravity];
_collision = [[UICollisionBehavior alloc]initWithItems:#[self.v1]];
_collision.collisionDelegate = self;
_collision.translatesReferenceBoundsIntoBoundary = YES;
[_animator addBehavior:_collision];
UIDynamicItemBehavior *itemBehaviour = [[UIDynamicItemBehavior alloc] initWithItems:#[self.v1]];
itemBehaviour.elasticity = 0.6;
itemBehaviour.allowsRotation = NO;
[_animator addBehavior:itemBehaviour];
UIPanGestureRecognizer *gesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
[self.v1 addGestureRecognizer:gesture];
UIAttachmentBehavior *attach = [[UIAttachmentBehavior alloc] initWithItem:self.v2 attachedToItem:self.v1];
attach.damping = 0.50f;
attach.length = 150.0f;
attach.frequency = 1.9;
[_animator addBehavior:attach];
}
-(void)handlePan:(UIPanGestureRecognizer *)gesture
{
CGPoint touchPoint = [gesture locationInView:self.view];
static float y ;
if (gesture.state == UIGestureRecognizerStateBegan) {
self.attachmentBehavior = [[UIAttachmentBehavior alloc] initWithItem:self.v1 attachedToAnchor:touchPoint];
self.attachmentBehavior.damping = 0.50f;
self.attachmentBehavior.frequency = 1.9;
[_animator addBehavior:self.attachmentBehavior];
y = self.v1.center.y;
} else if (gesture.state == UIGestureRecognizerStateChanged ) {
CGPoint center = self.v1.center;
center.x = touchPoint.x;
center.y = y;
self.attachmentBehavior.anchorPoint = center;
} else if (gesture.state == UIGestureRecognizerStateEnded) {
[self.animator removeBehavior:self.attachmentBehavior];
}
}

pinch zooming in image viewer

I have an UIImageView. I need pinch zoom.
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 90, 320, 38)];
[imageView setImage:[UIImage imageNamed:#"accesspanel.png"]];
[self.view addSubview: imageView];
You can add the imageview inside a scrollview and can use Scrollview delegate for the this purpose
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView
{
return self.imageView;
}
- (void)viewDidLoad
{
[super viewDidLoad];
self.scrollView.minimumZoomScale=0.5;
self.scrollView.maximumZoomScale=6.0;
self.scrollView.contentSize=CGSizeMake(1280, 960);
self.scrollView.delegate=self;
}
First ---> ADD Pinch Gesture to your Image View -:
UIPinchGestureRecognizer *pgr = [[UIPinchGestureRecognizer alloc]
initWithTarget:self action:#selector(handlePinchGesture:)];
pgr.delegate = self;
[imageView addGestureRecognizer:pgr];
Second ---> Implement Pinch Gesture -:
- (void)handlePinchGesture:(UIPinchGestureRecognizer *)gestureRecognizer {
if([gestureRecognizer state] == UIGestureRecognizerStateBegan) {
// Reset the last scale, necessary if there are multiple objects with different scales.
lastScale = [gestureRecognizer scale];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan ||
[gestureRecognizer state] == UIGestureRecognizerStateChanged) {
CGFloat currentScale = [[[gestureRecognizer view].layer valueForKeyPath:#"transform.scale"] floatValue];
// Constants to adjust the max/min values of zoom.
const CGFloat kMaxScale = 2.0;
const CGFloat kMinScale = 1.0;
CGFloat newScale = 1 - (lastScale - [gestureRecognizer scale]); // new scale is in the range (0-1)
newScale = MIN(newScale, kMaxScale / currentScale);
newScale = MAX(newScale, kMinScale / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[gestureRecognizer view] transform], newScale, newScale);
[gestureRecognizer view].transform = transform;
lastScale = [gestureRecognizer scale]; // Store the previous. scale factor for the next pinch gesture call
}
}
UIPinchGestureRecognizer *pinchGestureRecognizer=[[UIPinchGestureRecognizer alloc]initWithTarget:self action:#selector(pinchGestureDetected:)];
[pinchGestureRecognizer setDelegate:self];
[_third_imageview addGestureRecognizer:pinchGestureRecognizer];
- (void)pinchGestureDetected:(UIPinchGestureRecognizer *)recognizer
{
UIGestureRecognizerState state = [recognizer state];
if (state == UIGestureRecognizerStateBegan || state ==UIGestureRecognizerStateChanged)
{
CGFloat scale = [recognizer scale];
[recognizer.view setTransform:CGAffineTransformScale(recognizer.view.transform, scale, scale)];
[recognizer setScale:1.0];
}
}

Touch method UIImageView inside UIScrollView

I am wondering how I can use a touch method for a UIImageView inside a UIScrollView in xCode.
When I add the UIImageView subview to the self.view, I can use the touch method. But when I add the UIImageView subview to the UIScrollView, I can't. How can I solve this?
This is my code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
touches = [event allTouches];
for (UITouch *touch in touches) {
NSLog(#"Image Touched");
}
}
- (void)viewDidLoad
{
[super viewDidLoad];
UIScrollView *scrollView = [[UIScrollView alloc] initWithFrame:CGRectMake(0, 44, [UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height * 0.9)];
scrollView.scrollEnabled = TRUE;
scrollView.bounces = TRUE;
scrollView.contentSize = CGSizeMake([UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height);
scrollView.userInteractionEnabled = YES;
[self.view addSubview:scrollView];
UIImageView *ImageView = [UIImageView alloc] initWithFrame:CGRectMake([UIScreen mainScreen].bounds.size.width * 0.04, 10, [UIScreen mainScreen].bounds.size.width * 0.28, [UIScreen mainScreen].bounds.size.height * 0.22)];
ImageView.image = [UIImage imageNamed(#"image.png")];
ImageView.layer.borderColor = [UIColor whiteColor].CGColor;
ImageView.layer.borderWidth = 1;
ImageView.userInteractionEnabled = YES;
[scrollView addSubview:ImageView];
}
Give UIGestureRecognizers a try. They are far easier to manage with multiple layers of touch management.
- (void)touchedImage:(UITapGestureRecognizer *)gesture {
// When the gesture has ended, perform your action.
if (gesture.state == UIGestureRecognizerStateEnded) {
NSLog(#"Touched Image");
}
}
- (void)viewDidLoad
{
[super viewDidLoad];
UIScrollView *scrollView = [[UIScrollView alloc] initWithFrame:CGRectMake(0, 44, [UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height * 0.9)];
scrollView.scrollEnabled = TRUE;
scrollView.bounces = TRUE;
scrollView.contentSize = CGSizeMake([UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height);
scrollView.userInteractionEnabled = YES;
[self.view addSubview:scrollView];
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake([UIScreen mainScreen].bounds.size.width * 0.04, 10, [UIScreen mainScreen].bounds.size.width * 0.28, [UIScreen mainScreen].bounds.size.height * 0.22)];
imageView.image = [UIImage imageNamed:#"image.png"];
imageView.layer.borderColor = [UIColor whiteColor].CGColor;
imageView.layer.borderWidth = 1;
imageView.userInteractionEnabled = YES;
[scrollView addSubview:imageView];
// Create a tap gesture
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(touchedImage:)];
[imageView addGestureRecognizer:tap];
}
if you're dealing with multiple imageviews using the same gesture recognizer, it won't work. Try using a new gesture recognizer per image view.
User interaction of UIImageView is disabled by default you can enable it by setting imageView.userInteractionEnabled = YES;
You'll have to derive a CustomScrollView from UIScrollView and override all touches method like touchesBegan,Moved,Ended and Cancelled like below:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UIView *view in self.subviews)
{
UITouch *touch = [touches anyObject];
CGPoint pt = [touch locationInView:view];
if (CGRectContainsPoint(view.frame, pt))
{
[view touchesMoved:touches withEvent:event];
}
}
}
After you do this. When you add any view in the instance of customScrollView, you'll get the touches in the added view properly.
I had the same problem with an image view as a subview in a scroll view when applying a transformation.
Upon scroll my gesture recognizer no longer worked so I checked the offset of the scroll view and reset my image so the gesture was recognized.
And of course we need to set userInterActionEnabled to true and add the gesture in viewDidLoad()
override func viewDidLoad() {
imageView.userInteractionEnabled = true
addGesture()
}
func addGesture(){
let singleTap = UITapGestureRecognizer(target: self, action: #selector(MyViewController.enableGesture(_:)));
singleTap.numberOfTapsRequired = 1
self.imageView.addGestureRecognizer(singleTap)
self.imageView.tag = 1
}
override func scrollViewDidScroll(scrollView: UIScrollView) {
let offset = scrollView.contentOffset.y
if offset > 0 {
imageView.layer.transform = avatarTransform
}
else {
imageView.transform = CGAffineTransformIdentity
}
}

add tap to focus functionality to ZBarReaderViewController

I am using the zbarsdk to read the barcodes . It is working fine , but my problem is I have added the overlay view to the ZBarReaderViewController (reader in my code ). So now I tried to add the tap to focus functionality . But it is crashing . Below is my code . Thanks in advance . Any ideas would be grateful .
-(IBAction)scanBarCode:(id)sender
{
barcodeClicked = 1;
NSLog(#"TBD: scan barcode here...");
// ADD: present a barcode reader that scans from the camera feed
reader = [ZBarReaderViewController new];
reader.readerDelegate = self;
reader.supportedOrientationsMask = ZBarOrientationMaskAll;
ZBarImageScanner *scanner = reader.scanner;
// TODO: (optional) additional reader configuration here
// EXAMPLE: disable rarely used I2/5 to improve performance
[scanner setSymbology: ZBAR_I25
config: ZBAR_CFG_ENABLE
to: 0];
reader.showsZBarControls = NO;
UIView *ovlView = [[UIView alloc] init];
[ovlView setFrame:CGRectMake(0, 0, 320, 480)];
[ovlView setBackgroundColor:[UIColor clearColor]];
UIImageView *leftBracket = [[UIImageView alloc] init];
[leftBracket setFrame:CGRectMake(21, 100, 278, 15)];
[leftBracket setImage:[UIImage imageNamed:#"TopBracket.png"]];
UIImageView *rightBracket = [[UIImageView alloc] init];
[rightBracket setFrame:CGRectMake(21, 240, 278, 15)];
[rightBracket setImage:[UIImage imageNamed:#"BottomBracket.png"]];
UIToolbar *bottomBar = [[UIToolbar alloc] init];
[bottomBar setBarStyle:UIBarStyleBlackOpaque];
if(UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone && [UIScreen mainScreen].bounds.size.height * [UIScreen mainScreen].scale >= 1136)
{
[bottomBar setFrame:CGRectMake(0, 524, 320, 44)];
}
else
[bottomBar setFrame:CGRectMake(0, 436, 320, 44)];
UIBarButtonItem *cancel = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemCancel target:self action:#selector(cancelCamera)];
/*UIBarButtonItem *flexItem = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemFlexibleSpace
target:nil
action:nil];*/
//UIBarButtonItem *infoButton = [[UIBarButtonItem alloc] initWithTitle:#" Info " style:UIBarButtonItemStyleBordered target:self action:#selector(infoButton)];
/*UIButton *info = [UIButton buttonWithType:UIButtonTypeInfoLight];
[info addTarget:self action:#selector(infoButton) forControlEvents:UIControlEventTouchUpInside];
UIBarButtonItem *infoButton = [[UIBarButtonItem alloc] initWithCustomView:info];
[bottomBar setItems:[NSArray arrayWithObjects:cancel,flexItem,infoButton, nil]];*/
[bottomBar setItems:[NSArray arrayWithObjects:cancel, nil]];
[ovlView addSubview:leftBracket];
[ovlView addSubview:rightBracket];
[ovlView addSubview:bottomBar];
reader.cameraOverlayView = ovlView;
// present and release the controller
[self presentModalViewController:reader
animated: YES];
[reader release];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UIView * previewView = [[[[[[[[[[
reader.view // UILayoutContainerView
subviews] objectAtIndex:0] // UINavigationTransitionView
subviews] objectAtIndex:0] // UIViewControllerWrapperView
subviews] objectAtIndex:0] // UIView
subviews] objectAtIndex:0] // PLCameraView
subviews] objectAtIndex:0]; // PLPreviewView
[previewView touchesBegan:touches withEvent:event];
}
Thanks for MacN00b's answer! That points out the right direction to go for me. I have implemented tap-focus for zbarViewController. Here is the idea:
You can add a custom view to zbarViewController by assigning a custom view to its cameraOverlayView. Then add a TapGestureRecagonizer to the custom view to catch the tap. Then, get the touch point and make the camera focus to the touch point. You would like possibly add a little rectangle around the touch point(that is what I did).
Here goes the code(assigning the custom view to cameraOverlayView:
UIView *view = [[UIView alloc] init];
UITapGestureRecognizer* tapScanner = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(focusAtPoint:)];
[view addGestureRecognizer:tapScanner];
reader.cameraOverlayView = view;
Then in selector focusAtPoint:
- (void)focusAtPoint:(id) sender{
CGPoint touchPoint = [(UITapGestureRecognizer*)sender locationInView:_reader.cameraOverlayView];
double focus_x = touchPoint.x/_reader.cameraOverlayView.frame.size.width;
double focus_y = (touchPoint.y+66)/_reader.cameraOverlayView.frame.size.height;
NSError *error;
NSArray *devices = [AVCaptureDevice devices];
for (AVCaptureDevice *device in devices){
NSLog(#"Device name: %#", [device localizedName]);
if ([device hasMediaType:AVMediaTypeVideo]) {
if ([device position] == AVCaptureDevicePositionBack) {
NSLog(#"Device position : back");
CGPoint point = CGPointMake(focus_y, 1-focus_x);
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]){
[device setFocusPointOfInterest:point];
CGRect rect = CGRectMake(touchPoint.x-30, touchPoint.y-30, 60, 60);
UIView *focusRect = [[UIView alloc] initWithFrame:rect];
focusRect.layer.borderColor = [UIColor whiteColor].CGColor;
focusRect.layer.borderWidth = 2;
focusRect.tag = 99;
[_reader.cameraOverlayView addSubview:focusRect];
[NSTimer scheduledTimerWithTimeInterval: 1
target: self
selector: #selector(dismissFocusRect)
userInfo: nil
repeats: NO];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
}
}
}
}
}
I have added a white rectangle around the touch point, and then use selector dismissFocusRect to dismiss this rectangle. Here is the code:
- (void) dismissFocusRect{
for (UIView *subView in _reader.cameraOverlayView.subviews)
{
if (subView.tag == 99)
{
[subView removeFromSuperview];
}
}
}
I hope this could help!
Look at this documentation by apple in the "Focus Modes" section: https://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html It talks all about how to implement tap to focus properly. I would try to implement this by
CGRect screenRect = [[UIScreen mainScreen] bounds];
screenWidth = screenRect.size.width;
screenHeight = screenRect.size.height;
double focus_x = thisFocusPoint.center.x/screenWidth;
double focus_y = thisFocusPoint.center.y/screenHeight;
[[self captureManager].videoDevice lockForConfiguration:&error];
[[self captureManager].videoDevice setFocusPointOfInterest:CGPointMake(focus_x,focus_y)];
Well if you are using that view controller, how about adding a (void) that should be ok to implement in the barcodeviewcontroller.
- (void) focusAtPoint:(CGPoint)point
{
AVCaptureDevice *device = [[self videoInput] device];
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
} else {
id delegate = [self delegate];
if ([delegate respondsToSelector:#selector(acquiringDeviceLockFailedWithError:)]) {
[delegate acquiringDeviceLockFailedWithError:error];
}
}
}
}

how can i detect Custom UIImageView touch event?

i am using custom UIImageview for detect touch? but i am unable to detect touch on that particular imageview.
-(void)viewDidLoad {
[super viewDidLoad];
mainView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 320, 200)];
image = [[UIImageView alloc] initWithFrame:CGRectMake(10, 300, 100, 100)];
image.image = [UIImage imageNamed:#"CyanSquare.png"];
image2 = [[UIImageView alloc] initWithFrame:CGRectMake(150, 600, 100, 100)];
image2.image = [UIImage imageNamed:#"CyanSquare.png"];
image.tag = 1;
image2.tag = 2;
[self.mainView addSubview:image];
[self.mainView addSubview:image2];
[self.view addSubview:mainView];
}
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [[event allTouches] anyObject];
[super touchesBegan:touches withEvent:event];
if ([touch view] == [image viewWithTag:1]) {
NSLog(#"touch beggin 1");
mainView.image = [UIImage imageNamed:#"VideoBkGround.png"];
}
if ([touch view] == [image2 viewWithTag:2])
{
NSLog(#"touch beggin 2");
mainView.image = [UIImage imageNamed:#"VideoRunning.png"];
}
}
In this code i am not detect touch? pls help me?
with out custom view's it's detected.
You need to enable user interaction for UIImageViews you want to interact with.
#property(nonatomic, getter=isUserInteractionEnabled) BOOL userInteractionEnabled
something like that:
image.userInteractionEnabled = YES;
image2.userInteractionEnabled = YES;
otherwise the images will not get touches...