Change Gesture Recogniser Position? - objective-c

I have a gesture recogniser to detect a tap on a UIImageView, however I force a position change of the image when the orientation of the iPad changes but this causes the gesture recogniser to repositioned incorrectly. How can I resolve this?
EDIT:
strapTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(strapTap:)];
[(UITapGestureRecognizer *)strapTap setNumberOfTapsRequired:1];
strapTap.delegate = self;
[leatherButtonedStrap addGestureRecognizer:strapTap];
This above is how I setup the gesture, adding it to my UIImageView. However, something which might cause a stir is that I add the this view to my main view, remove it and then re-add it when the user presses a certain button. Hard for you to understand whats happening unless you saw the entire class, but let me know if this is enough to go on or not.

Make sure when you're calling locationInView: in strapTap: to send the leatherButtonedStrap as the parater:
CGPoint location = [recognizer locationInView:leatherButtonedStrap];
So that the location is relative to leatherButtonedStrap's coordinate system an not it's superview's.

Related

Detect swipe down and close View with animation Objective C

I have a search button that onClick opens an UIView that covers most of the screen (It's like an UIAlertView). The UIView has as first element an UIImageView 2 UITextView and a close UIButton, what I want to add now is a gesture recognizer that works like this:
When the user drags the UIImageView down it also drags the UIView and it close it.
I tried with the following code in order to detect the Swipe gesture:
UISwipeGestureRecognizer *swipeGesture = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(swipeToDoMethod)];
[swipeGesture setDirection:UISwipeGestureRecognizerDirectionDown];
[[self popupImage] addGestureRecognizer: swipeGesture];
But I don't know how to proceed after this, can someone point me to the right direction in order to reproduce the effect I want?
missing delegate:
....
[swipeGesture setDirection:UISwipeGestureRecognizerDirectionDown];
swipeGesture.delegate=self;
....

Add NSGestureRecognizer to MKMapView

I'm trying to add a NSPressGestureRecognizer to a MKMapView. It works, but unfortunately it has a unpleasant side effect.
NSPressGestureRecognizer *gp = [[NSPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleLongPress:)];
[gp setMinimumPressDuration:1.0];
[mapView addGestureRecognizer:gp];
[self addPin]; //Adds a pin to the map
It works fine, and the handleLongPress: is called. What is not working, is the ability to drag the pin on the map.
Whenenver I click and hold to 'pick up' the pin, the handleLongPress function is called. (It works fine, the I uncomment the [mapview addGesture..] line.
Is it not possible to add a custom long-press gesture recogniser to a MKMapView without overriding the existing GestureRecognizers?

UIButton inside UIView doesn't respond to touch events

I've put a UIButton inside a custom UIView and the button is not receiving any touch events (it doesn't get into the highlighted state, so my problem is not about being unable to wire up a touch inside up handler). I've tried both putting it into the XIB in Interface Builder, and also tried programatically adding the UIButton into the UIView seperately, both ended with no luck. All my views are inside a UIScrollView, so I first though UIScrollView may be blocking them, so I've also added a button programatically exactly the same way I add my custom view into UIScrollView, and the button worked, elimination the possibility of UIScrollView could be the cause. My View's hiearchy is like this:
The button is over the image view, and the front layer isn't occupying my button completely, so there's no reason for me not be physically interacting with the button. At my custom view's code side, I'm creating my view as such:
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
UIView *sub = [[[NSBundle mainBundle] loadNibNamed:#"ProfileView" owner:self options:nil] objectAtIndex:0];
[self addSubview:sub];
[sub setUserInteractionEnabled:YES];
[self setUserInteractionEnabled:YES];
CALayer *layer = sub.layer;
layer.masksToBounds = YES;
layer.borderWidth = 5.0;
layer.borderColor = [UIColor whiteColor].CGColor;
layer.cornerRadius = 30.0;
/*layer.shadowOffset = CGSizeZero;
layer.shadowRadius = 20.0;
layer.shadowColor = [[UIColor blackColor] CGColor];
layer.shadowOpacity = 0.8;
*/
}
return self;
}
I've tried all combinations of setUserInteractionsEnabled, and had no luck. (Yes, also set them to checked in Interface Builder too). I've also read in another question with a similar problem that I should try overriding 'canBecomeFirstResponder' to return 'YES' and I've also done that too. But the problem persists, I can't click the button. I've not given any special properties, settings to the button, it's just a regular one. My other objects in the view (labels below, image view behind the button etc.) are working properly without problems. What could be possibly wrong here?
Thanks,
Can.
UPDATE: Here is a quick reproduction of the problem: https://dl.dropbox.com/u/79632924/Test.zip
Try to run and click the button.
Looking at the test project, I believe your problem in the way you create TestView, you do not specify the frame for it, so basically the parent view is 0 size, and the subviews you see from XIB extending out of the parent view and thus do not get anything in responder chain.
You should either specify the frame when creating TestView, or adjust the frame after loading XIB file.
I have had this problem as well. The cause for me was that the UIButton superview frame was of height 0, so I believe that even though a touch was happening, it was not being passed down to the button.
After making sure that the button's superview took a larger rectangle as a frame the button actions worked.
The root cause for this problem on my side was a faulty auto layout implementation (I forgot to set the height constraint for the button's superview).
I've found the solution. I was initializing my custom view as:
MyView *view = [[MyView alloc] init];
I've initialized it instead with a frame of my view's size, and it started responding to events:
CGRect rect = CGRectMake(0,0,width,height);
MyView *view = [[MyView alloc] initWithFrame:rect];
Storyboard Solution
Just for anyone wanting a solution to this when using storyboards and constraints.
Add a constraint between the superview (containing the button) and the UIButton with an equal heights constraint.
In my case, I had selected embed UIButton in a UIView with no inset on the storyboard. Adding the additional height constraint between the UIButton and the superview allowed the UIButton to respond to touches.
You can confirm the issue by starting the View Debugger and visually confirm that the superview of the UIButton is not selectable.
(Xcode 11, *- Should also work in earlier versions)

I want to draw a focus indicator on camera view

I made an application using the camera.
I didn't use the default camera toolbar.
So I add a toolbar on camera overlay view.
I want to draw tap to focus indicator
So I used UITapGestureRecognizer.
Here is the code
UITapGestureRecognizer *focusRect = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(drawFocusRect:)];
focusRect.numberOfTapsRequired = 1;
focusRect.numberOfTouchesRequired = 1;
[cameraPicker.view addGestureRecognizer:focusRect];
[focusRect release];
In the drawFocusRect method
NSLog(#"tapped");
But it dosen't work.
I changed the value of numberOfTouchedRequired, 2;
And it works.
I think that it doen't work when only single-tap
So, how can I implement this single tapping gesture processing method?
I think camera view's focus view at top of all view.
And I think it has single tap action.
So If you want do this, you should read AVCamera demo in WWDC2010.

Touch event of a picture without using UIButton

Is it possible to register a touch of a picture without using UIButton? And if it is, how can i do this?
I can just think of changing the background of a button in the code, and then use IBAction to detect the touch.
Btw, sorry for my english.
Sure. You can put a UITapGestureRecognizer on a UIImageView (remember to set the image view's userInteractionEnabled property to YES):
// assumes "myImageView" is your UIImageView that has user interaction enabled
UITapGestureRecognizer * tap = [[[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tap:)] autorelease];
[myImageView addGestureRecognizer:tap];
later in your controller...
- (void)tap:(UITapGestureRecognizer *)recognizer
{
// whatever you want
}
However, you can set up a UIButton to basically be a dumb image. Just set the background image for the normal state, disable all the adjusting on highlighting and touches, and you should have exactly what you're thinking of.
If I've misunderstood what you're getting at, feel free to clarify.