Get UIEvent to cancel UIGestureRecognizer - objective-c

I have a simple swipe gesture that I want to be very low priority. I want it to be cancelled by events that happen for controls on the view. At first, I thought this would be simple. It's easy to cancel events when a gesture happens, but I can't seem to do the opposite.
My solution is to cancel the gesture if it conflicts with any thing that is touchable. Here is the code I hacked together:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint touch = [gestureRecognizer locationInView:self.view];
return [self.view hitTest:touch withEvent:nil] == self.view;
}
I feel this is the wrong solution to the problem. What am I missing? What is the right way to get events to cancel gestures?
For more context, I have two UISwipeGestureRecognizers (swipe left and swipe right) added to the view. There is also a UISlider in the view (part of an embedded MPVolumeView). When I moved the slider to change the volume, the left or right swipe would fire.

This is the correct way to do what you want. You are telling the gesture recognizer that it should only begin if the touch is directly in the view, not any subviews (according to hitTest: which is good because it allows the views to decide if they are hit or not).
It's always better to prevent it from starting rather than trying to cancel it afterwards. However, if you do want to cancel the gesture after it has started, set enabled = NO and then back to YES again.
If you need to allow the gesture for some subviews but not controls you can test if the view returned by hitTest: is a subclass of UIControl (for example) using isKindOfClass:.
I don't know what type of gesture this is for, but usually you don't need to do this because UIKit will automatically find the deepest view that wants the touches and that view will 'eat' them so outer gesture recognizers won't get them - however I can imagine this doesn't hold true for some combinations of recognizer/control.

Related

Same Viewcontroller pops multiple times

Required:
I want to enable iOS7 swipe to back feature with custom navigation back button item.
Current Implementation:
After researching a lot, I found the following solution to be best:
Set the delegate of the gesture recognizer as follows
self.navigationController.interactivePopGestureRecognizer.delegate = (id<UIGestureRecognizerDelegate>)self;
This, creates a lot of bugs as mentioned in this stackoverflow answer. To avoid that, subclassing the UINavigationController seems to be the only feasible option. I did that as mentioned in this blog by Keighl.
Problem:
Basic swipe to back feature is working, but the strange thing is that, sometimes, the same viewController that is being dismissed, appears again after the pop action is completed.
i.e. suppose the navigation stack looks like A -> B. Popping B will again bring up B. This keeps on happening until eventually the viewController B actually gets dismissed and A appears.
This happens to all views in all viewController objects and not just to a specific one.
Also, I have ensured that the push method is called only once at all places.
I also tried logging the navigation stack at each point, but there is only one instance of each viewController.
Point to note:
I need to disable the swipe feature in certain views. I did this by writing the code to disable and enable the swipe gesture in viewDidAppear and viewDidDisappear respectively.
Please provide your valuable suggestions or a solution to this problem. Thanks!
Short answer: You should add a UIScreenEdgePanGestureRecognizer to your view controller if you want to add a pop gesture where none exists. Modifying the existing interactivePopGestureRecognizer is probably not the right approach. Do this:
[self addGestureRecognizer:({
UIScreenEdgePanGestureRecognizer *gesture =
[[UIScreenEdgePanGestureRecognizer alloc]
initWithTarget:self action:#selector(pop)];
gesture;
})];
and
-(void)pop {
// pop your view controller here
}
Long answer: Forcing the interactivePopGestureRecognizer.delegate is what breaks your code.
If you need to cast self as such:
self.navigationController.interactivePopGestureRecognizer.delegate =
(id<UIGestureRecognizerDelegate>)self;
...it is because self is not a UIGestureRecognizerDelegate. The following should compile, link, build and run or you are setting yourself up for trouble:
self.navigationController.interactivePopGestureRecognizer.delegate = self;
Note that being a UIGestureRecognizerDelegate specifically allows you to tweak a gesture's behavior at runtime, assuming you are implementing one of the following and ensuring that the tweak applies to a gesture you own:
gestureRecognizerShouldBegin:
gestureRecognizer:shouldReceiveTouch:
gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer:
gestureRecognizer:shouldRequireFailureOfGestureRecognizer:
gestureRecognizer:shouldBeRequiredToFailByGestureRecognizer:
By constantly changing the delegate of that interactivePopGestureRecognizer you did not create, all you are doing is preventing the iOS behavior to take place.
From the documentation
UINavigationController -interactivePopGestureRecognizer
The gesture recognizer responsible for popping the top view controller off the navigation stack. (read-only)
In plain English: Use this value is you need to combine that gesture with your own gesture. But you are not supposed to modify its behavior:
...You can use this property to retrieve the gesture recognizer and tie it to the behavior of other gesture recognizers in your user interface...

Detecting all touches in an app

In an iPad app, wherever a user is touching the screen I want to display an image, highlighting the points they are touching. The app contains a number of nested views, all of which should receive touches and behave normally.
Seems simple, but I've failed to find a good way to do it. Using the touches began: with event and related functions on the root view controller does not work because if a subview is touched the events are not fired. I've also created a 'dummy' gesture recognizer which just passes touch events to another class which draws images. That works great-ish and buttons work, but breaks UIScrollViews and I'm guessing other subviews with gesture reconizers.
Is there nowhere you can just access all touch events without affecting where those touches are headed?
thanks.
Your dummy gesture recognizer should be ok. Just watch out for setting states. possible -> began -> ...
Basically your gesture recognizer is forwarding all touches so it can be in began or possible state all the time while any touch exists.
To get rid of problems with other gesture recognizers return YES in this delegate method.
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}
Other option is to subclass main UIWindow in your app and override this method
- (void)sendEvent:(UIEvent *)event
Here you should have access to all events. It's quite easy to filter them.
You can apply a UITapGestureRecognizer to the entire view and set the cancelsTouchesInView property to NO. This will allow you to be notified of all taps on the view and its subviews without intercepting all of the touch events.
Additionally, you can implement the -gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: delegate method to keep this gesture recognizer from stomping on the ones used by views like UIScrollView.
you can try overriding hitTest:withEvent:
-(UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event
This maybe what you are looking for.

UIButton firing selector inconsistantly

I have a UIButton linked up in IB correctly(I believe). The button fires inconsistently, every time I reload the view to show updated info, the button works sometimes and sometimes does not.It gives no errors. I can't find a pattern to when it works and when it doesn't, the same code is run every time I open the view and it still works when it wants. Besides linking it in IB I have also tried to addTarget in ViewDidLoad and remove the IB connection but it still has the same inconsistency,
[_buttonScreen addTarget:self action:#selector(buttonScreenClicked) forControlEvents:UIControlEventTouchUpInside];
If I add NSLog(#"Clicked"); to buttonScreenClicked I see that the method doesn't always get called, what would cause it to do this, I have made sure that I set:
[_buttonScreen setAlpha:0.1];
[_buttonScreen setHidden:NO];
[_buttonScreen setUserInteractionEnabled:YES];
I have no Image, text, or color in the button, but it still works sometimes.
I'm using AFKPageFlipper on the same view but it still had the same problem before I added AFKPageFlipper, so I don't think its that.
If anyone could point me in any direction to start trouble shooting this problem I would appreciate.
Thanks
I just had the same problem and worked it out. The 5 seconds is the clue.
Somewhere you have a gesture recognizer covering the same space as your button. More specifically you have a gesture recognizer that is eating your Taps but not your LongPresses. If you just tap the button the gesture recognizer runs off with your event; Hold your finger down long enough and the gesture recognizer no longer considers it a tap and the event is passed through to your button.
Instrument your Tap gesture recognizer handlers and the problem should pop out at you.
Make sure you don't have any other UIView descendants overlaying the button (like a transparent UIScrollView) as these will intercept the touch events first.
Also make sure that the containing view (the view with the button in) is correctly sized, by default you can place a view outside the bounds of another view and the clipsToBounds is set to false so you will see it but not be able to interact with it.
Things to try:
Do you have any other actions on the button?
Do you have any other UIViews which could possibly be accepting the key presses (above or below, or un-shown)
Also, please check that you have only one UIViewController instance for this screen. Other issues may arrise because of that.
What happens if you dont set the alpha level?
Do you release the object properly in the dealloc only ?
Hope this helps

Using hitTest logic only for touchesBegan and NOT gesture recognizers

I have been developing a simple game for iOS which involves dragging and using rotation- and other gesture recognizers. Dragging is realized through touchesBegan/Moved/Ended and rotation - through recognizer.
The views are irregularly shaped, and the view borders sometimes overlap, so I implemented Ole Belgeman's UIImage+ColorAtPixel in my picture view and overrode isPointInside method in the main element view. isPointInside invokes the method in picture view, which checks alpha at touch point and returns NO if the transparent section has been touched. Essentially, hitTest ignores this branch.
But the side effect of it is that hitTest ignores all touches on the transparent section, and rotation recognizer only works on the non-transparent zone. For some views, which are too small in size, it becomes impossible to use rotation gesture :(
Is there any way to somehow avoid this problem and use hitTest logic only for touchesBegan? I tried to work the solution out, but it seems that hitTest works strictly before any touch handling.
Checking the transparency at touchesBegan works, but when you touch the transparent section, which overlaps the non-transparent section of the other view, the latter doesn't receive the touch.
I just can't figure out the trick...
Thank you in advance for any help!
I would make the dragging use a UIPanGestureRecognizer, so that you can implement the delegate method -gestureRecognizer:shouldReceiveTouch: to return NO when your pan recognizer is considering touches in the transparent area. Leave it unimplemented or return YES from your rotation recognizer to receive everything.
In addition, using gesture recognizers for both kinds of actions has other benefits, like the ability to specify dependencies with -requireGestureRecognizerToFail:.
Try to check if the UIEvent parameter that passed to pointInside:withEvent: when it comes from the gesture recognizer, is different than the one passed when it called from touchBegan/Moved/Ended.
If it is different then i guess this is solving your problem.
Just put a break point or NSLog at pointInside to see the Event parameter on each case and see if you can differentiate.
Good Luck!

UIGestureRecognizer - detecting tap and drag like trackpad

Is there any way to cascade UIGestureRecognizers to detect a tap and then drag.
For example, I want to detect when the user taps and then drags his finger around.
This would be similar how drag works on trackpads.
So I want to detect a tap, then I want to get UIPanGestureRecognizer to send me continuous updates.
I want to use standard UIGesture classes to create this new gesture instead of creating my own using raw touches.
Is this even possible?
although i haven't found the solution the way i expected, i found a better solution.
By just using UILongPressGrstureRecognizer, it is surprising that it is able to implement tap and drag.
You have to:
set the numberOfTapsRequired to 1 to detect the initial tap.
set the minimumDuration something smaller to detect drags quicker without waiting
e.g.:
UILongPressGestureRecognizer *mouseDrag = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleDrag:)];
mouseDrag.numberOfTapsRequired=1;
mouseDrag.minimumPressDuration=0.05;
[clickLeft requireGestureRecognizerToFail:mouseDrag];
to handle the drag, you must determine the state to handle it appropriately as a continuous gesture.