UIGestureRecognizer blocking table view scrolling - cocoa-touch

I'm using a custom UIGestureRecognizer subclass to track gestures on my InfoView class. The InfoView class is a subview of a custom UITableViewCell subclass called InfoCell.
I've added my gesture recognizer to my root view (the parent view of everything else on screen, because the purpose of my custom gesture recognizer is to allow dragging of InfoCell views between tables). Now, everything works as it should except one thing. I'm using the following code in my UIGestureRecognizer subclass to detect touches on the InfoView view:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UIView *touchView = [[touches anyObject] view];
if ([touchView isKindOfClass:[InfoView class]]) {
// Do stuff
}
The problem here is that the touches on the InfoView object are being intercepted, therefore they are not being forwarded to the UITableView which contains the InfoCell, which is the parent view of the InfoView. This means that I can no longer scroll the table view by dragging on the InfoView view, which is an issue because the InfoView covers the entire InfoCell.
Is there any way I can forward the touches onto the table view so that it can scroll? I've tried a bunch of things already:
[super touchesBegan:touches withEvent:event];
[touchView.superview.superview touchesBegan:touches withEvent:event]; (touchView.superview.superview gets a reference to its parent UITableView)
But nothing has worked so far. Also, the cancelsTouchesInView propery of my UIGestureRecognizer is set to NO, so thats not interfering with the touches.
Help is appreciated. Thanks!

Check out the UIGestureRecognizerDelegate method: - (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
If this returns YES it will prevent your gesture recognizer from stomping on the one that UIScrollView is using to detect scrolling.

UIGestureRecognizer has a property "cancelsTouchesInView" which is set to YES by default. This means that touches in a UIView are cancelled when a gesture is recognized. Try to set it to NO to allow the UIScrollView to receive further touch events.

I had a line in my touchesBegan method that set the state property of the gesture recognizer to UIGestureRecognizerStateBegan. Removing this line seems to fix the problem.

You can try add this notification
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer {
if ([gestureRecognizer class] == [UIPanGestureRecognizer class]) {
UIPanGestureRecognizer *panGestureRec = (UIPanGestureRecognizer *)gestureRecognizer;
CGPoint point = [panGestureRec velocityInView:self];
if (fabsf(point.x) > fabsf(point.y) ) {
return YES;
}
}
return NO;
}

Related

How to identify a button touch event from other touch events

I have an app in which the user can interact with many objects. These are several UIButtons, a few UILabels and a lot of UIImageViews.
The focus of all interaction centers around touching the UIImageView objects. With a touch I can move the images around, tell them to do this or that. However, my current obstacle is in knowing how to properly have the app distinguish touches that occur when I touch a UIButton.
Why? Well the logic within the Touches Began event is meant to be only for the UIImageViews, however the moment that I touch a button, or any other object, the app is interpreting the touch as if it occurred for a UIImageView object.
So, my approach boils down to: is there a good way of identifying if a touch occurred for a UIButton, UIImageView, UILabel object? That way I can filter out the irrelevant touches in my app from the relevant ones.
EDIT:
The code below outlines how I capture the touch event, however I do not know how to know from the touch event whether it is a button or a view that I touched.
touch = [touches anyObject];
touchLocation = [touch locationInView:[self view]];
To know whether UIButton is pressed follow this:
-(void) touchBegan : (NSSet *) touches withEvent : (UIEvent *) even
{
UITouch *touch = [touched anyObject];
UIView *touchedView = [touch view];
if([touchedView isMemberofClass : [UIButton class])
{
//do something when button is touched
}
}
Call hitTest:withEvent: on the view with the touch event to get the view that is actually being touched. Then you can use isKindOfClass: to check what type of view it is and respond accordingly.
you can use this method :-
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if ([self pointInside:point withEvent:event]) {
//You clicked inside the object
}
return [super hitTest:point withEvent:event]
}
and wain has already given you the explanation for it..
https://stackoverflow.com/a/18051856/1865424

How to track UITouches between two UIViewcontrollers? (using containment)

I have two view controllers appearing on the screen at the same time using view controller containment, implemented the same as in Apple's code example. Call them view controller A (vcA) and view controller B (vcB), and the container view controller (containerVC).
Each vcA and vcB both have a grid of objects, and I want to be able to drag objects from vcA to vcB. More specifically, I want the touch that originates in vcA to hit the touchesMoved:withEvent method in vcB once it is within the bounds vcB.
I have overridden the touchesMoved:withEvent method on the containerVC, track the touch via a hit test, and tried forwarding the touch down the UIView hierarchy like so:
// in the containerVC
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint locationPoint = [[touches anyObject] locationInView:self.view];
UIView* touchIsInViewControllerA = [vcA hitTest:locationPoint withEvent:event];
if (touchIsInViewControllerA) {
NSLog(#"Touch is in vcA");
return;
}
UIView* touchIsInViewControllerB = [vcB hitTest:locationPoint withEvent:event];
if (touchIsInViewControllerB) {
NSLog(#"Touch is in vcB");
[vcB touchesMoved:touches withEvent:event]; // this causes a crash
}
}
This seems to be recursive, with containerVC pushing the touch event down the hierarchy, then vcB passing the touch event back up the hierarchy.
My question: Is there a way to keep vcB from passing the touch event back up the responder chain to containerVC? Or should I approach this a different way - make vcB a delegate of vcA and leave containerVC out of the equation?
Note: I'm guessing a common response will be to give up the VC containment pattern and keep it all in one view controller, but for reasons not shown in this example I think keeping them separate will work better for me - unless it's utter insanity and super hacky to do so...
This problem can be solved easily by setting individual gesture object for each view controller.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if (touchIsInViewControllerA) {
// calls a delegate of view controller A
}
else
{
// calls a delegate of view controller B
}
}

How to make touches propagation?

For example
I have a custom UIView Class, on the view I put a UIButton.
In the View Class
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
NSLog(#"touch view");
}
But when I touch the button, it intercepted view touch message. Is there a way to recieve touches when user touch view's subview?
I noticed when using UIGestureRecognizer bind to the superview, the event can recieved when user touched subview.
There is no (Apple-allowed or stable) way to "pass" touches to another view. Instead, call a method in your other view from your touches method.

UIView smaller than another UIView

I have an UIView high 30, than i add an UITableView as subview with 30 as origin.y. Because the table remains “under” the main view I cannot select the cells. Consequently I have implemented this method:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"Tocco intercettato");
if (oldSet && oldEvent) {
[table touchesEnded:oldSet withEvent:oldEvent];
}
oldSet = touches;
oldEvent = event;
[table touchesBegan:touches withEvent:event];
[table touchesEnded:touches withEvent:event];
}
The taps work, but the slides does not work and consequently I cannot slide the table. Some idea in order to resolve?
Passing the touches from an overlying UIView to an underlying UITableView does not work (as you're seeing, the taps go through but not the scrolling movements). This is because UITableView (as a subclass of UIScrollView) does all sorts of under-the-hood weirdness with the responder chain (your code is also kind of odd, where you call both touchesBegan and touchesEnded from the overlying view's touchesBegan method, but this isn't why you can't scroll the table).
A simple way to achieve what you want (assuming I've understood that correctly) is to override the hitTest:withEvent: method on your overlying view and have the method return the underlying UITableView (instead of self, which is the default implementation):
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UITableView *tv = (UITableView *)[self.superview.subviews objectAtIndex:0];
return tv;
}
Note that this code assumes the UITableView is the first view added to your view controller, and that the overlying UIView is added after that.
Edit: OK, after re-reading your question, I think I understand your question. You have a main view with a height of 30, and to that view you've added a UITableView at y = 30. I think adding this method to your main view will do what you need:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if (point.y < self.frame.size.height) {
return self;
} else {
return myTableView; // assumes myTableView is a reference to the
// UITableView that you've added
}
}
Essentially, you're telling the OS to pass all touches on to myTableView if they're outside the bounds of your main view.
However, a simpler way would be to add both the 30-pixel-high main view (the orange thing) and the UITableView as subviews of another view (or view controller). That way, you wouldn't have to do anything kludgy to get your table to behave like a normal table.

Allowing interaction with a UIView under another UIView

Is there a simple way of allowing interaction with a button in a UIView that lies under another UIView - where there are no actual objects from the top UIView on top of the button?
For instance, at the moment I have a UIView (A) with an object at the top and an object at the bottom of the screen and nothing in the middle. This sits on top of another UIView that has buttons in the middle (B). However, I cannot seem to interact with the buttons in the middle of B.
I can see the buttons in B - I've set the background of A to clearColor - but the buttons in B do not seem to receive touches despite the fact that there are no objects from A actually on top of those buttons.
EDIT - I still want to be able to interact with the objects in the top UIView
Surely there is a simple way of doing this?
You should create a UIView subclass for your top view and override the following method:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// UIView will be "transparent" for touch events if we return NO
return (point.y < MIDDLE_Y1 || point.y > MIDDLE_Y2);
}
You may also look at the hitTest:event: method.
While many of the answers here will work, I'm a little surprised to see that the most convenient, generic and foolproof answer hasn't been given here. #Ash came closest, except that there is something strange going on with returning the superview... don't do that.
This answer is taken from an answer I gave to a similar question, here.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView == self) return nil;
return hitView;
}
[super hitTest:point withEvent:event] will return the deepest view in that view's hierarchy that was touched. If hitView == self (i.e. if there is no subview under the touch point), return nil, specifying that this view should not receive the touch. The way the responder chain works means that the view hierarchy above this point will continue to be traversed until a view is found that will respond to the touch. Don't return the superview, as it is not up to this view whether its superview should accept touches or not!
This solution is:
convenient, because it requires no references to any other views/subviews/objects;
generic, because it applies to any view that acts purely as a container for touchable subviews, and the configuration of the subviews does not affect the way it works (as it does if you override pointInside:withEvent: to return a particular touchable area).
foolproof, there's not much code... and the concept isn't difficult to get your head around.
I use this often enough that I have abstracted it into a subclass to save pointless view subclasses for one override. As a bonus, add a property to make it configurable:
#interface ISView : UIView
#property(nonatomic, assign) BOOL onlyRespondToTouchesInSubviews;
#end
#implementation ISView
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView == self && onlyRespondToTouchesInSubviews) return nil;
return hitView;
}
#end
Then go wild and use this view wherever you might use a plain UIView. Configuring it is as simple as setting onlyRespondToTouchesInSubviews to YES.
There are several ways you could handle this. My favorite is to override hitTest:withEvent: in a view that is a common superview (maybe indirectly) to the conflicting views (sounds like you call these A and B). For example, something like this (here A and B are UIView pointers, where B is the "hidden" one, that is normally ignored):
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
CGPoint pointInB = [B convertPoint:point fromView:self];
if ([B pointInside:pointInB withEvent:event])
return B;
return [super hitTest:point withEvent:event];
}
You could also modify the pointInside:withEvent: method as gyim suggested. This lets you achieve essentially the same result by effectively "poking a hole" in A, at least for touches.
Another approach is event forwarding, which means overriding touchesBegan:withEvent: and similar methods (like touchesMoved:withEvent: etc) to send some touches to a different object than where they first go. For example, in A, you could write something like this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if ([self shouldForwardTouches:touches]) {
[B touchesBegan:touches withEvent:event];
}
else {
// Do whatever A does with touches.
}
}
However, this will not always work the way you expect! The main thing is that built-in controls like UIButton will always ignore forwarded touches. Because of this, the first approach is more reliable.
There's a good blog post explaining all this in more detail, along with a small working xcode project to demo the ideas, available here:
http://bynomial.com/blog/?p=74
You have to set upperView.userInteractionEnabled = NO;, otherwise the upper view will intercept the touches.
The Interface Builder version of this is a checkbox at the bottom of the View Attributes panel called "User Interaction Enabled". Uncheck it and you should be good to go.
Custom implementation of pointInside:withEvent: indeed seemed like the way to go, but dealing with hard-coded coordinates seemed odd to me. So I ended up checking whether the CGPoint was inside the button CGRect using the CGRectContainsPoint() function:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
return (CGRectContainsPoint(disclosureButton.frame, point));
}
Lately I wrote a class that will help me with just that. Using it as a custom class for a UIButton or UIView will pass touch events that were executed on a transparent pixel.
This solution is a somewhat better than the accepted answer because you can still click a UIButton that is under a semi transparent UIView while the non transparent part of the UIView will still respond to touch events.
As you can see in the GIF, the Giraffe button is a simple rectangle but touch events on transparent areas are passed on to the yellow UIButton underneath.
Link to class
I guess I'm a bit late to this party, but I'll add this possible solution:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView != self) return hitView;
return [self superview];
}
If you use this code to override a custom UIView's standard hitTest function, it will ignore ONLY the view itself. Any subviews of that view will return their hits normally, and any hits that would have gone to the view itself are passed up to its superview.
-Ash
Just riffing on the Accepted Answer and putting this here for my reference. The Accepted Answer works perfectly. You can extend it like this to allow your view's subviews to receive the touch, OR pass it on to any views behind us:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// If one of our subviews wants it, return YES
for (UIView *subview in self.subviews) {
CGPoint pointInSubview = [subview convertPoint:point fromView:self];
if ([subview pointInside:pointInSubview withEvent:event]) {
return YES;
}
}
// otherwise return NO, as if userInteractionEnabled were NO
return NO;
}
Note: You don't even have to do recursion on the subview tree, because each pointInside:withEvent: method will handle that for you.
This approach is quite clean and allows that transparent subviews are not reacting to touches as well. Just subclass UIView and add the following method to its implementation:
#implementation PassThroughUIView
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
for (UIView *v in self.subviews) {
CGPoint localPoint = [v convertPoint:point fromView:self];
if (v.alpha > 0.01 && ![v isHidden] && v.userInteractionEnabled && [v pointInside:localPoint withEvent:event])
return YES;
}
return NO;
}
#end
Setting userInteraction property disabled might help. Eg:
UIView * topView = [[TOPView alloc] initWithFrame:[self bounds]];
[self addSubview:topView];
[topView setUserInteractionEnabled:NO];
(Note: In the code above, 'self' refers to a view)
This way, you can only display on the topView, but won't get user inputs. All those user touches will through this view and the bottom view will respond for them. I'd use this topView for displaying transparent images, or animate them.
My solution here:
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
CGPoint pointInView = [self.toolkitController.toolbar convertPoint:point fromView:self];
if ([self.toolkitController.toolbar pointInside:pointInView withEvent:event]) {
self.userInteractionEnabled = YES;
} else {
self.userInteractionEnabled = NO;
}
return [super hitTest:point withEvent:event];
}
Hope this helps
There's something you can do to intercept the touch in both views.
Top view:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Do code in the top view
[bottomView touchesBegan:touches withEvent:event]; // And pass them on to bottomView
// You have to implement the code for touchesBegan, touchesEnded, touchesCancelled in top/bottom view.
}
But that's the idea.
Here is a Swift version:
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
return !CGRectContainsPoint(buttonView.frame, point)
}
Swift 3
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
for subview in subviews {
if subview.frame.contains(point) {
return true
}
}
return false
}
I have never built a complete user interface using the UI toolkit, so I don't have much experience with it. Here is what I think should work though.
Every UIView, and this the UIWindow, has a property subviews, which is an NSArray containing all the subviews.
The first subview you add to a view will receive index 0, and the next index 1 and so forth. You can also replace addSubview: with insertSubview: atIndex: or insertSubview:aboveSubview: and such methods that can determine the position of your subview in the hierarchy.
So check your code to see which view you add first to your UIWindow. That will be 0, the other will be 1.
Now, from one of your subviews, to reach another you would do the following:
UIView * theOtherView = [[[self superview] subviews] objectAtIndex: 0];
// or using the properties syntax
UIView * theOtherView = [self.superview.subviews objectAtIndex:0];
Let me know if that works for your case!
(below this marker is my previous answer):
If views need to communicate with each other, they should do so via a controller (that is, using the popular MVC model).
When you create a new view, you can make sure it registers itself with a controller.
So the technique is to make sure your views register with a controller (which can store them by name or whatever you prefer in a Dictionary or Array). Either you can have the controller send a message for you, or you can get a reference to the view and communicate with it directly.
If your view doesn't have a link back the controller (which may be the case) then you can make use of singletons and/or class methods to get a reference to your controller.
I think the right way is to use the view chain built into the view hierarchy.
For your subviews that are pushed onto the main view, do not use the generic UIView, but instead subclass UIView (or one of its variants like UIImageView) to make MYView : UIView (or whatever supertype you want, such as UIImageView). In the implementation for YourView, implement the touchesBegan method. This method will then get invoked when that view is touched. All you need to have in that implementation is an instance method:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event ;
{ // cannot handle this event. pass off to super
[self.superview touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event]; }
this touchesBegan is a responder api, so you dont need to declare it in your public or private interface; it's one of those magic api's you just have to know about. This self.superview will bubble up the request eventually to the viewController. In the viewController, then, implement this touchesBegan to handle the touch.
Note that the touches location (CGPoint) is automatically adjusted relative to the encompassing view for you as it is bounced up the view hierarchy chain.
Just want to post this, coz I had somewhat similar problem, spent substantial amount of time trying to implement answers here without any luck. What I ended up doing:
for(UIGestureRecognizer *recognizer in topView.gestureRecognizers)
{
recognizer.delegate=self;
[bottomView addGestureRecognizer:recognizer];
}
topView.abView.userInteractionEnabled=NO;
and implementing UIGestureRecognizerDelegate :
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Bottom view was a navigation controller with number of segues and I had sort of a door on top of it that could close with pan gesture. Whole thing was embedded in yet another VC. Worked like a charm. Hope this helps.
Swift 4 Implementation for HitTest based solution
let hitView = super.hitTest(point, with: event)
if hitView == self { return nil }
return hitView
Derived from Stuart's excellent, and mostly foolproof answer, and Segev's useful implementation, here is a Swift 4 package that you can drop into any project:
extension UIColor {
static func colorOfPoint(point:CGPoint, in view: UIView) -> UIColor {
var pixel: [CUnsignedChar] = [0, 0, 0, 0]
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
let context = CGContext(data: &pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
context!.translateBy(x: -point.x, y: -point.y)
view.layer.render(in: context!)
let red: CGFloat = CGFloat(pixel[0]) / 255.0
let green: CGFloat = CGFloat(pixel[1]) / 255.0
let blue: CGFloat = CGFloat(pixel[2]) / 255.0
let alpha: CGFloat = CGFloat(pixel[3]) / 255.0
let color = UIColor(red:red, green: green, blue:blue, alpha:alpha)
return color
}
}
And then with hitTest:
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
guard UIColor.colorOfPoint(point: point, in: self).cgColor.alpha > 0 else { return nil }
return super.hitTest(point, with: event)
}