Retrieve and change the point at which touchUpInside changes to touchUpOutside - objective-c

I have made a UISlider work just like the "slide to unlock" slider.
What I need to do is determine the point at which lifting your finger off is classed as touchUpOUTSIDE and not touchUpINSIDE. This is the point where you slide your finger past the end of the slider too far.
I guess it's the same as with a UIButton, you can press the button then slide your finger off the button and depending how far you go, it can still be classed as touchUpInside.
If possible, i'd like to mark the target area with a circle.
Once i've managed to find where this point is, is it possible to change it? So I could have a bigger target area?
I really don't know where to start with this. Thanks

According to the docs, the UIControlEventTouchUpOutside event is triggered when the finger is outside the bounds of the control. If you're trying to change that area, the slider will scale with it. Why not just tie the action for UIControlEventTouchUpOutside to the same as UIControlEventTouchUpInside?

It's taken me a few hours, but i've managed to sort this.
I've done a lot of testing overriding touchesMoved, touchesEnded and sendAction:action:target:event and it seems that any touch within 70px of the frame classes as a touch INSIDE. So for a UISlider that's 292x52 any touch from x:-70 to x:362 or y:-70 to 122 will count as an inside touch, even though it's outside the frame.
I have come up with this code that will override a custom class to allow a bigger area of 100px around the frame to count as an inside touch:
#import "UICustomSlider.h"
#implementation UICustomSlider {
BOOL callTouchInside;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
callTouchInside = NO;
[super touchesMoved:touches withEvent:event];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touchLocation = [[touches anyObject] locationInView:self];
if (touchLocation.x > -100 && touchLocation.x < self.bounds.size.width +100 && touchLocation.y > -100 && touchLocation.y < self.bounds.size.height +100) callTouchInside = YES;
[super touchesEnded:touches withEvent:event];
}
-(void)sendAction:(SEL)action to:(id)target forEvent:(UIEvent *)event
{
if (action == #selector(sliderTouchOutside)) { // This is the selector used for UIControlEventTouchUpOutside
if (callTouchInside == YES) {
NSLog(#"Overriding an outside touch to be an inside touch");
[self sendAction:#selector(UnLockIt) to:target forEvent:event]; // This is the selector used for UIControlEventTouchUpInside
} else {
[super sendAction:action to:target forEvent:event];
}
} else {
[super sendAction:action to:target forEvent:event];
}
}
With a little bit more tweaking I should be able to use it for the opposite also. (Using a closer touch as an outside touch).

Related

How to make OpenGL view and overlaying UIView react to the same touch event

I have an OpenGL View with sprites in it which have UIViews with the same position and size on top of the OpenGL view, to make them accessible. The OpenGL View should be able to receive touch events and the UIView on top too. In my UIView class I override the hitTest:withEvent method so the UIViews get the touch input, otherwise only the OpenGL gets the touch:
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
NSEnumerator *reverseE = [self.subviews reverseObjectEnumerator];
UIView *iSubView;
while ((iSubView = [reverseE nextObject]))
{
UIView *viewWasHit = [iSubView hitTest:[self convertPoint:point toView:iSubView] withEvent:event];
if (viewWasHit)
{
return viewWasHit;
}
}
return [super hitTest:point withEvent:event];
}
Now my UIViews receive touch-input but the OpenGL elements underneath don't. How can I change that so both of them receive the touch.
In my opinion in such cases it is best if only your main view receives touches in which you then check if any of the subviews was touched using CGRectContainsPoint for instance. If you really want to override the subviews then I suggest you override the touch methods to call the superview touch methods, for instance to override touchesBegin:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[self.superview touchesBegan:touches withEvent:event];
//do additional stuff
}
This will work with touches but probably not with gestures since they cancel touch events and have a different pipeline.

Dragging a tableview cell that contains uibuttons

I'm trying to implement a uitableview that its rows can be dragged to right and left (and show something behind them).
The code works fine, I've implemented it using the following methods:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
My problem is that the rows also contain UIButtons that when clicking them- should be clicked but when dragging - should drag the entire cell.
I've found this solution. Basically to bubble up the events when clicking on the UIButtons:
[super touchesBegan:touches withEvent:event];
[self.nextResponder touchesBegan:touches withEvent:event];
But, it seems taht the event touchesMoved only bubbles once.
I've seen all sort of questions in this area. Example. But I don't see any solution or responses.
Any help, suggestion or creative workaround would be appreciated!
Instead of implementing touchesBegan, etc. why not use a UIPanGestureRecognizer? I tested this with just a simple rectangular view which was mostly covered by a UIButton. The view was dragged, no matter where I touched, and the button method fired if I clicked over the button.
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
[self.theView addGestureRecognizer:panGesture]; //theView is IBOutlet for small view containing a button
}
-(void)viewDidAppear:(BOOL)animated {
self.currentViewFrame = self.theView.frame;
}
- (IBAction)handlePanGesture:(UIPanGestureRecognizer *)sender {
CGPoint translate = [sender translationInView:self.view];
CGRect newFrame = self.currentViewFrame;
newFrame.origin.x += translate.x;
newFrame.origin.y += translate.y;
sender.view.frame = newFrame;
if (sender.state == UIGestureRecognizerStateEnded)
self.currentViewFrame = newFrame;
}
-(IBAction)doClick:(id)sender {
NSLog(#"click");
}
Just check to see which one was touched using the tag system.

Get TouchEvent to the subview of my scrollview iOS SDK [duplicate]

I am trying to solve a basic problem with drag and drop on iPhone. Here's my setup:
I have a UIScrollView which has one large content subview (I'm able to scroll and zoom it)
Content subview has several small tiles as subviews that should be dragged around inside it.
My UIScrollView subclass has this method:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *tile = [contentView pointInsideTiles:[self convertPoint:point toView:contentView] withEvent:event];
if (tile) {
return tile;
} else {
return [super hitTest:point withEvent:event];
}
}
Content subview has this method:
- (UIView *)pointInsideTiles:(CGPoint)point withEvent:(UIEvent *)event {
for (TileView *tile in tiles) {
if ([tile pointInside:[self convertPoint:point toView:tile] withEvent:event])
return tile;
}
return nil;
}
And tile view has this method:
- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.superview];
self.center = location;
}
This works, but not fully correct: the tile sometimes "falls down" during the drag process. More precisely, it stops receiving touchesMoved: invocations, and scroll view starts scrolling instead. I noticed that this depends on the drag speed: the faster I drag, the quicker the tile "falls".
Any ideas on how to keep the tile glued to the dragging finger?
I was struggling with this same problem - I was trying to do a interface with a lot of "cards" (UIView subclasses) on a cork board, and have the cork board area scrollable, but still able to drag-and-drop the cards. I was doing the hitTest() solution above, but one of the Apple engineers asked me why I was doing it that way. The simpler solution they suggested was as follows:
1) In the UIScrollView class, set the value of canCancelContentTouches to NO - this tells the UIScrollView class to allow touches within subviews (or, in this case, in subviews of subviews).
2) In my "card" class, set exclusiveTouch to YES - this tells the subview it owns the touches inside of it.
After this, I was able to drag around the cards and still scroll the subview. It's a lot simpler and cleaner than the hitTest() solution above.
(BTW, for extra credit, if you are using iOS 3.2 or 4.0 or later, use the UIPanGestureRecognizer class to handle the drag and drop logic - the drag and drop motion is a lot smoother than overriding touchesBegan()/touchesMoved()/touchesEnded().)
Solved: it turned out that there should be also touchesBegan: and touchesEnded: implementations (in my case having empty methods helped) in the tile, otherwise the gesture started propagating to parent views, and they were intercepting the gesture somehow. Dependency on the drag speed was imaginary.
At first, set:
scrollview.canCancelContentTouches = NO;
yourSubView. exclusiveTouch = YES;
Then in your subview gesture handle function,
- (void)handleSubviewMove:(UIPanGestureRecognizer *)gesture {
if(gesture.state == UIGestureRecognizerStateBegan) {
if(_parentScrollView != nil) {
_parentScrollView.scrollEnabled = NO;
}
}
if(gesture.state == UIGestureRecognizerStateEnded) {
if(_parentScrollView != nil) {
_parentScrollView.scrollEnabled = YES;
}
}
CGPoint translation = [gesture translationInView:[self superview]];
/* handle your view's movement here. */
[gesture setTranslation:CGPointZero inView:[self superview]];
}
Based on the code you've shared, it looks like your touchesMoved: method will only be called for gestures within the tile. At each touch, you move the tile to be centered on that touch, so slow movements will each give an update within the tile -- and the tile will "catch up" with the fingertip -- before the gesture exits the tile. When a gesture is faster, however, the (x,y) touchesMoved events will be farther apart, and you'll lose the gesture when one (x,y) point is far enough away from the last one that it is outside of the tile already.
You can work around this by capturing the movements in a superview large enough to cover the whole draggable area, and controlling the movement of the tile from within that superview.
By the way, is there a reason you're overriding the hitTest: method? It might be easier (and possibly more efficient?) to use the built-in implementation.

UIScrollView with UIPinchGesture

I have a UIScrollView with a UIPinchGesture attached to it. My problem is though if I do a pinch gesture it moves the UIScrollView's and can see this when NSLogging the UIScrollView's X/Y. I was wondering if anyone has any ideas to prevent this happening on the scrollview?
I already set the minimum and maximum zoom scale:
[scrollView setMaximumZoomScale: 1.0];
[scrollView setMinimumZoomScale: 1.0];
Also I have subclasses the UIScrollView and implemented the touchesBegan and touchesEnded but I am unsure how I would ignore a touch on the scrollview if 2 fingers are used?
Please advise.
you can wisely use ScrollEnabled property.
Also to cancel touch when two fingers are used,
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSSet *allTouches = [event allTouches];
if ([allTouches count] > 1)
{
[self touchesCancelled:touches withEvent:event];
}
else
{
//pass touch.
}
}

Allowing interaction with a UIView under another UIView

Is there a simple way of allowing interaction with a button in a UIView that lies under another UIView - where there are no actual objects from the top UIView on top of the button?
For instance, at the moment I have a UIView (A) with an object at the top and an object at the bottom of the screen and nothing in the middle. This sits on top of another UIView that has buttons in the middle (B). However, I cannot seem to interact with the buttons in the middle of B.
I can see the buttons in B - I've set the background of A to clearColor - but the buttons in B do not seem to receive touches despite the fact that there are no objects from A actually on top of those buttons.
EDIT - I still want to be able to interact with the objects in the top UIView
Surely there is a simple way of doing this?
You should create a UIView subclass for your top view and override the following method:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// UIView will be "transparent" for touch events if we return NO
return (point.y < MIDDLE_Y1 || point.y > MIDDLE_Y2);
}
You may also look at the hitTest:event: method.
While many of the answers here will work, I'm a little surprised to see that the most convenient, generic and foolproof answer hasn't been given here. #Ash came closest, except that there is something strange going on with returning the superview... don't do that.
This answer is taken from an answer I gave to a similar question, here.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView == self) return nil;
return hitView;
}
[super hitTest:point withEvent:event] will return the deepest view in that view's hierarchy that was touched. If hitView == self (i.e. if there is no subview under the touch point), return nil, specifying that this view should not receive the touch. The way the responder chain works means that the view hierarchy above this point will continue to be traversed until a view is found that will respond to the touch. Don't return the superview, as it is not up to this view whether its superview should accept touches or not!
This solution is:
convenient, because it requires no references to any other views/subviews/objects;
generic, because it applies to any view that acts purely as a container for touchable subviews, and the configuration of the subviews does not affect the way it works (as it does if you override pointInside:withEvent: to return a particular touchable area).
foolproof, there's not much code... and the concept isn't difficult to get your head around.
I use this often enough that I have abstracted it into a subclass to save pointless view subclasses for one override. As a bonus, add a property to make it configurable:
#interface ISView : UIView
#property(nonatomic, assign) BOOL onlyRespondToTouchesInSubviews;
#end
#implementation ISView
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView == self && onlyRespondToTouchesInSubviews) return nil;
return hitView;
}
#end
Then go wild and use this view wherever you might use a plain UIView. Configuring it is as simple as setting onlyRespondToTouchesInSubviews to YES.
There are several ways you could handle this. My favorite is to override hitTest:withEvent: in a view that is a common superview (maybe indirectly) to the conflicting views (sounds like you call these A and B). For example, something like this (here A and B are UIView pointers, where B is the "hidden" one, that is normally ignored):
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
CGPoint pointInB = [B convertPoint:point fromView:self];
if ([B pointInside:pointInB withEvent:event])
return B;
return [super hitTest:point withEvent:event];
}
You could also modify the pointInside:withEvent: method as gyim suggested. This lets you achieve essentially the same result by effectively "poking a hole" in A, at least for touches.
Another approach is event forwarding, which means overriding touchesBegan:withEvent: and similar methods (like touchesMoved:withEvent: etc) to send some touches to a different object than where they first go. For example, in A, you could write something like this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if ([self shouldForwardTouches:touches]) {
[B touchesBegan:touches withEvent:event];
}
else {
// Do whatever A does with touches.
}
}
However, this will not always work the way you expect! The main thing is that built-in controls like UIButton will always ignore forwarded touches. Because of this, the first approach is more reliable.
There's a good blog post explaining all this in more detail, along with a small working xcode project to demo the ideas, available here:
http://bynomial.com/blog/?p=74
You have to set upperView.userInteractionEnabled = NO;, otherwise the upper view will intercept the touches.
The Interface Builder version of this is a checkbox at the bottom of the View Attributes panel called "User Interaction Enabled". Uncheck it and you should be good to go.
Custom implementation of pointInside:withEvent: indeed seemed like the way to go, but dealing with hard-coded coordinates seemed odd to me. So I ended up checking whether the CGPoint was inside the button CGRect using the CGRectContainsPoint() function:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
return (CGRectContainsPoint(disclosureButton.frame, point));
}
Lately I wrote a class that will help me with just that. Using it as a custom class for a UIButton or UIView will pass touch events that were executed on a transparent pixel.
This solution is a somewhat better than the accepted answer because you can still click a UIButton that is under a semi transparent UIView while the non transparent part of the UIView will still respond to touch events.
As you can see in the GIF, the Giraffe button is a simple rectangle but touch events on transparent areas are passed on to the yellow UIButton underneath.
Link to class
I guess I'm a bit late to this party, but I'll add this possible solution:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView != self) return hitView;
return [self superview];
}
If you use this code to override a custom UIView's standard hitTest function, it will ignore ONLY the view itself. Any subviews of that view will return their hits normally, and any hits that would have gone to the view itself are passed up to its superview.
-Ash
Just riffing on the Accepted Answer and putting this here for my reference. The Accepted Answer works perfectly. You can extend it like this to allow your view's subviews to receive the touch, OR pass it on to any views behind us:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// If one of our subviews wants it, return YES
for (UIView *subview in self.subviews) {
CGPoint pointInSubview = [subview convertPoint:point fromView:self];
if ([subview pointInside:pointInSubview withEvent:event]) {
return YES;
}
}
// otherwise return NO, as if userInteractionEnabled were NO
return NO;
}
Note: You don't even have to do recursion on the subview tree, because each pointInside:withEvent: method will handle that for you.
This approach is quite clean and allows that transparent subviews are not reacting to touches as well. Just subclass UIView and add the following method to its implementation:
#implementation PassThroughUIView
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
for (UIView *v in self.subviews) {
CGPoint localPoint = [v convertPoint:point fromView:self];
if (v.alpha > 0.01 && ![v isHidden] && v.userInteractionEnabled && [v pointInside:localPoint withEvent:event])
return YES;
}
return NO;
}
#end
Setting userInteraction property disabled might help. Eg:
UIView * topView = [[TOPView alloc] initWithFrame:[self bounds]];
[self addSubview:topView];
[topView setUserInteractionEnabled:NO];
(Note: In the code above, 'self' refers to a view)
This way, you can only display on the topView, but won't get user inputs. All those user touches will through this view and the bottom view will respond for them. I'd use this topView for displaying transparent images, or animate them.
My solution here:
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
CGPoint pointInView = [self.toolkitController.toolbar convertPoint:point fromView:self];
if ([self.toolkitController.toolbar pointInside:pointInView withEvent:event]) {
self.userInteractionEnabled = YES;
} else {
self.userInteractionEnabled = NO;
}
return [super hitTest:point withEvent:event];
}
Hope this helps
There's something you can do to intercept the touch in both views.
Top view:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Do code in the top view
[bottomView touchesBegan:touches withEvent:event]; // And pass them on to bottomView
// You have to implement the code for touchesBegan, touchesEnded, touchesCancelled in top/bottom view.
}
But that's the idea.
Here is a Swift version:
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
return !CGRectContainsPoint(buttonView.frame, point)
}
Swift 3
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
for subview in subviews {
if subview.frame.contains(point) {
return true
}
}
return false
}
I have never built a complete user interface using the UI toolkit, so I don't have much experience with it. Here is what I think should work though.
Every UIView, and this the UIWindow, has a property subviews, which is an NSArray containing all the subviews.
The first subview you add to a view will receive index 0, and the next index 1 and so forth. You can also replace addSubview: with insertSubview: atIndex: or insertSubview:aboveSubview: and such methods that can determine the position of your subview in the hierarchy.
So check your code to see which view you add first to your UIWindow. That will be 0, the other will be 1.
Now, from one of your subviews, to reach another you would do the following:
UIView * theOtherView = [[[self superview] subviews] objectAtIndex: 0];
// or using the properties syntax
UIView * theOtherView = [self.superview.subviews objectAtIndex:0];
Let me know if that works for your case!
(below this marker is my previous answer):
If views need to communicate with each other, they should do so via a controller (that is, using the popular MVC model).
When you create a new view, you can make sure it registers itself with a controller.
So the technique is to make sure your views register with a controller (which can store them by name or whatever you prefer in a Dictionary or Array). Either you can have the controller send a message for you, or you can get a reference to the view and communicate with it directly.
If your view doesn't have a link back the controller (which may be the case) then you can make use of singletons and/or class methods to get a reference to your controller.
I think the right way is to use the view chain built into the view hierarchy.
For your subviews that are pushed onto the main view, do not use the generic UIView, but instead subclass UIView (or one of its variants like UIImageView) to make MYView : UIView (or whatever supertype you want, such as UIImageView). In the implementation for YourView, implement the touchesBegan method. This method will then get invoked when that view is touched. All you need to have in that implementation is an instance method:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event ;
{ // cannot handle this event. pass off to super
[self.superview touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event]; }
this touchesBegan is a responder api, so you dont need to declare it in your public or private interface; it's one of those magic api's you just have to know about. This self.superview will bubble up the request eventually to the viewController. In the viewController, then, implement this touchesBegan to handle the touch.
Note that the touches location (CGPoint) is automatically adjusted relative to the encompassing view for you as it is bounced up the view hierarchy chain.
Just want to post this, coz I had somewhat similar problem, spent substantial amount of time trying to implement answers here without any luck. What I ended up doing:
for(UIGestureRecognizer *recognizer in topView.gestureRecognizers)
{
recognizer.delegate=self;
[bottomView addGestureRecognizer:recognizer];
}
topView.abView.userInteractionEnabled=NO;
and implementing UIGestureRecognizerDelegate :
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Bottom view was a navigation controller with number of segues and I had sort of a door on top of it that could close with pan gesture. Whole thing was embedded in yet another VC. Worked like a charm. Hope this helps.
Swift 4 Implementation for HitTest based solution
let hitView = super.hitTest(point, with: event)
if hitView == self { return nil }
return hitView
Derived from Stuart's excellent, and mostly foolproof answer, and Segev's useful implementation, here is a Swift 4 package that you can drop into any project:
extension UIColor {
static func colorOfPoint(point:CGPoint, in view: UIView) -> UIColor {
var pixel: [CUnsignedChar] = [0, 0, 0, 0]
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
let context = CGContext(data: &pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
context!.translateBy(x: -point.x, y: -point.y)
view.layer.render(in: context!)
let red: CGFloat = CGFloat(pixel[0]) / 255.0
let green: CGFloat = CGFloat(pixel[1]) / 255.0
let blue: CGFloat = CGFloat(pixel[2]) / 255.0
let alpha: CGFloat = CGFloat(pixel[3]) / 255.0
let color = UIColor(red:red, green: green, blue:blue, alpha:alpha)
return color
}
}
And then with hitTest:
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
guard UIColor.colorOfPoint(point: point, in: self).cgColor.alpha > 0 else { return nil }
return super.hitTest(point, with: event)
}