Is there any way to cascade UIGestureRecognizers to detect a tap and then drag.
For example, I want to detect when the user taps and then drags his finger around.
This would be similar how drag works on trackpads.
So I want to detect a tap, then I want to get UIPanGestureRecognizer to send me continuous updates.
I want to use standard UIGesture classes to create this new gesture instead of creating my own using raw touches.
Is this even possible?
although i haven't found the solution the way i expected, i found a better solution.
By just using UILongPressGrstureRecognizer, it is surprising that it is able to implement tap and drag.
You have to:
set the numberOfTapsRequired to 1 to detect the initial tap.
set the minimumDuration something smaller to detect drags quicker without waiting
e.g.:
UILongPressGestureRecognizer *mouseDrag = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleDrag:)];
mouseDrag.numberOfTapsRequired=1;
mouseDrag.minimumPressDuration=0.05;
[clickLeft requireGestureRecognizerToFail:mouseDrag];
to handle the drag, you must determine the state to handle it appropriately as a continuous gesture.
Related
I've attached a UITapGestureRecognizer to a UIView in my application. If the user double-taps it, the buttons inside that view get randomly re-arranged. Working fine, lovely.
However, the user can also trigger this by double-tapping one of the buttons themselves, or even by tapping two buttons on different parts of the screen.
Is there a sensible / easy way to have this double-tap only work if the two taps are within x number of pixels, and on the view itself, not any elements within it such as these UIButtons?
I think the usual way to do this is with shouldReceiveTouch. Check out this question for a lengthy discussion and all the details.
One way to do this would be to attach a single tap gesture recognizer to the buttons -- this will preempt the button's normal touch events, so you would have to put the button's action method in the gesture recognizer's action method. Then, you would add a dependency to the double tapper to have it only fire if the single tapper fails:
[self.doubleTapper requireGestureRecognizerToFail:self.tapper];
I have a simple swipe gesture that I want to be very low priority. I want it to be cancelled by events that happen for controls on the view. At first, I thought this would be simple. It's easy to cancel events when a gesture happens, but I can't seem to do the opposite.
My solution is to cancel the gesture if it conflicts with any thing that is touchable. Here is the code I hacked together:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint touch = [gestureRecognizer locationInView:self.view];
return [self.view hitTest:touch withEvent:nil] == self.view;
}
I feel this is the wrong solution to the problem. What am I missing? What is the right way to get events to cancel gestures?
For more context, I have two UISwipeGestureRecognizers (swipe left and swipe right) added to the view. There is also a UISlider in the view (part of an embedded MPVolumeView). When I moved the slider to change the volume, the left or right swipe would fire.
This is the correct way to do what you want. You are telling the gesture recognizer that it should only begin if the touch is directly in the view, not any subviews (according to hitTest: which is good because it allows the views to decide if they are hit or not).
It's always better to prevent it from starting rather than trying to cancel it afterwards. However, if you do want to cancel the gesture after it has started, set enabled = NO and then back to YES again.
If you need to allow the gesture for some subviews but not controls you can test if the view returned by hitTest: is a subclass of UIControl (for example) using isKindOfClass:.
I don't know what type of gesture this is for, but usually you don't need to do this because UIKit will automatically find the deepest view that wants the touches and that view will 'eat' them so outer gesture recognizers won't get them - however I can imagine this doesn't hold true for some combinations of recognizer/control.
I have a menu which I'd like to have automatically hide if it's inactive after a certain amount of time. This menu is composed of a hierarchy of UIViewControllers, which present various different views.
I'm thinking along the lines of running a timer, which invalidates and starts over whenever there's a touch.
Is it possible to catch all touch events in a set of UIViews? Perhaps just keep a boolean lying around and use the main UIWindow to catch touch events?
EDIT:
My app is a kiosk app of sorts, with a main screen and a menu. When the menu is up, I want it to run an auto dismiss timer, which resets after any touch in the entire menu screen. The menu is displayed over the entire screen, modally.
One way to be sure is to subclass UIApplication and override - (void)sendEvent:(UIEvent *)event method, every touch event happening in your app goes through this method and you can check the UIEvent type to see if it's UIEventTypeTouches and reset the timer.
Another way to do this simply involves adding a transparent layer over whole user accesible UI and override hitTest:withEvent:.
You can have an invisible view on top of your modals view controllers, and put have either a gesture recognizer on it which can start a timer, either a
-touchesBegan:withTouches
method, and then send to the .nextResponder the same method.
I am having a hard time finding good example of drag and drop with ios. I do have a working code that does drag and drop but it is not working as expected. The drag does not happen immediately after the touch and drag. I have to touch the image, load a dragImage to the main VC and then retouch and drag the dragImage for it to work.
some how the new added UIImageView (dragImage) does not receive touch events or pick up touch events when it does not receive the initial touchBegin event.
If you're using touchesBegan as your way of handling drag/drop that might be the first stumbling block. It's probably going to be much easier for you to use the various UIGestureRecognizers that are provided in iOS 4.0 and greater (unless you have a specific reason for supporting iOS 3).
With gesture recognizers you can simply attach one to your UIImageView and have the view change its center point to where the touch is whenever your recognizer is triggered. If you're not keen on using gesture recognizers perhaps you could provide some of your existing code so we can get a feeling for what's going on here.
I have the problem that in Notification Center widgets touch events are not being registered. Lets say, i have a simple widget with a view (_view) and a UIButton with target:self forEvent:touchDown. If I then press the button on my device nothing happens. I need to hold it for a short period of time, then the "touch" (more like hold) gets recognized and the action for the button starts. I've seen widgets where touch events work fine (UISettings, SBSettings 5), what do I need to modify in order to behave like a "normal" UIView?
I ended up figuring it out myself. I just added a UITapGestureRecognizer to the UIButton. The selector for the Gesture gets called immediately when the screen is touched, and doesn't have the annoying "delay" effect like the UI Objects. I have used it with three UIObjects so far: UIButton UIBarButtonItem and UISegmentedControl. For the segmented control simply detect the x-coordinate of the touch and then select the relevant segment. It should also work with UISlider UISwitch etc. . The only object that isn't working for me is UITextField. The clear button on that isn't responding to the clear selector so i wanted to add a gesture for that, without success.