I have been developing a simple game for iOS which involves dragging and using rotation- and other gesture recognizers. Dragging is realized through touchesBegan/Moved/Ended and rotation - through recognizer.
The views are irregularly shaped, and the view borders sometimes overlap, so I implemented Ole Belgeman's UIImage+ColorAtPixel in my picture view and overrode isPointInside method in the main element view. isPointInside invokes the method in picture view, which checks alpha at touch point and returns NO if the transparent section has been touched. Essentially, hitTest ignores this branch.
But the side effect of it is that hitTest ignores all touches on the transparent section, and rotation recognizer only works on the non-transparent zone. For some views, which are too small in size, it becomes impossible to use rotation gesture :(
Is there any way to somehow avoid this problem and use hitTest logic only for touchesBegan? I tried to work the solution out, but it seems that hitTest works strictly before any touch handling.
Checking the transparency at touchesBegan works, but when you touch the transparent section, which overlaps the non-transparent section of the other view, the latter doesn't receive the touch.
I just can't figure out the trick...
Thank you in advance for any help!
I would make the dragging use a UIPanGestureRecognizer, so that you can implement the delegate method -gestureRecognizer:shouldReceiveTouch: to return NO when your pan recognizer is considering touches in the transparent area. Leave it unimplemented or return YES from your rotation recognizer to receive everything.
In addition, using gesture recognizers for both kinds of actions has other benefits, like the ability to specify dependencies with -requireGestureRecognizerToFail:.
Try to check if the UIEvent parameter that passed to pointInside:withEvent: when it comes from the gesture recognizer, is different than the one passed when it called from touchBegan/Moved/Ended.
If it is different then i guess this is solving your problem.
Just put a break point or NSLog at pointInside to see the Event parameter on each case and see if you can differentiate.
Good Luck!
Related
I created a custom transition for navigation controller where as the user pans up, the next controller's view revealed below as the current controller's view moves in upward direction. I want that view to move by following the touch (as if it is glued to finger at the touch point), but i dont know how to pass that translation from pan gesture recognizer to the object that implements UIViewControllerAnimatedTransitioning. Well, I do but i cannot access it from inside the [UIView animateWithDuration ... ] block (It seems that block is executed once, I thought it would be executed as percentage of completion changes). How can I accomplish this?
To ask the question in a different way, if you use the Photos app in ios7, when you are looking at a photo, touch with two fingers and pinch /move and you will see that it is following the finger (movements). Any example code for this?
You'll need to create a separate animation controller as a subclass of UIPercentDrivenInteractiveTransition to go along with your custom transition animation. This is the class that will calculate the percentage of how complete your animation is. There's too much to explain in a single SO answer, but have a look at the docs here. You can also refer to one of my implementations of a custom transition animation with interactive abilities here to see it in action.
Croberth's answer is correct. You actually have two choices.
If you want to keep your custom animation, then use a UIPercentDrivenInteractiveTransition and keep updating it as the gesture proceeds, as in this example of mine:
https://github.com/mattneub/Programming-iOS-Book-Examples/blob/master/bk2ch06p296customAnimation2/ch19p620customAnimation1/AppDelegate.m
However, I prefer to split the controller up into two separate cases; if we are interactive (using a gesture), then I just keep updating the view positions myself, manually, as the gesture proceeds, including completing or reversing it at the end, as this in this code:
https://github.com/mattneub/Programming-iOS-Book-Examples/blob/master/bk2ch06p300customAnimation3/ch19p620customAnimation1/AppDelegate.m
I have custom UIScrollView subclass with some content views inside. In some of them I have UITapGestureRecogniser. All works fine when scroll view is not scrolling. But when it scrolling content views does not receive tap action. What is the simplest solution to handle tap action by subview while scroll view is scrolling?
Details:
MyScrollView scrolls horizontally. It contains a lot of content views (e.g. MyContentView). Each MyContentView has width about one third of MyScrollView width. So there are about 3-4 visible MyContentView elements at a moment. The main behavior of MyScrollView is to 1)make sure that after scrolling one of MyContentView elements will be at center of screen and 2)to scroll to center of MyContentView if user taps on it. So the main answer I hope to get is how to "properly" implement handling of tap action in MyContentView while MyScrollView is decelerating.
I found some same questions and answers but none of them satisfied me. The best was to implement gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: of UITapGestureRecogniser delegate. But in this case I sometimes (when I tap, make smaaaal drag and release finger so tap is steel recognizable(lets called it quasi tap)) have both tap and scroll events and it leads to bugs for me even if scroll view is not scrolling when I begin tap. When user make quasi tap my application tries to scroll to tapped MyContentView element and than immediately handle normal scrolling. It seems even more terrible, due to some other functionality start to perform after handling tap (it must not perform when normal scrolling).
I need solution where scroll view wait enough to decide it is not tap event and only then make scroll. Otherwise if tap event had recognized scroll must not happen.
You can go with the custom delegates methods as well, using #protocol. Implement those delegate methods in view controller where your UIScrollView has been added.
like in MyContentView:
In touchesBegan method,
[self.delegate contentViewTapped:self];
Now in ContainerView class where scroll view is added, implement that method:
- (void)contentViewTapped:(MyContentView *)myContentView {
NSLog (#"ContentView no: %d", myContentView.tag); // if tag has been set while adding this view to scrollview.
}
Go through the examples for #protocol.
Hope this is what you required.
Enjoy Coding :)
This is built into UIScrollView - take a look at the delaysContentTouches and canCancelContentTouches properties. This should alleviate the problem when dragging a small bit after a tap.
This is all system built-in behaviour. I would suggest sticking with what Apple has provided for the feel of your interface (how it reacts to small drags, for instance) so that your app doesn't feel out of place on a user's phone.
EDIT:
Alternatively, you could disable scrolling of your scroll view in you gesture recognizer and re-enable it once it's ended/cancelled.
Further Edit:
I don't understand - I've created a sample project that illustrates how to intercept touches in a subview of a scroll view using gesture recognizer delegate methods. Play close attention to the "Cancellable Content Touches" and "Delays Content Touches" properties of the scroll view. They're both YES for very important reasons.
You scroll view should be delaying content touches until it has determined if the user is attempting a tap, pseudo-tap (as you put it), or a pan for the scroll view. Apple has already written the functionality you're trying to build; UIScrollView will already do what you want.
The problem is that the system doesn't want a scroll view's subviews intercepting tap events while the scroll view is scrolling. To this end, it cancels touch events if it determines that the user is actually trying to pan. Setting "Delays Content Touches" enables this behaviour. Ensure it's turned on and you should be fine.
I am having a hard time finding good example of drag and drop with ios. I do have a working code that does drag and drop but it is not working as expected. The drag does not happen immediately after the touch and drag. I have to touch the image, load a dragImage to the main VC and then retouch and drag the dragImage for it to work.
some how the new added UIImageView (dragImage) does not receive touch events or pick up touch events when it does not receive the initial touchBegin event.
If you're using touchesBegan as your way of handling drag/drop that might be the first stumbling block. It's probably going to be much easier for you to use the various UIGestureRecognizers that are provided in iOS 4.0 and greater (unless you have a specific reason for supporting iOS 3).
With gesture recognizers you can simply attach one to your UIImageView and have the view change its center point to where the touch is whenever your recognizer is triggered. If you're not keen on using gesture recognizers perhaps you could provide some of your existing code so we can get a feeling for what's going on here.
I am doing some drawing on a CALayer and want to be able to have the user single tap different parts of the drawing and trigger a response. I tried looking into gesture recognizers, and it seems that they need to be tied to a UIView. Any idea how I can get my desired behavior using CALayers?
You need a responder to be able to respond to touches. From the view that is hosting this layer (at some point in your tree this needs to be true) you can use -[CALayer hitTest:] to try to find the deepest sublayer that will respond to you.
Totally stumped in debugging a simple swipe movement for my application. When I swipe in a left direction, I correctly get a touchesEnded message-method call.
BUT, if I drag in the other direction, touchesEnded does NOT get called.
Anyone have a clue as to why this might be?
I'm using a TapDetectingView as my view for my viewController class. And user intraction and multiple touch are both enabled for this view in IB.
I had some issues too, using GestureRecognizers and touchesEnded on a single view.
My problem was, that I was either able to detect swipes OR touches when using GestureRecognizers only.
What I ended up doing, was using SwipeGestureRecognizers to detect swipes and the touchesEnded Method for Taps.
Using TapGestureRecognizers and SwipeGestureRecognizers on the same view didn't work.
If you're trying to capture the touchesEnded on a swipe gesture try checking for touchesCancelled. I handle my swipe ends via the touchesCancelled and then call touchesEnded from within that method. For some reason touchesEnded is only called via single taps and long presses but not swipes. The nice thing is that you get a timestamp and coordinates to use for figuring out acceleration.