Debugging touchesEnded NOT being called when swiping in 1 direction only - objective-c

Totally stumped in debugging a simple swipe movement for my application. When I swipe in a left direction, I correctly get a touchesEnded message-method call.
BUT, if I drag in the other direction, touchesEnded does NOT get called.
Anyone have a clue as to why this might be?
I'm using a TapDetectingView as my view for my viewController class. And user intraction and multiple touch are both enabled for this view in IB.

I had some issues too, using GestureRecognizers and touchesEnded on a single view.
My problem was, that I was either able to detect swipes OR touches when using GestureRecognizers only.
What I ended up doing, was using SwipeGestureRecognizers to detect swipes and the touchesEnded Method for Taps.
Using TapGestureRecognizers and SwipeGestureRecognizers on the same view didn't work.

If you're trying to capture the touchesEnded on a swipe gesture try checking for touchesCancelled. I handle my swipe ends via the touchesCancelled and then call touchesEnded from within that method. For some reason touchesEnded is only called via single taps and long presses but not swipes. The nice thing is that you get a timestamp and coordinates to use for figuring out acceleration.

Related

Handle tap event by subview of UIScrollView while scrolling

I have custom UIScrollView subclass with some content views inside. In some of them I have UITapGestureRecogniser. All works fine when scroll view is not scrolling. But when it scrolling content views does not receive tap action. What is the simplest solution to handle tap action by subview while scroll view is scrolling?
Details:
MyScrollView scrolls horizontally. It contains a lot of content views (e.g. MyContentView). Each MyContentView has width about one third of MyScrollView width. So there are about 3-4 visible MyContentView elements at a moment. The main behavior of MyScrollView is to 1)make sure that after scrolling one of MyContentView elements will be at center of screen and 2)to scroll to center of MyContentView if user taps on it. So the main answer I hope to get is how to "properly" implement handling of tap action in MyContentView while MyScrollView is decelerating.
I found some same questions and answers but none of them satisfied me. The best was to implement gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: of UITapGestureRecogniser delegate. But in this case I sometimes (when I tap, make smaaaal drag and release finger so tap is steel recognizable(lets called it quasi tap)) have both tap and scroll events and it leads to bugs for me even if scroll view is not scrolling when I begin tap. When user make quasi tap my application tries to scroll to tapped MyContentView element and than immediately handle normal scrolling. It seems even more terrible, due to some other functionality start to perform after handling tap (it must not perform when normal scrolling).
I need solution where scroll view wait enough to decide it is not tap event and only then make scroll. Otherwise if tap event had recognized scroll must not happen.
You can go with the custom delegates methods as well, using #protocol. Implement those delegate methods in view controller where your UIScrollView has been added.
like in MyContentView:
In touchesBegan method,
[self.delegate contentViewTapped:self];
Now in ContainerView class where scroll view is added, implement that method:
- (void)contentViewTapped:(MyContentView *)myContentView {
NSLog (#"ContentView no: %d", myContentView.tag); // if tag has been set while adding this view to scrollview.
}
Go through the examples for #protocol.
Hope this is what you required.
Enjoy Coding :)
This is built into UIScrollView - take a look at the delaysContentTouches and canCancelContentTouches properties. This should alleviate the problem when dragging a small bit after a tap.
This is all system built-in behaviour. I would suggest sticking with what Apple has provided for the feel of your interface (how it reacts to small drags, for instance) so that your app doesn't feel out of place on a user's phone.
EDIT:
Alternatively, you could disable scrolling of your scroll view in you gesture recognizer and re-enable it once it's ended/cancelled.
Further Edit:
I don't understand - I've created a sample project that illustrates how to intercept touches in a subview of a scroll view using gesture recognizer delegate methods. Play close attention to the "Cancellable Content Touches" and "Delays Content Touches" properties of the scroll view. They're both YES for very important reasons.
You scroll view should be delaying content touches until it has determined if the user is attempting a tap, pseudo-tap (as you put it), or a pan for the scroll view. Apple has already written the functionality you're trying to build; UIScrollView will already do what you want.
The problem is that the system doesn't want a scroll view's subviews intercepting tap events while the scroll view is scrolling. To this end, it cancels touch events if it determines that the user is actually trying to pan. Setting "Delays Content Touches" enables this behaviour. Ensure it's turned on and you should be fine.

Drag and drop a UIImageView that is on a UIScrollView

I am having a hard time finding good example of drag and drop with ios. I do have a working code that does drag and drop but it is not working as expected. The drag does not happen immediately after the touch and drag. I have to touch the image, load a dragImage to the main VC and then retouch and drag the dragImage for it to work.
some how the new added UIImageView (dragImage) does not receive touch events or pick up touch events when it does not receive the initial touchBegin event.
If you're using touchesBegan as your way of handling drag/drop that might be the first stumbling block. It's probably going to be much easier for you to use the various UIGestureRecognizers that are provided in iOS 4.0 and greater (unless you have a specific reason for supporting iOS 3).
With gesture recognizers you can simply attach one to your UIImageView and have the view change its center point to where the touch is whenever your recognizer is triggered. If you're not keen on using gesture recognizers perhaps you could provide some of your existing code so we can get a feeling for what's going on here.

UIButton firing selector inconsistantly

I have a UIButton linked up in IB correctly(I believe). The button fires inconsistently, every time I reload the view to show updated info, the button works sometimes and sometimes does not.It gives no errors. I can't find a pattern to when it works and when it doesn't, the same code is run every time I open the view and it still works when it wants. Besides linking it in IB I have also tried to addTarget in ViewDidLoad and remove the IB connection but it still has the same inconsistency,
[_buttonScreen addTarget:self action:#selector(buttonScreenClicked) forControlEvents:UIControlEventTouchUpInside];
If I add NSLog(#"Clicked"); to buttonScreenClicked I see that the method doesn't always get called, what would cause it to do this, I have made sure that I set:
[_buttonScreen setAlpha:0.1];
[_buttonScreen setHidden:NO];
[_buttonScreen setUserInteractionEnabled:YES];
I have no Image, text, or color in the button, but it still works sometimes.
I'm using AFKPageFlipper on the same view but it still had the same problem before I added AFKPageFlipper, so I don't think its that.
If anyone could point me in any direction to start trouble shooting this problem I would appreciate.
Thanks
I just had the same problem and worked it out. The 5 seconds is the clue.
Somewhere you have a gesture recognizer covering the same space as your button. More specifically you have a gesture recognizer that is eating your Taps but not your LongPresses. If you just tap the button the gesture recognizer runs off with your event; Hold your finger down long enough and the gesture recognizer no longer considers it a tap and the event is passed through to your button.
Instrument your Tap gesture recognizer handlers and the problem should pop out at you.
Make sure you don't have any other UIView descendants overlaying the button (like a transparent UIScrollView) as these will intercept the touch events first.
Also make sure that the containing view (the view with the button in) is correctly sized, by default you can place a view outside the bounds of another view and the clipsToBounds is set to false so you will see it but not be able to interact with it.
Things to try:
Do you have any other actions on the button?
Do you have any other UIViews which could possibly be accepting the key presses (above or below, or un-shown)
Also, please check that you have only one UIViewController instance for this screen. Other issues may arrise because of that.
What happens if you dont set the alpha level?
Do you release the object properly in the dealloc only ?
Hope this helps

Using hitTest logic only for touchesBegan and NOT gesture recognizers

I have been developing a simple game for iOS which involves dragging and using rotation- and other gesture recognizers. Dragging is realized through touchesBegan/Moved/Ended and rotation - through recognizer.
The views are irregularly shaped, and the view borders sometimes overlap, so I implemented Ole Belgeman's UIImage+ColorAtPixel in my picture view and overrode isPointInside method in the main element view. isPointInside invokes the method in picture view, which checks alpha at touch point and returns NO if the transparent section has been touched. Essentially, hitTest ignores this branch.
But the side effect of it is that hitTest ignores all touches on the transparent section, and rotation recognizer only works on the non-transparent zone. For some views, which are too small in size, it becomes impossible to use rotation gesture :(
Is there any way to somehow avoid this problem and use hitTest logic only for touchesBegan? I tried to work the solution out, but it seems that hitTest works strictly before any touch handling.
Checking the transparency at touchesBegan works, but when you touch the transparent section, which overlaps the non-transparent section of the other view, the latter doesn't receive the touch.
I just can't figure out the trick...
Thank you in advance for any help!
I would make the dragging use a UIPanGestureRecognizer, so that you can implement the delegate method -gestureRecognizer:shouldReceiveTouch: to return NO when your pan recognizer is considering touches in the transparent area. Leave it unimplemented or return YES from your rotation recognizer to receive everything.
In addition, using gesture recognizers for both kinds of actions has other benefits, like the ability to specify dependencies with -requireGestureRecognizerToFail:.
Try to check if the UIEvent parameter that passed to pointInside:withEvent: when it comes from the gesture recognizer, is different than the one passed when it called from touchBegan/Moved/Ended.
If it is different then i guess this is solving your problem.
Just put a break point or NSLog at pointInside to see the Event parameter on each case and see if you can differentiate.
Good Luck!

touchesBegan method is not being called

I am trying to detect touches, but the touchesBegan method is not being called.
In my ViewController, I have added the touchesBegan method. My Nib files owner is set to the correct V.C. The Nib itself consists of the view, with a scroll view and a tab bar. Nested in the scroll view is an image view, which has user interaction enabled. What is precluding touches from being registered, or preventing my implementation of touchesBegan from being called?
I've scoured the Internet and Apple docs, and I can't see what I am doing wrong. Also, I'm not really sure what code I can post here to help with my query. Thanks.
Okay, after a lot more reading, I've now got a scrollview and a imageview, both of which are created programatically. The imageview is a sub view of the scrollview, and scrollview has been subclassed so that the touches ended method can decide whether it was a single touch, in which case call the touches ended method from the view controller, otherwise call its supers method. This works just fine, however, why is it that this cannot be done without subclassing scrollview? Is it my lack of understanding of how scrollview works, or is it just a limitation of it?