Recognizing UIScrollView movement by pixel - objective-c

I need to change UIScrollView subviews according to their place on the screen, so that they will get smaller while moving up and bigger while moving down.
Is there any way to know the contentOffset with the change of every pixel?
I catch the scrollViewDidScroll: method, but whenever the movement is fast there might be some 200pxls change between two calls.
Any ideas?

You have basically two approaches:
subclass UIScrollView and override touchesBegan/Moved/Ended;
add you own UIPanGestureRecognizer to your current UIScrollView.
set a timer, and each time it fires, update your view reading _scrollview.contentOffset.x;
In the first case, you would do for the touch handling methods:
- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch* touch = [touches anyObject];
_initialLocation = [touch locationInView:self.view];
_initialTime = touch.timestamp;
<more processing here>
//-- this will make the touch be processed as if your own logics were not there
[super touchesBegan:touches withEvent:event];
}
I am pretty sure you need to do that for touchesMoved; don't know if you also need to so something specific when the gesture starts or ends; in that case also override touchesMoved: and touchesEnded:. Also think about touchesCancelled:.
In the second case, you would do something like:
//-- add somewhere the gesture recognizer to the scroll view
UIPanGestureRecognizer *panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panView:)];
panRecognizer.delegate = self;
[scrollView addGestureRecognizer:panRecognizer];
//-- define this delegate method inside the same class to make both your gesture
//-- recognizer and UIScrollView's own work together
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return TRUE;
}
The third case is pretty trivial to be implemented. Not sure if it will give better results that the other two.

Related

Cocos2d v3 CCButton conflicting with UIGestureRecognizers

I've added a UIGestureRecognizer to a class which hinerits from NSObject, and I'm handling the gestures with the UIGestureRecognizers, but I need to add an upper layer and add some CCButton on it, so to conceptualize my hierarchy would be pretty much this:
- CCScene
-- CCNode added to the scene (with the gesture recognizers)
-- another CCNode added with a few CCButton
the UIGestureRecognizers are working well, but the CCButton is never called because the touch is handled by the gestures, if I remove the gestures, the CCButton is called
they are in conflict and I don't know why, I've read that the CCNodes touches are swallowed starting from the first (in term of hierarchy, where the first is the bigger z order I think) until the last, so since both my CCNode are added to the scene, and since the second ccnode (added with a bigger z order) is the first I think it should get the touch BEFORE the UIGestureRecognizer of the other ...
what I'm missing?
here's how I've added the gestures:
UITapGestureRecognizer *myGesture = [[UITapGestureRecognizer alloc] initWithTarget:self.myClassImplementation action:#selector(gesture:)];
[tapGesture setDelegate:self.myClassImplementation];
[[[CCDirector sharedDirector] view] addGestureRecognizer:myGesture];
I tried to simply add the CCButton directly into the CCScene but isn't called, can someone explain to me why they are in conflict? so to find I way to make it works
Gesturerecognizers get all the standard responder touch events. So you should handle it in gesturerecognizer's delegate method or in gesture action.
Here is how you can do it in delegate method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch
{
CGPoint touchLocation = [[CCDirector sharedDirector] convertToGL: [touch locationInView:touch.view]];
CCResponderManager *responder = [[CCDirector sharedDirector] responderManager];
CCNode *node = [responder nodeAtPoint:touchLocation];
if([node isKindOfClass:[CCButton class]])
{
NSLog(#"button");
return NO;
}
else
{
NSLog(#"not button");
return YES;
}
}

How to identify a button touch event from other touch events

I have an app in which the user can interact with many objects. These are several UIButtons, a few UILabels and a lot of UIImageViews.
The focus of all interaction centers around touching the UIImageView objects. With a touch I can move the images around, tell them to do this or that. However, my current obstacle is in knowing how to properly have the app distinguish touches that occur when I touch a UIButton.
Why? Well the logic within the Touches Began event is meant to be only for the UIImageViews, however the moment that I touch a button, or any other object, the app is interpreting the touch as if it occurred for a UIImageView object.
So, my approach boils down to: is there a good way of identifying if a touch occurred for a UIButton, UIImageView, UILabel object? That way I can filter out the irrelevant touches in my app from the relevant ones.
EDIT:
The code below outlines how I capture the touch event, however I do not know how to know from the touch event whether it is a button or a view that I touched.
touch = [touches anyObject];
touchLocation = [touch locationInView:[self view]];
To know whether UIButton is pressed follow this:
-(void) touchBegan : (NSSet *) touches withEvent : (UIEvent *) even
{
UITouch *touch = [touched anyObject];
UIView *touchedView = [touch view];
if([touchedView isMemberofClass : [UIButton class])
{
//do something when button is touched
}
}
Call hitTest:withEvent: on the view with the touch event to get the view that is actually being touched. Then you can use isKindOfClass: to check what type of view it is and respond accordingly.
you can use this method :-
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if ([self pointInside:point withEvent:event]) {
//You clicked inside the object
}
return [super hitTest:point withEvent:event]
}
and wain has already given you the explanation for it..
https://stackoverflow.com/a/18051856/1865424

Get TouchEvent to the subview of my scrollview iOS SDK [duplicate]

I am trying to solve a basic problem with drag and drop on iPhone. Here's my setup:
I have a UIScrollView which has one large content subview (I'm able to scroll and zoom it)
Content subview has several small tiles as subviews that should be dragged around inside it.
My UIScrollView subclass has this method:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *tile = [contentView pointInsideTiles:[self convertPoint:point toView:contentView] withEvent:event];
if (tile) {
return tile;
} else {
return [super hitTest:point withEvent:event];
}
}
Content subview has this method:
- (UIView *)pointInsideTiles:(CGPoint)point withEvent:(UIEvent *)event {
for (TileView *tile in tiles) {
if ([tile pointInside:[self convertPoint:point toView:tile] withEvent:event])
return tile;
}
return nil;
}
And tile view has this method:
- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.superview];
self.center = location;
}
This works, but not fully correct: the tile sometimes "falls down" during the drag process. More precisely, it stops receiving touchesMoved: invocations, and scroll view starts scrolling instead. I noticed that this depends on the drag speed: the faster I drag, the quicker the tile "falls".
Any ideas on how to keep the tile glued to the dragging finger?
I was struggling with this same problem - I was trying to do a interface with a lot of "cards" (UIView subclasses) on a cork board, and have the cork board area scrollable, but still able to drag-and-drop the cards. I was doing the hitTest() solution above, but one of the Apple engineers asked me why I was doing it that way. The simpler solution they suggested was as follows:
1) In the UIScrollView class, set the value of canCancelContentTouches to NO - this tells the UIScrollView class to allow touches within subviews (or, in this case, in subviews of subviews).
2) In my "card" class, set exclusiveTouch to YES - this tells the subview it owns the touches inside of it.
After this, I was able to drag around the cards and still scroll the subview. It's a lot simpler and cleaner than the hitTest() solution above.
(BTW, for extra credit, if you are using iOS 3.2 or 4.0 or later, use the UIPanGestureRecognizer class to handle the drag and drop logic - the drag and drop motion is a lot smoother than overriding touchesBegan()/touchesMoved()/touchesEnded().)
Solved: it turned out that there should be also touchesBegan: and touchesEnded: implementations (in my case having empty methods helped) in the tile, otherwise the gesture started propagating to parent views, and they were intercepting the gesture somehow. Dependency on the drag speed was imaginary.
At first, set:
scrollview.canCancelContentTouches = NO;
yourSubView. exclusiveTouch = YES;
Then in your subview gesture handle function,
- (void)handleSubviewMove:(UIPanGestureRecognizer *)gesture {
if(gesture.state == UIGestureRecognizerStateBegan) {
if(_parentScrollView != nil) {
_parentScrollView.scrollEnabled = NO;
}
}
if(gesture.state == UIGestureRecognizerStateEnded) {
if(_parentScrollView != nil) {
_parentScrollView.scrollEnabled = YES;
}
}
CGPoint translation = [gesture translationInView:[self superview]];
/* handle your view's movement here. */
[gesture setTranslation:CGPointZero inView:[self superview]];
}
Based on the code you've shared, it looks like your touchesMoved: method will only be called for gestures within the tile. At each touch, you move the tile to be centered on that touch, so slow movements will each give an update within the tile -- and the tile will "catch up" with the fingertip -- before the gesture exits the tile. When a gesture is faster, however, the (x,y) touchesMoved events will be farther apart, and you'll lose the gesture when one (x,y) point is far enough away from the last one that it is outside of the tile already.
You can work around this by capturing the movements in a superview large enough to cover the whole draggable area, and controlling the movement of the tile from within that superview.
By the way, is there a reason you're overriding the hitTest: method? It might be easier (and possibly more efficient?) to use the built-in implementation.

How to replace TouchesBegan with UIGestureRecognizer

here's the problem:
I'd like to move to using UIGestureRecognizer in my Apps.
For this reason I'd like to ditch TouchBegan/TouchEnded event's from my views.
However I don't understand how to manage when the touch began (user puts its finger on the screen) with UIGestureRecognizers.
The simplest one is UITapGestureRecognizer but the selector associated gets fired only when the TapGesture is completed (Well... it makes completely sense of course). But still the problem remains: how can I stop using touchesBegan and get that event anyway from UIGestureRecognizer?
Thanks!
Here is an example:
//Pan gesture
recognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
((UIPanGestureRecognizer *)recognizer).minimumNumberOfTouches = 3; //number of fingers
recognizer.delegate = self;
[self.view addGestureRecognizer:recognizer];
[recognizer release];
- (void)handlePan:(UIPanGestureRecognizer *)recognizer
{
if (recognizer.state == UIGestureRecognizerStateBegan)
{
//do something
} else if (recognizer.state == UIGestureRecognizerStateEnded)
{
//do something
}
}
Also implement UIGestureRecognizerDelegate in .h file. May be you need to do self.view.userInteractionEnabled = YES depending on the view you're using. e.g., if it's UIImageView, the you need to set userInteractionEnabled = YES, default is NO
For what you are tryin ti do you can't. The gesture recoginizers are for high level gestures so they behaive the same across all apps (think swipes, the timing required for a double tap, etc). For low level control and to do things that the recognizers can't you will still have to implement logic in touchesbegan, touchesEnded, etc.
Why not implement your own touchesBegan in a UIGestureRecognizer subclass -- intercept the message, extract the information you'd like, and then pass the message along to super's touchesBegan?

UIGestureRecognizer blocking table view scrolling

I'm using a custom UIGestureRecognizer subclass to track gestures on my InfoView class. The InfoView class is a subview of a custom UITableViewCell subclass called InfoCell.
I've added my gesture recognizer to my root view (the parent view of everything else on screen, because the purpose of my custom gesture recognizer is to allow dragging of InfoCell views between tables). Now, everything works as it should except one thing. I'm using the following code in my UIGestureRecognizer subclass to detect touches on the InfoView view:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UIView *touchView = [[touches anyObject] view];
if ([touchView isKindOfClass:[InfoView class]]) {
// Do stuff
}
The problem here is that the touches on the InfoView object are being intercepted, therefore they are not being forwarded to the UITableView which contains the InfoCell, which is the parent view of the InfoView. This means that I can no longer scroll the table view by dragging on the InfoView view, which is an issue because the InfoView covers the entire InfoCell.
Is there any way I can forward the touches onto the table view so that it can scroll? I've tried a bunch of things already:
[super touchesBegan:touches withEvent:event];
[touchView.superview.superview touchesBegan:touches withEvent:event]; (touchView.superview.superview gets a reference to its parent UITableView)
But nothing has worked so far. Also, the cancelsTouchesInView propery of my UIGestureRecognizer is set to NO, so thats not interfering with the touches.
Help is appreciated. Thanks!
Check out the UIGestureRecognizerDelegate method: - (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
If this returns YES it will prevent your gesture recognizer from stomping on the one that UIScrollView is using to detect scrolling.
UIGestureRecognizer has a property "cancelsTouchesInView" which is set to YES by default. This means that touches in a UIView are cancelled when a gesture is recognized. Try to set it to NO to allow the UIScrollView to receive further touch events.
I had a line in my touchesBegan method that set the state property of the gesture recognizer to UIGestureRecognizerStateBegan. Removing this line seems to fix the problem.
You can try add this notification
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer {
if ([gestureRecognizer class] == [UIPanGestureRecognizer class]) {
UIPanGestureRecognizer *panGestureRec = (UIPanGestureRecognizer *)gestureRecognizer;
CGPoint point = [panGestureRec velocityInView:self];
if (fabsf(point.x) > fabsf(point.y) ) {
return YES;
}
}
return NO;
}