enabling multi touch... need help with storing the 2nd touch - cocoa-touch

need to be able to store 2 touches, how I go about this i how no idea...
this is how I'm doing a single touch
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
/////checks whether the screen has been touched and stores its location and converts the coordinates to be avaible for use////
UITouch* myTouch = [touches anyObject];
CGPoint locationLeft = [myTouch locationInView: [myTouch view]];
locationLeft = [[CCDirector sharedDirector]convertToGL:locationLeft];
how would I be able to store the 2nd touch?
Thanks in advance

You should use ccTouchBegan (notice the singular 'touch' instead of 'touches') when you need to handle multi-touches. (IMO people should just ditch the ccTouchesBegan/Moved/Ended and just use ccTouchBegan/Moved/Ended).
Each of ccTouchBegan/Moved/Ended is called for each touch which mean you can easily differentiate between multiple touches. Example:
- (void)registerWithTouchDispatcher {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:1 swallowsTouches:YES];
}
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
if (self.firstTouch == nil) {
// we got the first touch
self.firstTouch = touch;
}
else if (self.secondTouch == nil) {
// we got the second touch
self.secondTouch = touch;
}
// return YES to consume the touch (otherwise it'll cascade down the layers)
return YES;
}
- (void)ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event {
if (touch == self.firstTouch) {
// we got the first touch
// do stuff
}
else if (touch == self.secondTouch) {
// we got the second touch
// do stuff
}
}
- (void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event {
if (touch == self.firstTouch) {
// first touch ended so remove both touches
self.firstTouch = nil;
self.secondTouch = nil;
}
else if (touch == self.secondTouch) {
// second touch ended so remove touch only
self.secondTouch = nil;
}
}

Have you tried iterating through the touches like this?
for (UITouch *touch in touches){
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
location = [self convertToNodeSpace:location];

Related

Move two CCSprites in Cocos2d/spritebuilder

I have two sprites in my scene that I'd like to move in different directions. I tried the following method from searching around but I can't get anything to work. This is the best I have to date.
If anyone can lend a hand that would be great.
-(void)touchMoved:(UITouch *)touch withEvent:(UIEvent *)event
{
CGPoint touchLocation = [touch locationInNode:self];
if(CGRectContainsPoint([redLeash boundingBox], touchLocation))
{
redLeash.position = touchLocation;
}
else if
(CGRectContainsPoint([blueLeash boundingBox], touchLocation))
{
blueLeash.position = touchLocation;
}
}
try this. I typed it up here so there might be some syntax errors but it should work.
-(void)touchMoved:(UITouch *)touch withEvent:(UIEvent *)event
{
CGPoint touchLocation = [touch locationInNode:self];
if(CGRectContainsPoint([redLeash boundingBox], touchLocaiton))
{
[redLeash setPosition:touchlocation];
}
else if (CGRectContainsPoint([blueLeash boundingBox], touchLocation))
{
[blueLeash setPosition:touchLocation];
}
}

Observe touches on multiple views in one motion

I have a parent view with 3 separate child views. The child views are spread out within the parent with no overlap (and with some space in between). As a user moves her finger around the screen (without lifting it), I'd like to track touches as they enter and exit each of the child views.
Example: If the user begins touching somewhere on the screen outside of the child views, then swipes her finger over child 1, off of child 1, over child 2, and then lets go, I would expect these events to be triggered:
Touch began
Touch entered child 1
Touch exited child 1
Touch entered child 2
Touch ended
It seems as if touchesBegan:withEvent: and touchesEnded:withEvent: methods would be helpful in this case, but when I define them on the child view controllers, they don't do exactly what I want -- if the user begins touching outside the child view, then swipes over the child view, no touch events are triggered on the child itself.
Current Solution: I'm currently using a solution that feels really hacky to accomplish this. I'm observing touchesBegan:withEvent:, touchesEnded:withEvent:, and touchesMoved:withEvent: on the parent, grabbing the coordinates of each event, and determining if they lie within the bounds of a child. If they do, I trigger the appropriate events as described above.
This method mostly works, but feels very inefficient. It feels like the framework should handle this work for me. My state management code also sometimes misses an "enter" or "exit" trigger and I suspect it's because touch events were either dropped or came to me in an unexpected order. Am I missing a better method here?
The simplest solution would be something like:
- (void)viewDidLoad
{
[super viewDidLoad];
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan:)];
[self.view addGestureRecognizer:pan];
// Do any additional setup after loading the view.
}
- (void)pan:(UIPanGestureRecognizer *)sender
{
static NSInteger startViewIndex;
static NSInteger endViewIndex;
CGPoint location = [sender locationInView:self.view];
if (sender.state == UIGestureRecognizerStateBegan)
{
if (CGRectContainsPoint(self.view0.frame, location))
startViewIndex = 0;
else if (CGRectContainsPoint(self.view1.frame, location))
startViewIndex = 1;
else if (CGRectContainsPoint(self.view2.frame, location))
startViewIndex = 2;
else
startViewIndex = -1;
}
else if (sender.state == UIGestureRecognizerStateEnded)
{
if (CGRectContainsPoint(self.view0.frame, location))
endViewIndex = 0;
else if (CGRectContainsPoint(self.view1.frame, location))
endViewIndex = 1;
else if (CGRectContainsPoint(self.view2.frame, location))
endViewIndex = 2;
else
endViewIndex = -1;
if (startViewIndex != -1 && endViewIndex != -1 && startViewIndex != endViewIndex)
{
// successfully moved between subviews!
NSLog(#"Moved from %1d to %1d", startViewIndex, endViewIndex);
}
}
}
Perhaps a little more elegant would be to define your own custom gesture recognizer (that way if you aren't dragging from one of your subviews, it will fail which will allow other gesture recognizers you might have going on elsewhwere to work ... probably not an issue unless you're use multiple gesture recognizers; it also isolates the gory details of the gesture logic from the rest of your view controller):
#interface PanBetweenSubviewsGestureRecognizer : UIPanGestureRecognizer
{
NSMutableArray *_arrayOfFrames;
}
#property NSInteger startingIndex;
#property NSInteger endingIndex;
#end
#implementation PanBetweenSubviewsGestureRecognizer
#synthesize startingIndex = _startingIndex;
#synthesize endingIndex = _endingIndex;
- (void)dealloc
{
_arrayOfFrames = nil;
}
- (id)initWithTarget:(id)target action:(SEL)action
{
self = [super initWithTarget:target action:action];
if (self)
{
_arrayOfFrames = [[NSMutableArray alloc] init];
}
return self;
}
- (void)addSubviewToArrayOfFrames:(UIView *)view
{
[_arrayOfFrames addObject:[NSValue valueWithCGRect:view.frame]];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
for (NSInteger i = 0; i < [_arrayOfFrames count]; i++)
{
if (CGRectContainsPoint([[_arrayOfFrames objectAtIndex:i] CGRectValue], location))
{
self.startingIndex = i;
return;
}
}
self.startingIndex = -1;
self.endingIndex = -1;
self.state = UIGestureRecognizerStateCancelled;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
for (NSInteger i = 0; i < [_arrayOfFrames count]; i++)
{
if (CGRectContainsPoint([[_arrayOfFrames objectAtIndex:i] CGRectValue], location))
{
self.endingIndex = i;
return;
}
}
self.endingIndex = -1;
self.state = UIGestureRecognizerStateCancelled;
}
#end
Which you could then use as follows:
- (void)viewDidLoad
{
[super viewDidLoad];
PanBetweenSubviewsGestureRecognizer *pan = [[PanBetweenSubviewsGestureRecognizer alloc] initWithTarget:self action:#selector(pan:)];
[pan addSubviewToArrayOfFrames:self.view0];
[pan addSubviewToArrayOfFrames:self.view1];
[pan addSubviewToArrayOfFrames:self.view2];
[self.view addGestureRecognizer:pan];
// Do any additional setup after loading the view.
}
- (void)pan:(PanBetweenSubviewsGestureRecognizer *)sender
{
if (sender.state == UIGestureRecognizerStateEnded && sender.startingIndex >= 0 && sender.endingIndex >= 0 && sender.startingIndex != sender.endingIndex)
{
// successfully moved between subviews!
NSLog(#"Moved from %1d to %1d", sender.startingIndex, sender.endingIndex);
}
}

Get original touch location from UIPanGestureRecognizer

My view has a UIPanGestureRecognizer added to it.
The problem is that by the time the touch is recognized as a pan gesture, it has to be moved a few points, and I can't extract the original touch location at the UIGestureRecognizerStateBegan state.
In this state the translationInView: is (0,0), and any subsequent moves are calculated from this point, and not from the original.
Is there a way to extract the original location from the gesture recognizer itself, or do I need to override the touchesBegan: method?
You should be able to set the delegate for the gesture recogniser and implement
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch
to get the initial touch.
You can solve this problem by implementing a custom PanGestureRecognizer which saves the original touch point and makes it available to the caller.
I went this route as I also wanted to control the distance moved to trigger a pan gesture.
This might be overkill for your situation, but works perfectly and is less difficult than it sounds.
To get the touchpoint coordinates, you just call:
[sender touchPointInView:yourView]
instead of:
[sender locationInView:yourView]
Here's the code for PanGestureRecognizer.m:
#import "PanGestureRecognizer.h"
#import <UIKit/UIGestureRecognizerSubclass.h>
#implementation PanGestureRecognizer
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
UITouch *touch = [touches anyObject];
// touchPoint is defined as: #property (assign,nonatomic) CGPoint touchPoint;
self.touchPoint = [touch locationInView:nil];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:nil];
// customize the pan distance threshold
CGFloat dx = fabs(p.x-self.touchPoint.x);
CGFloat dy = fabs(p.y-self.touchPoint.y);
if ( dx > 2 || dy > 2) {
if (self.state == UIGestureRecognizerStatePossible) {
[self setState:UIGestureRecognizerStateBegan];
}
else {
[self setState:UIGestureRecognizerStateChanged];
}
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
if (self.state == UIGestureRecognizerStateChanged) {
[self setState:UIGestureRecognizerStateEnded];
}
else {
[self setState:UIGestureRecognizerStateCancelled];
}
}
- (void) reset
{
}
// this returns the original touch point
- (CGPoint) touchPointInView:(UIView *)view
{
CGPoint p = [view convertPoint:self.touchPoint fromView:nil];
return p;
}
#end
If you use locationInView: rather than translationInView: you will get absolute coordinates instead of relative coordinates. However you do have to start the pan to get input... To work around this your view can be a UIButton and you can trigger a - (IBAction)buttonWasTouched:(UIButton *)button forEvent:(UIEvent *)event connected to "touch down" like so:
-(IBAction)shakeIntensityPanGesturebuttonWasTouched:(UIButton *)button forEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view]; //touch location local cordinates
}
You can use the UIGestureRecognizerState
- (void) onPanGesture:(UIPanGestureRecognizer *)sender {
if(sender.state == UIGestureRecognizerStateBegan) {
origin = [sender locationInView];
} else {
current = [sender locationInView];
// Initial touch stored in origin
}
}
Swift (ish):
var state = recognizer.state
if state == UIGestureRecognizerState.Began {
println(recognizer.locationInView(viewForGesture))
//this will be the point of first touch
}

objective-c touch-events

I've got a set of Images and would like to know which I have touched. How could I implement that...?
To be more precise:
A "Home-Class" will instantiate a couple of Image-Classes:
Image *myImageView = [[Image alloc] initWithImage:myImage];
The image-class looks something like this:
- (id) initWithImage: (UIImage *) anImage
{
if ((self = [super initWithImage:anImage]))
{
self.userInteractionEnabled = YES;
}
return self;
}
later on, I use these touches-event-methods also in the Image-class:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{}
My problem at the moment: the touchesBegan/Ended methods will be fired no matter where I touched the screen, but I would like to find out which of the Images has been touched.....
Whenever you get the touch, you check if that touch happen in between your image area. Here is the example code, lets suppose you have UIImage object called img.
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if (location.x >= img.x && location.x <= img.x && location.y >= img.y && location.y <= img.y) {
// your code here...
}
}
Inside your *.h (interface) file:
#interface MyViewController : UIViewController{
IBOutlet UIImageView *imageViewOne;
IBOutlet UIImageView *imageViewTwo;
UIImageView * alphaImage;
}
-(BOOL)isTouch:(UITouch *)touch WithinBoundsOf:(UIImageView *)imageView;
Place UIImageView components on your *.xib and bind them with 'imageViewOne' and 'imageViewTwo' using the "File's owner".
Go to the *.m (implementation) file and:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if ([self isTouch:touch WithinBoundsOf:imageViewOne])
{
NSLog(#"Fires first action...");
}
else if([self isTouch:touch WithinBoundsOf:imageViewTwo]){
NSLog(#"Fires second action...");
}
}
//(Optional 01) This is used to reset the transparency of the touched UIImageView
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[alphaImage setAlpha:1.0];
}
-(BOOL)isTouch:(UITouch *)touch WithinBoundsOf:(UIImageView *)imageView{
CGRect _frameRectangle=[imageView frame];
CGFloat _imageTop=_frameRectangle.origin.y;
CGFloat _imageLeft=_frameRectangle.origin.x;
CGFloat _imageRight=_frameRectangle.size.width+_imageLeft;
CGFloat _imageBottom=_frameRectangle.size.height+_imageTop;
CGPoint _touchPoint = [touch locationInView:self.view];
/*NSLog(#"image top %f",_imageTop);
NSLog(#"image bottom %f",_imageBottom);
NSLog(#"image left %f",_imageLeft);
NSLog(#"image right %f",_imageRight);
NSLog(#"touch happens at %f-%f",_touchPoint.x,_touchPoint.y);*/
if(_touchPoint.x>=_imageLeft &&
_touchPoint.x<=_imageRight &&
_touchPoint.y>=_imageTop &&
_touchPoint.y<=_imageBottom){
[imageView setAlpha:0.5];//optional 01 -adds a transparency changing effect
alphaImage=imageView;//optional 01 -marks the UIImageView which deals with the transparency changing effect for the moment.
return YES;
}else{
return NO;
}
}
That's how I handled that. I got the idea having read the post of "itsaboutcode".

Swiping a row in a tableview, clicking other, selects swiped

I have a tableview, and whenever I swipe a row in section A and after that select a row in section B, it thinks I selected the swiped row in section A! I placed a breakpoint to verify and am 100% sure that it thinks it's that cell AND that it calls it when I select the row in section B.
By swipe I mean that you place your finger one some part of a cell, then drag it across (left or right, doesn't matter) and then release it. This doesn't call didSelectRowAtIndexPath, since it is not a tap.
Example:
Swipe on indexpath A.1
Tap on indexpath B.4
OS calls tableView:didSelectRowAtIndexPath: A.1
Am I doing something wrong? What could go wrong?
Full code that handles touches in the specific cells:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
RDLogString(#"(%p) Received touches began", self);
moveCount = 0;
UITouch * touch = [touches anyObject];
touchBegin = [touch locationInView: nil];
[[self nextResponder] touchesBegan: touches withEvent: event];
}
- (void) touchesMoved: (NSSet * const)touches withEvent:(UIEvent * const)event {
RDLogString(#"(%p) Received touches moved", self);
moveCount++;
[[self nextResponder] touchesMoved: touches withEvent: event];
}
- (void) touchesEnded: (NSSet * const)touches withEvent:(UIEvent * const)event {
RDLogString(#"(%p) Received touches ended", self);
if(![self checkUserSwipedWithTouches: touches]){
[[self nextResponder] touchesEnded: touches withEvent: event];
}
}
- (BOOL) checkUserSwipedWithTouches: (NSSet * const) touches {
CGPoint touchEnd = [[touches anyObject] locationInView: nil];
NSInteger distance = touchBegin.x - touchEnd.x;
// This code shows an animation if the user swiped
if(distance > SWIPED_HORIZONTAL_THRESHOLD){
[self userSwipedRightToLeft: YES];
return YES;
} else if (distance < (-SWIPED_HORIZONTAL_THRESHOLD)) {
[self userSwipedRightToLeft: NO];
return YES;
}
return NO;
}
I fixed it by, when a swipe was detected and handled, instead of not sending anything, I now send a touchesCancelled, which makes a lot of sense in retrospect I must admit, but it wasn't really clear that I should do that. I couldn't find proper documentation on what to do if you didn't want the action to be handled.
Code that works:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
moveCount = 0;
UITouch * touch = [touches anyObject];
touchBegin = [touch locationInView: nil];
[[self nextResponder] touchesBegan: touches withEvent: event];
}
- (void) touchesMoved: (NSSet * const)touches withEvent:(UIEvent * const)event {
moveCount++;
[[self nextResponder] touchesMoved: touches withEvent: event];
}
- (void) touchesEnded: (NSSet * const)touches withEvent:(UIEvent * const)event {
// If we DO NOT handle the touch, send touchesEnded
if(![self checkUserSwipedWithTouches: touches]){
[[self nextResponder] touchesEnded: touches withEvent: event];
} else { // If we DO handle the touch, send a touches cancelled.
self.selected = NO;
self.highlighted = NO;
[[self nextResponder] touchesCancelled: touches withEvent: event];
}
}