I am going to develop a ball game where i have to touch two/more
ball simultaneously.So how will i detect these multiple touch.
I came to know the following way i can detect multiple touch-
-(void)touchesBegan:(NSSet*)toucheswithEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
NSSet *touch2 = [event allTouches];
[touch2 count] //which count no. of touches
}
which detect only no. of touch. But I need to find out the (x-co-ordinate,y-co-ordinate) of these touch point.Not only this when i throw (means touch inside a ball and then slide the cursor) these ball how will i identify which ball is moving(means touchmove will identify which touchbegan??and if for each ball touchmove is called then how will i reset ball position because i gettting two touchposition(x1.x2) and (x2,y2) for 2 ball,so how will i say which ball belongs to (x1,y2) or (x2,y2)) .
In your code about touch2 is a set of UITouch objects
You can get at each object like so:
UITouch *touch = [[touch2 allObjects] objectAtIndex:0];
EDIT: to add information about touchesMoved
touchesBegan is called when one or more fingers is placed on the screen.
At this point you will need to determine which ball corresponds to each touch (by using the coordinates of each touch). You will need to store this mapping.
touchesMoved will be called continually as the fingers are moved across the screen. Using the mapping you calculated earlier, you can determine which ball corresponds to which UITouch and apply some movement to it as you see fit.
Perhaps you should read handling a complex multi-touch sequence in the apple docs.
Related
I have a UIImageView called draggableImageView that is a subview of a UIView called tiltedView that is tilted backwards via CATransform3DRotate and with perspective transform.m34 = -1 / 400.0. That way, when I drag draggableImageView around, its perspective and absolute size change so that it appears to be moving in 2-space within the tiltedView.
I would like to calculate the frame for draggableImageView within tiltedView but using a different coordinate system (in this case, a UIView called viewForCoordinates that is not tilted and encompasses the whole screen). However, using [self convertRect:self.frame toView:self.viewForCoordinates] from within draggableImageView outputs the same sized frame regardless of where within tiltedView my draggableImageView is located.
I figure using self.frame is more appropriate than self.bounds since bounds is agnostic to the superview while frame is dependent upon it. However, based upon the above, I'm guessing that convertRect is just converting self.frame agnostic of the perspective manipulation applied to self.frame's superview's.
I'm not sure what code would be helpful in this case, so please let me know what code I can provide to help get this question answered.
I solved this!
So I think there were two main issues:
1) When DraggableImageView object is touched, make sure object is indeed being touched via if ([touch view] == self) within - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
2) Convert touch location to superview coordinate system via self.touchLocation = [[touches anyObject] locationInView:self.superview].
3) Convert DraggableImageView's frame from superview coordinate to viewForCoordinates coordinate system. <<< I think my error was that I converted from self rather than superview.
4) If making changes to DraggableImageView's frame or location based on viewForCoordinates, calculate changes within viewForCoordinates's coordinate systemm and then convert back to DraggableImageView's superview. <<< I think I forgot to do this conversion :S
I can't find anywhere in the documentation. When this message is called on my subclass of UIView:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
How can I get the touch for which this message was called ?
On both the NSSet and UIEvent I can get only sets of touches, but no unique identifier so I can determine which touch triggered the message.
PS: why on hell would they send a NSSet of all touches, and also the possibility to get the same set from [[event allTouches] anyObject]
You can't get the touch, because sometimes multiple touches triggered the message. If the user has two fingers on the screen, and moves both, you can get a single touchesMoved:withEvent: in which both touches are updated.
You need to process every touch in the touches set. If you've disabled multitouch for the view, so that you know there will only ever be one touch in the set, you can use touches.anyObject to get the touch. But if you haven't disabled multitouch, you need to loop over all of the touches in the set.
The message includes a set of touches separate from event.allTouches because the user might have three fingers down but only move one or two of them. The touches set only contains the moved touches, but event.allTouches contains all of the user's touches, including the touches that have not moved since the last message.
The unique identifier for the touch is the UITouch object itself. When the user puts a finger on the screen, iOS creates a UITouch object. It updates that object as the user moves his finger. So you can use the UITouch object as the key in an NSDictionary, or you can attach your own objects to it using objc_setAssociatedObject.
I have a UIButton whose MultipleTouchEnabled is NO. Still when I drag two fingers across it, the UIEvent I get has 2 UITouches in its allTouches property.
I want to only respond to dragging a finger around my button and ignore the second touch. If I use [[event allTouches] anyObject], I get (obviously) a random touch each time, and instead of a smooth drag across the screen my drag jumps around wildly.
My initial thought is that somehow I am still enabling multitouch, but I find no confirmation for this theory.
Im pretty new to i-phone development and I am trying to program a basic bubble popper. I have been able to program a game that creates a 10X10 grid using an NSMutableArray and allows you to touch and object and "pop" it. When I went to use a sprite atlas instead of just a UIImage (animated graphics are much more exciting) it changed the the y coordinate of 0 to the bottom left side of the screen when before I believe 0,0 started at the top left side of the screen. So now my touch will have the y inverted so when I touch the 0,0 element it will activate the 0,9 element and my 0,1 will activate the 0,8 element and so on. I have been searching for a while to find out what I might be doing wrong or if there is a beter solution to programing a touch detecting function for what seems to be a simple enough game. Any Ideas?
two things:
please provide source code because it makes you issue easier to understand
have you look at game programming libraries (http://cocos2d.org/ for example) ?
Assuming you are handling the touch from within CCLayer's ccTouchesBegan:withEvent: method:
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [self convertTouchToNodeSpace:touch];
// now the location CGPoint variable contain coordinate of the touch aligned to the cocos2d coordinate system
[self popBubbleAtLocation:location];
}
Im not too informed on the nitty gritty of modifying the responder chain, so if this is stupid pls dont bash me ;)
basically i have 2 view stacks(blue) as a subview of a parent view(red)
they are both occupying the full frame of parent at one point in time, so obviously only the one on top is getting touch events, which travel upstream to the parent view(red) and on to window.
During some cases i want the touch input to be picked up by the eligible child view of 'other' view stack. That is, the view that woud be receiving these inputs if the current topmost view had userinteractionenabled set to no.
Setting userinteractionenabled works but i feel like its a dirty hack. The gist is this topmot view is mostly transparent and i want the events when touched in the specified region to end up on the other stack.
here is a picture to help visually explain, and keep in mind both blue views are 100% of parent.
http://c.crap.ps/35a5
You can override hitTest:withEvent: in each of the views to control who gets to "consume" the touch.
In your example I am assuming that the greenish areas are subviews that you want to consume the touch events. If so, then try this for the hitTest method:
-(UIView*) hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *hitView = [super hitTest:point withEvent:event];
return (hitView == self) ? nil : hitView;
}
This method checks if the touch hits any of the subviews. If it does then it lets that subview consume the touch, otherwise it lets the touch continue through the hierarchy.