Detecting touch of a sprite for i-phone development - objective-c

Im pretty new to i-phone development and I am trying to program a basic bubble popper. I have been able to program a game that creates a 10X10 grid using an NSMutableArray and allows you to touch and object and "pop" it. When I went to use a sprite atlas instead of just a UIImage (animated graphics are much more exciting) it changed the the y coordinate of 0 to the bottom left side of the screen when before I believe 0,0 started at the top left side of the screen. So now my touch will have the y inverted so when I touch the 0,0 element it will activate the 0,9 element and my 0,1 will activate the 0,8 element and so on. I have been searching for a while to find out what I might be doing wrong or if there is a beter solution to programing a touch detecting function for what seems to be a simple enough game. Any Ideas?

two things:
please provide source code because it makes you issue easier to understand
have you look at game programming libraries (http://cocos2d.org/ for example) ?

Assuming you are handling the touch from within CCLayer's ccTouchesBegan:withEvent: method:
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [self convertTouchToNodeSpace:touch];
// now the location CGPoint variable contain coordinate of the touch aligned to the cocos2d coordinate system
[self popBubbleAtLocation:location];
}

Related

Using convertRectToView on UIImageView in CATransform3DRotate Rotated UIView

I have a UIImageView called draggableImageView that is a subview of a UIView called tiltedView that is tilted backwards via CATransform3DRotate and with perspective transform.m34 = -1 / 400.0. That way, when I drag draggableImageView around, its perspective and absolute size change so that it appears to be moving in 2-space within the tiltedView.
I would like to calculate the frame for draggableImageView within tiltedView but using a different coordinate system (in this case, a UIView called viewForCoordinates that is not tilted and encompasses the whole screen). However, using [self convertRect:self.frame toView:self.viewForCoordinates] from within draggableImageView outputs the same sized frame regardless of where within tiltedView my draggableImageView is located.
I figure using self.frame is more appropriate than self.bounds since bounds is agnostic to the superview while frame is dependent upon it. However, based upon the above, I'm guessing that convertRect is just converting self.frame agnostic of the perspective manipulation applied to self.frame's superview's.
I'm not sure what code would be helpful in this case, so please let me know what code I can provide to help get this question answered.
I solved this!
So I think there were two main issues:
1) When DraggableImageView object is touched, make sure object is indeed being touched via if ([touch view] == self) within - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
2) Convert touch location to superview coordinate system via self.touchLocation = [[touches anyObject] locationInView:self.superview].
3) Convert DraggableImageView's frame from superview coordinate to viewForCoordinates coordinate system. <<< I think my error was that I converted from self rather than superview.
4) If making changes to DraggableImageView's frame or location based on viewForCoordinates, calculate changes within viewForCoordinates's coordinate systemm and then convert back to DraggableImageView's superview. <<< I think I forgot to do this conversion :S

&& [[event allTouches] count] > 1

I have a UIButton whose MultipleTouchEnabled is NO. Still when I drag two fingers across it, the UIEvent I get has 2 UITouches in its allTouches property.
I want to only respond to dragging a finger around my button and ignore the second touch. If I use [[event allTouches] anyObject], I get (obviously) a random touch each time, and instead of a smooth drag across the screen my drag jumps around wildly.
My initial thought is that somehow I am still enabling multitouch, but I find no confirmation for this theory.

iOS: Non-square hit areas for buttons

I need to make some triangular buttons that overlap each other.
While UIButtons can take transparent images as backgrounds, and UIControls can have custom views, the hit area of these is always square. How can I create a triangular hitarea for my buttons?
I come from a FLash background so I would normally create a hitarea for my view, but I don't believe I can do this in Cocoa.
Any tips?
You can achieve this by subclassing UIButton and providing your own:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// return YES if point is inside the receiver’s bounds; otherwise, NO.
}
Apple's UIView Documentation provides the details, such as confirming that point is already in the receiver's coordinate system.

Conflicts between UIGestureRecognizer and in iPhone

I'm working in small app for iPhone which uses openGL.
The app is supposed to :
Move around the screen with a moving touch.
Zoom in/Out by tapping
I'm overwriting this functions to handle the coordinates of the touches and move around the screen with my finger, with success, working perfectly:
(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
I'm using UIGesture recognizers to double taps, pinch gestures, swipes, counter pinch with success as well, all gestures being detected perfectly.
But I'm having conflicts. Some times for example, I do a double tap and it's recognized correctly, but it's also passed to my touchedMoved function, resulting in erratic movements. Same thing happens with pinch and counter pinch gestures, most of the time they work well in zooming in and out, but some times my touchesMoved function detect the touches as well and moves the screen.
It's a bad practice to overwrite this functions and use UIGesturerecognizer at the same time ?. If not, it's there a way to work with both without conflicts ?
I tried setting setDelaysTouchesBegan and setDelaysTouchesEnded to Yes in my recognizers, but this results in my touches not being passed to none of my overwritten functions!
I was thinking in using UIGestureRecognizer to handle swipes and scrap the overwritten functions to receive touches, and use the value of the swipe to calculate how much move my screen, is it possible ?
I can't answer for why you sometimes receive spurious touchesMoved: events, but it does sound like you may want to just ditch the overwritten functions and use a UIPanGestureRecognizer to handle moving around the screen.

multiple touch problem in Cocoa touch/i-phone

I am going to develop a ball game where i have to touch two/more
ball simultaneously.So how will i detect these multiple touch.
I came to know the following way i can detect multiple touch-
-(void)touchesBegan:(NSSet*)toucheswithEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
NSSet *touch2 = [event allTouches];
[touch2 count] //which count no. of touches
}
which detect only no. of touch. But I need to find out the (x-co-ordinate,y-co-ordinate) of these touch point.Not only this when i throw (means touch inside a ball and then slide the cursor) these ball how will i identify which ball is moving(means touchmove will identify which touchbegan??and if for each ball touchmove is called then how will i reset ball position because i gettting two touchposition(x1.x2) and (x2,y2) for 2 ball,so how will i say which ball belongs to (x1,y2) or (x2,y2)) .
In your code about touch2 is a set of UITouch objects
You can get at each object like so:
UITouch *touch = [[touch2 allObjects] objectAtIndex:0];
EDIT: to add information about touchesMoved
touchesBegan is called when one or more fingers is placed on the screen.
At this point you will need to determine which ball corresponds to each touch (by using the coordinates of each touch). You will need to store this mapping.
touchesMoved will be called continually as the fingers are moved across the screen. Using the mapping you calculated earlier, you can determine which ball corresponds to which UITouch and apply some movement to it as you see fit.
Perhaps you should read handling a complex multi-touch sequence in the apple docs.