touches in cocos explanation - objective-c

I am new to objective C and iPhone game development. I am having some difficulty in understanding the code of to implement touches in cocos. Could anyone give me some explanation please?
-(BOOL)ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)events
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGPoint convertedLocation = [[Director sharedDirector] convertCoordinate:location];
lady.position = convertedLocation;
return kEventHandled;
}
Please explain me these punch of code. I would like to know how it works one line by one line.
Thanks in advance

UITouch ref http://developer.apple.com/iphone/library/documentation/UIKit/Reference/...
Example code that uses UITouch
You can probably google for more
It looks like standard UITouch handling code.
get any touch even from set of touches sent in the even
get the location in given view - e.g. the location in view coordinates starting from 0,0 in the corner of this specific view
then convert to 'Director' coordinates which is a shared object ... (no idea what object but you get the idea what they are doing)
set position of something (lady) to the location of the touch event

Related

How to enable a certain part of image to be clicked and display content using objective c?

Hi i am trying to implement an application which allows user to click on certain part of the image, for example fish(fins, tails) and navigate to another page of xib to show the content of the parts. Do anyone know how to enable on click to images and pin point the various part for clicking? Please help! Thanks a lot!
P.S. i have found out how to click on the image and able to get the coordinates for the image. How do i use that coordinates in an if statement? For example,
if (location = i)
{
go this page
}
else if (location = a)
{
go this page
}
If you have static image,
simplest way to achieve this is to add UIButton with type custom, and then you add that over that image parts. Set background color and everything to clearcolor, so it would appear transparent.
On touchUpInside event of that button, you can redirect it to other view.
Try like this may be it helps you,
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint lastLocation = [touch locationInView: self.view];
if (![[touch view] isKindOfClass:[UIImageView class]]) {
if (lastLocation.x>103 & lastLocation.y<269) { //place your location points here
// keep you condition based on image view and push the view here.
}
}
}

Drawing Straight Lines with Finger on iPhone

Background: I am trying to create a really simple iPhone app that will allow the user to draw multiple straight lines on the screen with their finger.
I'm using these two methods in my UIViewController to capture the coordinates of each line's endpoints.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
Question:
I would like the line to show up as soon as touchesEnded has fired and then keep drawing more lines on the screen. How do I do this? I don't necessarily need the code, but I need help with the big picture idea of how to put it together. Also, I'm not a huge fan of xibs and like to do things all programmatically if that affects the answer.
What I've Tried: I've tried using Quartz 2d but it seems in order to use that, you have to do your drawing in the drawRect method of a separate subclassed view. So I would have to create a new view for each line? and then my coordinates would be messed up b/c I'd have to translate the touches positions from the UIViewController to the view.
I've also tried with OpenGL, which I've had a bit more success with (using the GLPaint sample as a template) but OpenGL seems like overkill for just drawing some straight lines on the screen.
You don't need multiple views, and you don't need OpenGL.
Make a subclass of UIView -- call it CanvasView.
Make an object to represent "a line" in your canvas -- it would be a subclass of NSObject with CGPoint properties for start and end.
CanvasView should keep an array of the lines that are in the canvas.
In -[CanvasView drawRect:], loop through the lines in the array, and draw each one.
In -[CanvasView touchesBegan:withEvent:], stash the start point in an instance variable.
In -[CanvasView touchesEnded:withEvent:], make a new line with the start and end points, and add it to your array of lines. Call [self setNeedsDisplay] to cause the view to be redrawn.
U mentioned that you just need big picture of the idea..
So here it is..
Take a view, subclass of UIView.
Capture touches events and then draw it in drawRect: method..
This is all what #kurtRevis mentioned in his answer.
Now to avoid slowing down of drawing, keep an offline image that keeps updated screen view and then just go on adding lines over that. So you wont get slowing down performance when there is big array of lines to be drawn.
Hope u are getting me.
As far as i know you can also use Core Graphics to draw a line, and from your question you don't need to create views for every single line, instead your single view graphics context will be a drawing sheet for all the drawing, and you almost nearer to the solution. Just by taking the touch coordinate you can draw the lines on the view.
CGPoint previousPoint;//This must have the global scope, it will be used in all the touch events
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
previousPoint = [touch locationInView:self]; // take the starting touch point;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint currentPoint;
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 2.0);
CGContextMoveToPoint(context, previousPoint.x,previousPoint.y);
CGContextAddLineToPoint(context, currentPoint.x,currentPoint.y);
CGContextStrokePath(bluecontext);
}
Hope this helps you, Let me know any issue....

How to determine the location of a touch using view property

I'm trying to develop some simple game apps. For a Pong-styled game, I have a moving ball that stays in bounds and two paddles. I implemented code that moves paddle 1 so that it reflects the ball as expected.
When I tried to add the same behavior to the other paddle, I tried this:
UITouch *touch = [[event allTouches]anyObject]; // Picks up the touch
CGPoint location = [touch locationInView:self.view]; // Gets coordinates
// to help move the paddle on the X axis; Y axis is determined by the paddle,
// so it only moves along one axis
if ([touch view] == paddle2) {
// move the second paddle
...
}
else {
// move the first paddle
...
}
However, any touches only move paddle1, indicating the condition is never activated. Based on the documentation, I thought that by sending the [touch view] message, the image view that was touched would return its own name.
What am I doing wrong? Is a simpler way to do this?
Do not use different views for the paddles, use one host view with different CALayers for drawing you objects.
You can move you paddles by assigning new origin to the layers.
The view should proces the following methods
- (void)touchesBegan:(NSSet*)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
to find out where you touched it.
Assuming you know the frames of you paddles you can decide whether you hit them with the touch or not.
I hope this helps.

containsPoint doesn't work with CAShapeLayer?

I have two CAShapeLayers insde the main layer of UIView.
The CAShapeLayers have complex shapes and I need to know if a point was touched within the shape boundaries. Also, I need to know which shape got touched.
I am trying the containsPoint, but nothing works.
After banging my head for two days I was able to produce this bizarre code and looks like it is working!
The goal was to hit test CAShapeLayer. The CAShapeLayer is moving on the screen, so the shape is not in constant place. Hittesting the CGPath currentPoint is not straightforward.
Feel free to add any input...
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint p = [[touches anyObject] locationInView:self];
CGAffineTransform transf = CGAffineTransformMakeTranslation(-shapeLayer.position.x, -shapeLayer.position.y);
if(CGPathContainsPoint(shapeLayer.path, &transf, p, NO)){
// the touch is inside the shape
}
}

Problem with touchesMoved and drawRect

I have a sticky bug that I can't seem to figure out, and I think it has something to do with the way touchesMoved is implemented.
in touchesMoved, I check to see where the touch is (if statement) and then accordingly, call setNeedsDisplayWithRect on a 40 by 40 area near the touchpoint. What happens in DrawRect is that a black image is put down if there was a white image there before, and vice versa. At the same time I'm calling setNeedsDisplayWithRect, I'm setting a boolean variable in an array of booleans, so I can keep track of what the current image is, and therefore display the opposite. (Actually, I don't always flip the image... I look at what the first touch is going to do, like switch from black to white, and then put white images on all the subsequent touches, so it's kind of like drawing or erasing with the images).
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
CGPoint lastTouchPoint = [touch previousLocationInView:self];
touchX = touchPoint.x;
touchY = touchPoint.y;
int lastX = (int)floor((lastTouchPoint.x+0.001)/40);
int lastY = (int)floor((lastTouchPoint.y+0.001)/40);
int currentX = (int)(floor((touchPoint.x+0.001)/40));
int currentY = (int)(floor((touchPoint.y+0.001)/40));
if ((abs((currentX-lastX)) >=1) || (abs((currentY-lastY)) >=1))
{
if ([soundArray buttonStateForRow:currentX column:currentY] == firstTouchColor){
[soundArray setButtonState:!firstTouchColor row:(int)(floor((touchPoint.x+0.001)/40)) column:(int)(floor((touchPoint.y+0.001)/40))];
[self setNeedsDisplayInRect:(CGRectMake((CGFloat)(floor((touchPoint.x+0.001)/40)*40), (CGFloat)(floor((touchPoint.y+0.001)/40)*40), (CGFloat)40.0, (CGFloat)40.0))];
}
}
}
My problem is that the boolean array seems to be out of whack with the images I'm putting down. This only happens if I drag really fast across the screen. Eventually the boolean array and the images are no longer in sync, even though I set them at the same time. Any idea what is causing this, or what I can do to fix it?
Here's my drawRect:
- (void)drawRect:(CGRect)rect {
if ([soundArray buttonStateForRow:(int)(floor((touchX+0.001)/40)) column:(int)(floor((touchY+0.001)/40))])
[whiteImage drawAtPoint:(CGPointMake((CGFloat)(floor((touchX+0.001)/40)*40), (CGFloat)(floor((touchY+0.001)/40))*40))];
else
[blackImage drawAtPoint:(CGPointMake((CGFloat)(floor((touchX+0.001)/40)*40), (CGFloat)(floor((touchY+0.001)/40))*40))];
}
I figured out the answer to this. touchX and touchY were instance variables, and they were getting reset in touchesMoved before each call to drawRect was complete. Therefore, if I moved fast on the screen, touchesMoved would get called, then call drawRect, then touchesMoved would get called again before drawRect had used touchX and touchY, so the drawing would get out of sync with the boolean array backend.
to solve this, I stopped using touchX and touchY in drawRect, and started deriving the same point by using the dirty rect that was being passed in from touchesMoved.
tada!