Cocos2d 'snap to grid' logic - objective-c

I'm building a game in Cocos2d and I'm having a hard time implementing what I want. Imagine an 8 x 10 grid of squares. You can touch a square and drag that square's row / column horizontally / vertically, but not both. After you release the row / column, the squares will 'snap' back into place in the grid according to their closest row / col positions.
Everything works programmatically, with no overlaps or misplaced squares or invalid positions. However, I just cannot seem to get this 'snap to grid' to animate the way I want.
I'm using the MVC design pattern to separate any views from any game logic. The view is simply handed an array of 'squaresToUpdate', looks at every square, finds the corresponding sprite, and updates the position of the sprite based on the position of the square.
The problem arises when trying to do something along the lines of this:
-update():
for each square in squaresToUpdate:
if square is not being dragged at the moment:
setup a 'CCMoveTo' to bring the sprite in line with square
I can't get the squares to move freely while being dragged, but CCMoveTo when not being dragged. Either I create a new action every update, or the squares just freak out.
I don't know if it's my logic that is broken, if CCMoveTo is not doing what I want, or if this problem is actually much harder than I initially thought. Could someone help me out with this logic?
Good old pencil and paper gives me this, but I'm not sure it's 100%:
- (void)update:(ccTime)dt {
NSMutableArray *toDraw = [self.rootView whatAmIDrawing];
for (GameObject *o in toDraw) {
CCSprite *sprite = (CCSprite*)[self.batch getChildByTag:o.tag];
if (o.moving == NO) {
if (o.snapping == NO) {
o.snapping = YES;
CCMoveTo *move = [CCMoveTo actionWithDuration:self.rootView.model.activeGame.snapSpeed position:o.position];
[sprite runAction:move];
}
} else {
o.snapping = NO;
sprite.position = o.position;
sprite.rotation = o.rotation;
}
}
[super update:dt];
}

I would override the cocos2d touch methods:
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
- (void)ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event
- (void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event
ccTouchBegan would compare the touch point with every game object and set a "touched" flag:
ccTouchMoved would set the position of any "touched" object to the new touch position, and set a "moving" flag.
ccTouchEnded would clear all "touched" flags, run the CCMoveTo to the new position for any object with the "moving" flag, and then clear all "moving" flags.
The update() method would then just have to change position of every sprite to the position of the object.

Related

How to control CAKeyframeAnimation with touch gesture?

I have a CAEmitterLayer animated along a bezier path (closed form, like an '8', out of four control points) with a CAKeyframeAnimation. Now I want to control the animation by a slide of a touch-finger along (but not necessarily on) the path. How is this possible and is this even possible?
Make a CGpoint click; variable to remember your initial "drag" point, then create a local NSEvent handler...
[NSEvent addLocalMonitorForEventsMatchingMask: NSMouseMovedMask
| NSLeftMouseDownMask
handler:^(NSEvent *e) {
if ( e.type == NSLeftMouseDown ) click = e.locationInWindow;
else "yourDelta" = click - e.locationInWindow; // pseudoCode
return e;
}];
"yourDelta" is the offset of that initial point from the current location... you could also achieve similar results with scroll events, by monitoring NSEventScrollWheelMask... and looking at the e.deltaX and e.deltaY values.
Edit: I'm not as familiar with event handling on iOS.. but the same technique could be applied with normal event handlers... ie.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent*)e {
click = [e locationInView:_yourView];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent*)e {
"yourDelta" = click - [e locationInView:_yourView]; // pseudoCode
}
As to "seeking" your animation.. One possible way is to simply [layer addAnimation:theNewAnimation], using your previous toValue's, but instead of basing the fromValue's of 0, or your model layer... use your layer.presentationLayer values instead? It is hard to say without seeing the full content of your CAKeyframeAnimation.

Drawing Straight Lines with Finger on iPhone

Background: I am trying to create a really simple iPhone app that will allow the user to draw multiple straight lines on the screen with their finger.
I'm using these two methods in my UIViewController to capture the coordinates of each line's endpoints.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
Question:
I would like the line to show up as soon as touchesEnded has fired and then keep drawing more lines on the screen. How do I do this? I don't necessarily need the code, but I need help with the big picture idea of how to put it together. Also, I'm not a huge fan of xibs and like to do things all programmatically if that affects the answer.
What I've Tried: I've tried using Quartz 2d but it seems in order to use that, you have to do your drawing in the drawRect method of a separate subclassed view. So I would have to create a new view for each line? and then my coordinates would be messed up b/c I'd have to translate the touches positions from the UIViewController to the view.
I've also tried with OpenGL, which I've had a bit more success with (using the GLPaint sample as a template) but OpenGL seems like overkill for just drawing some straight lines on the screen.
You don't need multiple views, and you don't need OpenGL.
Make a subclass of UIView -- call it CanvasView.
Make an object to represent "a line" in your canvas -- it would be a subclass of NSObject with CGPoint properties for start and end.
CanvasView should keep an array of the lines that are in the canvas.
In -[CanvasView drawRect:], loop through the lines in the array, and draw each one.
In -[CanvasView touchesBegan:withEvent:], stash the start point in an instance variable.
In -[CanvasView touchesEnded:withEvent:], make a new line with the start and end points, and add it to your array of lines. Call [self setNeedsDisplay] to cause the view to be redrawn.
U mentioned that you just need big picture of the idea..
So here it is..
Take a view, subclass of UIView.
Capture touches events and then draw it in drawRect: method..
This is all what #kurtRevis mentioned in his answer.
Now to avoid slowing down of drawing, keep an offline image that keeps updated screen view and then just go on adding lines over that. So you wont get slowing down performance when there is big array of lines to be drawn.
Hope u are getting me.
As far as i know you can also use Core Graphics to draw a line, and from your question you don't need to create views for every single line, instead your single view graphics context will be a drawing sheet for all the drawing, and you almost nearer to the solution. Just by taking the touch coordinate you can draw the lines on the view.
CGPoint previousPoint;//This must have the global scope, it will be used in all the touch events
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
previousPoint = [touch locationInView:self]; // take the starting touch point;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint currentPoint;
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 2.0);
CGContextMoveToPoint(context, previousPoint.x,previousPoint.y);
CGContextAddLineToPoint(context, currentPoint.x,currentPoint.y);
CGContextStrokePath(bluecontext);
}
Hope this helps you, Let me know any issue....

Drag and drop UIButton but limit to boundaries

I’m not asking for givemesamplecodenow responses, just a nudge in the right direction would be much appreciated. My searches haven’t been any good.
I have a UIButton that I am able to move freely around the screen.
I would like to limit the drag area on this object so it can only be moved up and down or side to side. I believe I need to get the x,y coordinates of the boundaries and then restrict movement outside this area. But that’s a far as I have got. My knowledge doesn’t stretch any further than that.
Has anyone implemented something similar in the past?
Adam
So let's say you're in the middle of the drag operation. You're moving the button instance around by setting its center to the center of whatever gesture is causing the movement.
You can impose restrictions by testing the gesture's center and resetting the center values if you don't like them. The below assumes a button wired to an action for all Touch Drag events but the principle still applies if you're using gesture recognizers or touchesBegan: and friends.
- (IBAction)handleDrag:(UIButton *)sender forEvent:(UIEvent *)event
{
CGPoint point = [[[event allTouches] anyObject] locationInView:self.view];
if (point.y > 200)
{
point.y = 200; //No dragging this button lower than 200px from the origin!
}
sender.center = point;
}
If you want a button that slides only on one axis, that's easy enough:
- (IBAction)handleDrag:(UIButton *)sender forEvent:(UIEvent *)event
{
CGPoint point = [[[event allTouches] anyObject] locationInView:self.view];
point.y = sender.center.y; //Always stick to the same y value
sender.center = point;
}
Or perhaps you want the button draggable only inside the region of a specific view. This might be easier to define if your boundaries are complicated.
- (IBAction)handleDrag:(UIButton *)sender forEvent:(UIEvent *)event
{
CGPoint point = [[[event allTouches] anyObject] locationInView:self.someView];
if ([self.someView pointInside:point withEvent:nil])
{
sender.center = point;
//Only if the gesture center is inside the specified view will the button be moved
}
}
Presumably you'd be using touchesBegan:, touchesMoved:, etc., so it should be as simple as testing whether the touch point is outside your view's bounds in touchesMoved:. If it is outside, ignore it, but if it's inside, adjust the position of the button.
I suspect you may find this function useful:
bool CGRectContainsPoint ( CGRect rect, CGPoint point );

How to determine the location of a touch using view property

I'm trying to develop some simple game apps. For a Pong-styled game, I have a moving ball that stays in bounds and two paddles. I implemented code that moves paddle 1 so that it reflects the ball as expected.
When I tried to add the same behavior to the other paddle, I tried this:
UITouch *touch = [[event allTouches]anyObject]; // Picks up the touch
CGPoint location = [touch locationInView:self.view]; // Gets coordinates
// to help move the paddle on the X axis; Y axis is determined by the paddle,
// so it only moves along one axis
if ([touch view] == paddle2) {
// move the second paddle
...
}
else {
// move the first paddle
...
}
However, any touches only move paddle1, indicating the condition is never activated. Based on the documentation, I thought that by sending the [touch view] message, the image view that was touched would return its own name.
What am I doing wrong? Is a simpler way to do this?
Do not use different views for the paddles, use one host view with different CALayers for drawing you objects.
You can move you paddles by assigning new origin to the layers.
The view should proces the following methods
- (void)touchesBegan:(NSSet*)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
to find out where you touched it.
Assuming you know the frames of you paddles you can decide whether you hit them with the touch or not.
I hope this helps.

Problem with touchesMoved and drawRect

I have a sticky bug that I can't seem to figure out, and I think it has something to do with the way touchesMoved is implemented.
in touchesMoved, I check to see where the touch is (if statement) and then accordingly, call setNeedsDisplayWithRect on a 40 by 40 area near the touchpoint. What happens in DrawRect is that a black image is put down if there was a white image there before, and vice versa. At the same time I'm calling setNeedsDisplayWithRect, I'm setting a boolean variable in an array of booleans, so I can keep track of what the current image is, and therefore display the opposite. (Actually, I don't always flip the image... I look at what the first touch is going to do, like switch from black to white, and then put white images on all the subsequent touches, so it's kind of like drawing or erasing with the images).
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
CGPoint lastTouchPoint = [touch previousLocationInView:self];
touchX = touchPoint.x;
touchY = touchPoint.y;
int lastX = (int)floor((lastTouchPoint.x+0.001)/40);
int lastY = (int)floor((lastTouchPoint.y+0.001)/40);
int currentX = (int)(floor((touchPoint.x+0.001)/40));
int currentY = (int)(floor((touchPoint.y+0.001)/40));
if ((abs((currentX-lastX)) >=1) || (abs((currentY-lastY)) >=1))
{
if ([soundArray buttonStateForRow:currentX column:currentY] == firstTouchColor){
[soundArray setButtonState:!firstTouchColor row:(int)(floor((touchPoint.x+0.001)/40)) column:(int)(floor((touchPoint.y+0.001)/40))];
[self setNeedsDisplayInRect:(CGRectMake((CGFloat)(floor((touchPoint.x+0.001)/40)*40), (CGFloat)(floor((touchPoint.y+0.001)/40)*40), (CGFloat)40.0, (CGFloat)40.0))];
}
}
}
My problem is that the boolean array seems to be out of whack with the images I'm putting down. This only happens if I drag really fast across the screen. Eventually the boolean array and the images are no longer in sync, even though I set them at the same time. Any idea what is causing this, or what I can do to fix it?
Here's my drawRect:
- (void)drawRect:(CGRect)rect {
if ([soundArray buttonStateForRow:(int)(floor((touchX+0.001)/40)) column:(int)(floor((touchY+0.001)/40))])
[whiteImage drawAtPoint:(CGPointMake((CGFloat)(floor((touchX+0.001)/40)*40), (CGFloat)(floor((touchY+0.001)/40))*40))];
else
[blackImage drawAtPoint:(CGPointMake((CGFloat)(floor((touchX+0.001)/40)*40), (CGFloat)(floor((touchY+0.001)/40))*40))];
}
I figured out the answer to this. touchX and touchY were instance variables, and they were getting reset in touchesMoved before each call to drawRect was complete. Therefore, if I moved fast on the screen, touchesMoved would get called, then call drawRect, then touchesMoved would get called again before drawRect had used touchX and touchY, so the drawing would get out of sync with the boolean array backend.
to solve this, I stopped using touchX and touchY in drawRect, and started deriving the same point by using the dirty rect that was being passed in from touchesMoved.
tada!