Understanding CGPoint method - objective-c

I am having trouble understanding some of the math in the following tutorial:
Sprite Kit Tutorial
I am not sure how to comprehend offset. About half way through the tutorial, Ray uses the following code:
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
// 2 - Set up initial location of projectile
SKSpriteNode * projectile = [SKSpriteNode spriteNodeWithImageNamed:#"projectile"];
projectile.position = self.player.position;
// 3- Determine offset of location to projectile
CGPoint offset = rwSub(location, projectile.position);
where rwSub is
static inline CGPoint rwSub(CGPoint a, CGPoint b) {
return CGPointMake(a.x - b.x, a.y - b.y);
}
I know this code works, but I don't understand it. I tried NSLogging the touch point and the offset point, and they do not form a triangle like it shows in the picture:
(source: raywenderlich.com)
This is what I got from my output:
Touch Location
X: 549.000000 Y: 154.000000
Offset
X: 535.500000 Y: -6.000000
This does not form a vector in the correct direction..but it still works?
Is anyone able to explain how the offset works?

Offset is the difference from the ninja, and the point you have touched. So the touch you logged is, 535 pts to the right, and 6 pts down (-6).
So it is going in the correct direction, relative to the player.
The tutorial also forces the ninja star to travel offscreen, via
// 6 - Get the direction of where to shoot
CGPoint direction = rwNormalize(offset);
// 7 - Make it shoot far enough to be guaranteed off screen
CGPoint shootAmount = rwMult(direction, 1000);
// 8 - Add the shoot amount to the current position
CGPoint realDest = rwAdd(shootAmount, projectile.position);
Draw some pictures, it will help you understand.

The offset in this case simply represent the location of the touch related to the character, and allow you to know where the projectile will be aimed.
In the tutorial, on the next lines you can see :
// 4 - Bail out if you are shooting down or backwards
if (offset.x <= 0) return;
In this example offset.x < 0 means that the projectile is targeting something behind the ninja on the x axis, where 0 is the x-coordinate of the character.
The idea here is to translate the projectile's target coordinates in the character's own referential to understand better their positions to each other.

Related

Why does mouseDragged position lag behind actual mouse?

I'm trying to make a SpriteKit game where the player can drag groups of sprites around, but I can't figure out how to get the sprite to follow the mouse cursor. Using the SpriteKit boilerplate, I get this:
Here is the relevant logic for how I move the "Hello, world!" sprite in the SKNode babies
SKNode *babies;
-(void)mouseDown:(NSEvent *)theEvent {
dragStart = [theEvent locationInWindow];
babiesStart = babies.position;
}
-(void)mouseDragged:(NSEvent *)theEvent {
CGPoint translation = CGPointMake([theEvent locationInWindow].x - dragStart.x,
[theEvent locationInWindow].y - dragStart.y);
float adjust = 1.0;
babies.position = CGPointMake(babiesStart.x + translation.x * adjust,
babiesStart.y + translation.y * adjust);
}
I've tried a number of different methods, such as deltaX and Y on theEvent but I get the same result. The only solution I've found is to play with the adjust variable, but that's clearly a hack.
NSEvent has another method in SpriteKit, - (CGPoint)locationInNode:(SKNode *)node. By using this I was able to get correct offset values for move the SKNode along with the mouse.
please try this [theEvent locationInNode:self];
if you don't accumulate delta in mouseDragged, the loss is inevitable.
In my case the following works quite ok.
mouseDown()
{
previous = event.location(in: self)
}
mouseDragged()
{
current = event.location(in: self)
...
delta += (current - previous)
previous = current
...
}
update()
{
...
use up your delta
delta = 0
}
cheers
My guess is that the issue is with coordinate spaces. You're performing calculations based on -[NSEvent locationInWindow] which is, of course, in the window coordinate system. In what coordinate system is babies.position? It's at least in a view's coordinate system, although maybe SprikeKit also imposes another coordinate space.
To convert the point to the view's coordinate space, you will want to use NSPoint point = [theView convertPoint:[NSEvent locationInWindow] fromView:nil];. To convert the point from the view's coordinate space to the scene's, you'd use CGPoint cgpoint = [theScene convertPointFromView:NSPointToCGPoint(point)];. If babies is not the scene object, then to convert to the coordinate system used by babies.position, you'd do cgpoint = [babies.parent convertPoint:cgpoint fromNode:scene];. You'd then compute translation by taking the difference between babiesStart and cgpoint.
Update: actually, you wouldn't compare the result with babiesStart as such. You'd compare it with the result of the same coordinate transformation done on the original cursor location. So, you'd compute dragStart similar to how you'd compute cgpoint. Later, you'd take the difference between those.
This is normal behavior.
When you take the mouse position as reported by the event, some time passes before the Sprite Kit view is redrawn, at which point the cursor has already moved to a new position.
There isn't much you can do about it, except maybe predict the position for very fast movement by factoring in the previous mouse events distances and thus predicting where the next position is likely going to be, and then take the actual mouse position and adjust it a little according to the most recent general movement direction.
Usually this is overkill though.

How can I ensure I still get correct touch inputs when my scene is offset?

How can I accept touch input beyond the scene's bounds, so that no matter what I set self.position to, touches can still be detected?
I'm creating a tile based game from Ray Winderlich on Cocos2d version 3.0. I am at the point of setting the view of the screen to a zoomed in state on my tile map. I have successfully been able to do that although now my touches are not responding since I'm out of the coordinate space the touches used to work on.
This method is called to set the zoomed view to the player's position:
-(void)setViewPointCenter:(CGPoint)position{
CGSize winSize = [CCDirector sharedDirector].viewSizeInPixels;
int x = MAX(position.x, winSize.width/2);
int y = MAX(position.y, winSize.height/2);
x = MIN(x, (_tileMap.mapSize.width * _tileMap.tileSize.width) - winSize.width / 2);
y = MIN(y, (_tileMap.mapSize.height * _tileMap.tileSize.height) - winSize.height / 2);
CGPoint actualPosition = ccp(x, y);
CGPoint centerOfView = ccp(winSize.width/2, winSize.height/2);
NSLog(#"centerOfView%#", NSStringFromCGPoint(centerOfView));
CGPoint viewPoint = ccpSub(centerOfView, actualPosition);
NSLog(#"viewPoint%#", NSStringFromCGPoint(viewPoint));
//This changes the position of the helloworld layer/scene so that
//we can see the portion of the tilemap we're interested in.
//That however makes my touchbegan method stop firing
self.position = viewPoint;
}
This is what the NSLog prints from the method:
2014-01-30 07:05:08.725 TestingTouch[593:60b] centerOfView{512, 384}
2014-01-30 07:05:08.727 TestingTouch[593:60b] viewPoint{0, -832}
As you can see the y coordinate is -800. If i comment out the line self.position = viewPoint then the self.position reads {0, 0} and touches are detectable again but then we don't have a zoomed view on the character. Instead it shows the view on the bottom left of the map.
Here's a video demonstration.
How can I fix this?
Update 1
Here is the github page to my repository.
Update 2
Mark has been able to come up with a temporary solution so far by setting the hitAreaExpansion to a large number like so:
self.hitAreaExpansion = 10000000.0f;
This will cause touches to respond again all over! However, if there is a solution that would not require me to set the property with an absolute number then that would be great!
-edit 3-(tldr version):
setting the contentsize of the scene/layer to the size of the tilemap solves this issue:
[self setContentSize: self.tileMap.contentSize];
original replies below:
You would take the touch coordinate and subtract the layer position.
Generally something like:
touchLocation = ccpSub(touchLocation, self.position);
if you were to scale the layer, you would also need appropriate translation for that as well.
-edit 1-:
So, I had a chance to take another look, and it looks like my 'ridiculous' number was not ridiculous enough, or I had made another change. Anyway, if you simply add
self.hitAreaExpansion = 10000000.0f; // I'll let you find a more reasonable number
the touches will now get registered.
As for the underlying issue, I believe it to be one of content scale that is not set correctly, but again, I'll now leave that to you. I did however find out that when looking through some of the tilemap class, that tilesize is said to be in pixels, not points, which I guess is somehow related to this.
-edit 2-:
It bugged me with the sub-optimal answer, so I looked a little further. Forgive me, I hadn't looked at v3 until I saw this question. :p
after inspecting the base class and observing the scene/layer's value of:
- (BOOL)hitTestWithWorldPos:(CGPoint)pos;
it became obvious that the content size of the scene/layer was being set to the current view size, which in the case of an iPad is (1024, 768)
The position of the layer after the setViewPointCenter call is fully above the initial view's position, hence, the touch was being suppressed. by setting the layer/scene contentSize to the size of the tilemap, the touchable area is now expanded over the entire map, which allows the node to process the touch.

Trying to get my head around simulating momentum / inertia witih a UIRotationGestureController

Okay, so I'm trying to make a 'Wheel of Fortune' type of effect with a wheel shape in iOS, where I can grab and spin a wheel. I can currently drag and spin the wheel around to my heart's content, but upon releasing my finger, it stops dead. I need to apply some momentum or inertia to it, to simulate the wheel spinning down naturally.
I've got the velocity calculation in place, so when I lift my finger up I NSLog out a velocity (between a minimum of 1 and a maximum of 100), which ranges from anywhere between 1 and over 1800 (at my hardest flick!), now I'm just trying to establish how I would go about converting that velocity into an actual rotation to apply to the object, and how I'd go about slowing it down over time.
My initial thoughts were something like: begin rotating full circles on a loop at the same speed as the velocity that was given, then on each subsequent rotation, slow the speed by some small percentage. This should give the effect that a harder spin goes faster and takes longer to slow down.
I'm no mathematician, so my approach may be wrong, but if anybody has any tips on how I could get this to work, at least in a basic state, I'd be really grateful. There's a really helpful answer here: iPhone add inertia/momentum physics to animate "wheel of fortune" like rotating control, but it's more theoretical and lacking in practical information on how exactly to apply the calculated velocity to the object, etc. I'm thinking I'll need some animation help here, too.
EDIT: I'm also going to need to work out if they were dragging the wheel clockwise or anti-clockwise.
Many thanks!
I have written something analogous for my program Bit, but my case I think is a bit more complex because I rotate in 3D: https://itunes.apple.com/ua/app/bit/id366236469?mt=8
Basically what I do is I set up an NSTimer that calls some method regularly. I just take the direction and speed to create a rotation matrix (as I said, 3D is a bit nastier :P ), and I multiply the speed with some number smaller than 1 so it goes down. The reason for multiplying instead of subtracting is that you don't want the object to rotate twice as long if the spin from the user is twice as hard since that becomes annoying to wait on I find.
As for figuring out which direction the wheel is spinning, just store that in the touchesEnded:withEvent: method where you have all the information. Since you say you already have the tracking working as long as the user has the finger down this should hopefully be obvious.
What I have in 3D is something like:
// MyView.h
#interface MyView : UIView {
NSTimer *animationTimer;
}
- (void) startAnimation;
#end
// MyAppDelegate.h
#implementation MyAppDelegate
- (void) applicationDidFinishLaunching:(UIApplication *)application {
[myView startAnimation];
}
#end
// MyView.m
GLfloat rotationMomentum = 0;
GLfloat rotationDeltaX = 0.0f;
GLfloat rotationDeltaY = 0.0f;
#implementation MyView
- (void)startAnimation {
animationTimer = [NSTimer scheduledTimerWithTimeInterval:(NSTimeInterval)((1.0 / 60.0) * animationFrameInterval) target:self selector:#selector(drawView:) userInfo:nil repeats:TRUE];
}
- (void) drawView:(id)sender {
addRotationByDegree(rotationMomentum);
rotationMomentum /= 1.05;
if (rotationMomentum < 0.1)
rotationMomentum = 0.1; // never stop rotating completely
[renderer render];
}
- (void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
}
- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch *aTouch = [touches anyObject];
CGPoint loc = [aTouch locationInView:self];
CGPoint prevloc = [aTouch previousLocationInView:self];
rotationDeltaX = loc.x - prevloc.x;
rotationDeltaY = loc.y - prevloc.y;
GLfloat distance = sqrt(rotationDeltaX*rotationDeltaX+rotationDeltaY*rotationDeltaY)/4;
rotationMomentum = distance;
addRotationByDegree(distance);
self->moved = TRUE;
}
- (void)touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event
{
}
- (void)touchesCancelled:(NSSet*)touches withEvent:(UIEvent*)event
{
}
I've left out the addRotationByDegree function but what it does is that it uses the global variables rotationDeltaX and rotationDeltaY and applies a rotational matrix to an already stored matrix and then saves the result. In your example you probably want something much simpler, like (I'm assuming now that only movements in the X direction spin the wheel):
- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch *aTouch = [touches anyObject];
CGPoint loc = [aTouch locationInView:self];
CGPoint prevloc = [aTouch previousLocationInView:self];
GLfloat distance = loc.x - prevloc.x;
rotationMomentum = distance;
addRotationByDegree(distance);
self->moved = TRUE;
}
void addRotationByDegree(distance) {
angleOfWheel += distance; // probably need to divide the number with something reasonable here to have the spin be nicer
}
It's going to be a rough answer as I don't have any detailed example at hand.
If you have the velocity when you lift your finger already then it should not be hard.
The velocity you have is at pixels per second or something like that.
First you need to convert that linear speed to an angular speed. That can be done by knowing the perimeter of the circle 2*PI*radius and then do 2*PI/perimeter*velocity to get the angular speed in radians per second.
If your wheel didn't have any friction in its axis it would run forever at that speed. Well, you can just arbitrate a value for this friction, which is an acceleration and can be represented at pixels per second squared or radians per second squared for an angular acceleration. Then it's just a matter of dividing the angular speed by this angular acceleration and you get the time until it stops.
With the animation time you can use the equation finalAngle = initialAngle + angularSpeed*animationTime - angularAcceleration/2*animationTime*animationTime to get the final angle your wheel is going to be at the end of the animation. Then just do an animation on the transformation and rotate it by that angle for the time you got and say that your animation should ease out.
This should look realistic enough. If not you'll need to give an animation path for the rotation property of your wheel based on some samples from the equation from above.

iOS: Dynamically drawing on canvas(context)

I'm currently making an application which draws out a certain network of signs to the screen (on a CGContextRef). So far everything is going great, but now i'm finding myself in the situation that i can't solve this problem:
I'm trying to draw an object dynamically knowing only the line its on (have the start and ending point's x and y coordinates). With these i found the middle of the line, this is where the symbol should be drawn. With this information i found the angle of the line (with the top as 0). This is the information i have right now:
CGPoint firstLocation;
CGPoint secondLocation;
CGPoint middleLocation;
double x1 = firstLocation.x;
double y1 = firstLocation.y;
double x2 = middleLocation.x;
double y2 = middleLocation.y;
float a = (atan2(y2-y1, x2-x1) * (180/M_PI)) - 90;
I looked at using some transform function (like CGAffineTransform) on a CGRect, but this doesn't seem to work as i need to rotate the rect around it's center and a CGRect would only rotate around it's origin.
I want to create the following symbols with the above information:
Any help is appreciated, and if you need any more information please tell me!
In my app I do something similar. I have a path that I add a transform to before drawing. The transform shifts the path to the midpoint, rotates it, and shifts it back:
// Rotate the path such that it points to the end coordinate
CGAffineTransform t = CGAffineTransformTranslate(
CGAffineTransformRotate(
CGAffineTransformMakeTranslation(middleLocation.x, middleLocation.y),
-a),
-middleLocation.x, -middleLocation.y);
CGMutablePathRef path = CGPathCreateMutable();
CGPoint points[8] = { ... these are the 8 points in my path ... };
CGPathAddLines(path, &t, points, 8);
You don't have to use CGPathAddLines, that was just the easiest way for me to construct the path. All of the CGPathAdd... functions can take a transform.
If you're not using CGPath, you could do a similar transform in the context itself by doing CGContextTranslateCTM and CGContextRotateCTM.

Update coordinates of sprite in game

This has really been a hassle, but I have two sprites, both of them are 17 pixel long arms. Both of them have anchor points at ccp(0.5f,0.0f); and what I want is for, when arm1 rotates, for arm2's CGPoint to be equal to the opposite end of the anchorpoint of arm1. Like, at a 45 degree angle, the CGPoint would just be ccp (arm1.position.y, arm1.position.x + 17);
So I update the rotation for arm1 in my ccTime function, and it calls another method to do the math for the angle rotation. Basicall what happens is... arm2 rotates reaaaally fast around in the correct circular area, meaning something is right, but the rotation is just super fast.
-(void) callEveryFrame:(ccTime)dt{
//if the user is holding down on the screen, arm1 rotates.
if(held == true){
_theArm2.position = [self fixAngle];
timer = timer + .0166; //this gets reset after touchesEnded.
_theArm1.rotation = _theArm1.rotation - (timer * 10);
}
-(CGPoint)fixAngle{
CGFloat theAngle = _theArm1.rotation;
//I'm not to sure how the degrees works in cocos2d, so I added 90 to the angle of rotation's original position, and it works until the rotation variable changes.
CGFloat thyAngle = theAngle + 90;
CGFloat theMathx = (17*cosf(thyAngle)); //17 is the length of the arm
CGFloat theMathy = (17*sinf(thyAngle));
theMathx = theMathx + 100;//these 2 updates just change the position, because arm1's
theMathy = theMathy + 55; //CGpoint is located at ccp(100,57)
return CGPointMake(theMathx, theMathy);
}
Sorry if the code is... bad. I'm relatively new to programming, but everything works except the stupid arm2 likes to rotate really fast in a circle.
I will love whoever solves my problem for the rest of my/their lives.
EDIT:
Per discussion on this thread, it looks like you are using degree where radians should be used, but not where I thought. Try this:
CGFloat theMathx = (17*cosf(CC_DEGREES_TO_RADIANS(thyAngle))); //17 is the length of the arm
CGFloat theMathy = (17*sinf(CC_DEGREES_TO_RADIANS(thyAngle)));
Try using this to add 90 degrees to the rotation. Cocos2D uses radians instead of degrees: