Simulate "Newton's law of universal gravitation" using Box2D - physics

I want to simulate Newton's law of universal gravitation using Box2D.
I went through the manual but couldn't find a way to do this.
Basically what I want to do is place several objects in space (zero gravity) and simulate the movement.
Any tips?

It's pretty easy to implement:
for ( int i = 0; i < numBodies; i++ ) {
b2Body* bi = bodies[i];
b2Vec2 pi = bi->GetWorldCenter();
float mi = bi->GetMass();
for ( int k = i; k < numBodies; k++ ) {
b2Body* bk = bodies[k];
b2Vec2 pk = bk->GetWorldCenter();
float mk = bk->GetMass();
b2Vec2 delta = pk - pi;
float r = delta.Length();
float force = G * mi * mk / (r*r);
delta.Normalize();
bi->ApplyForce( force * delta, pi );
bk->ApplyForce( -force * delta, pk );
}
}

Unfortunately, Box2D doesn't have native support for it, but you can implement it yourself: Box2D and radial gravity code

As said by others, Box2D has no buildin support for it. But you can add support for it to the library in b2_islands.cpp. Just replace
v += h * b->m_invMass * (b->m_gravityScale * b->m_mass * gravity + b->m_force);
with
int planet_x = 0;
int planet_y = 0;
b2Vec2 gravityVector = (b2Vec2(planet_x, planet_y) - b->GetPosition());
gravityVector.Normalize();
gravityVector.x = gravityVector.x * 10.0f;
gravityVector.y = gravityVector.y * 10.0f;
v += h * b->m_invMass * (b->m_gravityScale * b->m_mass * gravityVector + b->m_force);
Thats a simple solution if you have only one planet.
If you want less force the further away you are, you could use 1/gravityVector instead of normalizing it. That would also make it possible to add up the gravity
of to planets. The you could also iterate over a planet list and sum the gravityVectors up.
Additionally implementing a function like b2World::CreatePlanet might be usefull then.
The 10.0f are just an approximation of the 9.81f from earth, you might need to adjust it. If the mass of the planet is relevant you might need a constant to be multiplied with it, to make it look more realistic, or just increase the density of the object to make it match the real weight of a planet.
Sure you can also set the gravity to 0, 0 and then calculate it before each step for every object, but that might not have so much performance.

Related

How do I randomize the starting direction of a ball in Spritekit?

I've started trying a few things with SpriteKit for Game Development. I was creating a brick breaking game. So I've run into a issue on how to randomize the starting direction of the ball.
My ball has the following properties
ball.physicsBody.friction = 0;
ball.physicsBody.linearDamping = 0;
ball.physicsBody.restitution = 1 ; //energy lost on impact or bounciness
To start at different direction during the gameplay, I've randomized the selection of the 4 vectors because I'm using the applyImpulse method to direct the ball in a particular direction and I need to make sure the ball does not go slow if the vector values are low.
int initialDirection = arc4random()%10;
CGVector myVector;
if(initialDirection < 2)
{
myVector = CGVectorMake(4, 7);
}
else if(initialDirection >3 && initialDirection <= 6)
{
myVector = CGVectorMake(-7, -5);
}
else if(initialDirection >6 && initialDirection <= 8)
{
myVector = CGVectorMake(-5, -8);
}
else
{
myVector = CGVectorMake(8, 5);
}
//apply the vector
[ball.physicsBody applyImpulse:myVector];
Is this the right way to do it? I tried using applyForce method but then, ball slowed down after the force was applied.
Is there any way I can randomize the direction and still maintain a speed for my ball ?
The basic steps
Randomly select an angle in [0, 2*PI)
Select the magnitude of the impulse
Form vector by converting magnitude/angle to vector components
Here's an example of how to do that
ObjC:
CGFloat angle = arc4random_uniform(1000)/1000.0 * M_PI_2;
CGFloat magnitude = 4;
CGVector vector = CGVectorMake(magnitude*cos(angle), magnitude*sin(angle));
[ball.physicsBody applyImpulse:vector];
Swift
let angle:CGFloat = CGFloat(arc4random_uniform(1000)/1000) * (CGFloat.pi/2)
let magnitude:CGFloat = 4
let vector = CGVector(x:magnitude * cos(angle), y:magnitude * sin(angle))
ball.physicsBody?.applyImpulse(vector)

Calculating Center of mass of body being tracked using kinect?

I am working on Kinect for my research project . I have worked previously to calculate the joint angle of kinect and the joint coordinates. I would like to calculate the center of mass of the body which is being tracked.
Any idea would be appreciated and code snippets would be immensely helpful.
I owe a lot to stack overflow without the community help it would had not been possible to do such a thing.
Thanks in Advance
Please find the code where i want to include this center of mass function. This function tracks the skeleton.
Skeleton GetFirstSkeleton(AllFramesReadyEventArgs e)
{
using (SkeletonFrame skeletonFrameData = e.OpenSkeletonFrame())
{
if (skeletonFrameData == null)
{
return null;
}
skeletonFrameData.CopySkeletonDataTo(allSkeletons);
//get the first tracked skeleton
Skeleton first = (from s in allSkeletons
where s.TrackingState == SkeletonTrackingState.Tracked
select s).FirstOrDefault();
return first;
}
I have tried using this code in my code but its not getting accustomed , can any one please help me include the center of mass code.
oreach (SkeletonData data in skeletonFrame.Skeletons) {
SkeletonFrame allskeleton = e.SkeletonFrame;
// Count passive and active person up to six in the group
int numberOfSkeletonsT = (from s in allskeleton.Skeletons
where s.TrackingState == SkeletonTrackingState.Tracked select s).Count();
int numberOfSkeletonsP = (from s in allskeleton.Skeletons
where s.TrackingState == SkeletonTrackingState.PositionOnly select s).Count();
// Count passive and active person up to six in the group
int totalSkeletons = numberOfSkeletonsP + numberOfSkeletonsT;
//Console.WriteLine("TotalSkeletons = " + totalSkeletons);
//======================================================
if (data.TrackingState == SkeletonTrackingState.PositionOnly)
{
foreach (Joint joint in data.Joints)
{
if (joint.Position.Z != 0)
{
double centerofmassX = com.Position.X;
double centerofmassY = com.Position.Y;
double centerofmassZ = com.Position.Z;
Console.WriteLine( centerofmassX + centerofmassY + centerofmassZ );
}
}
See a couple of resources here:
http://mathwiki.ucdavis.edu/Calculus/Vector_Calculus/Multiple_Integrals/Moments_and_Centers_of_Mass#Three-Dimensional_Solids
http://www.slideshare.net/GillianWinters/center-of-mass-presentation
http://en.wikipedia.org/wiki/Locating_the_center_of_mass
Basically no matter what, you are going to need to find the mass of your user. This can be a simple input, then you can determine how much weight the person puts on each foot and use the equations described at all of these sources. Another option may be to use plumb lines on a planar shape representation of the user in 2D, However that won't be the actually accurate 3D center of mass.
Here is an example of how to find what amount of mass is on each foot. using the equation found on http://www.vitutor.com/geometry/distance/line_plane.html
Vector3 v = new Vector3(skeleton.Joints[JointType.Head].Position.X, skeleton.Joints[JointType.Head].Position.Y, skeleton.Joints[JointType.Head].Position.Z);
double mass;
double leftM, rightM;
double A = sFrame.FloorClipPlane.X,
B = sFrame.FloorClipPlane.Y,
C = sFrame.FloorClipPlane.Z;
//find angle
double angle = Math.ASin(Math.Abs(A * v.X + B * v.Y * C * v.Z)/(Math.Sqrt(A * A + B * B + C * C) * Math.Sqrt(v.X * v.X + v.Y * v.Y + v.Z * v.Z)));
if (angle == 90.0)
{
leftM = mass / 2.0;
rightM = mass / 2.0;
}
double distanceFrom90 = 90.0 - angle;
if (distanceFrom90 > 0)
{
double leftMultiple = distanceFrom90 / 90.0;
leftM = mass * leftMultiple;
rightM = mass - leftM;
}
else
{
double rightMultiple = distanceFrom90 / 90.0;
rightM = rightMultiple * mass;
leftM = mass - rightMultiple;
}
This is of course assuming that the user is on both feet, but you could modify the code to create a new plane based off the users feet instead of the automatic one generated by Kinect.
The code to then find the center of mass you have to choose a datum. I would choose the head as that is the top of the person, and you can measure down from it easily. Using the steps found here:
double distanceFromDatumLeft = Math.Sqrt(Math.Pow(headX - footLeftX, 2) + Math.Pow(headY - footLeftY, 2) + Math.Pow(headZ - footLeftZ, 2));
double distanceFromDatumLeft = Math.Sqrt(Math.Pow(headX - footRightX, 2) + Math.Pow(headY - footRightY, 2) + Math.Pow(headZ - footRightZ, 2));
double momentLeft = distanceFromDatumLeft * leftM;
double momentRight = distanceFromDatumRight * rightM;
double momentSum = momentLeft + momentRight;
//measured in units from the datum
double centerOfGravity = momentSum / mass;
You then can of course show this on the screen by passing a point to plot that is centerOfGravity points below the head.

Lego NXT-RobotC ultrasonic sensor

I am newbie in programming so I need help with my Ultrasonic sensor driven NXT robot.
It is attached to motor(A) and I'd like it to scan the room from robot's centerline to 90° left and 90° right in 30° increments (seven measurements total), store the data to an array and based on largest distance point my robot in the direction that measurement was taken to avoid obsticles.
Is this possible at all? Or is there some better solution?
Any advice or suggestion is more than welcome.
This would work somewhat for avoiding obstacles, as for implementing it, it depends what you are programming the robot in. I only know lejos(java), in which a function for getting the angle to go would be something like:
public static int scanArea(RegulatedMotor motorTop, UltrasonicSensor sonar) {
int theAngle = 0;
int largestDist = 0;
int currentDist;
for (int rotateAngle = -90; rotateAngle <= 90; rotateAngle += 30) {
motorTop.rotateTo(rotateAngle);
currentDist = sonar.getDistance();
if (currentDist > largestDist) {
largestDist = currentDist;
theAngle = rotateAngle;
}
Delay.msDelay(25);
}
motorTop.rotateTo(0);
return theAngle;
}
If you're coding the robot in any code based language that should be fairly easy to convert (assuming you have functions such as rotateTo, otherwise you would have to use relative movements). Otherwise I don't know how easy this would be to do in the graphical programming language that you use originally.
I would suggest attaching the ultrasonic sensor to a 180° servo that is vertical. You can take a measurement at a specific degree by assigning that degree to the servo. Using this:
int largestDistance = 0;
int Angle =0;
for (int servovalue = 0; servovalue <= 255; rotateAngle += 30){
Servo[servo1] = servovalue;
if (SensorValue[Sonar] > largestDist) {
largestDist = SensorValue[Sonar];
Angle = servovalue;
}
}
return Angle;
}
It assumes that servo1 is your vertical servo, but this should work if your programming in RobotC.

GluUnProject for iOS

To detect 3D world coordinates through the 2D screen coordinates of the iOS, is there any other possible way besides the gluUnProject port?
I've been fiddling around with this days on end now, and I can't seemingly get the hang of it.
-(void)receivePoint:(CGPoint)loke
{
GLfloat projectionF[16];
GLfloat modelViewF[16];
GLint viewportI[4];
glGetFloatv(GL_MODELVIEW_MATRIX, modelViewF);
glGetFloatv(GL_PROJECTION_MATRIX, projectionF);
glGetIntegerv(GL_VIEWPORT, viewportI);
loke.y = (float) viewportI[3] - loke.y;
float nearPlanex, nearPlaney, nearPlanez, farPlanex, farPlaney, farPlanez;
gluUnProject(loke.x, loke.y, 0, modelViewF, projectionF, viewportI, &nearPlanex, &nearPlaney, &nearPlanez);
gluUnProject(loke.x, loke.y, 1, modelViewF, projectionF, viewportI, &farPlanex, &farPlaney, &farPlanez);
float rayx = farPlanex - nearPlanex;
float rayy = farPlaney - nearPlaney;
float rayz = farPlanez - nearPlanez;
float rayLength = sqrtf((rayx*rayx)+(rayy*rayy)+(rayz*rayz));
//normalizing rayVector
rayx /= rayLength;
rayy /= rayLength;
rayz /= rayLength;
float collisionPointx, collisionPointy, collisionPointz;
for (int i = 0; i < 50; i++)
{
collisionPointx = rayx * rayLength/i*50;
collisionPointy = rayy * rayLength/i*50;
collisionPointz = rayz * rayLength/i*50;
}
}
There's a good chunk of my code. Yeah, I could have easily used a struct but I was too mentally fat to do it at the time. That's something I could go back and fix later.
Anywho, the point is that when I output to the debugger using NSLog after I use gluUnProject, the nearplane's and farplane's don't relay results even close to accurate. In fact, they both relay the exact same results, not to mention, the first click always reproduces x, y, & z being all equal to "nan."
Am I skipping over something extraordinarily important here?
There is no gluUnProject function in ES2.0, what is this port that you are using? Also there is no GL_MODELVIEW_MATRIX or GL_PROJECTION_MATRIX, which is most likely your problem.

Line/Ray-intersection not working as expected

I've been working on cobbling together a ray tracer. You know, for fun. So far most things are going as planned, but as soon as I started transforming my test spheres, it all went awry.
The fundamental concept is using one of standard shapes as origin, transforming the camera rays into object space, and then intersecting.
As long as the sphere is identical in object space and world space, it works as expected, but as soon as the spheres are scaled, normals and intersection points go wild.
I've been wracking my brains, and poring over this code over and over, but I just can't find the mistake. Fresh eyes would be much appreciated.
#implementation RTSphere
- (CGFloat)intersectsRay:(RTRay *)worldRay atPoint:(RTVector *)intersection normal:(RTVector *)normal material:(RTMaterial **)material {
RTRay *objectRay = [worldRay rayByTransformingByMatrix:self.inverseTransformation];
RTVector D = objectRay.direction;
RTVector O = objectRay.start;
CGFloat A, B, C;
A = RTVectorDotProduct(D, D);
B = 2 * RTVectorDotProduct(D,O);
C = RTVectorDotProduct(O, O) - 0.25;
CGFloat BB4AC = B * B - 4 * A * C;
if (BB4AC < 0.0) {
return -1.0;
}
CGFloat t0 = (-B - sqrt(BB4AC)) / 2 * A;
CGFloat t1 = (-B + sqrt(BB4AC)) / 2 * A;
if (t0 > t1) {
CGFloat tmp = t0;
t0 = t1;
t1 = tmp;
}
if (t1 < 0.0) {
return -1.0;
}
CGFloat t;
if (t0 < 0.0) {
t = t1;
} else {
t = t0;
}
if (material) {
*material = self.material;
}
if (intersection) {
RTVector isect_o = RTVectorAddition(objectRay.start, RTVectorMultiply(objectRay.direction, t));
*intersection = RTVectorMatrixMultiply(isect_o, self.transformation);
if (normal) {
RTVector normal_o = RTVectorSubtraction(isect_o, RTMakeVector(0.0, 0.0, 0.0));
RTVector normal_w = RTVectorUnit(RTVectorMatrixMultiply(normal_o, self.transformationForNormal));
*normal = normal_w;
}
}
return t;
}
#end
Why are the normals and intersection points not translating into world space as expected?
Edit: I'm moderately confident that my vector and matrix functions are mathematically sound; and I'm thinking it's chiefly a method error, but I recognize that I could be wrong.
There is a lot of RT* code here "behind the scenes" that we have no way to know is correct, so I would start by making sure you have good unit tests of those math functions. The ones I would most suspect, from my experience managing transforms, is rayByTransformingByMatrix: or the value of inverseTransformation. I've found that this is very easy to get wrong when you combine transformations. Rotating and scaling is not the same as scaling and rotating.
At what point does it go wrong for you? Are you sure objectRay itself is correct? (If it isn't, then the rest of this function doesn't matter.) Again, unit test is your friend. You should hand-calculate several situations and then write unit tests to ensure that your methods return the right answers.