Lego NXT-RobotC ultrasonic sensor - robot

I am newbie in programming so I need help with my Ultrasonic sensor driven NXT robot.
It is attached to motor(A) and I'd like it to scan the room from robot's centerline to 90° left and 90° right in 30° increments (seven measurements total), store the data to an array and based on largest distance point my robot in the direction that measurement was taken to avoid obsticles.
Is this possible at all? Or is there some better solution?
Any advice or suggestion is more than welcome.

This would work somewhat for avoiding obstacles, as for implementing it, it depends what you are programming the robot in. I only know lejos(java), in which a function for getting the angle to go would be something like:
public static int scanArea(RegulatedMotor motorTop, UltrasonicSensor sonar) {
int theAngle = 0;
int largestDist = 0;
int currentDist;
for (int rotateAngle = -90; rotateAngle <= 90; rotateAngle += 30) {
motorTop.rotateTo(rotateAngle);
currentDist = sonar.getDistance();
if (currentDist > largestDist) {
largestDist = currentDist;
theAngle = rotateAngle;
}
Delay.msDelay(25);
}
motorTop.rotateTo(0);
return theAngle;
}
If you're coding the robot in any code based language that should be fairly easy to convert (assuming you have functions such as rotateTo, otherwise you would have to use relative movements). Otherwise I don't know how easy this would be to do in the graphical programming language that you use originally.

I would suggest attaching the ultrasonic sensor to a 180° servo that is vertical. You can take a measurement at a specific degree by assigning that degree to the servo. Using this:
int largestDistance = 0;
int Angle =0;
for (int servovalue = 0; servovalue <= 255; rotateAngle += 30){
Servo[servo1] = servovalue;
if (SensorValue[Sonar] > largestDist) {
largestDist = SensorValue[Sonar];
Angle = servovalue;
}
}
return Angle;
}
It assumes that servo1 is your vertical servo, but this should work if your programming in RobotC.

Related

How do I randomize the starting direction of a ball in Spritekit?

I've started trying a few things with SpriteKit for Game Development. I was creating a brick breaking game. So I've run into a issue on how to randomize the starting direction of the ball.
My ball has the following properties
ball.physicsBody.friction = 0;
ball.physicsBody.linearDamping = 0;
ball.physicsBody.restitution = 1 ; //energy lost on impact or bounciness
To start at different direction during the gameplay, I've randomized the selection of the 4 vectors because I'm using the applyImpulse method to direct the ball in a particular direction and I need to make sure the ball does not go slow if the vector values are low.
int initialDirection = arc4random()%10;
CGVector myVector;
if(initialDirection < 2)
{
myVector = CGVectorMake(4, 7);
}
else if(initialDirection >3 && initialDirection <= 6)
{
myVector = CGVectorMake(-7, -5);
}
else if(initialDirection >6 && initialDirection <= 8)
{
myVector = CGVectorMake(-5, -8);
}
else
{
myVector = CGVectorMake(8, 5);
}
//apply the vector
[ball.physicsBody applyImpulse:myVector];
Is this the right way to do it? I tried using applyForce method but then, ball slowed down after the force was applied.
Is there any way I can randomize the direction and still maintain a speed for my ball ?
The basic steps
Randomly select an angle in [0, 2*PI)
Select the magnitude of the impulse
Form vector by converting magnitude/angle to vector components
Here's an example of how to do that
ObjC:
CGFloat angle = arc4random_uniform(1000)/1000.0 * M_PI_2;
CGFloat magnitude = 4;
CGVector vector = CGVectorMake(magnitude*cos(angle), magnitude*sin(angle));
[ball.physicsBody applyImpulse:vector];
Swift
let angle:CGFloat = CGFloat(arc4random_uniform(1000)/1000) * (CGFloat.pi/2)
let magnitude:CGFloat = 4
let vector = CGVector(x:magnitude * cos(angle), y:magnitude * sin(angle))
ball.physicsBody?.applyImpulse(vector)

Turning Motor on Lego NXT returns error 0002EA Type 2

I a writing a program using RobotC for the Lego NXT to imitate the behaviour of a puppy. This section of code is supposed to rotate the head which is connected to motor port 3 and read the value on the ultra sonic sensor. If while the head is turned, the dog is called, it will turn in the direction it was already facing. The following function is called when the ultrasonic sensor reads a value (meaning the robot has come close to a wall):
visible
void SonarSensor()
{
int sensorValleft;
int sensorValright;
bool alreadyTurned = false;
int i,j;
i = 0;
j = 0;
motor[1] = 0;
motor[2] = 0;
motor[3] = -SPEED/2;
wait10Msec(15);
motor[3] = 0;
sensorValleft = SensorValue[3];
while(i<100)
{
if(SensorValue[4] > 40)//calibrate sound sensor
{
//turn left
motor[1]=SPEED;
motor[2] = -SPEED;
wait10Msec(25);
i = 1000;
j = 1000;
alreadyTurned = true;
}
else
{
i++;
wait1Msec(5);
}
}
motor[3] = SPEED/2;
wait10Msec(30);
motor[3] = 0;
sensorValright = SensorValue[3];
while(j<100)
{
if(SensorValue[3] > 1)//calibrate sound sensor
{
//turn right
motor[1]-=SPEED;
motor[2] = SPEED;
wait10Msec(25);
j = 1000;
alreadyTurned = true;
}
else
{
j++;
wait1Msec(5);
}
}
if(alreadyTurned == false)
{
if(sensorValleft > sensorValright)
{
//turn left
motor[1]=SPEED;
motor[2] = -SPEED;
wait10Msec(25);
}
else
{
//turn right
motor[1]=-SPEED;
motor[2] = SPEED;
wait10Msec(25);
}
}
}visible
When the head (motor[3]) rotates the first time the error 0002EA Type2 appears on the NXT screen. At first we thought it was because we were over-rotating the motor causing it to be obstructed so we tried to play around with the wait times but it made no difference.
Any ideas on what causes this error or how to fix it would be appreciated.
Thanks,
Dominique
The answer as to why only motor[3] causes an error is actually quite simple. The motorA, motorB, and motorC values are defined in an enum, where motorA=0, motorB=1, and motorC=2. So, motor[1] and motor[2] are equivalent to calling motor[motorB] and motor[motorC]. However, motor[3] isn't equivalent to anything. It's trying to set the power of a motor that doesn't exist. motor[0] would be ok, however, and would correspond to motor[motorA].
While debugging, I started putting break points in to see where the error was and it alwas occurred on the line motor[3] = -SPEED/2; it turns out that with the third motor the proper syntax is to use motor[motorA]=-SPEED/2;. I am not sure why only this motor returns this error as I am using two other motors which I set new speeds using
motor[1]=SPEED;
motor[2]=SPEED;
However, this was the way to abolish the error.

Microsoft Kinect SDK depth data to real world coordinates

I'm using the Microsoft Kinect SDK to get the depth and color information from a Kinect and then convert that information into a point cloud. I need the depth information to be in real world coordinates with the centre of the camera as the origin.
I've seen a number of conversion functions but these are apparently for OpenNI and non-Microsoft drivers. I've read that the depth information coming from the Kinect is already in millimetres, and is contained in the 11bits... or something.
How do I convert this bit information into real world coordinates that I can use?
Thanks in advance!
This is catered for within the Kinect for Windows library using the Microsoft.Research.Kinect.Nui.SkeletonEngine class, and the following method:
public Vector DepthImageToSkeleton (
float depthX,
float depthY,
short depthValue
)
This method will map the depth image produced by the Kinect into one that is vector scalable, based on real world measurements.
From there (when I've created a mesh in the past), after enumerating the byte array in the bitmap created by the Kinect depth image, you create a new list of Vector points similar to the following:
var width = image.Image.Width;
var height = image.Image.Height;
var greyIndex = 0;
var points = new List<Vector>();
for (var y = 0; y < height; y++)
{
for (var x = 0; x < width; x++)
{
short depth;
switch (image.Type)
{
case ImageType.DepthAndPlayerIndex:
depth = (short)((image.Image.Bits[greyIndex] >> 3) | (image.Image.Bits[greyIndex + 1] << 5));
if (depth <= maximumDepth)
{
points.Add(nui.SkeletonEngine.DepthImageToSkeleton(((float)x / image.Image.Width), ((float)y / image.Image.Height), (short)(depth << 3)));
}
break;
case ImageType.Depth: // depth comes back mirrored
depth = (short)((image.Image.Bits[greyIndex] | image.Image.Bits[greyIndex + 1] << 8));
if (depth <= maximumDepth)
{
points.Add(nui.SkeletonEngine.DepthImageToSkeleton(((float)(width - x - 1) / image.Image.Width), ((float)y / image.Image.Height), (short)(depth << 3)));
}
break;
}
greyIndex += 2;
}
}
By doing so, the end result from this is a list of vectors stored in millimeters, and if you want centimeters multiply by 100 (etc.).

Rendering painted lines as nodes in Cocos

I'm working on a drawing app for iPad using Cocos-iOS and I'm having performance issues with drawing lines as a type of CCNode. I understand that using draw in a node causes it to be called every time the canvas is repainted and the current code is very heavy if used every time:
for (LineNodePoint *point in self.points) {
start = end;
end = point;
if (start && end) {
float distance = ccpDistance(start.point, end.point);
if (distance > 1) {
int d = (int)distance;
float difx = end.point.x - start.point.x;
float dify = end.point.y - start.point.y;
for (int i = 0; i < d; i++) {
float delta = i / distance;
[[self.brush sprite] setPosition:ccp(start.point.x + (difx * delta), start.point.y + (dify * delta))];
[[self.brush sprite] visit];
}
}
}
}
Very heavy...
I either need a better way to draw the lines or to be able to cache the drawing as a raster.
Thanks in advance for any help.
How about ccDrawLine or CCMutableTexture? CCMutableTexture is for manipulating pixels using CCRenderTexture internally as you said.
ccDrawLine
cocos2d for iPhone 1.0.0 API reference
CCMutableTexture
Fast set/getPixel for an opengl texture?
[render texture] pixel manipulation (integrated CCMutableTexture functionality)

Simulate "Newton's law of universal gravitation" using Box2D

I want to simulate Newton's law of universal gravitation using Box2D.
I went through the manual but couldn't find a way to do this.
Basically what I want to do is place several objects in space (zero gravity) and simulate the movement.
Any tips?
It's pretty easy to implement:
for ( int i = 0; i < numBodies; i++ ) {
b2Body* bi = bodies[i];
b2Vec2 pi = bi->GetWorldCenter();
float mi = bi->GetMass();
for ( int k = i; k < numBodies; k++ ) {
b2Body* bk = bodies[k];
b2Vec2 pk = bk->GetWorldCenter();
float mk = bk->GetMass();
b2Vec2 delta = pk - pi;
float r = delta.Length();
float force = G * mi * mk / (r*r);
delta.Normalize();
bi->ApplyForce( force * delta, pi );
bk->ApplyForce( -force * delta, pk );
}
}
Unfortunately, Box2D doesn't have native support for it, but you can implement it yourself: Box2D and radial gravity code
As said by others, Box2D has no buildin support for it. But you can add support for it to the library in b2_islands.cpp. Just replace
v += h * b->m_invMass * (b->m_gravityScale * b->m_mass * gravity + b->m_force);
with
int planet_x = 0;
int planet_y = 0;
b2Vec2 gravityVector = (b2Vec2(planet_x, planet_y) - b->GetPosition());
gravityVector.Normalize();
gravityVector.x = gravityVector.x * 10.0f;
gravityVector.y = gravityVector.y * 10.0f;
v += h * b->m_invMass * (b->m_gravityScale * b->m_mass * gravityVector + b->m_force);
Thats a simple solution if you have only one planet.
If you want less force the further away you are, you could use 1/gravityVector instead of normalizing it. That would also make it possible to add up the gravity
of to planets. The you could also iterate over a planet list and sum the gravityVectors up.
Additionally implementing a function like b2World::CreatePlanet might be usefull then.
The 10.0f are just an approximation of the 9.81f from earth, you might need to adjust it. If the mass of the planet is relevant you might need a constant to be multiplied with it, to make it look more realistic, or just increase the density of the object to make it match the real weight of a planet.
Sure you can also set the gravity to 0, 0 and then calculate it before each step for every object, but that might not have so much performance.