Obejct Oriented Design Class Circle - Am I doing this right? - oop

Problem
Use object oriented deisgn to design a class called Circle that will receive the diameter of a circle, and calculate and display the circumference and the area of that circle. Design the class table…write an algorithm for each operation…write a test or driver algorithm to test the solution
Class Table
Class
Circle
Attributes
diameter
Responsibilities
receive diameter
calculate circumference
calculate area
display circumference
display area
Operations
+setDiameter()
-calculateCircumference
-calculateArea
+displayCircumference
+display Area
Algorithm
Class Circle
diameter1
setDiameter (inDiameter1)
Diameter1=inDiameter1
END
displayDiameter()
calculateCircumference (Circumference)
calculateArea (Area)
Display "The Diameter is", Diameter1
Display "The Circumference is", Circumference
Display "The Area is", Area
END
calculateCircumference (Circumference)
Circumference=Diameter1*3.14
END
calculateArea (Area)
Area=(Diameter1*Diameter1)*3.14
END
Test or Driver Algorithm
Create Circle as NewCircle()
testCircle()
inDiameter1=5
Circle.setDiameter (inDiameter1)
Circle.displayCircumference()
Circle.displayArea()
Am I doing this right???

You have just a couple of problems that are trivial, really:
Your function named displayDiameter() is actually displaying the circumference and area in addition to the diameter; there is nothing logically wrong with this, it just seems to be an unfortunate choice of function name.
Your calculateArea is calculating area as diameter2*pi. Shouldn't it be 1/4*diameter2*pi?

Related

My points are drawn at 0 latitude, 0 longitude

This is my click listener on the map:
#Override
public boolean onSingleTap(MotionEvent point) {
Point mapPoint=view.toMapPoint(point.getX(), point.getY());
SpatialReference sp = SpatialReference.create(SpatialReference.WKID_WGS84);
Point p1Done=(Point) GeometryEngine.project(mapPoint, view.getSpatialReference(), sp);
UtilsMapa.addPoint(p1Done, view, "Marker",view.getGraphicLayer());
return super.onSingleTap(point);
}
The addPoint method:
public static void addPoint(Point point, MapViewExt mView, String textMarker, GraphicsLayer gLayer){
SimpleMarkerSymbol simpleMarker = new SimpleMarkerSymbol(Color.RED, 10, SimpleMarkerSymbol.STYLE.CIRCLE);
TextSymbol bassRockTextSymbol = new TextSymbol(10, textMarker, BLUE, TextSymbol.HorizontalAlignment.LEFT,
TextSymbol.VerticalAlignment.BOTTOM);
//When reaching this line, point has 0=-3.0246720297389555 1=40.83564363734672, which is where I tapped the screen
Graphic graphic=new Graphic(point,simpleMarker);
gLayer.addGraphic(graphic);
}
The graphic IS added (a red spot) but it is drawn at 0,0 coords... Why is that? How can I get the point drawn in the proper place?
Thank you.
Answer
Don't project the point into WGS 1984. Use the point that MapView.toMapPoint(double, double) returns to you.
Detailed explanation
In ArcGIS Runtime 10.2.x for Android, a Point object doesn't keep track of its spatial reference. It's really just an x-value and a y-value. If you give a Point to the MapView, the MapView will assume that the Point is in the same spatial reference as the map, which by default is usually in Web Mercator.
Here's what your code is doing:
Point mapPoint=view.toMapPoint(point.getX(), point.getY());
mapPoint is now a point in MapView's spatial reference, which is probably Web Mercator. The point is probably some number of millions of meters from (0, 0), but since the Point class is blissfully unaware of spatial references, it just contains an x-value in the millions and a y-value in the millions, with no units.
SpatialReference sp = SpatialReference.create(SpatialReference.WKID_WGS84);
Point p1Done=(Point) GeometryEngine.project(mapPoint, view.getSpatialReference(), sp);
p1Done is now a point in WGS 1984, which is some number of degrees from (0, 0). You have a comment that says the x-value is -3.0246720297389555 and the y-value is 40.83564363734672, or maybe the other way around, but the important point is that the x-value is between -180 and 180 and the y-value is between -90 and 90, with no units.
UtilsMapa.addPoint(p1Done, view, "Marker",view.getGraphicLayer());
Your helper method is called with p1Done, which is called point in your helper method.
Graphic graphic=new Graphic(point,simpleMarker);
A graphic is created with a point whose x-value is -3.0246720297389555 and y-value is 40.83564363734672.
gLayer.addGraphic(graphic);
The graphic is added to a graphics layer, which has the spatial reference of the map that contains it, which is probably Web Mercator. The map correctly displays the point -3.024672029738955 meters west of (0, 0) and 40.83564363734672 meters north of (0, 0).
Side note: upgrade to ArcGIS Runtime 100.x
If you had used Runtime 100.x, you might have avoided this problem, since Runtime 100.x geometries, including Point, do keep track of their spatial reference.

Change measurement unit of Ruler in NSScrollview

Im trying to make a drawing application. So i have a subclass of NSScrollView that i use to show the rulers.
[self setHasHorizontalRuler: true];
[self setHasVerticalRuler:YES];
[self setRulersVisible:true];
[self setAutoresizesSubviews:YES];
The problem is that the numbers of the ruler have different units that the ones i use to draw. Here the points i wanted to draw are (0,0) (22,12) and (5,7)
I know theres a register measurement unit and a set for NSRulerView and some default units, but i can't find what are the default ones, or any example of how can i use it in an NSScrollView class. Should i just multiply every coordinate for a constant? In that case what constant is that?
The default units are listed in the description of the registerUnitWithName:abbreviation:unitToPointsConversionFactor:stepUpCycle:stepDownCycle: class method, and the current ruler unit can be found from the property measurementUnits.
The description tells you the Points/Unit for each pre-defined Unit Name. The measurement unit of your drawing view is points, so to draw in ruler units take your location and multiply each coordinate by the Points/Unit for the measurementUnits of the appropriate ruler (horizontal/vertical). E.g. consider your location(5, 7) with the ruler units set to Centimeters then the Points/Unit are 28.35 and your location is (141.75, 198.44) in points.
HTH

Test if MKCircle intersects with MKPolygon

I'm looking for some guidance in testing if a MKPolygon intersects an MKCircle. Currently I'm using:
if ([circle intersectsMapRect:[poly boundingMapRect]]) {
//they do intersect
}
I'm finding this returns inaccurate results simply b/c it draws a rectangle around my circle, thus giving me intersections that shouldn't otherwise be.
Searching the topic has lead me to Chad Saxon's polygon-polygon intersection project. This could be useful if I could somehow convert my MKCircle to a multi-sided polygon - which could be possible but ultimately I believe this is the round-about way to solve this.
I'm ultimately wondering if there is a simple solution I've overlooked before delving into porting my own custom geometry-ray-testing-algorithm implementation.
A couple of thoughts:
If you use that polygon intersection project, be aware that it has a few leaks in it. I issued a pull request that fixes a few of them (and a few other random observations). I would be wary of adopting any of that view controller code (as it has other issues), too, but the category seems ok if you're ok with the various limitations it entails (notably, the clockwise limitation, which is not really an issue if you're only determining if they intersected).
Rather than converting the circle to a series of polygons and then using that polygon intersection class, I might consider an alternative approach leveraging that you can detect the intersection with a circle by leveraging the fact that you can look at the distance between the relevant points in the polygon and the radius of a circle. It seems that there are three aspects of the problem:
If the distance between any of the polygon's vertices and the center of the circle is less than the radius of the circle, then the polygon and circle intersect.
Does the polygon encompass the circle (this is that special case where the distance from all of the sides of the polygon would be greater than the circle's radius, but the circle and polygon still obviously intersect). This is easily achieved by checking to see if the CGPath of the polygon's view encompasses the center of the circle using CGPathContainsPoint.
The only complicated part is to check to see if any side of the polygon intersects the circle, namely that the minimum distance between the sides of the polygon and the center of the circle less than the radius of the circle;
To calculate the distance of each side from the center of the circle, I might therefore iterate through each side of the polygon and for those sides facing the circle's center (i.e. for which the circle's center is perpendicular to the segment, meaning that an imaginary line perpendicular to the polygon's side that goes through the center of the circle actually crosses the line segment), you could:
Calculate the constants a, b, and c for this side of the polygon for the equation ax + by + c = 0 for a line segment going between vertices of the polygon (x1, y1) and (x2, y2):
a = (y1 – y2)
b = (x2 – x1)
c = (x1y2 – x2y1)
Calculate the distance from a point to a line, using (x0, y0) as the center of the circle:
If that distance is less than the radius of the circle, then you know that the polygon intersects the circle.
I put a sample project employing this technique on github.
Just for those to get a bit of substance on the solution, here is a useful MKCircle extension I wrote that checks if a point (in this case a polygon point) is inside the circle or not. Enjoy!
//MKCircle+PointInCircle.h
#import <Foundation/Foundation.h>
#import <MapKit/MapKit.h>
#interface MKCircle (PointInCircle)
-(BOOL)coordInCircle:(CLLocationCoordinate2D)coord;
#end
//MKCircle+PointInCircle.m
#import "MKCircle+PointInCircle.h"
#implementation MKCircle (PointInCircle)
-(BOOL)coordInCircle:(CLLocationCoordinate2D)coord {
CLLocation *locFrom = [[CLLocation alloc] initWithLatitude:self.coordinate.latitude longitude:self.coordinate.longitude];
CLLocation *locTo = [[CLLocation alloc] initWithLatitude:coord.latitude longitude:coord.longitude];
double distance = [locFrom distanceFromLocation:locTo];
BOOL isInside = (distance <= self.radius);
return isInside;
}
#end

Reflecting a circle off another circle

Working with iPhone and Objective C.
I am working on a game and I need to correctly reflect a ball off a circle object. I am trying to do it as a line and circle intersection. I have my ball position outside the circle and I have the new ball position that would be inside the circle at the next draw update. I know the intersect point of the line (ball path) and the circle. Now I want to rotate the ending point of the ball path about the intersection point to get the correct angle of reflection off the tangent.
The following are known:
ball current x,y
ball end x,y
ball radius
circle center x,y
circle radius
intersection point of ball path and circle x and y
I know I need to find the angle of incidence between the tangent line and the incoming ball path which will also equal my angle of reflection. I think once I know those two angles I can subtract them from 180 to get my rotation angle then rotate my end point about the angle of intersection by that amount. I just don't know how.
First, you should note that the center of the ball doesn't have to be inside of the circle to indicate that there's a reflection or bounce. As long as the distance between ball center and circle is less than the radius of the ball, there will be a bounce.
If the radius of the circle is R and the radius of the ball is r, things are simplified if you convert to the case where the circle has radius R+r and the ball has radius 0. For the purposes of collision detection and reflection/bouncing, this is equivalent.
If you have the point of intersection between the (enlarged) circle and the ball's path, you can easily compute the normal N to the circle at that point (it is the unit vector in the direction from the center of the circle to the collision point).
For an incoming vector V the reflected vector is V-2(N⋅V) N, where (N⋅V) is the dot product. For this problem, the incoming vector V is the vector from the intersection point to the point inside the circle.
As for the reflection formula given above, it is relatively easy to derive using vector math, but you can also Google search terms like "calculate reflection vector". The signs in the formula will vary with the assumed directions of V and N. Mathworld has a derivation although, as noted, the signs are different.
I only know the solution to the geometry part.
Let:
r1 => Radius of ball
r2 => Radius of circle
You can calculate the distance between the two circles by using Pythagoras theorem.
If the distance is less than the r1+r2 then do the collision.
For the collision,I would refer you Here. It's in python but I think it should give you an idea of what to do. Hopefully, even implement it in Objective C. The Tutorial By PeterCollingRidge.

Calculating collision for a moving circle, without overlapping the boundaries

Let's say I have circle bouncing around inside a rectangular area. At some point this circle will collide with one of the surfaces of the rectangle and reflect back. The usual way I'd do this would be to let the circle overlap that boundary and then reflect the velocity vector. The fact that the circle actually overlaps the boundary isn't usually a problem, nor really noticeable at low velocity. At high velocity it becomes quite clear that the circle is doing something it shouldn't.
What I'd like to do is to programmatically take reflection into account and place the circle at it's proper position before displaying it on the screen. This means that I have to calculate the point where it hits the boundary between it's current position and it's future position -- rather than calculating it's new position and then checking if it has hit the boundary.
This is a little bit more complicated than the usual circle/rectangle collision problem. I have a vague idea of how I should do it -- basically create a bounding rectangle between the current position and the new position, which brings up a slew of problems of it's own (Since the rectangle is rotated according to the direction of the circle's velocity). However, I'm thinking that this is a common problem, and that a common solution already exists.
Is there a common solution to this kind of problem? Perhaps some basic theories which I should look into?
Since you just have a circle and a rectangle, it's actually pretty simple. A circle of radius r bouncing around inside a rectangle of dimensions w, h can be treated the same as a point p at the circle's center, inside a rectangle (w-r), (h-r).
Now position update becomes simple. Given your point at position x, y and a per-frame velocity of dx, dy, the updated position is x+dx, y+dy - except when you cross a boundary. If, say, you end up with x+dx > W (letting W = w-r), then you do the following:
crossover = (x+dx) - W // this is how far "past" the edge your ball went
x = W - crossover // so you bring it back the same amount on the correct side
dx = -dx // and flip the velocity to the opposite direction
And similarly for y. You'll have to set up a similar (reflected) check for the opposite boundaries in each dimension.
At each step, you can calculate the projected/expected position of the circle for the next frame.
If this lies outside the rectangle, then you can then use the distance from the old circle position to the rectangle's edge and the amount "past" the rectangle's edge that the next position lies at (the interpenetration) to linearly interpolate and determine the precise time when the circle "hits" the rectangle edge.
For example, if the circle is 10 pixels away from the rectangle's edge, then is predicted to move to 5 pixels beyond it, you know that for 2/3rds of the timestep (10/15ths) it moves on its orginal path, then is reflected and continues on its new path for the remaining 1/3rd of the timestep (5/15ths). By calculating these two parts of the motion and "adding" the translations together, you can find the correct new position.
(Of course, it gets more complicated if you hit near a corner, as there may be several collisions during the timestep, off different edges. And if you have more than one circle moving, things get a lot more complex. But that's where you can start for the case you've asked about)
Reflection across a rectangular boundary is incredibly simple. Just take the amount that the object passed the boundary and subtract it from the boundary position. If the position without reflecting would be (-0.8,-0.2) for example and the upper left corner is at (0,0), the reflected position would be (0.8,0.2).