Here is what i want:
http://postimage.org/image/9pq8m79hx/
I know the coordonates of the O point and of the X point. Is there any posibility to find the V angle (the angle from North orientation) using iOS methods?
Yep.
#import <math.h>
float a = -1 * atan2(y1 - y0, x1 - x0);
if (a >= 0) {
a += M_PI / 2;
} else if (a < 0 && a >= -M_PI / 2) {
a += M_PI / 2;
} else {
a += 2 * M_PI + M_PI / 2;
}
if (a > 2 * M_PI) a -= 2 * M_PI;
Now a will contain the angle in radians, in the interval 0...2 PI.
Doesn't even need any iOS-specific APIs. Remember: iOS still has all the features of libc.
not sure user529758's answer addresses the question, which I read as being about changing East as 0 degrees to North as 0 degrees. Code below works - key line is 4th line which changes 0 degrees from East to North
-(CGFloat) bearingFromNorthBetweenStartPoint: (CGPoint)startPoint andEndPoint:(CGPoint) endPoint {
// get origin point of the Vector
CGPoint origin = CGPointMake(endPoint.x - startPoint.x, endPoint.y - startPoint.y);
// get bearing in radians
CGFloat bearingInRadians = atan2f(origin.y, origin.x);
// convert to bearing in radians to degrees
CGFloat bearingInDegrees = bearingInRadians * (180.0 / M_PI);
// convert the bearing so that it takes from North as 0 / 360 degrees, rather than from East as 0 degrees
bearingInDegrees = 90 + bearingInDegrees;
// debug comments:
if (bearingInDegrees >= 0)
{
NSLog(#"Bearing >=0 in Degrees %.1f degrees", bearingInDegrees );
}
else
{
bearingInDegrees = 360 + bearingInDegrees;
NSLog(#"Bearing in Degrees %.1f degrees", bearingInDegrees );
}
return bearingInDegrees;
}
Related
I have a problem with collision detection of a circle and a rectangle. I have tried to solve the problem with the Pythagorean Theorem. But none of the queries works. The rectangle collides with the rectangular bounding box of the circle.
if (CGRectIntersectsRect(player.frame, visibleEnemy.frame)) {
if (([visibleEnemy spriteTyp] == jumper || [visibleEnemy spriteTyp] == wobble )) {
if ((visibleEnemy.center.x - player.frame.origin.x) * (visibleEnemy.center.x - player.frame.origin.x) +
(visibleEnemy.center.y - player.frame.origin.y) * (visibleEnemy.center.y - player.frame.origin.y) <=
(visibleEnemy.bounds.size.width/2 * visibleEnemy.bounds.size.width/2)) {
NSLog(#"Check 1");
normalAction = NO;
}
if ((visibleEnemy.center.x - (player.frame.origin.x + player.bounds.size.width)) *
(visibleEnemy.center.x - (player.frame.origin.x + player.bounds.size.width)) +
(visibleEnemy.center.y - player.frame.origin.y) * (visibleEnemy.center.y - player.frame.origin.y) <=
(visibleEnemy.bounds.size.width/2 * visibleEnemy.bounds.size.width/2)) {
NSLog(#"Check 2");
normalAction = NO;
}
else {
NSLog(#"Check 3");
normalAction = NO;
}
}
}
Here is how I did it in one of my small gaming projects. It gave me best results and it's simple. My code detects if there is a collision between circle and the line. So you can easily adopt it to circle - rectangle collision detection by checking all 4 edges of the rectangle.
Let's say that a ball has a ballRadius, and location (xBall, yBall). The line is defined with two points (xStart, yStart) and (xEnd, yEnd).
Implementation of a simple collision detection:
float ballRadius = ...;
float x1 = xStart - xBall;
float y1 = yStart - yBall;
float x2 = xEnd - xBall;
float y2 = yEnd - yBall;
float dx = x2 - x1;
float dy = y2 - y1;
float dr = sqrtf(powf(dx, 2) + powf(dy, 2));
float D = x1*y2 - x2*y1;
float delta = powf(ballRadius*0.9,2)*powf(dr,2) - powf(D,2);
if (delta >= 0)
{
// Collision detected
}
If delta is greater than zero there are two intersections between ball (circle) and line. If delta is equal to zero there is one intersection – perfect collision.
I hope it will help you.
I am trying to determine the midpoint between two locations in an MKMapView. I am following the method outlined here (and here) and rewrote it in Objective-C, but the map is being centered somewhere northeast of Baffin Island, which is no where near the two points.
My method based on the java method linked above:
+(CLLocationCoordinate2D)findCenterPoint:(CLLocationCoordinate2D)_lo1 :(CLLocationCoordinate2D)_loc2 {
CLLocationCoordinate2D center;
double lon1 = _lo1.longitude * M_PI / 180;
double lon2 = _loc2.longitude * M_PI / 100;
double lat1 = _lo1.latitude * M_PI / 180;
double lat2 = _loc2.latitude * M_PI / 100;
double dLon = lon2 - lon1;
double x = cos(lat2) * cos(dLon);
double y = cos(lat2) * sin(dLon);
double lat3 = atan2( sin(lat1) + sin(lat2), sqrt((cos(lat1) + x) * (cos(lat1) + x) + y * y) );
double lon3 = lon1 + atan2(y, cos(lat1) + x);
center.latitude = lat3 * 180 / M_PI;
center.longitude = lon3 * 180 / M_PI;
return center;
}
The 2 parameters have the following data:
_loc1:
latitude = 45.4959839
longitude = -73.67826455
_loc2:
latitude = 45.482889
longitude = -73.57522299
The above are correctly place on the map (in and around Montreal). I am trying to center the map in the midpoint between the 2, yet my method return the following:
latitude = 65.29055
longitude = -82.55425
which somewhere in the arctic, when it should be around 500 miles south.
In case someone need code in Swift, I have written library function in Swift to calculate the midpoint between MULTIPLE coordinates:
// /** Degrees to Radian **/
class func degreeToRadian(angle:CLLocationDegrees) -> CGFloat {
return ( (CGFloat(angle)) / 180.0 * CGFloat(M_PI) )
}
// /** Radians to Degrees **/
class func radianToDegree(radian:CGFloat) -> CLLocationDegrees {
return CLLocationDegrees( radian * CGFloat(180.0 / M_PI) )
}
class func middlePointOfListMarkers(listCoords: [CLLocationCoordinate2D]) -> CLLocationCoordinate2D {
var x = 0.0 as CGFloat
var y = 0.0 as CGFloat
var z = 0.0 as CGFloat
for coordinate in listCoords{
var lat:CGFloat = degreeToRadian(coordinate.latitude)
var lon:CGFloat = degreeToRadian(coordinate.longitude)
x = x + cos(lat) * cos(lon)
y = y + cos(lat) * sin(lon)
z = z + sin(lat)
}
x = x/CGFloat(listCoords.count)
y = y/CGFloat(listCoords.count)
z = z/CGFloat(listCoords.count)
var resultLon: CGFloat = atan2(y, x)
var resultHyp: CGFloat = sqrt(x*x+y*y)
var resultLat:CGFloat = atan2(z, resultHyp)
var newLat = radianToDegree(resultLat)
var newLon = radianToDegree(resultLon)
var result:CLLocationCoordinate2D = CLLocationCoordinate2D(latitude: newLat, longitude: newLon)
return result
}
Detailed answer can be found here
Updated For Swift 5
func geographicMidpoint(betweenCoordinates coordinates: [CLLocationCoordinate2D]) -> CLLocationCoordinate2D {
guard coordinates.count > 1 else {
return coordinates.first ?? // return the only coordinate
CLLocationCoordinate2D(latitude: 0, longitude: 0) // return null island if no coordinates were given
}
var x = Double(0)
var y = Double(0)
var z = Double(0)
for coordinate in coordinates {
let lat = coordinate.latitude.toRadians()
let lon = coordinate.longitude.toRadians()
x += cos(lat) * cos(lon)
y += cos(lat) * sin(lon)
z += sin(lat)
}
x /= Double(coordinates.count)
y /= Double(coordinates.count)
z /= Double(coordinates.count)
let lon = atan2(y, x)
let hyp = sqrt(x * x + y * y)
let lat = atan2(z, hyp)
return CLLocationCoordinate2D(latitude: lat.toDegrees(), longitude: lon.toDegrees())
}
}
Just a hunch, but I noticed your lon2 and lat2 variables are being computed with M_PI/100 and not M_PI/180.
double lon1 = _lo1.longitude * M_PI / 180;
double lon2 = _loc2.longitude * M_PI / 100;
double lat1 = _lo1.latitude * M_PI / 180;
double lat2 = _loc2.latitude * M_PI / 100;
Changing those to 180 might help you out a bit.
For swift users, corrected variant as #dinjas suggest
import Foundation
import MapKit
extension CLLocationCoordinate2D {
// MARK: CLLocationCoordinate2D+MidPoint
func middleLocationWith(location:CLLocationCoordinate2D) -> CLLocationCoordinate2D {
let lon1 = longitude * M_PI / 180
let lon2 = location.longitude * M_PI / 180
let lat1 = latitude * M_PI / 180
let lat2 = location.latitude * M_PI / 180
let dLon = lon2 - lon1
let x = cos(lat2) * cos(dLon)
let y = cos(lat2) * sin(dLon)
let lat3 = atan2( sin(lat1) + sin(lat2), sqrt((cos(lat1) + x) * (cos(lat1) + x) + y * y) )
let lon3 = lon1 + atan2(y, cos(lat1) + x)
let center:CLLocationCoordinate2D = CLLocationCoordinate2DMake(lat3 * 180 / M_PI, lon3 * 180 / M_PI)
return center
}
}
It's important to say that the formula the OP used to calculate geographic midpoint is based on this formula which explains the cos/sin/sqrt calculation.
This formula will give you the geographic midpoint for any long distance including the four quarters and the prime meridian.
But, if your calculation is for short-range around 1 Kilometer, using a simple average will produce the same midpoint results.
i.e:
let firstPoint = CLLocation(....)
let secondPoint = CLLocation(....)
let midPointLat = (firstPoint.coordinate.latitude + secondPoint.coordinate.latitude) / 2
let midPointLong = (firstPoint.coordinate.longitude + secondPoint.coordinate.longitude) / 2
You can actually use it for 10km but expect a deviation - if you only need an estimation for a short range midpoint with a fast solution it will be sufficient.
I think you are over thinking it a bit. Just do:
float lon3 = ((lon1 + lon2) / 2)
float lat3 = ((lat1 + lat2) / 2)
lat3 and lon3 will be the center point.
I want to get angles between two line.
So I used this code.
int posX = (ScreenWidth) >> 1;
int posY = (ScreenHeight) >> 1;
double radians, degrees;
radians = atan2f( y - posY , x - posX);
degrees = -CC_RADIANS_TO_DEGREES(radians);
NSLog(#"%f %f",degrees,radians);
But it doesn't work .
The Log is that: 146.309935 -2.553590
What's the matter?
I can't know the reason.
Please help me.
If you simply use
radians = atan2f( y - posY , x - posX);
you'll get the angle with the horizontal line y=posY (blue angle).
You'll need to add M_PI_2 to your radians value to get the correct result.
Here's a function I use. It works great for me...
float cartesianAngle(float x, float y) {
float a = atanf(y / (x ? x : 0.0000001));
if (x > 0 && y > 0) a += 0;
else if (x < 0 && y > 0) a += M_PI;
else if (x < 0 && y < 0) a += M_PI;
else if (x > 0 && y < 0) a += M_PI * 2;
return a;
}
EDIT: After some research I found out you can just use atan2(y,x). Most compiler libraries have this function. You can ignore my function above.
If you have 3 points and want to calculate an angle between them here is a quick and correct way of calculating the right angle value:
double AngleBetweenThreePoints(CGPoint pointA, CGPoint pointB, CGPoint pointC)
{
CGFloat a = pointB.x - pointA.x;
CGFloat b = pointB.y - pointA.y;
CGFloat c = pointB.x - pointC.x;
CGFloat d = pointB.y - pointC.y;
CGFloat atanA = atan2(a, b);
CGFloat atanB = atan2(c, d);
return atanB - atanA;
}
This will work for you if you specify point on one of the lines, intersection point and point on the other line.
Let me just start with the code.
- (NSPoint*) pointFromPoint:(NSPoint*)point withDistance:(float)distance towardAngle:(float)angle; {
float newX = distance * cos(angle);
float newY = distance * sin(angle);
NSPoint * anNSPoint;
anNSPoint.x = newX;
anNSPoint.y = newY;
return thePoint;
}
This should, based on my knowledge, be perfect. It should return and x value of 0 and a y value of 2 if I call this code.
somePoint = [NSPoint pointFromPoint:somePoint withDistance:2 towardAngle:90];
Instead, I get and x value of 1.05 and a y of 1.70. How can I find the x and y coordinates based on an angle and a distance?
Additional note: I have looked on math.stackexchange.com, but the formulas there led me to this. I need the code, not the normal math because I know I will probably screw this up.
A working version of your function, which accepts values in degrees instead of radians, would look like this:
- (NSPoint)pointFromPoint:(NSPoint)origin withDistance:(float)distance towardAngle:(float)angle
{
double radAngle = angle * M_PI / 180.0;
return NSMakePoint(origin.x + distance * cos(radAngle), point.y + distance * sin(radAngle));
}
Your problem is you're giving the angle in degrees (e.g. 90), but the math is expecting it in radians. Try replacing the 90 with M_PI_2
I have some problems figuring out where my error is. I got the following:
Have an image and corresponding GPS coordinates of its top-left and bottom-right vertices.
E.g:
topLeft.longitude = 8.235128;
topLeft.latitude = 49.632383;
bottomRight.longitude = 8.240547;
bottomRight.latitude = 49.629808;
Now a have an Point that lies in that map:
p.longitude = 8.238567;
p.latitude = 49.630664;
I draw my image in landscape fullscreen (1024*748).
Now I want to calculate the exact Pixel position (x,y) of my point.
For doing that I am trying to use the great circle distance approach from here: Link.
CGFloat DegreesToRadians(CGFloat degrees)
{
return degrees * M_PI / 180;
};
- (float) calculateDistanceP1:(CLLocationCoordinate2D)p1 andP2:(CLLocationCoordinate2D)p2 {
double circumference = 40000.0; // Erdumfang in km am Äquator
double distance = 0.0;
double latitude1Rad = DegreesToRadians(p1.latitude);
double longitude1Rad = DegreesToRadians(p1.longitude);
double latititude2Rad = DegreesToRadians(p2.latitude);
double longitude2Rad = DegreesToRadians(p2.longitude);
double logitudeDiff = fabs(longitude1Rad - longitude2Rad);
if (logitudeDiff > M_PI)
{
logitudeDiff = 2.0 * M_PI - logitudeDiff;
}
double angleCalculation =
acos(sin(latititude2Rad) * sin(latitude1Rad) + cos(latititude2Rad) * cos(latitude1Rad) * cos(logitudeDiff));
distance = circumference * angleCalculation / (2.0 * M_PI);
NSLog(#"%f",distance);
return distance;
}
Here is my code for getting the Pixel position:
- (CGPoint) calculatePoint:(CLLocationCoordinate2D)point {
float x_coord;
float y_coord;
CLLocationCoordinate2D x1;
CLLocationCoordinate2D x2;
x1.longitude = p.longitude;
x1.latitude = topLeft.latitude;
x2.longitude = p.longitude;
x2.latitude = bottomRight.latitude;
CLLocationCoordinate2D y1;
CLLocationCoordinate2D y2;
y1.longitude = topLeft.longitude;
y1.latitude = p.latitude;
y2.longitude = bottomRight.longitude;
y2.latitude = p.latitude;
float distanceX = [self calculateDistanceP1:x1 andP2:x2];
float distanceY = [self calculateDistanceP1:y1 andP2:y2];
float distancePX = [self calculateDistanceP1:x1 andP2:p];
float distancePY = [self calculateDistanceP1:y1 andP2:p];
x_coord = fabs(distancePX * (1024 / distanceX))-1;
y_coord = fabs(distancePY * (748 / distanceY))-1;
return CGPointMake(x_coord,y_coord);
}
x1 and x2 are the points on the longitude of p and with latitude of topLeft and bottomRight.
y1 and y2 are the points on the latitude of p and with longitude of topLeft and bottomRight.
So I got the distance between left and right on longitude of p and distance between top and bottom on latitude of p. (Needed for calculate the pixel position)
Now I calculate the distance between x1 and p (my distance between x_0 and x_p) after that I calculate the distance between y1 and p (distance between y_0 and y_p)
Last but not least the Pixel position is calculated and returned.
The Result is, that my point is on the red and NOT on the blue position:
Maybe you find any mistakes or have any suggestions for improving the accuracy.
Maybe I didn't understand your question, but shouldn't you be using the Converting Map Coordinates methods of MKMapView?
See this image
I used your co-ordinates, and simply did the following:
x_coord = 1024 * (p.longitude - topLeft.longitude)/(bottomRight.longitude - topLeft.longitude);
y_coord = 748 - (748 * (p.latitude - bottomRight.latitude)/(topLeft.latitude - bottomRight.latitude));
The red dot markes this point. For such small distances you don't really need to use great circles, and your rounding errors will be making things much more inaccurate