Swift ReverseGeocode CLPlacemark areasOfInterest almost always nil. Should I be using something else? - xcode8

I am using CLGeocoder and reverseGeocodeLocation to get my CLPlacemark. How come the name mostly comes back as an address and areasOfInterest come back as nil? For example... the areasOfInterest appear for really major things like apple HQ and airports, things of that sort but stores such as Walmart, Publix blah blah are nil. Should I be searching another way? Am I expecting more information than is available with this method? I mean, Apple has these points of interest on their maps, is there another way I should be trying to get this information?
Here are a few location lat longs that I've tried in my area that aren't bringing back the store names. These came from google and when put into apple maps, it beings you right on top of the correct location but it doesn't associate either... This makes me think I should be doing something different to being back the name of the store. Other info like a description or category would be nice as well.
Note: I am only wanting the information, I am not trying to place it on a map or anything.
Walmart: 35.0944° N, 85.3319° W
Aquarium: 35.0558° N, 85.3111° W
Publix: 35.0651° N, 85.3083° W
Small bit of my code. All works just wanted to give you an adea of what im bringing back and how.
CLGeocoder().reverseGeocodeLocation(manager.location!, completionHandler: {(placemarks, error)->Void in
if placemarks != nil
{
if error == nil && placemarks!.count >= 1 {
let thePlacemarks = placemarks![0] as CLPlacemark
print(placemarks)
print(thePlacemarks.areasOfInterest?.description)
print(thePlacemarks.administrativeArea?.description)
print(thePlacemarks.areasOfInterest?.description)
print(thePlacemarks.country?.description)
print(thePlacemarks.inlandWater?.description)
print(thePlacemarks.isoCountryCode?.description)
print(thePlacemarks.locality?.description)
print(thePlacemarks.location?.description)
print(thePlacemarks.name?.description)
print(thePlacemarks.ocean?.description)
print(thePlacemarks.subAdministrativeArea?.description)
print()
}
}
})
Any help would be great!
Thanks!

So... Not necessarily ideal but by using mapkit I was able to do a MKLocalSearch to get what I wanted. This only works because I have an array in my code of the specific locations I am interested in. See me code below.
Import Mapkit
let listArr = ["Walmart", "Publix","Game Stop"]
Then somewhere in your ViewController
func searchArr() //This function will iterate through the array and see if any of the locations are within 30 meters
{
for element in listsArr
{
let request = MKLocalSearchRequest()
request.naturalLanguageQuery = "\(element)"
request.region = MKCoordinateRegionMakeWithDistance(currentCoordinates, 3200, 3200)
MKLocalSearch(request: request).start { (response, error) in
guard error == nil else {print()return}
guard let response = response else {return}
guard response.mapItems.count > 0 else {return}
print(response.mapItems[0])
let coord1 = currentCoordinates
let coord2 = response.mapItems[0].placemark.coordinate
let distance = self.calculateDistance(fromlat: currentCoordinates.latitude, fromlon: currentCoordinates.longitude, tolat: response.mapItems[0].placemark.coordinate.latitude, tolon: response.mapItems[0].placemark.coordinate.longitude)
if distance > 30
{
print("the distance between the two points is: \(distance) meters")
}
}
}
Here is a little function I found to get the distance between two coordinates.
func calculateDistance(fromlat : Double, fromlon : Double, tolat : Double, tolon : Double) -> Double {
let DEG_TO_RAD = 0.017453292519943295769236907684886
let EARTH_RADIUS_IN_METERS = 6372797.560856
let latitudeArc : Double = (fromlat - tolat) * DEG_TO_RAD
let longitudeArc : Double = (fromlon - tolon) * DEG_TO_RAD
var latitudeH : Double = sin(latitudeArc * 0.5)
latitudeH *= latitudeH
var lontitudeH : Double = sin(longitudeArc * 0.5)
lontitudeH *= lontitudeH
let tmp : Double = cos(fromlat*DEG_TO_RAD) * cos(tolat*DEG_TO_RAD)
return EARTH_RADIUS_IN_METERS * 2.0 * asin(sqrt(latitudeH + tmp*lontitudeH))
}

Related

AGSMAPView callout not showing on touch point

I am new to ARCGIS. Any help will be appreciated.
I am showing callout on didtap delegate Like this
func geoView(_ geoView: AGSGeoView, didTapAtScreenPoint screenPoint: CGPoint, mapPoint: AGSPoint) {
isFromSearch = false
MBProgressHUD.showAdded(to: self.view, animated: true)
self.mapView.identifyLayers(atScreenPoint: screenPoint, tolerance: 12, returnPopupsOnly: false, maximumResultsPerLayer: 10) { (identifyLayerResults: [AGSIdentifyLayerResult]?, error: Error?) in
//check for errors and ensure identifyLayerResults is not nil
MBProgressHUD.hide(for: self.view, animated: true)
if let error = error {
print(error)
return
}
guard let identifyLayerResults = identifyLayerResults else { return }
// iterate the identify layer results
guard identifyLayerResults.count > 0 else {return}
guard identifyLayerResults[0].sublayerResults.count > 0 else {return}
guard identifyLayerResults[0].sublayerResults[0].geoElements.count > 0 else {return}
let result = identifyLayerResults[0].sublayerResults[0].geoElements[0].attributes
self.identifyLayerResult = identifyLayerResults[0]
var title: String? = nil
var subtitle: String? = nil
if ((result["SiteCode"] as? String) != nil) && ((result["SiteName"] as? String) != nil){
title = (result["SiteCode"] as? String)
subtitle = (result["SiteName"] as? String)
}
else {
title = (result["company"] as? String)
subtitle = (result["identifier"] as? String)
}
self.mapView.callout.title = title
self.mapView.callout.detail = subtitle
self.mapView.callout.show(at: mapPoint, screenOffset: .zero, rotateOffsetWithMap: false, animated: true)
}
}
Everything is Working fine first time . But User can also search for places using REST API
and then mapview is moves to that point and show callout
https://******/arcgis/rest/services/Google/MobileiOS3/MapServer/find?
It returns Site and I create ViewPoint using Latitude and Longitude and show callout with zoom out and zoom in animation Code is given below
let pointView = AGSViewpoint(latitude: center.latitude, longitude: center.longitude, scale: 12E7)
self.mapView.setViewpoint(pointView, duration: 2) { (value) in
let pointView1 = AGSViewpoint(latitude: center.latitude, longitude: center.longitude, scale: 12E4)
self.mapView.setViewpoint(pointView1, duration: 2) { (true) in
let wgs84 = AGSSpatialReference(wkid: 4236)
let point = AGSPoint(x: center.latitude, y: center.longitude, spatialReference: wgs84)
let marker = AGSPictureMarkerSymbol(image: UIImage(named: "BluePushpin.png")!)
marker.leaderOffsetX = 9
marker.leaderOffsetY = -16
let graphics = AGSGraphic(geometry: point, symbol: marker, attributes: nil)
self.mGraphicOverlay.graphics.add(graphics)
let cgPoint = CGPoint(x: self.mapView.center.x, y: self.mapView.center.y - (self.mapView.callout.frame.height + 33))
print(cgPoint)
self.mapView.callout.show(at: graphics.geometry as! AGSPoint, screenOffset: cgPoint, rotateOffsetWithMap: false, animated: true)
}
}
After that when I tap on map any point Callout always shows to top Left Corner While first time didtap delegate was working fine
When I debug code and print callout frame it always shows zero x and zero y
There are a few things going on here:
Firstly, I think you've got the wrong spatial reference. You're using 4236 but WGS84 is 4326.
Note, you can avoid this type of typo by just referencing AGSSpatialReference.wgs84(), so you could say this:
let point = AGSPoint(x: center.latitude, y: center.longitude, spatialReference: .wgs84())
But look closely at that: you're also using latitude as X and longitude as Y. It's unfortunately confusing, but when you create an x,y point you need to specify longitude,latitude, not latitude,longitude:
let point = AGSPoint(x: center.longitude, y: center.latitude, spatialReference: .wgs84())
It's a common mistake. We have a helper function to simplify working with lat/lon source data:
AGSPointMakeWGS84(center.latitude, center.longitude)
You're also doing a bit more work than you need to (especially in forcing spatial reference conversions) and are introducing a few potential areas where errors could creep in.
So, assuming you actually need to set the map scale twice (I guess you are zooming to the location, and then zooming in a bit to focus attention), you could try something like this, which seems much less prone to errors:
let centerPoint = AGSPointMakeWGS84(center.latitude, center.longitude)
self.mapView.setViewpoint(AGSViewpoint(center: centerPoint, scale: 12E7), duration: 2) { _ in
self.mapView.setViewpoint(AGSViewpoint(center: centerPoint, scale: 12E4), duration: 2) { _ in
let marker = AGSPictureMarkerSymbol(image: UIImage(named: "BluePushpin.png")!)
marker.leaderOffsetX = 9
marker.leaderOffsetY = -16
let graphics = AGSGraphic(geometry: centerPoint, symbol: marker, attributes: nil)
self.mGraphicOverlay.graphics.add(graphics)
let cgPoint = CGPoint(x: self.mapView.center.x, y: self.mapView.center.y - (self.mapView.callout.frame.height + 33))
print(cgPoint)
self.mapView.callout.show(at: centerPoint, screenOffset: cgPoint, rotateOffsetWithMap: false, animated: true)
}
}
I confess I'm not entirely sure what your callout screenOffset calculation is doing, but I've left that as is.
I might also suggest adding the graphic before you animate the view, or after the first animation and before the second one (i.e. you're looking at the right location but have yet to zoom in a bit), but that's up to you.
Also, could I suggest that you post questions like this over at the ArcGIS Runtime SDK for iOS forum? We do monitor this space if you use the right tags, but you'll generally get more eyes on your questions over there.

ARKit – Spatial Audio barely changes the volume over distance

I created a SCNNode and added an Audio to it.
It is a Mono audio. Everything is set up correctly.
It is working as Spatial Audio, that's not the problem.
The problem is that as i get closer or far away it barely changes the volume. I know it changes if i get very very far away, but it's nothing like Apple demonstrated here:
https://youtu.be/d9kb1LfNNU4?t=23
Some other games i see the audio volume really changing from one step distance.
With mine, with one step you can't even tell the volume changed. You need at least 4 steps.
Anyone has any clue why?
Code bellow:
SCNNode *audioNode = [[SCNNode alloc] init];
SCNAudioSource *audioSource = [[SCNAudioSource alloc] initWithFileNamed:audioFileName];
audioSource.loops = YES;
[audioSource load];
audioSource.volume = 0.05; // <-- i used different values. won't change much either
audioSource.positional = YES;
//audioSource.shouldStream = NO; // <-- makes no difference
[audioNode addAudioPlayer:[SCNAudioPlayer audioPlayerWithSource:audioSource]];
[audioNode runAction:[SCNAction playAudioSource:audioSource waitForCompletion:NO] completionHandler:nil];
[massNode addChildNode:audioNode];
Maybe scale of the nodes?
The whole scene is the size of around 4 feet.
When i add an object i usually scale it to 0.005 (otherwise it gets way too big).
But i also tried with one that was already in the right size from .scn file.
It shouldn't affect anything tho, since the result is a coffee table size scene and i can see the objects alright.
Updated.
Here's a working code for controlling sound's decay (works in iOS and macOS):
import AVFoundation
import ARKit
class ViewController: UIViewController, AVAudioMixing {
#IBOutlet var sceneView: SCNView!
// #IBOutlet var sceneView: ARSCNView!
func destination(forMixer mixer: AVAudioNode,
bus: AVAudioNodeBus) -> AVAudioMixingDestination? {
return nil
}
var volume: Float = 0.0
var pan: Float = 0.0
var sourceMode: AVAudio3DMixingSourceMode = .bypass
var pointSourceInHeadMode: AVAudio3DMixingPointSourceInHeadMode = .bypass
var renderingAlgorithm = AVAudio3DMixingRenderingAlgorithm.sphericalHead
var rate: Float = 1.2
var reverbBlend: Float = 40.0
var obstruction: Float = -100.0
var occlusion: Float = -100.0
var position = AVAudio3DPoint(x: 0, y: 0, z: 10)
let audioNode = SCNNode()
override func viewDidLoad() {
super.viewDidLoad()
let myScene = SCNScene()
let cameraNode = SCNNode()
cameraNode.camera = SCNCamera()
cameraNode.position = SCNVector3(0, 0, 0)
myScene.rootNode.addChildNode(cameraNode)
// let sceneView = view as! SCNView
sceneView.scene = myScene
sceneView.backgroundColor = UIColor.orange
let myPath = Bundle.main.path(forResource: "Mono_Audio", ofType: "mp3")
let myURL = URL(fileURLWithPath: myPath!)
let mySource = SCNAudioSource(url: myURL)!
mySource.loops = true
mySource.isPositional = true // Positional Audio
mySource.shouldStream = false // FALSE for Positional Audio
mySource.volume = volume
mySource.reverbBlend = reverbBlend
mySource.rate = rate
mySource.load()
let player = SCNAudioPlayer(source: mySource)
let sphere: SCNGeometry = SCNSphere(radius: 0.1)
let sphereNode = SCNNode(geometry: sphere)
sphereNode.addChildNode(audioNode)
myScene.rootNode.addChildNode(sphereNode)
audioNode.addAudioPlayer(player)
sceneView.audioEnvironmentNode.distanceAttenuationParameters.maximumDistance = 2
sceneView.audioEnvironmentNode.distanceAttenuationParameters.referenceDistance = 0.1
sceneView.audioEnvironmentNode.renderingAlgorithm = .auto
// sceneView.audioEnvironmentNode.reverbParameters.enable = true
// sceneView.audioEnvironmentNode.reverbParameters.loadFactoryReverbPreset(.plate)
let hither = SCNAction.moveBy(x: 0, y: 0, z: 1, duration: 2)
let thither = SCNAction.moveBy(x: 0, y: 0, z: -1, duration: 2)
let sequence = SCNAction.sequence([hither, thither])
let loop = SCNAction.repeatForever(sequence)
sphereNode.runAction(loop)
}
}
And, yes, you're absolutely right – there are some obligatory settings.
But there are 7 of them:
use AVAudioMixing protocol with its stubs (properties and methods).
use MONO audio file.
use source.isPositional = true.
use source.shouldStream = false.
assign maximumDistance value to distanceAttenuationParameters property.
assign referenceDistance value to distanceAttenuationParameters property.
and location of mySource.load() is very important in your code.
P.S. If the aforementioned tips didn't help you, then use additional instance properties to make your sound even quieter using a graph, obstacles and orientation of implicit listener:
var rolloffFactor: Float { get set } // attenuation's graph, default = 1
var obstruction: Float { get set } // default = 0.0
var occlusion: Float { get set } // default = 0.0
var listenerAngularOrientation: AVAudio3DAngularOrientation { get set } //(0,0,0)
It definitely works if you'll write it in Objective-C.
In this example the distance of audioNode is 1 meter away from a listener.
If none of the above answers seem to work, try the following code:
sceneView.audioEnvironmentNode.reverbParameters.enable = true
And if even these seem to barely work, or if you wanna optimal performance, there is a property called level where you can set the level of how spatial the code can be.
sceneView.audioEnvironmentNode.reverbParameters.level = 40
(the level of the reverbParameters ranges between -40 to 40 parameters)

Count colors in image: `NSCountedSet` and `colorAtX` are very slow

I'm making an OS X app which creates a color scheme from the main colors of an image.
As a first step, I'm using NSCountedSet and colorAtX to get all the colors from an image and count their occurrences:
func sampleImage(#width: Int, height: Int, imageRep: NSBitmapImageRep) -> (NSCountedSet, NSCountedSet) {
// Store all colors from image
var colors = NSCountedSet(capacity: width * height)
// Store the colors from left edge of the image
var leftEdgeColors = NSCountedSet(capacity: height)
// Loop over the image pixels
var x = 0
var y = 0
while x < width {
while y < height {
// Instruments shows that `colorAtX` is very slow
// and using `NSCountedSet` is also very slow
if let color = imageRep.colorAtX(x, y: y) {
if x == 0 {
leftEdgeColors.addObject(color)
}
colors.addObject(color)
}
y++
}
// Reset y every x loop
y = 0
// We sample a vertical line every x pixels
x += 1
}
return (colors, leftEdgeColors)
}
My problem is that this is very slow. In Instruments, I see there's two big bottlenecks: with NSCountedSet and with colorAtX.
So first I thought maybe replace NSCountedSet by a pure Swift equivalent, but the new implementation was unsurprisingly much slower than NSCountedSet.
For colorAtX, there's this interesting SO answer but I haven't been able to translate it to Swift (and I can't use a bridging header to Objective-C for this project).
My problem when trying to translate this is I don't understand the unsigned char and char parts in the answer.
What should I try to scan the colors faster than with colorAtX?
Continue working on adapting the Objective-C answer because it's a good answer? Despite being stuck for now, maybe I can achieve this later.
Use another Foundation/Cocoa method that I don't know of?
Anything else that I could try to improve my code?
TL;DR
colorAtX is slow, and I don't understand how to adapt this Objective-C answer to Swift because of unsigned char.
The fastest alternative to colorAtX() would be iterating over the raw bytes of the image using let bitmapBytes = imageRep.bitmapData and composing the colour yourself from that information, which should be really simple if it's just RGBA data. Instead of your for x/y loop, do something like this...
let bitmapBytes = imageRep.bitmapData
var colors = Dictionary<UInt32, Int>()
var index = 0
for _ in 0..<(width * height) {
let r = UInt32(bitmapBytes[index++])
let g = UInt32(bitmapBytes[index++])
let b = UInt32(bitmapBytes[index++])
let a = UInt32(bitmapBytes[index++])
let finalColor = (r << 24) + (g << 16) + (b << 8) + a
if colors[finalColor] == nil {
colors[finalColor] = 1
} else {
colors[finalColor]!++
}
}
You will have to check the order of the RGBA values though, I just guessed!
The quickest way to maintain a count might just be a [Int, Int] dictionary of pixel values to counts, doing something like colors[color]++. Later on if you need to you can convert that to a NSColor using NSColor(calibratedRed red: CGFloat, green green: CGFloat, blue blue: CGFloat, alpha alpha: CGFloat)

Determine whether a CLLocationCoordinate2D is within a defined region (bounds)?

I am trying to find a simple method to determine whether a CLLocationCoordinate2D lies within the boundaries of an arbitrary shape defined by a series of other CLLocationCoordinate2D's. The shapes may be large enough that great-circle paths need to be considered.
CL used to have a circular region and the containsCoordinate: call to test against, but this has been deprecated in iOS7 and the dox do not contain a hint of what might replace it. I cannot find any other examples, notably one that works on polygons.
There are many similar questions here on SO, but they are not related to iOS specifically, and again, I can't seem to find one that works generally on great-circle polys.
Here's an example (using Algonquin Provincial Park) of an approach that may work for you.
To use CGPathContainsPoint for this purpose, an MKMapView is not required.
Nor is it necessary to create an MKPolygon or even to use the CLLocationCoordinate2D or MKMapPoint structs. They just make the code easier to understand.
The screenshot below was created from the data only for illustration purposes.
int numberOfCoordinates = 10;
//This example draws a crude polygon with 10 coordinates
//around Algonquin Provincial Park. Use as many coordinates
//as you like to achieve the accuracy you require.
CLLocationCoordinate2D algonquinParkCoordinates[numberOfCoordinates];
algonquinParkCoordinates[0] = CLLocationCoordinate2DMake(46.105, -79.4);
algonquinParkCoordinates[1] = CLLocationCoordinate2DMake(46.15487, -78.80759);
algonquinParkCoordinates[2] = CLLocationCoordinate2DMake(46.16629, -78.12095);
algonquinParkCoordinates[3] = CLLocationCoordinate2DMake(46.11964, -77.70896);
algonquinParkCoordinates[4] = CLLocationCoordinate2DMake(45.74140, -77.45627);
algonquinParkCoordinates[5] = CLLocationCoordinate2DMake(45.52630, -78.22532);
algonquinParkCoordinates[6] = CLLocationCoordinate2DMake(45.18662, -78.06601);
algonquinParkCoordinates[7] = CLLocationCoordinate2DMake(45.11689, -78.29123);
algonquinParkCoordinates[8] = CLLocationCoordinate2DMake(45.42230, -78.69773);
algonquinParkCoordinates[9] = CLLocationCoordinate2DMake(45.35672, -78.90647);
//Create CGPath from the above coordinates...
CGMutablePathRef mpr = CGPathCreateMutable();
for (int p=0; p < numberOfCoordinates; p++)
{
CLLocationCoordinate2D c = algonquinParkCoordinates[p];
if (p == 0)
CGPathMoveToPoint(mpr, NULL, c.longitude, c.latitude);
else
CGPathAddLineToPoint(mpr, NULL, c.longitude, c.latitude);
}
//set up some test coordinates and test them...
int numberOfTests = 7;
CLLocationCoordinate2D testCoordinates[numberOfTests];
testCoordinates[0] = CLLocationCoordinate2DMake(45.5, -78.5);
testCoordinates[1] = CLLocationCoordinate2DMake(45.3, -79.1);
testCoordinates[2] = CLLocationCoordinate2DMake(45.1, -77.9);
testCoordinates[3] = CLLocationCoordinate2DMake(47.3, -79.6);
testCoordinates[4] = CLLocationCoordinate2DMake(45.5, -78.7);
testCoordinates[5] = CLLocationCoordinate2DMake(46.8, -78.4);
testCoordinates[6] = CLLocationCoordinate2DMake(46.1, -78.2);
for (int t=0; t < numberOfTests; t++)
{
CGPoint testCGPoint = CGPointMake(testCoordinates[t].longitude, testCoordinates[t].latitude);
BOOL tcInPolygon = CGPathContainsPoint(mpr, NULL, testCGPoint, FALSE);
NSLog(#"tc[%d] (%f,%f) in polygon = %#",
t,
testCoordinates[t].latitude,
testCoordinates[t].longitude,
(tcInPolygon ? #"Yes" : #"No"));
}
CGPathRelease(mpr);
Here are the results of the above test:
tc[0] (45.500000,-78.500000) in polygon = Yes
tc[1] (45.300000,-79.100000) in polygon = No
tc[2] (45.100000,-77.900000) in polygon = No
tc[3] (47.300000,-79.600000) in polygon = No
tc[4] (45.500000,-78.700000) in polygon = Yes
tc[5] (46.800000,-78.400000) in polygon = No
tc[6] (46.100000,-78.200000) in polygon = Yes
This screenshot is to illustrate the data only (actual MKMapView is not required to run the code above):
Anna's solution converted to Swift 3.0:
extension CLLocationCoordinate2D {
func contained(by vertices: [CLLocationCoordinate2D]) -> Bool {
let path = CGMutablePath()
for vertex in vertices {
if path.isEmpty {
path.move(to: CGPoint(x: vertex.longitude, y: vertex.latitude))
} else {
path.addLine(to: CGPoint(x: vertex.longitude, y: vertex.latitude))
}
}
let point = CGPoint(x: self.longitude, y: self.latitude)
return path.contains(point)
}
}

Making sense of a list of GPS values in an iOS application

I have a web service that interfaces with the google maps API to generate a polygon on a google map. The service takes the GPS values and stores them for retrieval.
The problem is that when I try and use these values on my iPhone app the MKPolyline is just either a mess or a bunch of zig-zag lines.
Is there a way to make sense of these values so I can reconstruct the polygon?
My current code looks like this
private void GenerateMap()
{
var latCoord = new List<double>();
var longCoord = new List<double>();
var pad = AppDelegate.Self.db.GetPaddockFromCrop(crop);
mapMapView.MapType = MKMapType.Standard;
mapMapView.ZoomEnabled = true;
mapMapView.ScrollEnabled = false;
mapMapView.OverlayRenderer = (m, o) =>
{
if (o.GetType() == typeof(MKPolyline))
{
var p = new MKPolylineRenderer((MKPolyline)o);
p.LineWidth = 2.0f;
p.StrokeColor = UIColor.Green;
return p;
}
else
return null;
};
scMapType.ValueChanged += (s, e) =>
{
switch (scMapType.SelectedSegment)
{
case 0:
mapMapView.MapType = MKMapType.Standard;
break;
case 1:
mapMapView.MapType = MKMapType.Satellite;
break;
case 2:
mapMapView.MapType = MKMapType.Hybrid;
break;
}
};
if (pad.Boundaries != null)
{
var bounds = pad.Boundaries.OrderBy(t => t.latitude).ThenBy(t => t.longitude).ToList();
foreach (var l in bounds)
{
double lat = l.latitude;
double lon = l.longitude;
latCoord.Add(lat);
longCoord.Add(lon);
}
if (latCoord.Count != 0)
{
if (latCoord.Count > 0)
{
var coord = new List<CLLocationCoordinate2D>();
for (int i = 0; i < latCoord.Count; ++i)
{
var c = new CLLocationCoordinate2D();
c.Latitude = latCoord[i];
c.Longitude = longCoord[i];
coord.Add(c);
}
var line = MKPolyline.FromCoordinates(coord.ToArray());
mapMapView.AddOverlay(line);
mapMapView.SetVisibleMapRect(line.BoundingMapRect, true);
}
}
}
}
MKPolygon / MKPolygonRenderer gives the same sort of random line mess. The OrderBy LINQ makes no difference other than to make the random lines a zig-zag going up or down the view.
Since you don't know the order the points were captured in, you can't trace the actual path traveled around the perimeter of the paddock; this is why your polylines are turning into silly-walks all over the map. Lacking that information, you can at best make an educated guess.
Some possible heuristics you might want to try:
Take the average of all the points to get a "somewhere in the middle" point, then order by atan2(l.latitude - middle.latitude, l.longitude - middle.longitude). (Be careful, atan2 is undefined at (0, 0)!)
Take the convex hull of the points captured: for a relatively small number of points you can get away with the simple quadratic time Jarvis's march. This has the approximate effect of wrapping a notional rubber band around the outside of the map push-pins by discarding points that would form concavities, and should also give you the order of the remaining points.