CLLocation geocodeAddressString result accuracy - mapkit

I'm currently working on an app where the user inputs and address, which is then converted into coords. A database of locations is then queried and locations with in, say, 5km of search location is returned.
The problem I'm having is the accuracy returned by the geocodeAddressString function. When searching: Auckland, New Zealand, I'm getting back -36.90000, 174.70000, which is about 10 km's off the correct result. It's a few suburbs over.
Is there any way to improve on this? The Google Maps result is -36.848479, 174.763373, which you can see is much sharper and what I'm after.
Thanks!

Related

Weather Api (OpenWeatherMap) no wind direction(deg)

is there anyone using OpenWeatherMap api? , just a quick question, Im using their current condition API, but it doesnt return a wind directorion(deg) property on this lat lang (14.6760, 121.0437), it does when I input a different location, does this mean the wind is going NORTH "N if I dont receive a deg property from OpenWeatherMap?
Edit: I just noticed, its not just on that coordinates, If the wind is only at 1.5mph, it doesnt have a direction, does this mean its automatically north?

How to get user location using accelerometer, gryoscope, and magnetometer in iPhone?

The simple equation for user location using inbuilt inertial measurement unit (IMU) which is also called pedestrian dead reckoning (PDR) is given as:
x= x(previous)+step length * sin(heading direction)
y= y(previous)+step length *cos(heading direction )
We can use the motionManager property of CMMotionManager class to access raw values from accelerometer, gyroscope, and magnetometer. Also, we can get attitudes values as roll, pitch, and yaw. The step length can be calculated as the double square root of acceleration. However, I'm confused with the heading direction. Some of the published literature has used a combination of magnetometer and gyroscope data to estimate the heading direction. I can see that CLHeading also gives heading information. There are some online tutorials especially for an android platform like this to estimate user location. However, it does not give any proper mathematical explanation.
I've followed many online resources like this, this,this, and this to make a PDR app. My app can detect the steps and gives the step length properly however its output is full of errors. I think the error is due to the lack of proper heading direction. I've used the following relation to get heading direction from the magnetometer.
magnetometerHeading = atan2(-self.motionManager.magnetometerData.magneticField.y, self.motionManager.magnetometerData.magneticField.x);
Similarly, from gyroscope:
grysocopeHeading +=-self.motionManager.gyroData.rotationRate.z*180/M_PI;
Finally, I give proportional weight to the previous heading driection, gryoscopeheading, and magnetometerHeading as follows:
headingDriection = (2*headingDirection/5)+(magnetometerHeading/5)+(2*gryospoceHeading/5);
I followed this method from a published journal paper. However, I'm getting lots of error in my work. Is my approach wrong? What exactly should I do to get a proper heading direction such that the localization estimation error would be minimum?
Any help would be appreciated.
Thank you.
EDIT
I noticed that while calculating heading direction using gyroscope data, I didn't multiply the rotation rate (which is in radian/sec) with the delta time. For this, I added following code:
CMDeviceMotion *motion = self.motionManager.deviceMotion;
[_motionManager startDeviceMotionUpdates];
if(!previousTime)
previousTime = motion.timestamp;
double deltaTime = motion.timestamp - previousTime;
previousTime = motion.timestamp;
Then I updated the gyroscope heading with :
gyroscopeHeading+= -self.motionManager.gryoData.rotationRate.z*deltaTime*180/M_PI;
The localization result is still not close to the real location. Is my approach correct?

How to get reliable U.S. state responses by reverse geocoding?

Google sometimes returns the incorrect U.S. state when reverse geocoding a lat/long. Presumably this is because Google is trying to return the nearest street address, which in some cases is not in the same state as the lat/long you are trying to reverse geocode.
Though it may not be a common scenario in practice, it's pretty easy to reproduce by playing around with a map: http://gmaps-samples.googlecode.com/svn/trunk/geocoder/reverse.html
For my application, I am less concerned about getting the nearest address and more concerned about always getting the correct U.S. state for a lat/long. Is there a way to achieve this with Google's API?
Thank you
Iterate over all results and pick the one with "administrative_area_level_1" in results[i].types
This is better than taking the "equivalent" address component from the first result, i.e. finding "administrative_area_level_1" in results[0].address_components[j].types
When reverse geocoding snaps your latlng to the nearest address which happens to be in a different state (or country), the state/country address component of the first result will be that of where that address is, but the subsequent result will be the state/country where the input latlng is.
Example: 42.834185,-0.302811 is in Spain, but snaps to an address in France.
https://google-developers.appspot.com/maps/documentation/utils/geocoder/#q%3D42.834185%252C-0.302811
results[0].address_components[3].types = ["administrative_area_level_1", "political"]
results[0].address_components[3].short_name = "FR"
results[6].types = ["administrative_area_level_1", "political"]
results[6].short_name = "ES"

Different distance between two points on iOS and Android

I'm trying to measure the distance between two points (longitude, latitude). My problem is that I get different results on iOS then on Android.
I've checked it with this site and the result was that the Android values are correct.
I'm using this MapKit method to get the distance in iOS: distanceFromLocation:
Here are my test locations:
P1: 48.643798, 9.453735
P2: 49.495150, 9.782150
Distance iOS: 97717 m
Distance Android: 97673 m
How is this possible and how can I fix this?
So I was having a different issue and stumbled upon the answer to both of our questions:
On iOS you can do the following:
meters1 = [P1 distanceFromLocation:P2]
// meters1 is 97,717
meters2 = [P2 distanceFromLocation:P1]
// meters2 is 97,630
I've searched and searched but haven't been able to find a reason for the difference. Since they are the exact same points, it should show the same distance no matter which way you are traveling. I submitted it to Apple as a bug and they closed it as a duplicate but have still not fixed it. I would suggest to anyone who wants this to be fixed to also submit it as a bug.
In the meantime, the average of the two is actually the correct value:
meters = (meters1 + meters2)/2
// meters (the average of the first two) is 97,673
Apparently Android does not have this problem.
The longitude and latitude are not all that you need. You have to use the same reference model like WGS84 or ETRS89.
The earth is not an exact ellipsoid, so you need models, none of the models are entirely exact, and depending on which model you use, distances are somewhat different.
Please make sure you use the same reference for iOS and Android.
There is more than one way to calculate distance between long/lat coords based on how you compensate for the curvature of the earth, and there's no right or wrong approach. Most likely the two platforms use a slightly different model.
Here are some formulae for calculating it yourself. http://www.movable-type.co.uk/scripts/latlong.html
If you absolutely need them to be the same, just implement your own calculation using one of these formulae, then you can ensure you get the same result on both platforms.

how to analyze min max loc returned by opencv's cvMatchTemplate?

I am trying to detect objects in image on an iphone app.
I am using the cvMatchTemplate function, I manage to see some patterns returned by the cvMatchTemplate function (I chose CV_TM_CCOEFF_NORMED).
Positive Results (result image is 163x371):
http://encryptedpixel.files.wordpress.com/2011/07/photo-13-7-11-11-52-19-am.jpeg
cvMinMaxLoc returns: min (102,244) max(11,210)
The min point is making some sense here, the position of the dark spot is really 102,244 in the result image of 163x371
Negative Results:
cvMinMaxLoc returns: min (114,370) max(0,0)
This is not making sense, there is totally no results, why is there still a min point at 114,370?
I need to know how to analyze these results programatically so that I can say "Hey I found the object!" in objectiveC for iPhone app?
Thanks!
cvMinMaxLoc will always return the position of the minimum and maximum values of their input. It only "doesn't make sense" in your particular application. You should check the value at the returned position for the minimum and do something like threshold it to see if that's a probable match for your template. A template match will yield a very low or a very high value, depending on the method you chose.