I am trying to make an app which uses both the GPS and the magnetometer for finding the way and direction for mecca (Mosque). It has some special features like date picker for upcoming prayers, prayer timings left calculated from current location time zone and weather on the current location and some more on. If anyone has sample code regarding to this, please reply.
Thanks in advance
The magnetometer can be told just to return the current device heading, via CLLocationManager. It can return in unfiltered 3d, but there's no reason to use it — from CLHeading just use the trueHeading property. That'll give you the same information shown in the Compass app.
To work out the heading to Mecca from where you are, you can use the formulas given here. Google Maps gives me a geolocation of about 21.436, 39.832 for Masjid al-Haram (I'm not sure it's terribly accurate, so I've rounded inappropriately low), so you could get the bearing from whatever location CLLocationManager tells you you're at something like:
#define toRadians(x) ((x)*M_PI / 180.0)
#define toDegrees(x) ((x)*180.0 / M_PI)
...
double y = sin(toRadians(39.832 - currentLocation.longitude)) * cos(toRadians(21.436));
double x = cos(toRadians(currentLocation.latitude)) * dsin(toRadians(21.436)) -
sin(toRadians(currentLocation.latitude)) * dcos(toRadians(21.436)) * dcos(toRadians(39.832 - currentLocation.longitude));
double bearing = toDegrees(atan2(y, x));
You can then rotate a pointer on screen by the difference between the device's heading and the one you've just calculated. Probably easiest to use a CGAffineTransform on a UIView's transform property.
That's all typed as I answer by the way, not tested. I'd check it against a reliable source before you depend on it.
Related
The simple equation for user location using inbuilt inertial measurement unit (IMU) which is also called pedestrian dead reckoning (PDR) is given as:
x= x(previous)+step length * sin(heading direction)
y= y(previous)+step length *cos(heading direction )
We can use the motionManager property of CMMotionManager class to access raw values from accelerometer, gyroscope, and magnetometer. Also, we can get attitudes values as roll, pitch, and yaw. The step length can be calculated as the double square root of acceleration. However, I'm confused with the heading direction. Some of the published literature has used a combination of magnetometer and gyroscope data to estimate the heading direction. I can see that CLHeading also gives heading information. There are some online tutorials especially for an android platform like this to estimate user location. However, it does not give any proper mathematical explanation.
I've followed many online resources like this, this,this, and this to make a PDR app. My app can detect the steps and gives the step length properly however its output is full of errors. I think the error is due to the lack of proper heading direction. I've used the following relation to get heading direction from the magnetometer.
magnetometerHeading = atan2(-self.motionManager.magnetometerData.magneticField.y, self.motionManager.magnetometerData.magneticField.x);
Similarly, from gyroscope:
grysocopeHeading +=-self.motionManager.gyroData.rotationRate.z*180/M_PI;
Finally, I give proportional weight to the previous heading driection, gryoscopeheading, and magnetometerHeading as follows:
headingDriection = (2*headingDirection/5)+(magnetometerHeading/5)+(2*gryospoceHeading/5);
I followed this method from a published journal paper. However, I'm getting lots of error in my work. Is my approach wrong? What exactly should I do to get a proper heading direction such that the localization estimation error would be minimum?
Any help would be appreciated.
Thank you.
EDIT
I noticed that while calculating heading direction using gyroscope data, I didn't multiply the rotation rate (which is in radian/sec) with the delta time. For this, I added following code:
CMDeviceMotion *motion = self.motionManager.deviceMotion;
[_motionManager startDeviceMotionUpdates];
if(!previousTime)
previousTime = motion.timestamp;
double deltaTime = motion.timestamp - previousTime;
previousTime = motion.timestamp;
Then I updated the gyroscope heading with :
gyroscopeHeading+= -self.motionManager.gryoData.rotationRate.z*deltaTime*180/M_PI;
The localization result is still not close to the real location. Is my approach correct?
I have to zoom a pdf-file thats inside of a ScrollPane.
The ScrollPane itself is inside of a StackPane.
In the beginning I scale my pdf to fit the width of my ScrollPane. As a result of that the pdf-height doesn't fit the ScrollPanes height.
I already managed to zoom, by changing my scaleFactor when using the mousewheel. Unfortunately I can't zoom into a specific point.
I guess I have to change the ScrollPanes values depending on the mouse coordinates, but I just can't find the correct calculation. Can somebody please help me?
For example I tried
scrollPane.setVvalue(e.getY() / scrollPane.getHeight())
With this line of code my view just jumps up or down, depending on whether I click on the upper bound or the lower bound of my viewport.
I also understand that it has to behave like that, but I can't figure it out what has to be added/changed.
I use Jpedal to display my pdf
Hope you understand what I am looking for.
Tell me if you need more information.
Edit:
Here is a snipped of how I managed to drag.
eventRegion.addEventFilter(MouseEvent.MOUSE_PRESSED, e -> {
dragStartX = e.getX();
dragStartY = e.getY();
});
eventRegion.addEventFilter(MouseEvent.MOUSE_DRAGGED, e -> {
double deltaX = dragStartX - e.getX();
double deltaY = dragStartY - e.getY();
scrollPane.setHvalue(Math.min(scrollPane.getHvalue() + deltaX / scrollPane.getWidth(), scrollPane.getHmax()));
scrollPane.setVvalue(Math.min(scrollPane.getVvalue() + deltaY / scrollPane.getHeight(), scrollPane.getVmax()));
e.consume();
});
I think zooming to the mouse position could be done in a similar way, by just setting the Hvalue and Vvalue.
Any ideas how I can calculate these values?
This example has JavaFX 8 code for a zoomable, pannable ScrollPane with zoom to mouse pointer, reset zoom and fit to width of a rectangle which can really be any Node. Be sure to check out the answer to the question to get fitWidth() to work correctly. I am using this solution for an ImageView now, and it is slick.
just for all related questions about "zooming where the mouse is".
I had the same problem and I came up with the following code snippet.
public void setZoom(final double x, final double y, final double factor) {
// save the point before scaling
final Point2D sceneToLocalPointBefore = this.sceneToLocal(x, y);
// do scale
this.setScaleX(factor);
this.setScaleY(factor);
// save the point after scaling
final Point2D sceneToLocalPointAfter = this.sceneToLocal(x, y);
// calculate the difference of before and after the scale
final Point2D diffMousePoint = sceneToLocalPointBefore.subtract(sceneToLocalPointAfter);
// translate the pane in order to point where the mouse is
this.setTranslateX(this.getTranslateX() - diffMousePoint.getX() * this.getScaleX());
this.setTranslateY(this.getTranslateY() - diffMousePoint.getY() * this.getScaleY());
}
The basic idea is to move the underlying Pane to that point where it was before scaling. Important is the fact, that we calculate the mouse position to the local coordinate system of the Pane. After scale we do this just another time and calculate the difference. Once we know the difference we are able to move back the Pane. I think this solution is very easy and straightforward.
My setup in JavaFX is following: I have a javafx.scene.layout.BorderPane as root for my javafx.scene.Scene. In the center I put a Pane. This will be the Pane where I act on (i.e. put other Nodes in..zoom, move..etc.) If anyone is interested in how I actually did it, just mail me.
Good programming!
Good day! i just want to ask if what value should i need to input in my code and inside the condition in order to detect a regular voice of a user, in such that after i detected the voice, i will record it automatically and stop the recording when it is silent/the recorder didnt detect the voice, this is my code and i get that from detecting when a user blows into the mic.
- (void)levelTimerCallback:(NSTimer *)timer {
[recorder updateMeters];
const double ALPHA = 0.05;
double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0]));
lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;
[recorder record];
if (lowPassResults < 0.95)
{NSLog(#"Recording");
[recorder record];}
}
im new at objective c, any help would be very helpful to me... thanks in advance.
There is no set level you can use to detect the volume of normal speech. Leaving aside issues of background noise and so on, There is no standard translation between audio levels as numbers in a computer and sound levels in the air.
Think about it: what are the input levels? What type of mike is it? How far away is the user? You don't know any of these things, so there's no way to know the answer.
You might want to think about looking for relative change in volume, rather than absolute level (although this is iffy as well) or a different user experience entirely.
I'm trying to measure the distance between two points (longitude, latitude). My problem is that I get different results on iOS then on Android.
I've checked it with this site and the result was that the Android values are correct.
I'm using this MapKit method to get the distance in iOS: distanceFromLocation:
Here are my test locations:
P1: 48.643798, 9.453735
P2: 49.495150, 9.782150
Distance iOS: 97717 m
Distance Android: 97673 m
How is this possible and how can I fix this?
So I was having a different issue and stumbled upon the answer to both of our questions:
On iOS you can do the following:
meters1 = [P1 distanceFromLocation:P2]
// meters1 is 97,717
meters2 = [P2 distanceFromLocation:P1]
// meters2 is 97,630
I've searched and searched but haven't been able to find a reason for the difference. Since they are the exact same points, it should show the same distance no matter which way you are traveling. I submitted it to Apple as a bug and they closed it as a duplicate but have still not fixed it. I would suggest to anyone who wants this to be fixed to also submit it as a bug.
In the meantime, the average of the two is actually the correct value:
meters = (meters1 + meters2)/2
// meters (the average of the first two) is 97,673
Apparently Android does not have this problem.
The longitude and latitude are not all that you need. You have to use the same reference model like WGS84 or ETRS89.
The earth is not an exact ellipsoid, so you need models, none of the models are entirely exact, and depending on which model you use, distances are somewhat different.
Please make sure you use the same reference for iOS and Android.
There is more than one way to calculate distance between long/lat coords based on how you compensate for the curvature of the earth, and there's no right or wrong approach. Most likely the two platforms use a slightly different model.
Here are some formulae for calculating it yourself. http://www.movable-type.co.uk/scripts/latlong.html
If you absolutely need them to be the same, just implement your own calculation using one of these formulae, then you can ensure you get the same result on both platforms.
How'd I go about this one? I want to tween a value from one to another in x time. While also taking into account that it'd be nice to have an 'ease' at the start and end.
I know, I shouldn't ask really, but I've tried myself, and I'm stuck.
Please assume that to cause a delay, you need to call function wait(time).
One simple approach that might work for you is to interpolate along the unit circle:
To do this, you simply evaluate points along the circle, which ensures a fairly smooth movement, and ease-in as well as ease-out. You can control the speed of the interpolation by changing how quickly you alter the angle.
Assuming you're doing 1-dimensional interpolation (i.e. a simple scalar interpolation, like from 3.5 to 6.9 or whatever), it might be handy to use Y-values from -π/2 to π/2. These are given by the sine function, all you need to do is apply suitable scaling:
angle = -math.pi / 2
start = 3.5
end = 6.9
radius = (end - start) / 2
value = start + radius + radius * math.sin(angle)
I'm not 100% sure if this is legal Lua, didn't test it. If not, it's probably trivial to convert.
You may look at Tweener ActionScript library for inspiration.
For instance, you may borrow necessary equations from here.
If you need further help, please ask.