I am using LG Optimus 2x smartphone which consists of gyro and accelrometer sensors. I am using it in indoor tracking application by using pedestrian dead reckoning techniques. I want to use gyro sensor to get correct orientation of mobile. I am integrating gyro data over time to get angles. But these angles are not well accurate. how I can get error free angles from Gyro sensor.
navig
I am using the method described in this manuscript and it works like charm in my application. It gives very accurate orientations.
I am curious. What method do you use for tracking the pedestrian? How do you use orientation?
The best pedometer algorithm I have found so far is this. It seems to me you have something better. Could you share it?
Related
I am in search of a method to get one meter accuracy between mobile devices, with no luck so far. I was looking at gps and location based geo-fencing as an option, but i don't think I am going to get a one meter accuracy. The other option is with Bluetooth, but not sure about that as well.
Can I build a react-native app that can see whether two mobile devices with the app installed to get an accuracy as little as one meter using gps or any other device specific sensors?
I am not sure about your requirement. But to answer your questions, yes you should be able to get 1-meter displacement measurement between 2 geo-location.
Maybe you can check how to achieve 1-meter accuracy in Android
There is other way, where you don't need 2 devices but you can find the distance of any object. That would require a phone with 2 cameras. If that fits the requirement of yours you can explore that area also.
Is it possible to perform quaternion/Euler angle calculations from only accelerometer and gyroscope readings?
I’d like to be able to detect orientation for a small pcb that I have which I designed and built with InvenSense ICM-20689 (SPI version of the popular MPU-6050/6000) but without a magnetometer. I can incorporate a magnetometer into the next revision, but I’d prefer not to if I can get away without it as it costs valuable PCB real estate on a wearable device which I’m trying to make very small. I’ve seen complimentary filters used to give 2 of 3 Euler angles in which no magnetometer is used, so I’d like to understand what the trade-offs are for not using a magnetometer.
At my university we have several Kinect 1's and Kinect 2's. I am testing the quality of the Kinect Fusion results on both device and unexpectedly Kinect 2 produces worst results.
My testing environment:
Static camera scanning a static scene.
In this case if I check both results from Kinect 1 and 2, then it looks like Kinect 2 has a way smoother and nicer resulting point cloud, but if I check the scans from a different angle, then you can see the that Kinect 2 result is way worst even if the point cloud is smoother. As you can see on the pictures if I check the resulting point cloud from the same view as the camera was, then it looks nice, but as soon as I check it from a different angle then the Kinect 2 result is horrible, can't even tell that in the red circle there is a mug.
Moving camera scanning a static scene
In this case Kinect 2 has even worst results, then in the above mentioned case compared to Kinect 1. Actually I can't even reconstruct with Kinect 2 if I am moving it. On the other hand Kinect 1 does a pretty good job with moving camera.
Does anybody have any idea why is the Kinect 2 failing these tests against Kinect 1? As I mentioned above we have several Kinect cameras at my university and I tested more then one of them each, so this should not be a hardware problem.
I've experienced similar results when I was using Kinect for 3D reconstruction. Kinect 2 produced worse results compared to Kienct 1. In fact, I tried the InfiniTAM framework for doing 3D reconstruction. It too yielded similar results. What was different in my case compared to yours was that I was moving the camera around and the camera tracking was awful.
When I asked the authors of InfiniTAM about this, they provided the following likely explanation:
... the Kinect v2 has a time of flight camera rather than a structured
light sensor. Due to imperfections in the modulation of the active
illumination, it is known that most of these time of flight sensors
tend to have biased depth values, e.g. at a distance of 2m, everything
is about 5cm closer than measured, at a distance of 3m everything is
about 5cm further away than measured...
Apparently, this is not an issue with structured light cameras (Kinect v1 and the like). You can follow the original discussion here.
I am right now working on one application where I need to find out user's heartbeat rate. I found plenty of applications working on the same. But not able to find a single private or public API supporting the same.
Is there any framework available, that can be helpful for the same? Also I was wondering whether UIAccelerometer class can be helpful for the same and what can be the level of accuracy with the same?
How to implement the same feature using : putting the finger on iPhone camera or by putting the microphones on jaw or wrist or some other way?
Is there any way to check the blood circulation changes ad find the heart beat using the same or UIAccelerometer? Any API or some code?? Thank you.
There is no API used to detect heart rates, these apps do so in a variety of ways.
Some will use the accelerometer to measure when the device shakes with each pulse. Other use the camera lens, with the flash on, then detect when blood moves through the finger by detecting the light levels that can be seen.
Various DSP signal processing techniques can be used to possibly discern very low level periodic signals out of a long enough set of samples taken at an appropriate sample rate (accelerometer or reflected light color).
Some of the advanced math functions in the Accelerate framework API can be used as building blocks for these various DSP techniques. An explanation would require several chapters of a Digital Signal Processing textbook, so that might be a good place to start.
i want to detect heart rate using iphone sdk does someone knows any method for calculating heartbeat rate?
Fast Fourier Transform is a class of algorithms that can quickly turn samples into an analysis that tells you how prominently ceratin frequencies occur in that sample. For more check out:
Wikipedia: FFT
Literate program example: Cooley-Tukey FFT
This is relevant to your problem because: (1) heart rate is itself a frequency, and (2) most of the sound that comes through the body that you can measure will be within a certain frequency range. Dropping frequencies outside this range means dropping all or mostly noise.
Good luck!
Well I've seen various implementations. Some of them use the accelerometer to detect minute movements in your arm/hand when you hold the phone, some of them can use the microphone, you could also do a manual 'tap' interface where you tap the screen while checking your own pulse.