How to open IMU and GPS data from previous flights in robot_localization ROS2? - gps

how to open imu and gps data from previous flights in robot_localization package ROS2? I have an IMU data from VN200, gps data from HERE3. I want to pass data through robot_localization Kalman filter.
My goal is to compare data from robot_localization Kalman filter and from VN200 built-in EKF. I think that a visual method of comparison would be to plot graphs via plotjuggler. Please provide a step by step guide if its possible, I`m a newbie in robotics ros2.

Related

GPS trajectory dataset

Stackoverflow community,
I have a CSV file containing GPS coordinates (latitude and longitude) for a set of towns (in the image below),
I am working for a covid-19 contact tracing project, so I want to generate a GPS trajectories dataset for a number of humans, my dataset must contain many close GPS points, and I want to know if there is a way to find the closest GPS points around every town GPS coordinates
please, I need a solution as soon as possible Cordinates Details

Fusing asynchronous measurements from multiple sensors

I have a set of 12 IMUs mounted on the end effector of my robot arm which I read using a micro controller to determine it's movement. With my controller I can read two sensors simultaneously using direct memory access. After acquiring the measurements I would like to fuse them to make up for the sensor error and generate a more reliable reading than having only one sensor.
After some research my understanding is that I can use a Kalman filter to reach my desired outcome, but still have the problem of all the sensor values having different time stamps, since I can read only two at a time and even if both time stamps will be synchronized perfectly, the next pair will have a different time stamp if only in the µs range.
Now I know controls engineering principles but am completely new to the topic of sensor fusion and google presents me with too many results to find a solution in a reasonable amount of time.
Therefore my question, can anybody point me into the right direction by naming me a certain keyword I need to look for or literature I should work through to better understand that topic, please?
Thank you!
The topic you are dealing with is not an easy one. Try to have a look at the multi-rate kalman filters.
The idea is that you design different kalman filters for each combination of sensor that you can available at the same time, and use it when you have the data from those sensors, while the system state is passed between the various filters.

How to save Gnuradio Waterfall Plot?

I want to measure spectrum Occupancy of any one of the GSM band using Gnuradio and a USRP for 24 hours.
Is there any way to save the waterfall plot of gnuradio in image file or any other format?
If not is there any other way to show the spectrum occupancy for certain amount of time in one image or graph?
Is there any way to save the waterfall plot of gnuradio in image file or any other format?
Middle-mouse-button -> Save.
If not is there any other way to show the spectrum occupancy for certain amount of time in one image or graph?
This is a typical case for "offline processing and visualization". I'd recommend you just build a GNU Radio flow graph that takes the samples from the USRP, applies decimating band pass filters (best case: in shape of the GSM pulse shape), and then calculates the power of the resulting sample streams (complex_to_mag_squared) and then just saves these power vectors.
Then you could later easily visualize them with e.g. numpy/matplotlib, or whatever tool you prefer.
The problem really is that GSM spectrum access happens in the order of microseconds, and you want to observe for 24 hours – no visualization in this world can both represent accurately what's happening and still be compact. You will need to come up with some intelligent measure built atop of the pure occupancy information.

Is the raw depth data from the Kinect 2 completely unfiltered?

As the title says, is the raw data really raw or does the Kinect apply some sort of filtering (median, bilateral etc.) to reduce the noise? I am comparing the data with other non consumer ToF cameras and it seems that the raw values from the Kinect 2 are pretty smooth.
No, some filters are applied.
But Microsoft doesn't publish any information about what's going on inside their Kinect SDK/hardware, so we can only guess.
The best information about this comes from libfreenect2, the open source driver for Kinect v2. One of the developers said:
[libfreenect's] current depth processing code [...] is doing the same things as the shader shipped with the K4W2 Preview SDK (might have changed in the meantime). The bilateral filter is applied to the complex valued images before computing the amplitude/phase(depth). Its only aware of intensity edges in these images. The "edge-aware" filter basically tries to filter the flying pixels at the object boundaries by calculating some statistics in a local neighborhood. Both filters can be disabled in libfreenect2.
(emphasis mine, Source)
Of course we don't know if there's anything else going on or if something changed in the release version of the Microsoft SDK.
Btw. here's a recent paper comparing some current ToF sensors:
A Comparative Error Analysis of Current Time-of-Flight Sensors - Peter Fürsattel et al.

How to implement lowpass filter to reduce noise in gyroscope values?

I am new to labview and I need help.
I am using myrio with gyroscope, and when I display the gyroscope values I get noise.
My question is: How can I implement lowpass filter to reduce the noise in X , Y and Z rates of the gyroscope?
I searched a lot, but I did not understand how can I know what is the sampling frequency, the low and the high cutoff frequency.
Thank you so much.
If you're data is noisy you should try to fix the problem before you digitize the data. If a physical low-pass filter will do the trick, install one. The better the signal before the DAQ the better the data will be once it's digitized.
Some other signal conditioning considerations: make sure to reduce the length of wire from the gyroscope to the DAQ to only what's necessary, if possible eliminate any sources of noise from the environment (like any large rotating magnets--seriously I once helped someone who was complaining about noise when they were using an unshielded wire next to an MRI machine), and if you're going to add any signal conditioning try to amplify close to your sensor.
If you still would like to filter in software, there's an example included with LabVIEW that demonstrates both the point-by-point VIs and the array based VIs. It's called PtByBp and Array Based Filter.vi and can be found in the Example Finder under Analysis, Signal Processing and Mathematics >> Filtering and Conditioning
Please install this FREE toolkit from ni.com: http://sine.ni.com/nips/cds/view/p/lang/en/nid/212733
There are examples and good ready to use application how to use myRIO gyroscope and how to do proper DSP.
Sampling frequency is how fast you sample. Look for this value in the ADC settings. Low and high cutoffs - play with those values. Doing an FFT on your signal may help you to determine spectral frequency density, and decide where to cut.