"ERROR 010067": Error in executing grid expression in ArcMap 10.8 - kernel density analysis - arcgis

I'm attempting to estimate the kernel density of crime incidents in London. The geographic coordinate system of my data is GCS_OSGB_1936 and the projection coordinate system is British_National_Grid.
"Error 010067" keeps poping up. Any advise on how can I resolve this error and carry on kernel density estimation would be highly appreciated. Thanks in advance!

Related

Triggering Lucid cameras with Lidar sensor in a drone-mounted setup

I am working on a drone-mounted setup which consists of one Lidar sensor and two Lucid cameras. Currently, the sensors are triggered by a CPU that is also mounted on the drone. However, triggering the sensors in this way is reducing the computational power of the CPU, which is also used to process the data captured by the sensors.
I am looking for a solution to trigger the Lucid cameras with the Lidar sensor, in order to free up the computational power of the CPU. I have no prior experience in this area, and I would be grateful for any guidance or advice on how to resolve this issue.
Thank you in advance for your time and help.
Sincerely,
Anh.

Is it possible to determine if vehicle rolled back on slope or hill using gps or acclerometer?

I have a vehicle with a tracker installed. The device has a gps syst, 3-axis accelerometer, 3-axis magnetomet and a gyromeyet. Is it possible to determine by how much the vehicle rolled back on a slope or hill. Using gps angle wasn't an option as the angle given for short backward movement isn't always reliable. Can accelerometer be used in such a scenario??
You're right that the GPS angle (heading) will not help you in a single-antennae setup. On its own a GPS receiver needs a minimum distance of movement to determine heading.
A simple GPS receiver, when used without GPS corrections (which is the case for off-the-shelf GPS devices and mobile phones/tablets), has a minimum ~5 meter accuracy. That's why a short backward movement will not yield the desired results.
In construction/mining applications, there is often a fixed GPS base station nearby that broadcasts GPS corrections, which allows vehicle-mounted GPS receiver to apply corrections, reduce error and ultimately get centimeter-level accuracy.
So in conclusion, your 3-axis accelerometer will likely be the only sensor that you can rely on until your vehicle has rolled back at least 5 meters.
If your accelerometer is sensitive enough, you'll get measurable sensor values. However, if you rollback is very slow, where the G forces are almost imperceptible to the accelerometer, then you're out of luck.
This is assuming that you want near real-time detection of vehicle rollback.

is GPS error the same on two devices near each other?

There are a fair number of GPS questions here...
The GPS is not always completely accurate.
If two devices are nearby, will they have the same error?
I have GPS in my lawnmower. If an app subtracts the GPS location of the mower and the GPS on the phone, will the error cancel out, so it can show the direction to the mower with greater accuracy than just looking at the GPS location of the mower on the map?
No, the error can be different even if the devices are of the same model.
The error is due to signal diversity which can change even if you move the receiver even a bit, location of the antenna (which is definitely not the same) and the accuracy of the GPS's components. All of these can vary even for two devices of the same model. If the models are not the same, they can be using different technologies, like SiRF3 or SiRF4 so once again, the error probably not be the same.

When is it needed to fuse IMU sensor data with GPS-RTK, and when is it not?

I'm using a high accuracy GPS RTK setup to precisely locate a mobile robotic platform in the field (down to 10 cm accuracy). I have also a 9DOF IMU mounted on the platform (9DOF sparkfun IMU Razor).
The Question is, Do I really need to perform a sensor fusion between IMU and GPS like what this ROS node do (http://wiki.ros.org/robot_localization) to estimate the robot pose? or is it just enough to read the Pitch,Yaw,Rotation data from the IMU to know the heading along with the GPS Long,Lat,Alt ?
What cases make it essential to perform this type of fusion ?
Thanks in advance
It is essential to perform fusion because:
1) Roll, Pitch, and Rotation data from the IMU are not perfect, and they will drift over time due to gyro errors. The magnetic field sensor in the IMU module limits this, but crudely. Fusion allows the GPS RTK measurements to be used to continuously estimate the dominant error sources in the IMU and maintain better attitude information.
2) The IMU supports position estimation when GPS-RTK is lost through signal blockage or any other outage, such that the robotic platform is not lost when and if GPS signals are interrupted.

How to increase interrupt sampling frequency on Beagle Bone Black?

Currently I'm attempting to read a 600ppr optical encoder with a simple attachInterrupt() function through the built in Cloud 9 IDE (node.js), the issue is that if the rotary encoder is rotated too quickly the position data becomes lost; it appears that the frequency of the signal provided by the encoder exceeds the interrupt's sampling rate.
My question is there a way to increase the sampling rate to somewhere in the range of 100KHz, currently it seems to sample at roughly 2KHz.
Thank you for your help!