Model analyzing in Java 3D - java-3d

In Java3D application I have two planes. How can I find if they have an intersection and if they do what is the angle between these planes? And how to resolve the vector of their intersection?

Related

How do we project from camera to lidar coordinate when both of the sensors share same coordinate systems?

I am working on object detection task. I am able to detect objects in kitti point cloud. I am trying to use the same code on my own point cloud dataset. In Kitti dataset the camera and lidar sensor share different coordinate systems. I have attached image here for reference. For camera the axis are (z,x,y) and for lidar the axis are (x,y,z).
For KITTI dataset they have also provided calibration information. I am able to understand the projection matrix for KITTI dataset.I went through few materials.
Camera-Lidar Projection.
In the above link he has calculated projection matrix as:
R_ref2rect_inv = np.linalg.inv(R_ref2rect)
P_cam_ref2velo = np.linalg.inv(velo2cam_ref)
proj_mat = R_ref2rect_inv # P_cam_ref2velo
My question is:
For my dataset the sensor setup is almost same so we can say lidar and camera share the same coordinate system. So, in this case how do I project point from camera to lidar?
In other words,
If my object center is (x=7.5, y=1.7, z=0.58) in camera coordinate system then how do I find the same point in Lidar pointcloud?

Butterworth filtering and bezier smoothing of a trajectory

So i have a dataset that represents x-y-z coordinates of a linear motion. The dataset is noisy and i am trying to extract a smooth trajectory as close as possible to the real trajectory using a bezier curve and fitting each point as a control point of the curve. The results are moderately satisfying so i was wondering whether a pre-filtering with a lowpass butterworth will show any better results.
In general is it useful to combine a smoothing technique like bezier or cubic smoothing splines with low pass filtering?

How to apply quantization to vertex coordinates?

Question title is pretty self explanatory.
I couldn't find any option to quantize vertex coordinates.
I think Meshlab determines itself but I want to determine it myself. (32bit 16bit 10bit etc.)
Does meshlab capable of quantization of vertices? If not what is the default bit value of vertex representations?
To my acknowledge, meshlab is not capable of doing that. I suggest you to export your mesh in ply ascii file format and use matlab or python to process the lines of the coordinates.

2D image decomposition

I have a matrix and I want to decompose it into different matrices with low to high frequency limit. As I have noticed, it can be done using wavelet transform. I found something like the figure below for 1D signal and I want to do similar procedure for my 2D matrix using MATLAB. I want to decompose it to different matrices with low to high frequency components in different levels.
I used the matrix tool box, however, when I have problems with extracting the data.
How can I do this using MATLAB?
You are looking for the wavedec2 function.
There's a basic example w/ the function documentation here

plotting a function like ezplot in matlab in python using matplotlib

I have a a classification function that classify a data point into one of two classes. The problem is, I need a way to plot the decision boundary of two class. While this is easy for linear function. it's cumbersome to find the equation of the boundary. ezplot package in matlab seems to be able to do it. It will plot the result automatically. It works using linear and quadratic function. It doesn't require you to provide the coordinate. In matplotlib, you can only plot if you are given the coordinate. Does anyone know how to do this with matplotlib?