Butterworth filtering and bezier smoothing of a trajectory - bezier

So i have a dataset that represents x-y-z coordinates of a linear motion. The dataset is noisy and i am trying to extract a smooth trajectory as close as possible to the real trajectory using a bezier curve and fitting each point as a control point of the curve. The results are moderately satisfying so i was wondering whether a pre-filtering with a lowpass butterworth will show any better results.
In general is it useful to combine a smoothing technique like bezier or cubic smoothing splines with low pass filtering?

Related

Gaussian process regression for optimization, how hyperparameters affect the gradient of a Gaussian process

I am studying the predictive control of a Gaussian process model. When I use the hyperparameters optimized by the maximum edge likelihood function, I cannot get good results. When I manually change the hyperparameters, the MSE of the model becomes larger, but the control effect has a certain improvement, can someone explain it? thanks

Multivariate Gaussian likelihood without matrix inversion

There are several tricks available for sampling from a multivariate Gaussian without matrix inversion--cholesky/LU decomposition among them. Are there any tricks for calculating the likelihood of a multivariate Gaussian without doing the full matrix inversion?
I'm working in python, using numpy arrays. scipy.stats.multivariate_normal is absurdly slow for the task, taking significantly longer than just doing the matrix inversion directly with numpy.linalg.inv.
So at this point I'm trying to understand what is best practice.

What's the difference between Keras' AUC(curve='PR') and Scikit-learn's average_precision_score?

I am quite confused on the difference between Keras' AUC(curve='PR') and Scikit-learn's average_precision_score. My objective is to compute the Area Under the Precision-Recall Curve (AUPRC), for both Scikit-learn and Keras models. However, these two metrics yield vastly different results!
Did I miss something out on the TensorFlow-Keras documentation at https://www.tensorflow.org/api_docs/python/tf/keras/metrics/AUC, with regards to the use of the AUC function?
As stated in the Scikit-learn documentation, they use a different implementation method:
References [Manning2008] and [Everingham2010] present alternative variants of AP that interpolate the precision-recall curve. Currently, average_precision_score does not implement any interpolated variant. References [Davis2006] and [Flach2015] describe why a linear interpolation of points on the precision-recall curve provides an overly-optimistic measure of classifier performance. This linear interpolation is used when computing area under the curve with the trapezoidal rule in auc.
In the average_precision_score function documentation, you can also read:
This implementation is not interpolated and is different
from computing the area under the precision-recall curve with the
trapezoidal rule, which uses linear interpolation and can be too
optimistic.
I encourage you to look in detail at the different functions and their descriptions available in the metrics module. I also highly recommend to read the related paper.
Lastly, there's also a potentially interested thread here: [AUC] result of tf.metrics.auc doesnot match with sklearn's.

2D image decomposition

I have a matrix and I want to decompose it into different matrices with low to high frequency limit. As I have noticed, it can be done using wavelet transform. I found something like the figure below for 1D signal and I want to do similar procedure for my 2D matrix using MATLAB. I want to decompose it to different matrices with low to high frequency components in different levels.
I used the matrix tool box, however, when I have problems with extracting the data.
How can I do this using MATLAB?
You are looking for the wavedec2 function.
There's a basic example w/ the function documentation here

plotting a function like ezplot in matlab in python using matplotlib

I have a a classification function that classify a data point into one of two classes. The problem is, I need a way to plot the decision boundary of two class. While this is easy for linear function. it's cumbersome to find the equation of the boundary. ezplot package in matlab seems to be able to do it. It will plot the result automatically. It works using linear and quadratic function. It doesn't require you to provide the coordinate. In matplotlib, you can only plot if you are given the coordinate. Does anyone know how to do this with matplotlib?