How to calculate feature contribution of XGBoost and LightGBM model? - xgboost

It is able to extract feature contributions of the Random Forest model with the python library "treeinterpreter" with reference to http://blog.datadive.net/random-forest-interpretation-with-scikit-learn/.
Is there a similar way to calculate feature contributions of the XGboost or LightGBM model ? Thank you !

Related

How to apply a higher weight to a feature in BQML?

I'm attempting to train an xgboost classification model using BQML. But I'd like to give one feature a higher weight. I couldn't find any documentation about assigning feature weights. There is CLASS_WEIGHTS to assign weights to class labels but that is not what I want. BQML documentation.
I feel like this feature is not available yet and I have to create handmade models using sklearn.

Random forest via tensorflow 2.3 is it possible?

I would like to impl. a random forest regression via tensorflow 2.3 but I cannot find any example for that. is it possible to do the random forest regression via tensorflow 2.3?
The same problem with svm, svr :/
I cannot use sklearn, because I have to use golang in running system. Maybe I can do the random forest regression via sklearn but how can I read the model via tensorflow? I think it is not possible.

Is there a Tensorflow or Keras equivalent to fastai's interp.plot_top_losses?

Is there a Tensorflow or Keras equivalent to fastai's interp.plot_top_losses? If not, how can I manually obtain the predictions with the greatest loss?
Thank you.
I found the answer, it is ktrain! Comes with learning rate finder, learning rate schedules, ready to used per-trained models and many more features inspired by fastai.
https://github.com/amaiya/ktrain

Feature importance from tf.estimator.BoostedTreeRegression

I am trying to extracted feature importance from a model built in python using tf.estimator.BoostedTreeRegressor.
It looks like a standard way to achieve it is by iterating over all trees in the forest and from the importance of each tree's coefficients to calculate some statistics.
Example in sklearn, xgboost. I have not found how to address this issue in tensorflow.
This is not possible at the moment using TensorFlow's Premade BoostedTreeRegressor or Classifier Estimators.

Tensorflow: Training a model in C++

Can I train a model in C++ in Tensorflow? I don't see any optimizers exposed in it's C++ API. Are the optimizers written in Python? If not, how can I train a graph in C++? I'm able to import a Python trained graph in C++, but I want to write the code fully in C++ (training and inference)
I have found an example training file from the official repository
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/cc/tutorials/example_trainer.cc
I do believe that this is only basic training of some sort, not open to all the optimizations that the python API has. I will keep looking around for more info.
Auto-differentiation is currently not implemented in C in tensorflow so training complex models in C is a huge task. They say they are working on it: https://github.com/tensorflow/tensorflow/issues/4130