linfa icon indicating copy to clipboard operation
linfa copied to clipboard

Random Forest and Ensemble Learning

Open jk1015 opened this issue 3 years ago • 3 comments

I've recently been looking into adding Random Forest to linfa. Since Ensemble Learning is on the roadmap anyway I think the best way to do this would be to add Bootstrap Aggregation for any classifier rather than specialising the implementation to Decision Trees. I'm not totally sure what the design of this should look like though, especially since there don't seem to be any fixed conventions for implementing classifiers in linfa.

Would general bootstrap aggregation be a useful addition? If so I'm interested in other's opinions on how this should interface with existing/future classifiers in linfa along with any other design considerations.

jk1015 avatar Feb 16 '22 19:02 jk1015

In impl_dataset.rs we already have bootstrap aggregation code that produces sub-samples from a dataset. We just need a generalized way of fitting classifiers over the subsamples. We have the trait linfa::traits::Fit that represents the training of a model using a set of hyperparameters, and we have linfa::traits::PredictInplace representing prediction using a trained model. You can define a new ensemble classifier that's generic over these traits, similar to how cross_validate is defined. Its Fit impl fits its "inner" classifier/regressor multiple times over the subsamples, and its Predict impl averages/votes on predictions made across its inner models.

YuhanLiin avatar Feb 18 '22 04:02 YuhanLiin

Here is a WIP PR for RF

https://github.com/rust-ml/linfa/pull/43

EricTulowetzke avatar Aug 03 '22 05:08 EricTulowetzke

The work for that PR for ensemble learning ended up in #66 which didn't pan out for some reason. The current work is in #229.

YuhanLiin avatar Aug 07 '22 01:08 YuhanLiin