Abhimanyu Dayal
Abhimanyu Dayal
Written in context of #1319, DFFML could build a centralised module for implementation of datasets. That is, at present, in order to implement a new dataset, we need to write...
I noticed that mlpack doesn't currently have a binding for Gradient Boosting. To give a brief description of Gradient Boosting: it's an ensemble technique which uses weak learners such as...
XGBoost (eXtreme Gradient Boosting) is an optimized implementation of Gradient Boosting. Working forward from PR #3735 where I implement Gradient Boosting.
Started working on new loss functions more specific to XGB. Continuing from PR #3747
SSE_loss existed in xgboost directory beforehand. I shifted that to the Decision Trees directory and included that as a gain function to be used. All the xgboost loss functions are...
Continuing from PR #3725 Needed to close that one because of some branch issues.
Cherry-picked from PR #3736 After training, xgboost can calculate feature importance to understand the contribution of each feature in the classification decision. XGBoost provides two main types of feature importance...
Adding tree pruning method in decision trees as part of the xgboost implementation (PR #3736).