Minimal Variance Sampling in Stochastic Gradient Boosting

10/29/2019
by   Bulat Ibragimov, et al.
0

Stochastic Gradient Boosting (SGB) is a widely used approach to regularization of boosting models based on decision trees. It was shown that, in many cases, random sampling at each iteration can lead to better generalization performance of the model and can also decrease the learning time. Different sampling approaches were proposed, where probabilities are not uniform, and it is not currently clear which approach is the most effective. In this paper, we formulate the problem of randomization in SGB in terms of optimization of sampling probabilities to maximize the estimation accuracy of split scoring used to train decision trees. This optimization problem has a closed-form nearly optimal solution, and it leads to a new sampling technique, which we call Minimal Variance Sampling (MVS). The method both decreases the number of examples needed for each iteration of boosting and increases the quality of the model significantly as compared to the state-of-the art sampling methods. The superiority of the algorithm was confirmed by introducing MVS as a new default option for subsampling in CatBoost, a gradient boosting library achieving state-of-the-art quality on various machine learning tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2017

Tree-Structured Boosting: Connections Between Gradient Boosted Stumps and Full Decision Trees

Additive models, such as produced by gradient boosting, and full interac...
research
02/14/2012

Boosting as a Product of Experts

In this paper, we derive a novel probabilistic model of boosting as a Pr...
research
10/20/2022

Improving Data Quality with Training Dynamics of Gradient Boosting Decision Trees

Real world datasets contain incorrectly labeled instances that hamper th...
research
03/06/2018

Accelerated Gradient Boosting

Gradient tree boosting is a prediction algorithm that sequentially produ...
research
07/04/2012

Obtaining Calibrated Probabilities from Boosting

Boosted decision trees typically yield good accuracy, precision, and ROC...
research
08/16/2021

Task-wise Split Gradient Boosting Trees for Multi-center Diabetes Prediction

Diabetes prediction is an important data science application in the soci...
research
06/17/2020

MixBoost: A Heterogeneous Boosting Machine

Modern gradient boosting software frameworks, such as XGBoost and LightG...

Please sign up or login with your details

Forgot password? Click here to reset