Boulevard: Regularized Stochastic Gradient Boosted Trees and Their Limiting Distribution

06/26/2018
by   Yichen Zhou, et al.
0

This paper examines a novel gradient boosting framework for regression. We regularize gradient boosted trees by introducing subsampling and employ a modified shrinkage algorithm so that at every boosting stage the estimate is given by an average of trees. The resulting algorithm, titled Boulevard, is shown to converge as the number of trees grows. We also demonstrate a central limit theorem for this limit, allowing a characterization of uncertainty for predictions. A simulation study and real world examples provide support for both the predictive accuracy of the model and its limiting behavior.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/26/2021

Infinitesimal gradient boosting

We define infinitesimal gradient boosting as a limit of the popular tree...
research
03/06/2018

Accelerated Gradient Boosting

Gradient tree boosting is a prediction algorithm that sequentially produ...
research
06/18/2020

Uncertainty in Gradient Boosting via Ensembles

Gradient boosting is a powerful machine learning technique that is parti...
research
10/31/2017

TF Boosted Trees: A scalable TensorFlow based framework for gradient boosting

TF Boosted Trees (TFBT) is a new open-sourced frame-work for the distrib...
research
07/03/2020

Team voyTECH: User Activity Modeling with Boosting Trees

This paper describes our winning solution for the ECML-PKDD ChAT Discove...
research
10/24/2022

Data-IQ: Characterizing subgroups with heterogeneous outcomes in tabular data

High model performance, on average, can hide that models may systematica...
research
01/27/2021

Boost-S: Gradient Boosted Trees for Spatial Data and Its Application to FDG-PET Imaging Data

Boosting Trees are one of the most successful statistical learning appro...

Please sign up or login with your details

Forgot password? Click here to reset