Fully-Corrective Gradient Boosting with Squared Hinge: Fast Learning Rates and Early Stopping

04/01/2020
by   Jinshan Zeng, et al.
0

Boosting is a well-known method for improving the accuracy of weak learners in machine learning. However, its theoretical generalization guarantee is missing in literature. In this paper, we propose an efficient boosting method with theoretical generalization guarantees for binary classification. Three key ingredients of the proposed boosting method are: a) the fully-corrective greedy (FCG) update in the boosting procedure, b) a differentiable squared hinge (also called truncated quadratic) function as the loss function, and c) an efficient alternating direction method of multipliers (ADMM) algorithm for the associated FCG optimization. The used squared hinge loss not only inherits the robustness of the well-known hinge loss for classification with outliers, but also brings some benefits for computational implementation and theoretical justification. Under some sparseness assumption, we derive a fast learning rate of the order O((m/log m)^-1/4) for the proposed boosting method, which can be further improved to O((m/log m)^-1/2) if certain additional noise assumption is imposed, where m is the size of sample set. Both derived learning rates are the best ones among the existing generalization results of boosting-type methods for classification. Moreover, an efficient early stopping scheme is provided for the proposed method. A series of toy simulations and real data experiments are conducted to verify the developed theories and demonstrate the effectiveness of the proposed method.

READ FULL TEXT

page 2

page 3

page 4

page 5

page 6

page 7

page 12

page 13

research
01/20/2020

SGLB: Stochastic Gradient Langevin Boosting

In this paper, we introduce Stochastic Gradient Langevin Boosting (SGLB)...
research
11/23/2013

Fast Training of Effective Multi-class Boosting Using Coordinate Descent Optimization

Wepresentanovelcolumngenerationbasedboostingmethod for multi-class class...
research
11/24/2019

Fast Polynomial Kernel Classification for Massive Data

In the era of big data, it is highly desired to develop efficient machin...
research
02/23/2017

Online Multiclass Boosting

Recent work has extended the theoretical analysis of boosting algorithms...
research
06/20/2017

SPLBoost: An Improved Robust Boosting Algorithm Based on Self-paced Learning

It is known that Boosting can be interpreted as a gradient descent techn...
research
09/16/2020

Kernel-based L_2-Boosting with Structure Constraints

Developing efficient kernel methods for regression is very popular in th...
research
10/24/2018

Randomized Gradient Boosting Machine

Gradient Boosting Machine (GBM) introduced by Friedman is an extremely p...

Please sign up or login with your details

Forgot password? Click here to reset