Unified Robust Boosting

01/19/2021
by   Zhu Wang, et al.
0

Boosting is a popular machine learning algorithm in regression and classification problems. Boosting can combine a sequence of regression trees to obtain accurate prediction. In the presence of outliers, traditional boosting, based on optimizing convex loss functions, may show inferior results. In this article, a unified robust boosting is proposed for more resistant estimation. The method utilizes a recently developed concave-convex family for robust estimation, composite optimization by conjugation operator, and functional decent boosting. As a result, an iteratively reweighted boosting algorithm can be conveniently constructed with existing software. Applications in robust regression, classification and Poisson regression are demonstrated in the R package ccboost.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2020

Unified Robust Estimation via the COCO

Robust estimation is concerned with how to provide reliable parameter es...
research
10/05/2015

Boosting in the presence of outliers: adaptive classification with non-convex loss functions

This paper examines the role and efficiency of the non-convex loss funct...
research
10/09/2015

Functional Frank-Wolfe Boosting for General Loss Functions

Boosting is a generic learning method for classification and regression....
research
02/06/2020

Robust Boosting for Regression Problems

The gradient boosting algorithm constructs a regression estimator using ...
research
05/20/2019

A Distributionally Robust Boosting Algorithm

Distributionally Robust Optimization (DRO) has been shown to provide a f...
research
05/19/2022

What killed the Convex Booster ?

A landmark negative result of Long and Servedio established a worst-case...
research
07/04/2013

AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods

Boosting methods are highly popular and effective supervised learning me...

Please sign up or login with your details

Forgot password? Click here to reset