Functional Frank-Wolfe Boosting for General Loss Functions

10/09/2015
by   Chu Wang, et al.
0

Boosting is a generic learning method for classification and regression. Yet, as the number of base hypotheses becomes larger, boosting can lead to a deterioration of test performance. Overfitting is an important and ubiquitous phenomenon, especially in regression settings. To avoid overfitting, we consider using l_1 regularization. We propose a novel Frank-Wolfe type boosting algorithm (FWBoost) applied to general loss functions. By using exponential loss, the FWBoost algorithm can be rewritten as a variant of AdaBoost for binary classification. FWBoost algorithms have exactly the same form as existing boosting methods, in terms of making calls to a base learning algorithm with different weights update. This direct connection between boosting and Frank-Wolfe yields a new algorithm that is as practical as existing boosting methods but with new guarantees and rates of convergence. Experimental results show that the test performance of FWBoost is not degraded with larger rounds in boosting, which is consistent with the theoretical analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/19/2021

Unified Robust Boosting

Boosting is a popular machine learning algorithm in regression and class...
research
08/09/2018

Gradient and Newton Boosting for Classification and Regression

Boosting algorithms enjoy large popularity due to their high predictive ...
research
09/28/2022

TRBoost: A Generic Gradient Boosting Machine based on Trust-region Method

A generic Gradient Boosting Machine called Trust-region Boosting (TRBoos...
research
02/14/2012

Boosting as a Product of Experts

In this paper, we derive a novel probabilistic model of boosting as a Pr...
research
09/16/2020

Kernel-based L_2-Boosting with Structure Constraints

Developing efficient kernel methods for regression is very popular in th...
research
05/16/2015

A New Perspective on Boosting in Linear Regression via Subgradient Optimization and Relatives

In this paper we analyze boosting algorithms in linear regression from a...
research
11/14/2020

MP-Boost: Minipatch Boosting via Adaptive Feature and Observation Sampling

Boosting methods are among the best general-purpose and off-the-shelf ma...

Please sign up or login with your details

Forgot password? Click here to reset