Totally Corrective Boosting for Regularized Risk Minimization

08/30/2010
by   Chunhua Shen, et al.
0

Consideration of the primal and dual problems together leads to important new insights into the characteristics of boosting algorithms. In this work, we propose a general framework that can be used to design new boosting algorithms. A wide variety of machine learning problems essentially minimize a regularized risk functional. We show that the proposed boosting framework, termed CGBoost, can accommodate various loss functions and different regularizers in a totally-corrective optimization fashion. We show that, by solving the primal rather than the dual, a large body of totally-corrective boosting algorithms can actually be efficiently solved and no sophisticated convex optimization solvers are needed. We also demonstrate that some boosting algorithms like AdaBoost can be interpreted in our framework--even their optimization is not totally corrective. We empirically show that various boosting algorithms based on the proposed framework perform similarly on the UCIrvine machine learning datasets [1] that we have used in the experiments.

READ FULL TEXT
research
11/27/2012

Duality between subgradient and conditional gradient methods

Given a convex optimization problem and its dual, there are many possibl...
research
02/11/2020

IPBoost – Non-Convex Boosting via Integer Programming

Recently non-convex optimization approaches for solving machine learning...
research
06/20/2017

SPLBoost: An Improved Robust Boosting Algorithm Based on Self-paced Learning

It is known that Boosting can be interpreted as a gradient descent techn...
research
08/09/2018

Gradient and Newton Boosting for Classification and Regression

Boosting algorithms enjoy large popularity due to their high predictive ...
research
05/10/2011

Generalized Boosting Algorithms for Convex Optimization

Boosting is a popular way to derive powerful learners from simpler hypot...
research
01/23/2009

On the Dual Formulation of Boosting Algorithms

We study boosting algorithms from a new perspective. We show that the La...
research
09/28/2022

TRBoost: A Generic Gradient Boosting Machine based on Trust-region Method

A generic Gradient Boosting Machine called Trust-region Boosting (TRBoos...

Please sign up or login with your details

Forgot password? Click here to reset