Boosting as a kernel-based method

08/08/2016
by   Aleksandr Y. Aravkin, et al.
0

Boosting combines weak (biased) learners to obtain effective learning algorithms for classification and prediction. In this paper, we show a connection between boosting and kernel-based methods, highlighting both theoretical and practical applications. In the context of ℓ_2 boosting, we start with a weak linear learner defined by a kernel K. We show that boosting with this learner is equivalent to estimation with a special boosting kernel that depends on K, as well as on the regression matrix, noise variance, and hyperparameters. The number of boosting iterations is modeled as a continuous hyperparameter, and fit along with other parameters using standard techniques. We then generalize the boosting kernel to a broad new class of boosting approaches for more general weak learners, including those based on the ℓ_1, hinge and Vapnik losses. The approach allows fast hyperparameter tuning for this general class, and has a wide range of applications, including robust regression and classification. We illustrate some of these applications with numerical examples on synthetic and real data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2023

Language models are weak learners

A central notion in practical and theoretical machine learning is that o...
research
02/11/2019

KTBoost: Combined Kernel and Tree Boosting

In this article, we introduce a novel boosting algorithm called `KTBoost...
research
02/10/2018

Enhanced version of AdaBoostM1 with J48 Tree learning method

Machine Learning focuses on the construction and study of systems that c...
research
06/18/2021

Being Properly Improper

In today's ML, data can be twisted (changed) in various ways, either for...
research
07/11/2022

Multi-Study Boosting: Theoretical Considerations for Merging vs. Ensembling

Cross-study replicability is a powerful model evaluation criterion that ...
research
01/31/2020

Boosting Simple Learners

We consider boosting algorithms under the restriction that the weak lear...
research
06/24/2022

Symbolic-Regression Boosting

Modifying standard gradient boosting by replacing the embedded weak lear...

Please sign up or login with your details

Forgot password? Click here to reset