KTBoost: Combined Kernel and Tree Boosting

02/11/2019
by   Fabio Sigrist, et al.
0

In this article, we introduce a novel boosting algorithm called `KTBoost', which combines kernel boosting and tree boosting. In each boosting iteration, the algorithm adds either a regression tree or reproducing kernel Hilbert space (RKHS) regression function to the ensemble of base learners. Intuitively, the idea is that discontinuous trees and continuous RKHS regression functions complement each other, and that this combination allows for better learning of both continuous and discontinuous functions as well as functions that exhibit parts with varying degrees of regularity. We empirically show that KTBoost outperforms both tree and kernel boosting in terms of predictive accuracy on a wide array of data sets.

READ FULL TEXT
research
08/09/2018

Gradient and Newton Boosting for Classification and Regression

Boosting algorithms enjoy large popularity due to their high predictive ...
research
08/08/2016

Boosting as a kernel-based method

Boosting combines weak (biased) learners to obtain effective learning al...
research
06/14/2019

Learning Landmark-Based Ensembles with Random Fourier Features and Gradient Boosting

We propose a Gradient Boosting algorithm for learning an ensemble of ker...
research
05/19/2021

Latent Gaussian Model Boosting

Latent Gaussian models and boosting are widely used techniques in statis...
research
05/18/2019

Gradient tree boosting with random output projections for multi-label classification and multi-output regression

In many applications of supervised learning, multiple classification or ...
research
09/16/2020

Kernel-based L_2-Boosting with Structure Constraints

Developing efficient kernel methods for regression is very popular in th...
research
04/04/2021

Urysohn Forest for Aleatoric Uncertainty Quantification

The terms tree and forest are normally associated with an ensemble of cl...

Please sign up or login with your details

Forgot password? Click here to reset