A Bayesian Boosting Model

09/10/2012
by   Alexander Lorbert, et al.
0

We offer a novel view of AdaBoost in a statistical setting. We propose a Bayesian model for binary classification in which label noise is modeled hierarchically. Using variational inference to optimize a dynamic evidence lower bound, we derive a new boosting-like algorithm called VIBoost. We show its close connections to AdaBoost and give experimental results from four datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2020

Stochastic Bayesian Neural Networks

Bayesian neural networks perform variational inference over weights but ...
research
08/05/2017

Boosting Variational Inference: an Optimization Perspective

Variational Inference is a popular technique to approximate a possibly i...
research
09/05/2023

Distributed Variational Inference for Online Supervised Learning

Developing efficient solutions for inference problems in intelligent sen...
research
01/18/2021

Classification of fNIRS Data Under Uncertainty: A Bayesian Neural Network Approach

Functional Near-Infrared Spectroscopy (fNIRS) is a non-invasive form of ...
research
02/13/2019

On the Convergence of Extended Variational Inference for Non-Gaussian Statistical Models

Variational inference (VI) is a widely used framework in Bayesian estima...
research
10/19/2020

Statistical Guarantees and Algorithmic Convergence Issues of Variational Boosting

We provide statistical guarantees for Bayesian variational boosting by p...
research
12/05/2021

Local Adaptivity of Gradient Boosting in Histogram Transform Ensemble Learning

In this paper, we propose a gradient boosting algorithm called adaptive ...

Please sign up or login with your details

Forgot password? Click here to reset