Statistical Guarantees and Algorithmic Convergence Issues of Variational Boosting

10/19/2020
by   Biraj Subhra Guha, et al.
0

We provide statistical guarantees for Bayesian variational boosting by proposing a novel small bandwidth Gaussian mixture variational family. We employ a functional version of Frank-Wolfe optimization as our variational algorithm and study frequentist properties of the iterative boosting updates. Comparisons are drawn to the recent literature on boosting, describing how the choice of the variational family and the discrepancy measure affect both convergence and finite-sample statistical properties of the optimization routine. Specifically, we first demonstrate stochastic boundedness of the boosting iterates with respect to the data generating distribution. We next integrate this within our algorithm to provide an explicit convergence rate, ending with a result on the required number of boosting updates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/05/2017

Boosting Variational Inference: an Optimization Perspective

Variational Inference is a popular technique to approximate a possibly i...
research
01/30/2008

Recursive Bias Estimation and L_2 Boosting

This paper presents a general iterative bias correction procedure for re...
research
06/06/2018

Boosting Black Box Variational Inference

Approximating a probability density in a tractable manner is a central t...
research
06/04/2019

Universal Boosting Variational Inference

Boosting variational inference (BVI) approximates an intractable probabi...
research
09/10/2012

A Bayesian Boosting Model

We offer a novel view of AdaBoost in a statistical setting. We propose a...
research
05/20/2019

A Distributionally Robust Boosting Algorithm

Distributionally Robust Optimization (DRO) has been shown to provide a f...
research
09/22/2022

Boosting as Frank-Wolfe

Some boosting algorithms, such as LPBoost, ERLPBoost, and C-ERLPBoost, a...

Please sign up or login with your details

Forgot password? Click here to reset