The Bayesian Bridge

09/11/2011
by   Nicholas G. Polson, et al.
0

We propose the Bayesian bridge estimator for regularized regression and classification. Two key mixture representations for the Bayesian bridge model are developed: (1) a scale mixture of normals with respect to an alpha-stable random variable; and (2) a mixture of Bartlett--Fejer kernels (or triangle densities) with respect to a two-component mixture of gamma random variables. Both lead to MCMC methods for posterior simulation, and these methods turn out to have complementary domains of maximum efficiency. The first representation is a well known result due to West (1987), and is the better choice for collinear design matrices. The second representation is new, and is more efficient for orthogonal problems, largely because it avoids the need to deal with exponentially tilted stable random variables. It also provides insight into the multimodality of the joint posterior distribution, a feature of the bridge model that is notably absent under ridge or lasso-type priors. We prove a theorem that extends this representation to a wider class of densities representable as scale mixtures of betas, and provide an explicit inversion formula for the mixing distribution. The connections with slice sampling and scale mixtures of normals are explored. On the practical side, we find that the Bayesian bridge model outperforms its classical cousin in estimation and prediction across a variety of data sets, both simulated and real. We also show that the MCMC for fitting the bridge model exhibits excellent mixing properties, particularly for the global scale parameter. This makes for a favorable contrast with analogous MCMC algorithms for other sparse Bayesian models. All methods described in this paper are implemented in the R package BayesBridge. An extensive set of simulation results are provided in two supplemental files.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2019

Sampling for Bayesian Mixture Models: MCMC with Polynomial-Time Mixing

We study the problem of sampling from the power posterior distribution i...
research
05/06/2020

Bayesian estimation and prediction for mixtures

For two vast families of mixture distributions and a given prior, we pro...
research
04/19/2012

EP-GIG Priors and Applications in Bayesian Sparse Learning

In this paper we propose a novel framework for the construction of spars...
research
09/22/2020

An l_1-oracle inequality for the Lasso in mixture-of-experts regression models

Mixture-of-experts (MoE) models are a popular framework for modeling het...
research
07/09/2021

A Bayesian Semiparametric Vector Multiplicative Error Model

Interactions among multiple time series of positive random variables are...
research
02/22/2018

Bayesian Lasso : Concentration and MCMC Diagnosis

Using posterior distribution of Bayesian LASSO we construct a semi-norm ...
research
06/14/2021

Improving Bridge estimators via f-GAN

Bridge sampling is a powerful Monte Carlo method for estimating ratios o...

Please sign up or login with your details

Forgot password? Click here to reset