Random weighting to approximate posterior inference in LASSO regression

by   Tun Lee Ng, et al.

We consider a general-purpose approximation approach to Bayesian inference in which repeated optimization of a randomized objective function provides surrogate samples from the joint posterior distribution. In the context of LASSO regression, we repeatedly assign independently-drawn standard-exponential random weights to terms in the objective function, and optimize to obtain the surrogate samples. We establish the asymptotic properties of this method under different regularization parameters λ_n. In particular, if λ_n = o(√(n)), then the random-weighting (weighted bootstrap) samples are equivalent (up to the first order) to the Bayesian posterior samples. If λ_n = O( n^c ) for some 1/2 < c < 1, then these samples achieve conditional model selection consistency. We also establish the asymptotic properties of the random-weighting method when weights are drawn from other distributions, and also if weights are assigned to the LASSO penalty terms.



There are no comments yet.


page 1

page 2

page 3

page 4


Understanding Variational Inference in Function-Space

Recent work has attempted to directly approximate the `function-space' o...

The Reciprocal Bayesian LASSO

A reciprocal LASSO (rLASSO) regularization employs a decreasing penalty ...

Introducing prior information in Weighted Likelihood Bootstrap with applications to model misspecification

We propose Posterior Bootstrap, a set of algorithms extending Weighted L...

Kernel Bayesian Inference with Posterior Regularization

We propose a vector-valued regression problem whose solution is equivale...

Monte Carlo Approximation of Bayes Factors via Mixing with Surrogate Distributions

By mixing the posterior distribution with a surrogate distribution, of w...

Empirical Likelihood Under Mis-specification: Degeneracies and Random Critical Points

We investigate empirical likelihood obtained from mis-specified (i.e. bi...

Asymptotics of cut distributions and robust modular inference using Posterior Bootstrap

Bayesian inference provides a framework to combine an arbitrary number o...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.