Random weighting to approximate posterior inference in LASSO regression

02/07/2020
by   Tun Lee Ng, et al.
0

We consider a general-purpose approximation approach to Bayesian inference in which repeated optimization of a randomized objective function provides surrogate samples from the joint posterior distribution. In the context of LASSO regression, we repeatedly assign independently-drawn standard-exponential random weights to terms in the objective function, and optimize to obtain the surrogate samples. We establish the asymptotic properties of this method under different regularization parameters λ_n. In particular, if λ_n = o(√(n)), then the random-weighting (weighted bootstrap) samples are equivalent (up to the first order) to the Bayesian posterior samples. If λ_n = O( n^c ) for some 1/2 < c < 1, then these samples achieve conditional model selection consistency. We also establish the asymptotic properties of the random-weighting method when weights are drawn from other distributions, and also if weights are assigned to the LASSO penalty terms.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

11/18/2020

Understanding Variational Inference in Function-Space

Recent work has attempted to directly approximate the `function-space' o...
01/23/2020

The Reciprocal Bayesian LASSO

A reciprocal LASSO (rLASSO) regularization employs a decreasing penalty ...
03/26/2021

Introducing prior information in Weighted Likelihood Bootstrap with applications to model misspecification

We propose Posterior Bootstrap, a set of algorithms extending Weighted L...
07/07/2016

Kernel Bayesian Inference with Posterior Regularization

We propose a vector-valued regression problem whose solution is equivale...
09/12/2019

Monte Carlo Approximation of Bayes Factors via Mixing with Surrogate Distributions

By mixing the posterior distribution with a surrogate distribution, of w...
10/03/2019

Empirical Likelihood Under Mis-specification: Degeneracies and Random Critical Points

We investigate empirical likelihood obtained from mis-specified (i.e. bi...
10/21/2021

Asymptotics of cut distributions and robust modular inference using Posterior Bootstrap

Bayesian inference provides a framework to combine an arbitrary number o...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.