On the Computational Complexity of High-Dimensional Bayesian Variable Selection

05/29/2015
by   Yun Yang, et al.
0

We study the computational complexity of Markov chain Monte Carlo (MCMC) methods for high-dimensional Bayesian linear regression under sparsity constraints. We first show that a Bayesian approach can achieve variable-selection consistency under relatively mild conditions on the design matrix. We then demonstrate that the statistical criterion of posterior concentration need not imply the computational desideratum of rapid mixing of the MCMC algorithm. By introducing a truncated sparsity prior for variable selection, we provide a set of conditions that guarantee both variable-selection consistency and rapid mixing of a particular Metropolis-Hastings algorithm. The mixing time is linear in the number of covariates up to a logarithmic factor. Our proof controls the spectral gap of the Markov chain by constructing a canonical path ensemble that is inspired by the steps taken by greedy algorithms for variable selection.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/28/2019

Approximate spectral gaps for Markov chains mixing times in high-dimension

This paper introduces a concept of approximate spectral gap to analyze t...
research
11/06/2018

Mixing Time of Metropolis-Hastings for Bayesian Community Detection

We study the computational complexity of a Metropolis-Hastings algorithm...
research
05/12/2021

Dimension-free Mixing for High-dimensional Bayesian Variable Selection

Yang et al. (2016) proved that the symmetric random walk Metropolis–Hast...
research
10/31/2008

Gibbs posterior for variable selection in high-dimensional classification and data mining

In the popular approach of "Bayesian variable selection" (BVS), one uses...
research
01/11/2021

Complexity analysis of Bayesian learning of high-dimensional DAG models and their equivalence classes

We consider MCMC methods for learning equivalence classes of sparse Gaus...
research
11/22/2017

PULasso: High-dimensional variable selection with presence-only data

In various real-world problems, we are presented with positive and unlab...
research
03/05/2019

Spike-and-Slab Group Lassos for Grouped Regression and Sparse Generalized Additive Models

We introduce the spike-and-slab group lasso (SSGL) for Bayesian estimati...

Please sign up or login with your details

Forgot password? Click here to reset