l_1-ball Prior: Uncertainty Quantification with Exact Zeros

06/02/2020
by   Maoran Xu, et al.
0

Lasso and l_1-regularization play a dominating role in high dimensional statistics and machine learning. The most attractive property is that it produces a sparse parameter estimate containing exact zeros. For uncertainty quantification, popular Bayesian approaches choose a continuous prior that puts concentrated mass near zero; however, as a limitation, the continuous posterior cannot be exactly sparse. This makes such a prior problematic for advanced models, such as the change-point detection, linear trend filtering and convex clustering, where zeros are crucial for dimension reduction. In this article, we propose a new class of prior, by projecting a continuous distribution onto the l_1-ball with radius r. This projection creates a positive probability on the lower-dimensional boundary of the ball, where the random variable now contains both continuous elements and exact zeros; meanwhile, assigning prior for radius r gives robustness to large signals. Compared with the spike-and-slab prior, our proposal has substantial flexibility in the prior specification and adaptive shrinkage on small signals; in addition, it enjoys an efficient optimization-based posterior estimation. In asymptotic theory, the prior attains the minimax optimal rate for the posterior concentration around the truth; in practice, it enables a direct application of the rich class of l_1-tricks in Bayesian models. We demonstrate its potentials in a data application of analyzing electroencephalogram time series data in human working memory study, using a non-parametric mixture model of linear trend filters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2021

Bayesian Inference using the Proximal Mapping: Uncertainty Quantification under Varying Dimensionality

In statistical applications, it is common to encounter parameters suppor...
research
08/24/2021

Uncertainty Quantification of the 4th kind; optimal posterior accuracy-uncertainty tradeoff with the minimum enclosing ball

There are essentially three kinds of approaches to Uncertainty Quantific...
research
08/24/2020

Unified Bayesian asymptotic theory for sparse linear regression

We study frequentist asymptotic properties of Bayesian procedures for hi...
research
11/29/2020

Bayesian Bootstrap Spike-and-Slab LASSO

The impracticality of posterior sampling has prevented the widespread ad...
research
05/26/2021

Ideal Bayesian Spatial Adaptation

Many real-life applications involve estimation of curves that exhibit co...
research
05/12/2021

Bayesian variational regularization on the ball

We develop variational regularization methods which leverage sparsity-pr...
research
03/21/2019

Latent Simplex Position Model: High Dimensional Multi-view Clustering with Uncertainty Quantification

High dimensional data often contain multiple facets, and several cluster...

Please sign up or login with your details

Forgot password? Click here to reset