DeepAI
Log In Sign Up

Generalized double Pareto shrinkage

04/05/2011
by   Artin Armagan, et al.
Duke University
SAS
0

We propose a generalized double Pareto prior for Bayesian shrinkage estimation and inferences in linear models. The prior can be obtained via a scale mixture of Laplace or normal distributions, forming a bridge between the Laplace and Normal-Jeffreys' priors. While it has a spike at zero like the Laplace density, it also has a Student's t-like tail behavior. Bayesian computation is straightforward via a simple Gibbs sampling algorithm. We investigate the properties of the maximum a posteriori estimator, as sparse estimation plays an important role in many problems, reveal connections with some well-established regularization procedures, and show some asymptotic results. The performance of the prior is tested through simulations and an application.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/06/2019

Regularization of Bayesian shrinkage priors and inference via geometrically / uniformly ergodic Gibbs sampler

Use of continuous shrinkage priors — with a "spike" near zero and heavy-...
12/27/2018

Bayesian Fusion Estimation via t-Shrinkage

Shrinkage prior has gained great successes in many data analysis, howeve...
08/15/2022

Intuitive Joint Priors for Bayesian Linear Multilevel Models: The R2D2M2 prior

The training of high-dimensional regression models on comparably sparse ...
09/09/2022

A Laplace Mixture Representation of the Horseshoe and Some Implications

The horseshoe prior, defined as a half Cauchy scale mixture of normal, p...
03/06/2017

On parameters transformations for emulating sparse priors using variational-Laplace inference

So-called sparse estimators arise in the context of model fitting, when ...
01/23/2020

Shrinkage with Robustness: Log-Adjusted Priors for Sparse Signals

We introduce a new class of distributions named log-adjusted shrinkage p...
10/11/2019

Beta Rank Function: A Smooth Double-Pareto-Like Distribution

The Beta Rank Function (BRF) x(u) =A(1-u)^b/u^a, where u is the normaliz...