Rectified Gaussian Scale Mixtures and the Sparse Non-Negative Least Squares Problem

01/22/2016
by   Alican Nalci, et al.
0

In this paper, we develop a Bayesian evidence maximization framework to solve the sparse non-negative least squares problem (S-NNLS). We introduce a family of scale mixtures referred as to Rectified Gaussian Scale Mixture (R-GSM) to model the sparsity enforcing prior distribution for the signal of interest. Through proper choice of the mixing density, the R-GSM prior encompasses a wide variety of heavy-tailed distributions such as the rectified Laplacian and rectified Student-t distributions. Utilizing the hierarchical representation induced by the scale mixture prior, an evidence maximization or Type II estimation method based on the expectation-maximization (EM) framework is developed to estimate the hyper-parameters and to obtain a point estimate of the parameter of interest. In the proposed method, called rectified Sparse Bayesian Learning (R-SBL), we provide four alternative approaches that offer a range of options to trade-off computational complexity to quality of the E-step computation. The methods include the Markov Chain Monte Carlo EM, linear minimum mean square estimation, approximate message passing and a diagonal approximation. Through numerical experiments, we show that the proposed R-SBL method outperforms existing S-NNLS solvers in terms of both signal and support recovery.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2014

Sparse Estimation with Generalized Beta Mixture and the Horseshoe Prior

In this paper, the use of the Generalized Beta Mixture (GBM) and Horsesh...
research
09/09/2022

A Laplace Mixture Representation of the Horseshoe and Some Implications

The horseshoe prior, defined as a half Cauchy scale mixture of normal, p...
research
08/05/2015

A MAP approach for ℓ_q-norm regularized sparse parameter estimation using the EM algorithm

In this paper, Bayesian parameter estimation through the consideration o...
research
03/08/2017

A GAMP Based Low Complexity Sparse Bayesian Learning Algorithm

In this paper, we present an algorithm for the sparse signal recovery pr...
research
03/05/2019

Generalized Approximate Message Passing for Massive MIMO mmWave Channel Estimation with Laplacian Prior

This paper tackles the problem of millimeter-Wave (mmWave) channel estim...
research
06/18/2021

Sparse Linear Spectral Unmixing of Hyperspectral images using Expectation-Propagation

This paper presents a novel Bayesian approach for hyperspectral image un...
research
12/19/2016

An extended Perona-Malik model based on probabilistic models

The Perona-Malik model has been very successful at restoring images from...

Please sign up or login with your details

Forgot password? Click here to reset