On parameters transformations for emulating sparse priors using variational-Laplace inference

03/06/2017
by   Jean Daunizeau, et al.
0

So-called sparse estimators arise in the context of model fitting, when one a priori assumes that only a few (unknown) model parameters deviate from zero. Sparsity constraints can be useful when the estimation problem is under-determined, i.e. when number of model parameters is much higher than the number of data points. Typically, such constraints are enforced by minimizing the L1 norm, which yields the so-called LASSO estimator. In this work, we propose a simple parameter transform that emulates sparse priors without sacrificing the simplicity and robustness of L2-norm regularization schemes. We show how L1 regularization can be obtained with a "sparsify" remapping of parameters under normal Bayesian priors, and we demonstrate the ensuing variational Laplace approach using Monte-Carlo simulations.

READ FULL TEXT

page 12

page 14

page 15

page 16

research
02/24/2021

Sparse online variational Bayesian regression

This work considers variational Bayesian inference as an inexpensive and...
research
02/03/2023

Characterization and estimation of high dimensional sparse regression parameters under linear inequality constraints

Modern statistical problems often involve such linear inequality constra...
research
05/13/2015

Bootstrapped Adaptive Threshold Selection for Statistical Model Selection and Estimation

A central goal of neuroscience is to understand how activity in the nerv...
research
03/27/2013

Expectation Propagation for Neural Networks with Sparsity-promoting Priors

We propose a novel approach for nonlinear regression using a two-layer n...
research
07/12/2020

It Is Likely That Your Loss Should be a Likelihood

We recall that certain common losses are simplified likelihoods and inst...
research
04/05/2011

Generalized double Pareto shrinkage

We propose a generalized double Pareto prior for Bayesian shrinkage esti...
research
12/02/2019

Relating lp regularization and reweighted l1 regularization

We propose a general framework of iteratively reweighted l1 methods for ...

Please sign up or login with your details

Forgot password? Click here to reset