Expectation Propagation for Neural Networks with Sparsity-promoting Priors

03/27/2013
by   Pasi Jylänki, et al.
0

We propose a novel approach for nonlinear regression using a two-layer neural network (NN) model structure with sparsity-favoring hierarchical priors on the network weights. We present an expectation propagation (EP) approach for approximate integration over the posterior distribution of the weights, the hierarchical scale parameters of the priors, and the residual scale. Using a factorized posterior approximation we derive a computationally efficient algorithm, whose complexity scales similarly to an ensemble of independent sparse linear models. The approach enables flexible definition of weight priors with different sparseness properties such as independent Laplace priors with a common scale parameter or Gaussian automatic relevance determination (ARD) priors with different relevance parameters for all inputs. The approach can be extended beyond standard activation functions and NN model structures to form flexible nonlinear predictors from multiple sparse linear models. The effects of the hierarchical priors and the predictive performance of the algorithm are assessed using both simulated and real-world data. Comparisons are made to two alternative models with ARD priors: a Gaussian process with a NN covariance function and marginal maximum a posteriori estimates of the relevance parameters, and a NN with Markov chain Monte Carlo integration over all the unknown model parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2023

Variational EP with Probabilistic Backpropagation for Bayesian Neural Networks

I propose a novel approach for nonlinear Logistic regression using a two...
research
11/28/2017

Dependent relevance determination for smooth and structured sparse regression

In many problem settings, parameter vectors are not merely sparse, but d...
research
11/01/2012

Laplace approximation for logistic Gaussian process density estimation and regression

Logistic Gaussian process (LGP) priors provide a flexible alternative fo...
research
04/22/2014

Approximate Inference for Nonstationary Heteroscedastic Gaussian process Regression

This paper presents a novel approach for approximate integration over th...
research
02/10/2020

Hierarchical Gaussian Process Priors for Bayesian Neural Network Weights

Probabilistic neural networks are typically modeled with independent wei...
research
12/10/2011

Convergent Expectation Propagation in Linear Models with Spike-and-slab Priors

Exact inference in the linear regression model with spike and slab prior...
research
03/06/2017

On parameters transformations for emulating sparse priors using variational-Laplace inference

So-called sparse estimators arise in the context of model fitting, when ...

Please sign up or login with your details

Forgot password? Click here to reset