All You Need is a Good Functional Prior for Bayesian Deep Learning

11/25/2020
by   Ba-Hien Tran, et al.
0

The Bayesian treatment of neural networks dictates that a prior distribution is specified over their weight and bias parameters. This poses a challenge because modern neural networks are characterized by a large number of parameters, and the choice of these priors has an uncontrolled effect on the induced functional prior, which is the distribution of the functions obtained by sampling the parameters from their prior distribution. We argue that this is a hugely limiting aspect of Bayesian deep learning, and this work tackles this limitation in a practical and effective way. Our proposal is to reason in terms of functional priors, which are easier to elicit, and to "tune" the priors of neural network parameters in a way that they reflect such functional priors. Gaussian processes offer a rigorous framework to define prior distributions over functions, and we propose a novel and robust framework to match their prior with the functional prior of neural networks based on the minimization of their Wasserstein distance. We provide vast experimental evidence that coupling these priors with scalable Markov chain Monte Carlo sampling offers systematically large performance improvements over alternative choices of priors and state-of-the-art approximate Bayesian deep learning approaches. We consider this work a considerable step in the direction of making the long-standing challenge of carrying out a fully Bayesian treatment of neural networks, including convolutional neural networks, a concrete possibility.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/14/2021

Priors in Bayesian Deep Learning: A Review

While the choice of prior is one of the most critical parts of the Bayes...
research
05/14/2021

BNNpriors: A library for Bayesian neural network inference with different prior distributions

Bayesian neural networks have shown great promise in many applications w...
research
06/18/2020

Predictive Complexity Priors

Specifying a Bayesian prior is notoriously difficult for complex models ...
research
05/11/2020

Prior choice affects ability of Bayesian neural networks to identify unknowns

Deep Bayesian neural networks (BNNs) are a powerful tool, though computa...
research
12/20/2020

Dimension-robust Function Space MCMC With Neural Network Priors

This paper introduces a new prior on functions spaces which scales more ...
research
02/12/2021

Bayesian Neural Network Priors Revisited

Isotropic Gaussian priors are the de facto standard for modern Bayesian ...
research
06/19/2019

The Functional Neural Process

We present a new family of exchangeable stochastic processes, the Functi...

Please sign up or login with your details

Forgot password? Click here to reset