Hierarchical Gaussian Process Priors for Bayesian Neural Network Weights

02/10/2020
by   Theofanis Karaletsos, et al.
10

Probabilistic neural networks are typically modeled with independent weight priors, which do not capture weight correlations in the prior and do not provide a parsimonious interface to express properties in function space. A desirable class of priors would represent weights compactly, capture correlations between weights, facilitate calibrated reasoning about uncertainty, and allow inclusion of prior knowledge about the function space such as periodicity or dependence on contexts such as inputs. To this end, this paper introduces two innovations: (i) a Gaussian process-based hierarchical model for network weights based on unit embeddings that can flexibly encode correlated weight structures, and (ii) input-dependent versions of these weight priors that can provide convenient ways to regularize the function space through the use of kernels defined on contextual inputs. We show these models provide desirable test-time uncertainty estimates on out-of-distribution data, demonstrate cases of modeling inductive biases for neural networks with kernels which help both interpolation and extrapolation from training data, and demonstrate competitive predictive performance on an active learning benchmark.

READ FULL TEXT

page 8

page 9

page 16

page 22

page 23

research
07/14/2021

Hybrid Bayesian Neural Networks with Functional Probabilistic Layers

Bayesian neural networks provide a direct and natural way to extend stan...
research
01/22/2021

Bayesian hierarchical stacking

Stacking is a widely used model averaging technique that yields asymptot...
research
12/30/2017

Learning Structural Weight Uncertainty for Sequential Decision-Making

Learning probability distributions on the weights of neural networks (NN...
research
03/27/2013

Expectation Propagation for Neural Networks with Sparsity-promoting Priors

We propose a novel approach for nonlinear regression using a two-layer n...
research
07/24/2018

Reliable Uncertainty Estimates in Deep Neural Networks using Noise Contrastive Priors

Obtaining reliable uncertainty estimates of neural network predictions i...
research
11/06/2020

Beyond Marginal Uncertainty: How Accurately can Bayesian Regression Models Estimate Posterior Predictive Correlations?

While uncertainty estimation is a well-studied topic in deep learning, m...
research
03/06/2020

Weight Priors for Learning Identity Relations

Learning abstract and systematic relations has been an open issue in neu...

Please sign up or login with your details

Forgot password? Click here to reset