Global inducing point variational posteriors for Bayesian neural networks and deep Gaussian processes

05/17/2020
by   Sebastian W. Ober, et al.
0

Variational inference is a popular approach to reason about uncertainty in Bayesian neural networks (BNNs) and deep Gaussian processes (deep GPs). However, typical variational approximate posteriors for deep BNNs and GPs use an approximate posterior that factorises across layers. This is a problematic assumption, because what matters in a deep BNN or GP is the input-output transformation defined by the full network, not the input-output transformation defined by an individual layer. We therefore propose an approximate posterior with dependencies across layers that seeks to jointly model the input-output transformation over the full network. Our approximate posterior is based on a "global" set of inducing points that are defined only at the input layer, and propagated through the network. In showing that BNNs are a special case of deep GPs, we demonstrate that this approximate posterior can be used to infer both the weights of a BNN and the functions in a deep GP. Further, we consider a new correlated prior over the weights of a BNN, which in combination with global inducing points gives state-of-the-art performance for a variational Bayesian method, without data augmentation or posterior tempering, on CIFAR-10 of 86.7%.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2021

Input Dependent Sparse Gaussian Processes

Gaussian Processes (GPs) are Bayesian models that provide uncertainty es...
research
06/10/2018

Building Bayesian Neural Networks with Blocks: On Structure, Interpretability and Uncertainty

We provide simple schemes to build Bayesian Neural Networks (BNNs), bloc...
research
12/27/2020

A Tutorial on Sparse Gaussian Processes and Variational Inference

Gaussian processes (GPs) provide a framework for Bayesian inference that...
research
09/08/2018

Non-Parametric Variational Inference with Graph Convolutional Networks for Gaussian Processes

Inference for GP models with non-Gaussian noises is computationally expe...
research
05/10/2021

Deep Neural Networks as Point Estimates for Deep Gaussian Processes

Deep Gaussian processes (DGPs) have struggled for relevance in applicati...
research
05/16/2019

Efficient Deep Gaussian Process Models for Variable-Sized Input

Deep Gaussian processes (DGP) have appealing Bayesian properties, can ha...
research
12/10/2018

Bayesian Layers: A Module for Neural Network Uncertainty

We describe Bayesian Layers, a module designed for fast experimentation ...

Please sign up or login with your details

Forgot password? Click here to reset