Deep Neural Networks as Gaussian Processes

11/01/2017
by   Jaehoon Lee, et al.
0

A deep fully-connected neural network with an i.i.d. prior over its parameters is equivalent to a Gaussian process (GP) in the limit of infinite network width. This correspondence enables exact Bayesian inference for neural networks on regression tasks by means of straightforward matrix computations. For single hidden-layer networks, the covariance function of this GP has long been known. Recently, kernel functions for multi-layer random neural networks have been developed, but only outside of a Bayesian framework. As such, previous work has not identified the correspondence between using these kernels as the covariance function for a GP and performing fully Bayesian prediction with a deep neural network. In this work, we derive this correspondence and develop a computationally efficient pipeline to compute the covariance functions. We then use the resulting GP to perform Bayesian inference for deep neural networks on MNIST and CIFAR-10. We find that the GP-based predictions are competitive and can outperform neural networks trained with stochastic gradient descent. We observe that the trained neural network accuracy approaches that of the corresponding GP-based computation with increasing layer width, and that the GP uncertainty is strongly correlated with prediction error. We connect our observations to the recent development of signal propagation in random neural networks.

READ FULL TEXT

page 8

page 13

research
08/08/2022

Deep Maxout Network Gaussian Process

Study of neural networks with infinite width is important for better und...
research
12/12/2019

On the relationship between multitask neural networks and multitask Gaussian Processes

Despite the effectiveness of multitask deep neural network (MTDNN), ther...
research
10/11/2018

Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes

There is a previously identified equivalence between wide fully connecte...
research
11/29/2019

Richer priors for infinitely wide multi-layer perceptrons

It is well-known that the distribution over functions induced through a ...
research
06/18/2020

Exact posterior distributions of wide Bayesian neural networks

Recent work has shown that the prior over functions induced by a deep Ba...
research
02/10/2021

Attentive Gaussian processes for probabilistic time-series generation

The transduction of sequence has been mostly done by recurrent networks,...
research
06/22/2018

Neural-net-induced Gaussian process regression for function approximation and PDE solution

Neural-net-induced Gaussian process (NNGP) regression inherits both the ...

Please sign up or login with your details

Forgot password? Click here to reset