Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes

10/11/2018
by   Roman Novak, et al.
18

There is a previously identified equivalence between wide fully connected neural networks (FCNs) and Gaussian processes (GPs). This equivalence enables, for instance, test set predictions that would have resulted from a fully Bayesian, infinitely wide trained FCN to be computed without ever instantiating the FCN, but by instead evaluating the corresponding GP. In this work, we derive an analogous equivalence for multi-layer convolutional neural networks (CNNs) both with and without pooling layers, and achieve state of the art results on CIFAR10 for GPs without trainable kernels. We also introduce a Monte Carlo method to estimate the GP corresponding to a given neural network architecture, even in cases where the analytic form has too many terms to be computationally feasible. Surprisingly, in the absence of pooling layers, the GPs corresponding to CNNs with and without weight sharing are identical. As a consequence, translation equivariance in finite-channel CNNs trained with stochastic gradient descent (SGD) has no corresponding property in the Bayesian treatment of the infinite channel limit - a qualitative difference between the two regimes that is not present in the FCN case. We confirm experimentally, that while in some scenarios the performance of SGD-trained finite CNNs approaches that of the corresponding GPs as the channel count increases, with careful tuning SGD-trained CNNs can significantly outperform their corresponding GPs, suggesting advantages from SGD training compared to fully Bayesian parameter estimation.

READ FULL TEXT

page 7

page 16

page 17

research
11/01/2017

Deep Neural Networks as Gaussian Processes

A deep fully-connected neural network with an i.i.d. prior over its para...
research
06/18/2020

Infinite attention: NNGP and NTK for deep attention networks

There is a growing amount of literature on the relationship between wide...
research
05/26/2018

Calibrating Deep Convolutional Gaussian Processes

The wide adoption of Convolutional Neural Networks (CNNs) in application...
research
09/17/2022

Interrelation of equivariant Gaussian processes and convolutional neural networks

Currently there exists rather promising new trend in machine leaning (ML...
research
12/30/2019

Disentangling trainability and generalization in deep learning

A fundamental goal in deep learning is the characterization of trainabil...
research
06/16/2019

Finding the Needle in the Haystack with Convolutions: on the benefits of architectural bias

Despite the phenomenal success of deep neural networks in a broad range ...
research
02/07/2021

Infinite-channel deep stable convolutional neural networks

The interplay between infinite-width neural networks (NNs) and classes o...

Please sign up or login with your details

Forgot password? Click here to reset