DeepAI AI Chat
Log In Sign Up

A Gaussian Process perspective on Convolutional Neural Networks

by   Anastasia Borovykh, et al.
University of Bologna

In this paper we cast the well-known convolutional neural network in a Gaussian process perspective. In this way we hope to gain additional insights into the performance of convolutional networks, in particular understand under what circumstances they tend to perform well and what assumptions are implicitly made in the network. While for feedforward networks the properties of convergence to Gaussian processes have been studied extensively, little is known about situations in which the output from a convolutional network approaches a multivariate normal distribution. In the convolutional net the sum is computed over variables which are not necessarily identically distributed, rendering the general central limit theorem useless. Nevertheless we can apply a Lyapunov-type bound on the distance between the Gaussian process and convolutional network output, and use this bound to study the properties under which the convolutional network behaves approximately like a Gaussian process, so that this behavior -depending on the application- can be either obtained or avoided.


page 1

page 2

page 3

page 4


Gaussian Process Behaviour in Wide Deep Neural Networks

Whilst deep neural networks have shown great empirical success, there is...

Large-width functional asymptotics for deep Gaussian neural networks

In this paper, we consider fully connected feed-forward deep neural netw...

Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes

Wide neural networks with random weights and biases are Gaussian process...

On the Spectral Bias of Convolutional Neural Tangent and Gaussian Process Kernels

We study the properties of various over-parametrized convolutional neura...

Deep Convolutional Networks as shallow Gaussian Processes

We show that the output of a (residual) convolutional neural network (CN...

A Modified Convolutional Network for Auto-encoding based on Pattern Theory Growth Function

This brief paper reports the shortcoming of a variant of convolutional n...

Quantitative Gaussian Approximation of Randomly Initialized Deep Neural Networks

Given any deep fully connected neural network, initialized with random G...