Avoiding pathologies in very deep networks

02/24/2014
by   David Duvenaud, et al.
0

Choosing appropriate architectures and regularization strategies for deep networks is crucial to good predictive performance. To shed light on this problem, we analyze the analogous problem of constructing useful priors on compositions of functions. Specifically, we study the deep Gaussian process, a type of infinitely-wide, deep neural network. We show that in standard architectures, the representational capacity of the network tends to capture fewer degrees of freedom as the number of layers increases, retaining only a single degree of freedom in the limit. We propose an alternate network architecture which does not suffer from this pathology. We also examine deep covariance functions, obtained by composing infinitely many feature transforms. Lastly, we characterize the class of models obtained by performing dropout on Gaussian processes.

READ FULL TEXT

page 4

page 5

page 6

research
03/14/2022

On Connecting Deep Trigonometric Networks with Deep Gaussian Processes: Covariance, Expressivity, and Neural Tangent Kernel

Deep Gaussian Process as a Bayesian learning model is promising because ...
research
04/30/2018

Gaussian Process Behaviour in Wide Deep Neural Networks

Whilst deep neural networks have shown great empirical success, there is...
research
08/29/2021

Neural Network Gaussian Processes by Increasing Depth

Recent years have witnessed an increasing interest in the correspondence...
research
10/12/2019

On the expected behaviour of noise regularised deep neural networks as Gaussian processes

Recent work has established the equivalence between deep neural networks...
research
06/02/2023

Linked Deep Gaussian Process Emulation for Model Networks

Modern scientific problems are often multi-disciplinary and require inte...
research
05/27/2019

Neural Stochastic Differential Equations

Deep neural networks whose parameters are distributed according to typic...

Please sign up or login with your details

Forgot password? Click here to reset