Neural Network Gaussian Processes by Increasing Depth

08/29/2021
by   Shao-Qun Zhang, et al.
0

Recent years have witnessed an increasing interest in the correspondence between infinitely wide networks and Gaussian processes. Despite the effectiveness and elegance of the current neural network Gaussian process theory, to the best of our knowledge, all the neural network Gaussian processes are essentially induced by increasing width. However, in the era of deep learning, what concerns us more regarding a neural network is its depth as well as how depth impacts the behaviors of a network. Inspired by a width-depth symmetry consideration, we use a shortcut network to show that increasing the depth of a neural network can also give rise to a Gaussian process, which is a valuable addition to the existing theory and contributes to revealing the true picture of deep learning. Beyond the proposed Gaussian process by depth, we theoretically characterize its uniform tightness property and the smallest eigenvalue of its associated kernel. These characterizations can not only enhance our understanding of the proposed depth-induced Gaussian processes, but also pave the way for future applications. Lastly, we examine the performance of the proposed Gaussian process by regression experiments on two real-world data sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/25/2021

A brief note on understanding neural networks as Gaussian processes

As a generalization of the work in [Lee et al., 2017], this note briefly...
research
01/03/2020

Wide Neural Networks with Bottlenecks are Deep Gaussian Processes

There is recently much work on the "wide limit" of neural networks, wher...
research
10/18/2022

Locally Smoothed Gaussian Process Regression

We develop a novel framework to accelerate Gaussian process regression (...
research
11/30/2017

How Deep Are Deep Gaussian Processes?

Recent research has shown the potential utility of probability distribut...
research
12/19/2022

A note on the smallest eigenvalue of the empirical covariance of causal Gaussian processes

We present a simple proof for bounding the smallest eigenvalue of the em...
research
02/24/2014

Avoiding pathologies in very deep networks

Choosing appropriate architectures and regularization strategies for dee...
research
06/11/2021

The Limitations of Large Width in Neural Networks: A Deep Gaussian Process Perspective

Large width limits have been a recent focus of deep learning research: m...

Please sign up or login with your details

Forgot password? Click here to reset