Characterizing Deep Gaussian Processes via Nonlinear Recurrence Systems

10/19/2020
by   Anh Tong, et al.
0

Recent advances in Deep Gaussian Processes (DGPs) show the potential to have more expressive representation than that of traditional Gaussian Processes (GPs). However, there exists a pathology of deep Gaussian processes that their learning capacities reduce significantly when the number of layers increases. In this paper, we present a new analysis in DGPs by studying its corresponding nonlinear dynamic systems to explain the issue. Existing work reports the pathology for the squared exponential kernel function. We extend our investigation to four types of common stationary kernel functions. The recurrence relations between layers are analytically derived, providing a tighter bound and the rate of convergence of the dynamic systems. We demonstrate our finding with a number of experimental results.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

11/01/2020

Inter-domain Deep Gaussian Processes

Inter-domain Gaussian processes (GPs) allow for high flexibility and low...
03/12/2013

Gaussian Processes for Nonlinear Signal Processing

Gaussian processes (GPs) are versatile tools that have been successfully...
11/30/2017

How Deep Are Deep Gaussian Processes?

Recent research has shown the potential utility of probability distribut...
02/24/2018

Product Kernel Interpolation for Scalable Gaussian Processes

Recent work shows that inference for Gaussian processes can be performed...
10/28/2020

Hierarchical Gaussian Processes with Wasserstein-2 Kernels

We investigate the usefulness of Wasserstein-2 kernels in the context of...
06/01/2012

Predictive Information Rate in Discrete-time Gaussian Processes

We derive expressions for the predicitive information rate (PIR) for the...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.