Characterizing Deep Gaussian Processes via Nonlinear Recurrence Systems

10/19/2020
by   Anh Tong, et al.
0

Recent advances in Deep Gaussian Processes (DGPs) show the potential to have more expressive representation than that of traditional Gaussian Processes (GPs). However, there exists a pathology of deep Gaussian processes that their learning capacities reduce significantly when the number of layers increases. In this paper, we present a new analysis in DGPs by studying its corresponding nonlinear dynamic systems to explain the issue. Existing work reports the pathology for the squared exponential kernel function. We extend our investigation to four types of common stationary kernel functions. The recurrence relations between layers are analytically derived, providing a tighter bound and the rate of convergence of the dynamic systems. We demonstrate our finding with a number of experimental results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2019

Multi-task Learning in Deep Gaussian Processes with Multi-kernel Layers

We present a multi-task learning formulation for Deep Gaussian processes...
research
03/12/2013

Gaussian Processes for Nonlinear Signal Processing

Gaussian processes (GPs) are versatile tools that have been successfully...
research
11/30/2017

How Deep Are Deep Gaussian Processes?

Recent research has shown the potential utility of probability distribut...
research
02/24/2018

Product Kernel Interpolation for Scalable Gaussian Processes

Recent work shows that inference for Gaussian processes can be performed...
research
06/01/2012

Predictive Information Rate in Discrete-time Gaussian Processes

We derive expressions for the predicitive information rate (PIR) for the...
research
06/04/2019

Posterior Variance Analysis of Gaussian Processes with Application to Average Learning Curves

The posterior variance of Gaussian processes is a valuable measure of th...

Please sign up or login with your details

Forgot password? Click here to reset