Deep Gaussian Processes for Regression using Approximate Expectation Propagation

02/12/2016
by   Thang D. Bui, et al.
0

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (GPs) and are formally equivalent to neural networks with multiple, infinitely wide hidden layers. DGPs are nonparametric probabilistic models and as such are arguably more flexible, have a greater capacity to generalise, and provide better calibrated uncertainty estimates than alternative deep models. This paper develops a new approximate Bayesian learning scheme that enables DGPs to be applied to a range of medium to large scale regression problems for the first time. The new method uses an approximate Expectation Propagation procedure and a novel and efficient extension of the probabilistic backpropagation algorithm for learning. We evaluate the new method for non-linear regression on eleven real-world datasets, showing that it always outperforms GP regression and is almost always better than state-of-the-art deterministic and sampling-based approximate inference methods for Bayesian neural networks. As a by-product, this work provides a comprehensive analysis of six approximate Bayesian methods for training neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/11/2015

Training Deep Gaussian Processes using Stochastic Expectation Propagation and Probabilistic Backpropagation

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisati...
research
06/11/2015

Mondrian Forests for Large-Scale Regression when Uncertainty Matters

Many real-world regression problems demand a measure of the uncertainty ...
research
06/14/2022

Deep Variational Implicit Processes

Implicit processes (IPs) are a generalization of Gaussian processes (GPs...
research
08/14/2016

Branching Gaussian Processes with Applications to Spatiotemporal Reconstruction of 3D Trees

We propose a robust method for estimating dynamic 3D curvilinear branchi...
research
11/20/2015

Recurrent Gaussian Processes

We define Recurrent Gaussian Processes (RGP) models, a general family of...
research
06/18/2020

Infinite attention: NNGP and NTK for deep attention networks

There is a growing amount of literature on the relationship between wide...
research
06/06/2018

Variational Implicit Processes

This paper introduces the variational implicit processes (VIPs), a Bayes...

Please sign up or login with your details

Forgot password? Click here to reset