Training Deep Gaussian Processes using Stochastic Expectation Propagation and Probabilistic Backpropagation

11/11/2015
by   Thang D. Bui, et al.
0

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (GPs) and are formally equivalent to neural networks with multiple, infinitely wide hidden layers. DGPs are probabilistic and non-parametric and as such are arguably more flexible, have a greater capacity to generalise, and provide better calibrated uncertainty estimates than alternative deep models. The focus of this paper is scalable approximate Bayesian learning of these networks. The paper develops a novel and efficient extension of probabilistic backpropagation, a state-of-the-art method for training Bayesian neural networks, that can be used to train DGPs. The new method leverages a recently proposed method for scaling Expectation Propagation, called stochastic Expectation Propagation. The method is able to automatically discover useful input warping, expansion or compression, and it is therefore is a flexible form of Bayesian kernel design. We demonstrate the success of the new method for supervised learning on several real-world datasets, showing that it typically outperforms GP regression and is never much worse.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2016

Deep Gaussian Processes for Regression using Approximate Expectation Propagation

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisati...
research
02/18/2015

Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks

Large multilayer neural networks trained with backpropagation have recen...
research
05/26/2020

Skew Gaussian Processes for Classification

Gaussian processes (GPs) are distributions over functions, which provide...
research
02/24/2021

The Promises and Pitfalls of Deep Kernel Learning

Deep kernel learning and related techniques promise to combine the repre...
research
06/29/2018

Bayesian Deep Learning on a Quantum Computer

Bayesian methods in machine learning, such as Gaussian processes, have g...
research
12/11/2021

A Sparse Expansion For Deep Gaussian Processes

Deep Gaussian Processes (DGP) enable a non-parametric approach to quanti...
research
06/11/2015

Mondrian Forests for Large-Scale Regression when Uncertainty Matters

Many real-world regression problems demand a measure of the uncertainty ...

Please sign up or login with your details

Forgot password? Click here to reset