A Framework for Interdomain and Multioutput Gaussian Processes

by   Mark van der Wilk, et al.

One obstacle to the use of Gaussian processes (GPs) in large-scale problems, and as a component in deep learning system, is the need for bespoke derivations and implementations for small variations in the model or inference. In order to improve the utility of GPs we need a modular system that allows rapid implementation and testing, as seen in the neural network community. We present a mathematical and software framework for scalable approximate inference in GPs, which combines interdomain approximations and multiple outputs. Our framework, implemented in GPflow, provides a unified interface for many existing multioutput models, as well as more recent convolutional structures. This simplifies the creation of deep models with GPs, and we hope that this work will encourage more interest in this approach.


page 1

page 2

page 3

page 4


Inter-domain Deep Gaussian Processes

Inter-domain Gaussian processes (GPs) allow for high flexibility and low...

Position Tracking using Likelihood Modeling of Channel Features with Gaussian Processes

Recent localization frameworks exploit spatial information of complex ch...

Scalable GAM using sparse variational Gaussian processes

Generalized additive models (GAMs) are a widely used class of models of ...

Low-Precision Arithmetic for Fast Gaussian Processes

Low-precision arithmetic has had a transformative effect on the training...

Learning in the Wild with Incremental Skeptical Gaussian Processes

The ability to learn from human supervision is fundamental for personal ...

Connections and Equivalences between the Nyström Method and Sparse Variational Gaussian Processes

We investigate the connections between sparse approximation methods for ...

When are Iterative Gaussian Processes Reliably Accurate?

While recent work on conjugate gradient methods and Lanczos decompositio...

Code Repositories


Gaussian processes in TensorFlow

view repo