A Tutorial on Sparse Gaussian Processes and Variational Inference

12/27/2020
by   Felix Leibfried, et al.
0

Gaussian processes (GPs) provide a framework for Bayesian inference that can offer principled uncertainty estimates for a large range of problems. For example, if we consider regression problems with Gaussian likelihoods, a GP model enjoys a posterior in closed form. However, identifying the posterior GP scales cubically with the number of training examples and requires to store all examples in memory. In order to overcome these obstacles, sparse GPs have been proposed that approximate the true posterior GP with pseudo-training examples. Importantly, the number of pseudo-training examples is user-defined and enables control over computational and memory complexity. In the general case, sparse GPs do not enjoy closed-form solutions and one has to resort to approximate inference. In this context, a convenient choice for approximate inference is variational inference (VI), where the problem of Bayesian inference is cast as an optimization problem – namely, to maximize a lower bound of the log marginal likelihood. This paves the way for a powerful and versatile framework, where pseudo-training examples are treated as optimization arguments of the approximate posterior that are jointly identified together with hyperparameters of the generative model (i.e. prior and likelihood). The framework can naturally handle a wide scope of supervised learning problems, ranging from regression with heteroscedastic and non-Gaussian likelihoods to classification problems with discrete labels, but also multilabel problems. The purpose of this tutorial is to provide access to the basic matter for readers without prior knowledge in both GPs and VI. A proper exposition to the subject enables also access to more recent advances (like importance-weighted VI as well as inderdomain, multioutput and deep GPs) that can serve as an inspiration for new research ideas.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 31

11/03/2017

Structured Variational Inference for Coupled Gaussian Processes

Sparse variational approximations allow for principled and scalable infe...
05/17/2020

Global inducing point variational posteriors for Bayesian neural networks and deep Gaussian processes

Variational inference is a popular approach to reason about uncertainty ...
10/26/2021

Modular Gaussian Processes for Transfer Learning

We present a framework for transfer learning based on modular variationa...
07/26/2021

Wasserstein-Splitting Gaussian Process Regression for Heterogeneous Online Bayesian Inference

Gaussian processes (GPs) are a well-known nonparametric Bayesian inferen...
12/31/2017

Using Deep Neural Network Approximate Bayesian Network

We present a new method to approximate posterior probabilities of Bayesi...
10/28/2021

Conditioning Sparse Variational Gaussian Processes for Online Decision-making

With a principled representation of uncertainty and closed form posterio...
12/25/2014

Gaussian Process Pseudo-Likelihood Models for Sequence Labeling

Several machine learning problems arising in natural language processing...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.