Understanding Probabilistic Sparse Gaussian Process Approximations

06/15/2016
by   Matthias Bauer, et al.
0

Good sparse approximations are essential for practical inference in Gaussian Processes as the computational cost of exact methods is prohibitive for large datasets. The Fully Independent Training Conditional (FITC) and the Variational Free Energy (VFE) approximations are two recent popular methods. Despite superficial similarities, these approximations have surprisingly different theoretical properties and behave differently in practice. We thoroughly investigate the two methods for regression both analytically and through illustrative examples, and draw conclusions to guide practical application.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2017

Multi-resolution approximations of Gaussian processes for large spatial datasets

Gaussian processes are popular and flexible models for spatial, temporal...
research
08/27/2023

Integrated Variational Fourier Features for Fast Spatial Modelling with Gaussian Processes

Sparse variational approximations are popular methods for scaling up inf...
research
10/06/2020

Recyclable Gaussian Processes

We present a new framework for recycling independent variational approxi...
research
08/01/2020

Convergence of Sparse Variational Inference in Gaussian Processes Regression

Gaussian processes are distributions over functions that are versatile a...
research
03/05/2020

Knot Selection in Sparse Gaussian Processes with a Variational Objective

Sparse, knot-based Gaussian processes have enjoyed considerable success ...
research
03/08/2019

Rates of Convergence for Sparse Variational Gaussian Process Regression

Excellent variational approximations to Gaussian process posteriors have...
research
02/22/2022

Adaptive Cholesky Gaussian Processes

We present a method to fit exact Gaussian process models to large datase...

Please sign up or login with your details

Forgot password? Click here to reset