Gaussian Processes for Big Data

09/26/2013
by   James Hensman, et al.
0

We introduce stochastic variational inference for Gaussian process models. This enables the application of Gaussian process (GP) models to data sets containing millions of data points. We show how GPs can be vari- ationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform variational inference. Our ap- proach is readily extended to models with non-Gaussian likelihoods and latent variable models based around Gaussian processes. We demonstrate the approach on a simple toy problem and two real world data sets.

READ FULL TEXT

page 5

page 7

research
02/06/2014

Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

Gaussian processes (GPs) are a powerful tool for probabilistic inference...
research
10/23/2019

Sparse Orthogonal Variational Inference for Gaussian Processes

We introduce a new interpretation of sparse variational approximations f...
research
11/02/2012

Deep Gaussian Processes

In this paper we introduce deep Gaussian process (GP) models. Deep GPs a...
research
11/05/2019

GP-ALPS: Automatic Latent Process Selection for Multi-Output Gaussian Process Models

A simple and widely adopted approach to extend Gaussian processes (GPs) ...
research
09/10/2022

Revisiting Active Sets for Gaussian Process Decoders

Decoders built on Gaussian processes (GPs) are enticing due to the margi...
research
05/27/2017

Efficient Modeling of Latent Information in Supervised Learning using Gaussian Processes

Often in machine learning, data are collected as a combination of multip...
research
02/06/2014

Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models - a Gentle Tutorial

In this tutorial we explain the inference procedures developed for the s...

Please sign up or login with your details

Forgot password? Click here to reset