Memory-Based Dual Gaussian Processes for Sequential Learning

by   Paul E. Chang, et al.

Sequential learning with Gaussian processes (GPs) is challenging when access to past data is limited, for example, in continual and active learning. In such cases, errors can accumulate over time due to inaccuracies in the posterior, hyperparameters, and inducing points, making accurate learning challenging. Here, we present a method to keep all such errors in check using the recently proposed dual sparse variational GP. Our method enables accurate inference for generic likelihoods and improves learning by actively building and updating a memory of past data. We demonstrate its effectiveness in several applications involving Bayesian optimization, active learning, and continual learning.


page 1

page 8

page 17

page 18


Variational Auto-Regressive Gaussian Processes for Continual Learning

This paper proposes Variational Auto-Regressive Gaussian Process (VAR-GP...

Fantasizing with Dual GPs in Bayesian Optimization and Active Learning

Gaussian processes (GPs) are the main surrogate functions used for seque...

Functional Regularisation for Continual Learning using Gaussian Processes

We introduce a novel approach for supervised continual learning based on...

Continual Multi-task Gaussian Processes

We address the problem of continual learning in multi-task Gaussian proc...

Conditioning Sparse Variational Gaussian Processes for Online Decision-making

With a principled representation of uncertainty and closed form posterio...

Kernel Interpolation for Scalable Online Gaussian Processes

Gaussian processes (GPs) provide a gold standard for performance in onli...

Active learning for enumerating local minima based on Gaussian process derivatives

We study active learning (AL) based on Gaussian Processes (GPs) for effi...

Please sign up or login with your details

Forgot password? Click here to reset