Continual Multi-task Gaussian Processes

10/31/2019
by   Pablo Moreno-Muñoz, et al.
16

We address the problem of continual learning in multi-task Gaussian process (GP) models for handling sequential input-output observations. Our approach extends the existing prior-posterior recursion of online Bayesian inference, i.e. past posterior discoveries become future prior beliefs, to the infinite functional space setting of GP. For a reason of scalability, we introduce variational inference together with an sparse approximation based on inducing inputs. As a consequence, we obtain tractable continual lower-bounds where two novel Kullback-Leibler (KL) divergences intervene in a natural way. The key technical property of our method is the recursive reconstruction of conditional GP priors conditioned on the variational parameters learned so far. To achieve this goal, we introduce a novel factorization of past variational distributions, where the predictive GP equation propagates the posterior uncertainty forward. We then demonstrate that it is possible to derive GP models over many types of sequential observations, either discrete or continuous and amenable to stochastic optimization. The continual inference approach is also applicable to scenarios where potential multi-channel or heterogeneous observations might appear. Extensive experiments demonstrate that the method is fully scalable, shows a reliable performance and is robust to uncertainty error propagation over a plenty of synthetic and real-world datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/31/2019

Functional Regularisation for Continual Learning using Gaussian Processes

We introduce a novel approach for supervised continual learning based on...
research
06/09/2020

Variational Auto-Regressive Gaussian Processes for Continual Learning

This paper proposes Variational Auto-Regressive Gaussian Process (VAR-GP...
research
03/06/2020

Rethinking Sparse Gaussian Processes: Bayesian Approaches to Inducing-Variable Approximations

Variational inference techniques based on inducing variables provide an ...
research
06/06/2023

Memory-Based Dual Gaussian Processes for Sequential Learning

Sequential learning with Gaussian processes (GPs) is challenging when ac...
research
05/11/2022

Stochastic Variational Smoothed Model Checking

Model-checking for parametric stochastic models can be expressed as chec...
research
02/14/2022

Mapping Interstellar Dust with Gaussian Processes

Interstellar dust corrupts nearly every stellar observation, and account...
research
11/18/2016

A Generalized Stochastic Variational Bayesian Hyperparameter Learning Framework for Sparse Spectrum Gaussian Process Regression

While much research effort has been dedicated to scaling up sparse Gauss...

Please sign up or login with your details

Forgot password? Click here to reset