DeepAI AI Chat
Log In Sign Up

Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models

We identify a new variational inference scheme for dynamical systems whose transition function is modelled by a Gaussian process. Inference in this setting has either employed computationally intensive MCMC methods, or relied on factorisations of the variational posterior. As we demonstrate in our experiments, the factorisation between latent system states and transition function can lead to a miscalibrated posterior and to learning unnecessarily large noise terms. We eliminate this factorisation by explicitly modelling the dependence between state trajectories and the Gaussian process posterior. Samples of the latent states can then be tractably generated by conditioning on this representation. The method we obtain (VCDT: variationally coupled dynamics and trajectories) gives better predictive performance and more calibrated estimates of the transition function, yet maintains the same time and space complexities as mean-field methods. Code is available at: github.com/ialong/GPt.

READ FULL TEXT
12/14/2018

Non-Factorised Variational Inference in Dynamical Systems

We focus on variational inference in dynamical systems where the discret...
02/02/2021

Symplectic Gaussian Process Dynamics

Dynamics model learning is challenging and at the same time an active fi...
11/28/2017

Variational Inference for Gaussian Process Models with Linear Complexity

Large-scale Gaussian process inference has long faced practical challeng...
05/30/2017

Identification of Gaussian Process State Space Models

The Gaussian process state space model (GPSSM) is a non-linear dynamical...
11/02/2021

Likelihood-Free Inference in State-Space Models with Unknown Dynamics

We introduce a method for inferring and predicting latent states in the ...
03/14/2022

Modelling Non-Smooth Signals with Complex Spectral Structure

The Gaussian Process Convolution Model (GPCM; Tobar et al., 2015a) is a ...
11/14/2020

Factorized Gaussian Process Variational Autoencoders

Variational autoencoders often assume isotropic Gaussian priors and mean...