DeepAI AI Chat
Log In Sign Up

Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models

We identify a new variational inference scheme for dynamical systems whose transition function is modelled by a Gaussian process. Inference in this setting has either employed computationally intensive MCMC methods, or relied on factorisations of the variational posterior. As we demonstrate in our experiments, the factorisation between latent system states and transition function can lead to a miscalibrated posterior and to learning unnecessarily large noise terms. We eliminate this factorisation by explicitly modelling the dependence between state trajectories and the Gaussian process posterior. Samples of the latent states can then be tractably generated by conditioning on this representation. The method we obtain (VCDT: variationally coupled dynamics and trajectories) gives better predictive performance and more calibrated estimates of the transition function, yet maintains the same time and space complexities as mean-field methods. Code is available at:


Non-Factorised Variational Inference in Dynamical Systems

We focus on variational inference in dynamical systems where the discret...

Symplectic Gaussian Process Dynamics

Dynamics model learning is challenging and at the same time an active fi...

Variational Inference for Gaussian Process Models with Linear Complexity

Large-scale Gaussian process inference has long faced practical challeng...

Identification of Gaussian Process State Space Models

The Gaussian process state space model (GPSSM) is a non-linear dynamical...

Likelihood-Free Inference in State-Space Models with Unknown Dynamics

We introduce a method for inferring and predicting latent states in the ...

Modelling Non-Smooth Signals with Complex Spectral Structure

The Gaussian Process Convolution Model (GPCM; Tobar et al., 2015a) is a ...

Factorized Gaussian Process Variational Autoencoders

Variational autoencoders often assume isotropic Gaussian priors and mean...