Latent SDEs on Homogeneous Spaces

06/28/2023
by   Sebastian Zeng, et al.
0

We consider the problem of variational Bayesian inference in a latent variable model where a (possibly complex) observed stochastic process is governed by the solution of a latent stochastic differential equation (SDE). Motivated by the challenges that arise when trying to learn an (almost arbitrary) latent neural SDE from large-scale data, such as efficient gradient computation, we take a step back and study a specific subclass instead. In our case, the SDE evolves on a homogeneous latent space and is induced by stochastic dynamics of the corresponding (matrix) Lie group. In learning problems, SDEs on the unit n-sphere are arguably the most relevant incarnation of this setup. Notably, for variational inference, the sphere not only facilitates using a truly uninformative prior SDE, but we also obtain a particularly simple and intuitive expression for the Kullback-Leibler divergence between the approximate posterior and prior process in the evidence lower bound. Experiments demonstrate that a latent SDE of the proposed type can be learned efficiently by means of an existing one-step geometric Euler-Maruyama scheme. Despite restricting ourselves to a less diverse class of SDEs, we achieve competitive or even state-of-the-art performance on various time series interpolation and classification benchmarks.

READ FULL TEXT

page 9

page 15

research
06/29/2021

Continuous Latent Process Flows

Partial observations of continuous time-series dynamics at arbitrary tim...
research
09/06/2018

Improving Explorability in Variational Inference with Annealed Variational Objectives

Despite the advances in the representational capacity of approximate dis...
research
02/12/2021

Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations

We perform scalable approximate inference in a recently-proposed family ...
research
06/18/2020

Constraining Variational Inference with Geometric Jensen-Shannon Divergence

We examine the problem of controlling divergences for latent space regul...
research
06/05/2018

Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference

We formalize the problem of learning interdomain correspondences in the ...
research
05/23/2019

Neural Stochastic Differential Equations: Deep Latent Gaussian Models in the Diffusion Limit

In deep latent Gaussian models, the latent variable is generated by a ti...
research
01/18/2021

Mind the Gap when Conditioning Amortised Inference in Sequential Latent-Variable Models

Amortised inference enables scalable learning of sequential latent-varia...

Please sign up or login with your details

Forgot password? Click here to reset