DeepAI AI Chat
Log In Sign Up

Probabilistic Recurrent State-Space Models

by   Andreas Doerr, et al.

State-space models (SSMs) are a highly expressive model class for learning patterns in time series data and for system identification. Deterministic versions of SSMs (e.g., LSTMs) proved extremely successful in modeling complex time-series data. Fully probabilistic SSMs, however, unfortunately often prove hard to train, even for smaller problems. To overcome this limitation, we propose a scalable initialization and training algorithm based on doubly stochastic variational inference and Gaussian processes. In the variational approximation we propose in contrast to related approaches to fully capture the latent state temporal correlations to allow for robust training.


page 1

page 2

page 3

page 4


Hidden Parameter Recurrent State Space Models For Changing Dynamics Scenarios

Recurrent State-space models (RSSMs) are highly expressive models for le...

Structured Variational Inference in Unstable Gaussian Process State Space Models

Gaussian processes are expressive, non-parametric statistical models tha...

Recurrent Neural Processes

We extend Neural Processes (NPs) to sequential data through Recurrent NP...

Scalable approximate inference for state space models with normalising flows

By exploiting mini-batch stochastic gradient optimisation, variational i...

A Variational Bayesian State-Space Approach to Online Passive-Aggressive Regression

Online Passive-Aggressive (PA) learning is a class of online margin-base...

Variational Inference for On-line Anomaly Detection in High-Dimensional Time Series

Approximate variational inference has shown to be a powerful tool for mo...

SeDMiD for Confusion Detection: Uncovering Mind State from Time Series Brain Wave Data

Understanding how brain functions has been an intriguing topic for years...