Multimodal Teacher Forcing for Reconstructing Nonlinear Dynamical Systems

12/15/2022
by   Manuel Brenner, et al.
0

Many, if not most, systems of interest in science are naturally described as nonlinear dynamical systems (DS). Empirically, we commonly access these systems through time series measurements, where often we have time series from different types of data modalities simultaneously. For instance, we may have event counts in addition to some continuous signal. While by now there are many powerful machine learning (ML) tools for integrating different data modalities into predictive models, this has rarely been approached so far from the perspective of uncovering the underlying, data-generating DS (aka DS reconstruction). Recently, sparse teacher forcing (TF) has been suggested as an efficient control-theoretic method for dealing with exploding loss gradients when training ML models on chaotic DS. Here we incorporate this idea into a novel recurrent neural network (RNN) training framework for DS reconstruction based on multimodal variational autoencoders (MVAE). The forcing signal for the RNN is generated by the MVAE which integrates different types of simultaneously given time series data into a joint latent code optimal for DS reconstruction. We show that this training method achieves significantly better reconstructions on multimodal datasets generated from chaotic DS benchmarks than various alternative methods.

READ FULL TEXT

page 6

page 11

research
11/04/2021

Identifying nonlinear dynamical systems from multi-modal time series data

Empirically observed time series in physics, biology, or medicine, are c...
research
07/06/2022

Tractable Dendritic RNNs for Reconstructing Nonlinear Dynamical Systems

In many scientific disciplines, we are interested in inferring the nonli...
research
06/07/2023

Generalized Teacher Forcing for Learning Chaotic Dynamics

Chaotic dynamical systems (DS) are ubiquitous in nature and society. Oft...
research
04/11/2022

Lyapunov-Guided Embedding for Hyperparameter Selection in Recurrent Neural Networks

Recurrent Neural Networks (RNN) are ubiquitous computing systems for seq...
research
09/15/2022

Efficient learning of nonlinear prediction models with time-series privileged information

In domains where sample sizes are limited, efficient learning algorithms...
research
10/08/2022

Multi-Task Dynamical Systems

Time series datasets are often composed of a variety of sequences from t...
research
01/31/2023

Recurrences reveal shared causal drivers of complex time series

Many experimental time series measurements share an unobserved causal dr...

Please sign up or login with your details

Forgot password? Click here to reset