Generalized Teacher Forcing for Learning Chaotic Dynamics

06/07/2023
by   Florian Hess, et al.
0

Chaotic dynamical systems (DS) are ubiquitous in nature and society. Often we are interested in reconstructing such systems from observed time series for prediction or mechanistic insight, where by reconstruction we mean learning geometrical and invariant temporal properties of the system in question (like attractors). However, training reconstruction algorithms like recurrent neural networks (RNNs) on such systems by gradient-descent based techniques faces severe challenges. This is mainly due to exploding gradients caused by the exponential divergence of trajectories in chaotic systems. Moreover, for (scientific) interpretability we wish to have as low dimensional reconstructions as possible, preferably in a model which is mathematically tractable. Here we report that a surprisingly simple modification of teacher forcing leads to provably strictly all-time bounded gradients in training on chaotic systems, and, when paired with a simple architectural rearrangement of a tractable RNN design, piecewise-linear RNNs (PLRNNs), allows for faithful reconstruction in spaces of at most the dimensionality of the observed system. We show on several DS that with these amendments we can reconstruct DS better than current SOTA algorithms, in much lower dimensions. Performance differences were particularly compelling on real world data with which most other methods severely struggled. This work thus led to a simple yet powerful DS reconstruction algorithm which is highly interpretable at the same time.

READ FULL TEXT

page 4

page 20

page 22

page 23

page 24

page 26

page 30

page 31

research
07/06/2022

Tractable Dendritic RNNs for Reconstructing Nonlinear Dynamical Systems

In many scientific disciplines, we are interested in inferring the nonli...
research
10/14/2021

How to train RNNs on chaotic data?

Recurrent neural networks (RNNs) are wide-spread machine learning tools ...
research
12/15/2022

Multimodal Teacher Forcing for Reconstructing Nonlinear Dynamical Systems

Many, if not most, systems of interest in science are naturally describe...
research
04/11/2020

On Error Correction Neural Networks for Economic Forecasting

Recurrent neural networks (RNNs) are more suitable for learning non-line...
research
12/23/2016

A State Space Approach for Piecewise-Linear Recurrent Neural Networks for Reconstructing Nonlinear Dynamics from Neural Measurements

The computational properties of neural systems are often thought to be i...
research
05/18/2018

Prediction in Projection: A new paradigm in delay-coordinate reconstruction

Delay-coordinate embedding is a powerful, time-tested mathematical frame...
research
11/02/2016

Inferring Coupling of Distributed Dynamical Systems via Transfer Entropy

In this work, we are interested in structure learning for a set of spati...

Please sign up or login with your details

Forgot password? Click here to reset