Inferring Dynamical Systems with Long-Range Dependencies through Line Attractor Regularization

10/08/2019
by   Dominik Schmidt, et al.
0

Vanilla RNN with ReLU activation have a simple structure that is amenable to systematic dynamical systems analysis and interpretation, but they suffer from the exploding vs. vanishing gradients problem. Recent attempts to retain this simplicity while alleviating the gradient problem are based on proper initialization schemes or orthogonality/unitary constraints on the RNN's recurrence matrix, which, however, comes with limitations to its expressive power with regards to dynamical systems phenomena like chaos or multi-stability. Here, we instead suggest a regularization scheme that pushes part of the RNN's latent subspace toward a line attractor configuration that enables long short-term memory and arbitrarily slow time scales. We show that our approach excels on a number of benchmarks like the sequential MNIST or multiplication problems, and enables reconstruction of dynamical systems which harbor widely different time scales.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2019

Customizing Sequence Generation with Multi-Task Dynamical Systems

Dynamical system models (including RNNs) often lack the ability to adapt...
research
05/23/2016

An Information Criterion for Inferring Coupling in Distributed Dynamical Systems

The behaviour of many real-world phenomena can be modelled by nonlinear ...
research
11/12/2015

Improving performance of recurrent neural network with relu nonlinearity

In recent years significant progress has been made in successfully train...
research
10/31/2019

A Dynamically Controlled Recurrent Neural Network for Modeling Dynamical Systems

This work proposes a novel neural network architecture, called the Dynam...
research
10/30/2019

Input-Output Equivalence of Unitary and Contractive RNNs

Unitary recurrent neural networks (URNNs) have been proposed as a method...
research
04/11/2022

Lyapunov-Guided Embedding for Hyperparameter Selection in Recurrent Neural Networks

Recurrent Neural Networks (RNN) are ubiquitous computing systems for seq...
research
04/11/2019

GP-HD: Using Genetic Programming to Generate Dynamical Systems Models for Health Care

The huge wealth of data in the health domain can be exploited to create ...

Please sign up or login with your details

Forgot password? Click here to reset