On Lyapunov Exponents for RNNs: Understanding Information Propagation Using Dynamical Systems Tools

06/25/2020
by   Ryan Vogt, et al.
0

Recurrent neural networks (RNNs) have been successfully applied to a variety of problems involving sequential data, but their optimization is sensitive to parameter initialization, architecture, and optimizer hyperparameters. Considering RNNs as dynamical systems, a natural way to capture stability, i.e., the growth and decay over long iterates, are the Lyapunov Exponents (LEs), which form the Lyapunov spectrum. The LEs have a bearing on stability of RNN training dynamics because forward propagation of information is related to the backward propagation of error gradients. LEs measure the asymptotic rates of expansion and contraction of nonlinear system trajectories, and generalize stability analysis to the time-varying attractors structuring the non-autonomous dynamics of data-driven RNNs. As a tool to understand and exploit stability of training dynamics, the Lyapunov spectrum fills an existing gap between prescriptive mathematical approaches of limited scope and computationally-expensive empirical approaches. To leverage this tool, we implement an efficient way to compute LEs for RNNs during training, discuss the aspects specific to standard RNN architectures driven by typical sequential datasets, and show that the Lyapunov spectrum can serve as a robust readout of training stability across hyperparameters. With this exposition-oriented contribution, we hope to draw attention to this understudied, but theoretically grounded tool for understanding training stability in RNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2022

Lyapunov-Guided Embedding for Hyperparameter Selection in Recurrent Neural Networks

Recurrent Neural Networks (RNN) are ubiquitous computing systems for seq...
research
04/11/2020

Convex Sets of Robust Recurrent Neural Networks

Recurrent neural networks (RNNs) are a class of nonlinear dynamical syst...
research
10/14/2021

How to train RNNs on chaotic data?

Recurrent neural networks (RNNs) are wide-spread machine learning tools ...
research
07/29/2020

Theory of gating in recurrent neural networks

RNNs are popular dynamical models, used for processing sequential data. ...
research
06/12/2023

On the Dynamics of Learning Time-Aware Behavior with Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have shown great success in modeling ti...
research
10/28/2020

The geometry of integration in text classification RNNs

Despite the widespread application of recurrent neural networks (RNNs) a...

Please sign up or login with your details

Forgot password? Click here to reset