Gating creates slow modes and controls phase-space complexity in GRUs and LSTMs

01/31/2020
by   Tankut Can, et al.
0

Recurrent neural networks (RNNs) are powerful dynamical models for data with complex temporal structure. However, training RNNs has traditionally proved challenging due to exploding or vanishing of gradients. RNN models such as LSTMs and GRUs (and their variants) significantly mitigate the issues associated with training RNNs by introducing various types of gating units into the architecture. While these gates empirically improve performance, how the addition of gates influences the dynamics and trainability of GRUs and LSTMs is not well understood. Here, we take the perspective of studying randomly-initialized LSTMs and GRUs as dynamical systems, and ask how the salient dynamical properties are shaped by the gates. We leverage tools from random matrix theory and mean-field theory to study the state-to-state Jacobians of GRUs and LSTMs. We show that the update gate in the GRU and the forget gate in the LSTM can lead to an accumulation of slow modes in the dynamics. Moreover, the GRU update gate can poise the system at a marginally stable point. The reset gate in the GRU and the output and input gates in the LSTM control the spectral radius of the Jacobian, and the GRU reset gate also modulates the complexity of the landscape of fixed-points. Furthermore, for the GRU we obtain a phase diagram describing the statistical properties of fixed-points. Finally, we provide some preliminary comparison of training performance to the various dynamical regimes, which will be investigated elsewhere. The techniques introduced here can be generalized to other RNN architectures to elucidate how various architectural choices influence the dynamics and potentially discover novel architectures.

READ FULL TEXT
research
06/14/2018

Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks

Recurrent neural networks have gained widespread use in modeling sequenc...
research
07/29/2020

Theory of gating in recurrent neural networks

RNNs are popular dynamical models, used for processing sequential data. ...
research
09/29/2017

Improving speech recognition by revising gated recurrent units

Speech recognition is largely taking advantage of deep learning, showing...
research
01/25/2019

Dynamical Isometry and a Mean Field Theory of LSTMs and GRUs

Training recurrent neural networks (RNNs) on long sequence tasks is plag...
research
05/09/2018

Long Short-Term Memory as a Dynamically Computed Element-wise Weighted Sum

LSTMs were introduced to combat vanishing gradients in simple RNNs by au...
research
05/21/2017

Recurrent Additive Networks

We introduce recurrent additive networks (RANs), a new gated RNN which i...
research
02/19/2019

Understanding and Controlling Memory in Recurrent Neural Networks

To be effective in sequential data processing, Recurrent Neural Networks...

Please sign up or login with your details

Forgot password? Click here to reset