Dynamical Isometry and a Mean Field Theory of LSTMs and GRUs

01/25/2019
by   Dar Gilboa, et al.
12

Training recurrent neural networks (RNNs) on long sequence tasks is plagued with difficulties arising from the exponential explosion or vanishing of signals as they propagate forward or backward through the network. Many techniques have been proposed to ameliorate these issues, including various algorithmic and architectural modifications. Two of the most successful RNN architectures, the LSTM and the GRU, do exhibit modest improvements over vanilla RNN cells, but they still suffer from instabilities when trained on very long sequences. In this work, we develop a mean field theory of signal propagation in LSTMs and GRUs that enables us to calculate the time scales for signal propagation as well as the spectral properties of the state-to-state Jacobians. By optimizing these quantities in terms of the initialization hyperparameters, we derive a novel initialization scheme that eliminates or reduces training instabilities. We demonstrate the efficacy of our initialization scheme on multiple sequence tasks, on which it enables successful training while a standard initialization either fails completely or is orders of magnitude slower. We also observe a beneficial effect on generalization performance using this new initialization.

READ FULL TEXT

page 6

page 7

page 15

research
06/14/2018

Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks

Recurrent neural networks have gained widespread use in modeling sequenc...
research
06/14/2018

Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks

In recent years, state-of-the-art methods in computer vision have utiliz...
research
01/31/2020

Gating creates slow modes and controls phase-space complexity in GRUs and LSTMs

Recurrent neural networks (RNNs) are powerful dynamical models for data ...
research
06/03/2019

A Mean Field Theory of Quantized Deep Networks: The Quantization-Depth Trade-Off

Reducing the precision of weights and activation functions in neural net...
research
11/15/2017

Variational Bi-LSTMs

Recurrent neural networks like long short-term memory (LSTM) are importa...
research
12/24/2017

Mean Field Residual Networks: On the Edge of Chaos

We study randomly initialized residual networks using mean field theory ...
research
02/01/2019

Signal propagation in continuous approximations of binary neural networks

The training of stochastic neural network models with binary (±1) weight...

Please sign up or login with your details

Forgot password? Click here to reset