RNNs Evolving on an Equilibrium Manifold: A Panacea for Vanishing and Exploding Gradients?

08/22/2019
by   Anil Kag, et al.
0

Recurrent neural networks (RNNs) are particularly well-suited for modeling long-term dependencies in sequential data, but are notoriously hard to train because the error backpropagated in time either vanishes or explodes at an exponential rate. While a number of works attempt to mitigate this effect through gated recurrent units, well-chosen parametric constraints, and skip-connections, we develop a novel perspective that seeks to evolve the hidden state on the equilibrium manifold of an ordinary differential equation (ODE). We propose a family of novel RNNs, namely Equilibriated Recurrent Neural Networks (ERNNs) that overcome the gradient decay or explosion effect and lead to recurrent models that evolve on the equilibrium manifold. We show that equilibrium points are stable, leading to fast convergence of the discretized ODE to fixed points. Furthermore, ERNNs account for long-term dependencies, and can efficiently recall informative aspects of data from the distant past. We show that ERNNs achieve state-of-the-art accuracy on many challenging data sets with 3-10x speedups, 1.5-3x model size reduction, and with similar prediction cost relative to vanilla RNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/22/2019

RNNs Evolving in Equilibrium: A Solution to the Vanishing and Exploding Gradients

Recurrent neural networks (RNNs) are particularly well-suited for modeli...
research
03/09/2021

UnICORNN: A recurrent model for learning very long time dependencies

The design of recurrent neural networks (RNNs) to accurately process seq...
research
02/26/2019

AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks

Recurrent neural networks have gained widespread use in modeling sequent...
research
04/30/2016

Higher Order Recurrent Neural Networks

In this paper, we study novel neural network structures to better model ...
research
04/29/2019

Learning Longer-term Dependencies via Grouped Distributor Unit

Learning long-term dependencies still remains difficult for recurrent ne...
research
10/28/2019

On Generalization Bounds of a Family of Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have been widely applied to sequential ...
research
02/29/2016

Representation of linguistic form and function in recurrent neural networks

We present novel methods for analyzing the activation patterns of RNNs f...

Please sign up or login with your details

Forgot password? Click here to reset