RNNs Evolving in Equilibrium: A Solution to the Vanishing and Exploding Gradients

08/22/2019
by   Anil Kag, et al.
0

Recurrent neural networks (RNNs) are particularly well-suited for modeling long-term dependencies in sequential data, but are notoriously hard to train because the error backpropagated in time either vanishes or explodes at an exponential rate. While a number of works attempt to mitigate this effect through gated recurrent units, well-chosen parametric constraints, and skip-connections, we develop a novel perspective that seeks to evolve the hidden state on the equilibrium manifold of an ordinary differential equation (ODE). We propose a family of novel RNNs, namely Equilibriated Recurrent Neural Networks (ERNNs) that overcome the gradient decay or explosion effect and lead to recurrent models that evolve on the equilibrium manifold. We show that equilibrium points are stable, leading to fast convergence of the discretized ODE to fixed points. Furthermore, ERNNs account for long-term dependencies, and can efficiently recall informative aspects of data from the distant past. We show that ERNNs achieve state-of-the-art accuracy on many challenging data sets with 3-10x speedups, 1.5-3x model size reduction, and with similar prediction cost relative to vanilla RNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/22/2019

RNNs Evolving on an Equilibrium Manifold: A Panacea for Vanishing and Exploding Gradients?

Recurrent neural networks (RNNs) are particularly well-suited for modeli...
research
03/09/2021

UnICORNN: A recurrent model for learning very long time dependencies

The design of recurrent neural networks (RNNs) to accurately process seq...
research
02/26/2019

AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks

Recurrent neural networks have gained widespread use in modeling sequent...
research
04/30/2016

Higher Order Recurrent Neural Networks

In this paper, we study novel neural network structures to better model ...
research
06/24/2016

Sampling-based Gradient Regularization for Capturing Long-Term Dependencies in Recurrent Neural Networks

Vanishing (and exploding) gradients effect is a common problem for recur...
research
11/04/2013

On Fast Dropout and its Applicability to Recurrent Networks

Recurrent Neural Networks (RNNs) are rich models for the processing of s...
research
03/02/2019

Equilibrated Recurrent Neural Network: Neuronal Time-Delayed Self-Feedback Improves Accuracy and Stability

We propose a novel Equilibrated Recurrent Neural Network (ERNN) to comb...

Please sign up or login with your details

Forgot password? Click here to reset