The trade-off between long-term memory and smoothness for recurrent networks

06/20/2019
by   Antônio H. Ribeiro, et al.
0

Training recurrent neural networks (RNNs) that possess long-term memory is challenging. We provide insight into the trade-off between the smoothness of the cost function and the memory retention capabilities of the network. We express both aspects in terms of the Lipschitz constant of the dynamics modeled by the network. This allows us to make a distinction between three types of regions in the parameter space. In the first region, the network experiences problems in retaining long-term information, while at the same time the cost function is smooth and easy for gradient descent to navigate in. In the second region, the amount of stored information increases with time and the cost function is intricate and full of local minima. The third region is in between the two other regions and here the RNN is able to retain long-term information. Based on these theoretical findings we present the hypothesis that good parameter choices for the RNN are located in between the well-behaved and the ill-behaved cost function regions. The concepts presented in the paper are illustrated by artificially generated and real examples.

READ FULL TEXT
research
11/18/2019

Eigenvalue Normalized Recurrent Neural Networks for Short Term Memory

Several variants of recurrent neural networks (RNNs) with orthogonal or ...
research
06/02/2021

Warming-up recurrent neural networks to maximize reachable multi-stability greatly improves learning

Training recurrent neural networks is known to be difficult when time de...
research
10/25/2017

Benefits of Depth for Long-Term Memory of Recurrent Networks

The key attribute that drives the unprecedented success of modern Recurr...
research
07/05/2023

Facing off World Model Backbones: RNNs, Transformers, and S4

World models are a fundamental component in model-based reinforcement le...
research
04/27/2022

Can deep learning match the efficiency of human visual long-term memory in storing object details?

Humans have a remarkably large capacity to store detailed visual informa...
research
05/29/2020

Long term dynamics of the subgradient method for Lipschitz path differentiable functions

We consider the long-term dynamics of the vanishing stepsize subgradient...
research
10/02/2004

Applying Policy Iteration for Training Recurrent Neural Networks

Recurrent neural networks are often used for learning time-series data. ...

Please sign up or login with your details

Forgot password? Click here to reset