Fading memory as inductive bias in residual recurrent networks

07/27/2023
by   Igor Dubinin, et al.
0

Residual connections have been proposed as architecture-based inductive bias to mitigate the problem of exploding and vanishing gradients and increase task performance in both feed-forward and recurrent networks (RNNs) when trained with the backpropagation algorithm. Yet, little is known about how residual connections in RNNs influence their dynamics and fading memory properties. Here, we introduce weakly coupled residual recurrent networks (WCRNNs) in which residual connections result in well-defined Lyapunov exponents and allow for studying properties of fading memory. We investigate how the residual connections of WCRNNs influence their performance, network dynamics, and memory properties on a set of benchmark tasks. We show that several distinct forms of residual connections yield effective inductive biases that result in increased network expressivity. In particular, residual connections that (i) result in network dynamics at the proximity of the edge of chaos, (ii) allow networks to capitalize on characteristic spectral properties of the data, and (iii) result in heterogeneous memory properties are shown to increase practical expressivity. In addition, we demonstrate how our results can be extended to non-linear residuals and introduce a weakly coupled residual initialization scheme that can be used for Elman RNNs

READ FULL TEXT

page 10

page 14

page 23

page 24

research
09/26/2019

The Ant Swarm Neuro-Evolution Procedure for Optimizing Recurrent Networks

Hand-crafting effective and efficient structures for recurrent neural ne...
research
08/06/2018

Residual Memory Networks: Feed-forward approach to learn long temporal dependencies

Training deep recurrent neural network (RNN) architectures is complicate...
research
01/19/2021

Implicit Bias of Linear RNNs

Contemporary wisdom based on empirical studies suggests that standard re...
research
05/31/2019

Improved memory in recurrent neural networks with sequential non-normal dynamics

Training recurrent neural networks (RNNs) is a hard problem due to degen...
research
05/21/2021

Maximum and Leaky Maximum Propagation

In this work, we present an alternative to conventional residual connect...
research
03/23/2023

Return of the RNN: Residual Recurrent Networks for Invertible Sentence Embeddings

This study presents a novel model for invertible sentence embeddings usi...
research
10/11/2022

On Scrambling Phenomena for Randomly Initialized Recurrent Networks

Recurrent Neural Networks (RNNs) frequently exhibit complicated dynamics...

Please sign up or login with your details

Forgot password? Click here to reset