Learning Longer-term Dependencies in RNNs with Auxiliary Losses

03/01/2018
by   Trieu H. Trinh, et al.
0

Despite recent advances in training recurrent neural networks (RNNs), capturing long-term dependencies in sequences remains a fundamental challenge. Most approaches use backpropagation through time (BPTT), which is difficult to scale to very long sequences. This paper proposes a simple method that improves the ability to capture long term dependencies in RNNs by adding an unsupervised auxiliary loss to the original objective. This auxiliary loss forces RNNs to either reconstruct previous events or predict next events in a sequence, making truncated backpropagation feasible for long sequences and also improving full BPTT. We evaluate our method on a variety of settings, including pixel-by-pixel image classification with sequence lengths up to 16 000, and a real document classification benchmark. Our results highlight good performance and resource efficiency of this approach over competitive baselines, including other recurrent models and a comparable sized Transformer. Further analyses reveal beneficial effects of the auxiliary loss on optimization and regularization, as well as extreme cases where there is little to no backpropagation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2019

Decoupling Hierarchical Recurrent Neural Networks With Locally Computable Losses

Learning long-term dependencies is a key long-standing challenge of recu...
research
03/26/2021

Backpropagation Through Time For Networks With Long-Term Dependencies

Backpropagation through time (BPTT) is a technique of updating tuned par...
research
06/24/2016

Sampling-based Gradient Regularization for Capturing Long-Term Dependencies in Recurrent Neural Networks

Vanishing (and exploding) gradients effect is a common problem for recur...
research
05/23/2019

Population-based Global Optimisation Methods for Learning Long-term Dependencies with RNNs

Despite recent innovations in network architectures and loss functions, ...
research
11/07/2017

Sparse Attentive Backtracking: Long-Range Credit Assignment in Recurrent Networks

A major drawback of backpropagation through time (BPTT) is the difficult...
research
07/09/2018

On Training Recurrent Networks with Truncated Backpropagation Through Time in Speech Recognition

Recurrent neural networks have been the dominant models for many speech ...
research
11/29/2022

Exploring the Long-Term Generalization of Counting Behavior in RNNs

In this study, we investigate the generalization of LSTM, ReLU and GRU m...

Please sign up or login with your details

Forgot password? Click here to reset