Parallelizing Linear Recurrent Neural Nets Over Sequence Length

09/12/2017
by   Eric Martin, et al.
0

Recurrent neural networks (RNNs) are widely used to model sequential data but their non-linear dependencies between sequence elements prevent parallelizing training over sequence length. We show the training of RNNs with only linear sequential dependencies can be parallelized over the sequence length using the parallel scan algorithm, leading to rapid training on long sequences with small minibatch size. We abstract prior linear sequence models into a new framework of linear surrogate RNNs and develop a linear surrogate long short-term memory (LS-LSTM) powered by a parallel linear recurrence CUDA kernel we implemented. We evaluate the LS-LSTM on a long sequence noisy autoregressive task and find the LS-LSTM achieves slightly superior train and test performance to a similar sized LSTM in 4x less training time. We analyze latency and throughput of the LS-LSTM and find the LS-LSTM reaches up to 175x the throughput of the LSTM in the small minibatch long sequence regime.

READ FULL TEXT
research
02/11/2017

Parallel Long Short-Term Memory for Multi-stream Classification

Recently, machine learning methods have provided a broad spectrum of ori...
research
09/21/2023

Parallelizing non-linear sequential models over the sequence length

Sequential models, such as Recurrent Neural Networks and Neural Ordinary...
research
03/07/2022

Parallel Training of GRU Networks with a Multi-Grid Solver for Long Sequences

Parallelizing Gated Recurrent Unit (GRU) networks is a challenging task,...
research
03/10/2015

Single stream parallelization of generalized LSTM-like RNNs on a GPU

Recurrent neural networks (RNNs) have shown outstanding performance on p...
research
10/17/2014

Learning to Execute

Recurrent Neural Networks (RNNs) with Long Short-Term Memory units (LSTM...
research
09/09/2011

Learning Sequence Neighbourhood Metrics

Recurrent neural networks (RNNs) in combination with a pooling operator ...
research
07/09/2018

IGLOO: Slicing the Features Space to Represent Long Sequences

We introduce a new neural network architecture, IGLOO, which aims at pro...

Please sign up or login with your details

Forgot password? Click here to reset