Fast-Slow Recurrent Neural Networks

05/24/2017
by   Asier Mujika, et al.
0

Processing sequential data of variable length is a major challenge in a wide range of applications, such as speech recognition, language modeling, generative image modeling and machine translation. Here, we address this challenge by proposing a novel recurrent neural network (RNN) architecture, the Fast-Slow RNN (FS-RNN). The FS-RNN incorporates the strengths of both multiscale RNNs and deep transition RNNs as it processes sequential data on different timescales and learns complex transition functions from one time step to the next. We evaluate the FS-RNN on two character level language modeling data sets, Penn Treebank and Hutter Prize Wikipedia, where we improve state of the art results to 1.19 and 1.25 bits-per-character (BPC), respectively. In addition, an ensemble of two FS-RNNs achieves 1.20 BPC on Hutter Prize Wikipedia outperforming the best known compression algorithm with respect to the BPC measure. We also present an empirical investigation of the learning and network dynamics of the FS-RNN, which explains the improved performance compared to other RNN architectures. Our approach is general as any kind of RNN cell is a possible building block for the FS-RNN architecture, and thus can be flexibly applied to different tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/26/2017

Rotational Unit of Memory

The concepts of unitary evolution matrices and associative memory have b...
research
09/08/2017

Training RNNs as Fast as CNNs

Common recurrent neural network architectures scale poorly due to the in...
research
10/24/2016

Surprisal-Driven Zoneout

We propose a novel method of regularization for recurrent neural network...
research
08/19/2018

Linked Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have been proven to be effective in mod...
research
03/03/2017

Machine Learning on Sequential Data Using a Recurrent Weighted Average

Recurrent Neural Networks (RNN) are a type of statistical model designed...
research
11/19/2015

Alternative structures for character-level RNNs

Recurrent neural networks are convenient and efficient models for langua...
research
11/26/2015

Regularizing RNNs by Stabilizing Activations

We stabilize the activations of Recurrent Neural Networks (RNNs) by pena...

Please sign up or login with your details

Forgot password? Click here to reset