How to Construct Deep Recurrent Neural Networks

12/20/2013
by   Razvan Pascanu, et al.
0

In this paper, we explore different ways to extend a recurrent neural network (RNN) to a deep RNN. We start by arguing that the concept of depth in an RNN is not as clear as it is in feedforward neural networks. By carefully analyzing and understanding the architecture of an RNN, however, we find three points of an RNN which may be made deeper; (1) input-to-hidden function, (2) hidden-to-hidden transition and (3) hidden-to-output function. Based on this observation, we propose two novel architectures of a deep RNN which are orthogonal to an earlier attempt of stacking multiple recurrent layers to build a deep RNN (Schmidhuber, 1992; El Hihi and Bengio, 1996). We provide an alternative interpretation of these deep RNNs using a novel framework based on neural operators. The proposed deep RNNs are empirically evaluated on the tasks of polyphonic music prediction and language modeling. The experimental result supports our claim that the proposed deep RNNs benefit from the depth and outperform the conventional, shallow RNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2015

Gated Feedback Recurrent Neural Networks

In this work, we propose a novel recurrent neural network (RNN) architec...
research
02/26/2016

Architectural Complexity Measures of Recurrent Neural Networks

In this paper, we systematically analyze the connecting architectures of...
research
04/13/2016

Bridging the Gaps Between Residual Learning, Recurrent Neural Networks and Visual Cortex

We discuss relations between Residual Networks (ResNet), Recurrent Neura...
research
01/30/2019

Generalized Tensor Models for Recurrent Neural Networks

Recurrent Neural Networks (RNNs) are very successful at solving challeng...
research
02/09/2019

Contextual Recurrent Neural Networks

There is an implicit assumption that by unfolding recurrent neural netwo...
research
11/25/2018

Deep RNN Framework for Visual Sequential Applications

Extracting temporal and representation features efficiently plays a pivo...
research
03/02/2019

Equilibrated Recurrent Neural Network: Neuronal Time-Delayed Self-Feedback Improves Accuracy and Stability

We propose a novel Equilibrated Recurrent Neural Network (ERNN) to comb...

Please sign up or login with your details

Forgot password? Click here to reset