Going Wider: Recurrent Neural Network With Parallel Cells

by   Danhao Zhu, et al.

Recurrent Neural Network (RNN) has been widely applied for sequence modeling. In RNN, the hidden states at current step are full connected to those at previous step, thus the influence from less related features at previous step may potentially decrease model's learning ability. We propose a simple technique called parallel cells (PCs) to enhance the learning ability of Recurrent Neural Network (RNN). In each layer, we run multiple small RNN cells rather than one single large cell. In this paper, we evaluate PCs on 2 tasks. On language modeling task on PTB (Penn Tree Bank), our model outperforms state of art models by decreasing perplexity from 78.6 to 75.3. On Chinese-English translation task, our model increases BLEU score for 0.39 points than baseline model.


page 1

page 2

page 3

page 4


Generate Image Descriptions based on Deep RNN and Memory Cells for Images Features

Generating natural language descriptions for images is a challenging tas...

Unified recurrent neural network for many feature types

There are time series that are amenable to recurrent neural network (RNN...

Thick-Net: Parallel Network Structure for Sequential Modeling

Recurrent neural networks have been widely used in sequence learning tas...

Recurrent Attention Unit

Recurrent Neural Network (RNN) has been successfully applied in many seq...

Neural oscillators for magnetic hysteresis modeling

Hysteresis is a ubiquitous phenomenon in science and engineering; its mo...

DartsReNet: Exploring new RNN cells in ReNet architectures

We present new Recurrent Neural Network (RNN) cells for image classifica...

Decomposing a Recurrent Neural Network into Modules for Enabling Reusability and Replacement

Can we take a recurrent neural network (RNN) trained to translate betwee...

Please sign up or login with your details

Forgot password? Click here to reset