Deep Gate Recurrent Neural Network

04/11/2016
by   Yuan Gao, et al.
0

This paper introduces two recurrent neural network structures called Simple Gated Unit (SGU) and Deep Simple Gated Unit (DSGU), which are general structures for learning long term dependencies. Compared to traditional Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), both structures require fewer parameters and less computation time in sequence classification tasks. Unlike GRU and LSTM, which require more than one gates to control information flow in the network, SGU and DSGU only use one multiplicative gate to control the flow of information. We show that this difference can accelerate the learning speed in tasks that require long dependency information. We also show that DSGU is more numerically stable than SGU. In addition, we also propose a standard way of representing inner structure of RNN called RNN Conventional Graph (RCG), which helps analyzing the relationship between input units and hidden units of RNN.

READ FULL TEXT
research
06/07/2017

Gated Recurrent Neural Tensor Network

Recurrent Neural Networks (RNNs), which are a powerful scheme for modeli...
research
04/29/2019

Learning Longer-term Dependencies via Grouped Distributor Unit

Learning long-term dependencies still remains difficult for recurrent ne...
research
02/26/2020

Refined Gate: A Simple and Effective Gating Mechanism for Recurrent Units

Recurrent neural network (RNN) has been widely studied in sequence learn...
research
10/30/2018

Recurrent Attention Unit

Recurrent Neural Network (RNN) has been successfully applied in many seq...
research
01/22/2019

Reducing state updates via Gaussian-gated LSTMs

Recurrent neural networks can be difficult to train on long sequence dat...
research
05/22/2018

EcoRNN: Fused LSTM RNN Implementation with Data Layout Optimization

Long-Short-Term-Memory Recurrent Neural Network (LSTM RNN) is a state-of...
research
09/03/2019

EleAtt-RNN: Adding Attentiveness to Neurons in Recurrent Neural Networks

Recurrent neural networks (RNNs) are capable of modeling temporal depend...

Please sign up or login with your details

Forgot password? Click here to reset