Simplified Minimal Gated Unit Variations for Recurrent Neural Networks

01/12/2017
by   Joel Heck, et al.
0

Recurrent neural networks with various types of hidden units have been used to solve a diverse range of problems involving sequence data. Two of the most recent proposals, gated recurrent units (GRU) and minimal gated units (MGU), have shown comparable promising results on example public datasets. In this paper, we introduce three model variants of the minimal gated unit (MGU) which further simplify that design by reducing the number of parameters in the forget-gate dynamic equation. These three model variants, referred to simply as MGU1, MGU2, and MGU3, were tested on sequences generated from the MNIST dataset and from the Reuters Newswire Topics (RNT) dataset. The new models have shown similar accuracy to the MGU model while using fewer parameters and thus lowering training expense. One model variant, namely MGU2, performed better than MGU on the datasets considered, and thus may be used as an alternate to MGU or GRU in recurrent neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/20/2017

Gate-Variants of Gated Recurrent Unit (GRU) Neural Networks

The paper evaluates three variants of the Gated Recurrent Unit (GRU) in ...
research
03/31/2016

Minimal Gated Unit for Recurrent Neural Networks

Recently recurrent neural networks (RNN) has been very successful in han...
research
08/02/2019

Falls Prediction in eldery people using Gated Recurrent Units

Falls prevention, especially in older people, becomes an increasingly im...
research
10/06/2017

Lattice Recurrent Unit: Improving Convergence and Statistical Efficiency for Sequence Modeling

Recurrent neural networks have shown remarkable success in modeling sequ...
research
05/08/2021

Pouring Dynamics Estimation Using Gated Recurrent Units

One of the most commonly performed manipulation in a human's daily life ...
research
06/03/2013

Riemannian metrics for neural networks II: recurrent networks and learning symbolic data sequences

Recurrent neural networks are powerful models for sequential data, able ...
research
03/22/2017

Gate Activation Signal Analysis for Gated Recurrent Neural Networks and Its Correlation with Phoneme Boundaries

In this paper we analyze the gate activation signals inside the gated re...

Please sign up or login with your details

Forgot password? Click here to reset