Gate-Variants of Gated Recurrent Unit (GRU) Neural Networks

01/20/2017
by   Rahul Dey, et al.
0

The paper evaluates three variants of the Gated Recurrent Unit (GRU) in recurrent neural networks (RNN) by reducing parameters in the update and reset gates. We evaluate the three variant GRU models on MNIST and IMDB datasets and show that these GRU-RNN variant models perform as well as the original GRU RNN model while reducing the computational expense.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/12/2017

Simplified Minimal Gated Unit Variations for Recurrent Neural Networks

Recurrent neural networks with various types of hidden units have been u...
research
03/03/2019

Understanding Feature Selection and Feature Memorization in Recurrent Neural Networks

In this paper, we propose a test, called Flagged-1-Bit (F1B) test, to st...
research
01/09/2019

Using stigmergy as a computational memory in the design of recurrent neural networks

In this paper, a novel architecture of Recurrent Neural Network (RNN) is...
research
11/24/2015

rnn : Recurrent Library for Torch

The rnn package provides components for implementing a wide range of Rec...
research
01/15/2022

Large-Scale Inventory Optimization: A Recurrent-Neural-Networks-Inspired Simulation Approach

Many large-scale production networks include thousands types of final pr...
research
10/30/2018

Recurrent Attention Unit

Recurrent Neural Network (RNN) has been successfully applied in many seq...
research
07/11/2018

Recurrent Neural Networks with Flexible Gates using Kernel Activation Functions

Gated recurrent neural networks have achieved remarkable results in the ...

Please sign up or login with your details

Forgot password? Click here to reset