Refined Gate: A Simple and Effective Gating Mechanism for Recurrent Units

02/26/2020
by   Zhanzhan Cheng, et al.
0

Recurrent neural network (RNN) has been widely studied in sequence learning tasks, while the mainstream models (e.g., LSTM and GRU) rely on the gating mechanism (in control of how information flows between hidden states). However, the vanilla gates in RNN (e.g. the input gate in LSTM) suffer from the problem of gate undertraining mainly due to the saturating activation functions, which may result in failures of learning gating roles and thus the weak performance. In this paper, we propose a new gating mechanism within general gated recurrent neural networks to handle this issue. Specifically, the proposed gates directly short connect the extracted input features to the outputs of vanilla gates, denoted as refined gates. The refining mechanism allows enhancing gradient back-propagation as well as extending the gating activation scope, which, although simple, can guide RNN to reach possibly deeper minima. We verify the proposed gating mechanism on three popular types of gated RNNs including LSTM, GRU and MGU. Extensive experiments on 3 synthetic tasks, 3 language modeling tasks and 5 scene text recognition benchmarks demonstrate the effectiveness of our method.

READ FULL TEXT
research
04/11/2016

Deep Gate Recurrent Neural Network

This paper introduces two recurrent neural network structures called Sim...
research
11/13/2019

Structured Sparsification of Gated Recurrent Neural Networks

Recently, a lot of techniques were developed to sparsify the weights of ...
research
03/03/2019

Understanding Feature Selection and Feature Memorization in Recurrent Neural Networks

In this paper, we propose a test, called Flagged-1-Bit (F1B) test, to st...
research
12/12/2018

Bayesian Sparsification of Gated Recurrent Neural Networks

Bayesian methods have been successfully applied to sparsify weights of n...
research
10/22/2019

Improving the Gating Mechanism of Recurrent Neural Networks

Gating mechanisms are widely used in neural network models, where they a...
research
05/21/2017

Recurrent Additive Networks

We introduce recurrent additive networks (RANs), a new gated RNN which i...
research
07/11/2018

Recurrent Neural Networks with Flexible Gates using Kernel Activation Functions

Gated recurrent neural networks have achieved remarkable results in the ...

Please sign up or login with your details

Forgot password? Click here to reset