Recurrent Neural Networks with Flexible Gates using Kernel Activation Functions

07/11/2018
by   Simone Scardapane, et al.
0

Gated recurrent neural networks have achieved remarkable results in the analysis of sequential data. Inside these networks, gates are used to control the flow of information, allowing to model even very long-term dependencies in the data. In this paper, we investigate whether the original gate equation (a linear projection followed by an element-wise sigmoid) can be improved. In particular, we design a more flexible architecture, with a small number of adaptable parameters, which is able to model a wider range of gating functions than the classical one. To this end, we replace the sigmoid function in the standard gate with a non-parametric formulation extending the recently proposed kernel activation function (KAF), with the addition of a residual skip-connection. A set of experiments on sequential variants of the MNIST dataset shows that the adoption of this novel gate allows to improve accuracy with a negligible cost in terms of computational power and with a large speed-up in the number of training iterations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2017

Gate Activation Signal Analysis for Gated Recurrent Neural Networks and Its Correlation with Phoneme Boundaries

In this paper we analyze the gate activation signals inside the gated re...
research
01/20/2017

Gate-Variants of Gated Recurrent Unit (GRU) Neural Networks

The paper evaluates three variants of the Gated Recurrent Unit (GRU) in ...
research
10/04/2022

Fast Saturating Gate for Learning Long Time Scales with Recurrent Neural Networks

Gate functions in recurrent models, such as an LSTM and GRU, play a cent...
research
01/29/2019

Multikernel activation functions: formulation and a case study

The design of activation functions is a growing research area in the fie...
research
02/26/2020

Refined Gate: A Simple and Effective Gating Mechanism for Recurrent Units

Recurrent neural network (RNN) has been widely studied in sequence learn...
research
10/22/2019

Improving the Gating Mechanism of Recurrent Neural Networks

Gating mechanisms are widely used in neural network models, where they a...
research
07/17/2018

Learning Neuron Non-Linearities with Kernel-Based Deep Neural Networks

The effectiveness of deep neural architectures has been widely supported...

Please sign up or login with your details

Forgot password? Click here to reset