Gated Recurrent Neural Tensor Network

06/07/2017
by   Andros Tjandra, et al.
0

Recurrent Neural Networks (RNNs), which are a powerful scheme for modeling temporal and sequential data need to capture long-term dependencies on datasets and represent them in hidden layers with a powerful model to capture more information from inputs. For modeling long-term dependencies in a dataset, the gating mechanism concept can help RNNs remember and forget previous information. Representing the hidden layers of an RNN with more expressive operations (i.e., tensor products) helps it learn a more complex relationship between the current input and the previous hidden layer information. These ideas can generally improve RNN performances. In this paper, we proposed a novel RNN architecture that combine the concepts of gating mechanism and the tensor product into a single model. By combining these two concepts into a single RNN, our proposed models learn long-term dependencies by modeling with gating units and obtain more expressive and direct interaction between input and hidden layers using a tensor product on 3-dimensional array (tensor) weight parameters. We use Long Short Term Memory (LSTM) RNN and Gated Recurrent Unit (GRU) RNN and combine them with a tensor product inside their formulations. Our proposed RNNs, which are called a Long-Short Term Memory Recurrent Neural Tensor Network (LSTMRNTN) and Gated Recurrent Unit Recurrent Neural Tensor Network (GRURNTN), are made by combining the LSTM and GRU RNN models with the tensor product. We conducted experiments with our proposed models on word-level and character-level language modeling tasks and revealed that our proposed models significantly improved their performance compared to our baseline models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2015

Gated Feedback Recurrent Neural Networks

In this work, we propose a novel recurrent neural network (RNN) architec...
research
04/11/2016

Deep Gate Recurrent Neural Network

This paper introduces two recurrent neural network structures called Sim...
research
02/14/2014

A Clockwork RNN

Sequence prediction and classification are ubiquitous and challenging pr...
research
03/26/2017

Learning Simpler Language Models with the Differential State Framework

Learning useful information across long time lags is a critical and diff...
research
06/09/2020

Tensor train decompositions on recurrent networks

Recurrent neural networks (RNN) such as long-short-term memory (LSTM) ne...
research
08/21/2019

Restricted Recurrent Neural Networks

Recurrent Neural Network (RNN) and its variations such as Long Short-Ter...
research
03/30/2020

SiTGRU: Single-Tunnelled Gated Recurrent Unit for Abnormality Detection

Abnormality detection is a challenging task due to the dependence on a s...

Please sign up or login with your details

Forgot password? Click here to reset