Assessing the Memory Ability of Recurrent Neural Networks

02/18/2020
by   Cheng Zhang, et al.
22

It is known that Recurrent Neural Networks (RNNs) can remember, in their hidden layers, part of the semantic information expressed by a sequence (e.g., a sentence) that is being processed. Different types of recurrent units have been designed to enable RNNs to remember information over longer time spans. However, the memory abilities of different recurrent units are still theoretically and empirically unclear, thus limiting the development of more effective and explainable RNNs. To tackle the problem, in this paper, we identify and analyze the internal and external factors that affect the memory ability of RNNs, and propose a Semantic Euclidean Space to represent the semantics expressed by a sequence. Based on the Semantic Euclidean Space, a series of evaluation indicators are defined to measure the memory abilities of different recurrent units and analyze their limitations. These evaluation indicators also provide a useful guidance to select suitable sequence lengths for different RNNs during training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2016

Improving Recurrent Neural Networks For Sequence Labelling

In this paper we study different types of Recurrent Neural Networks (RNN...
research
05/31/2015

Recurrent Neural Networks with External Memory for Language Understanding

Recurrent Neural Networks (RNNs) have become increasingly popular for th...
research
10/25/2018

Reversible Recurrent Neural Networks

Recurrent neural networks (RNNs) provide state-of-the-art performance in...
research
02/19/2019

Understanding and Controlling Memory in Recurrent Neural Networks

To be effective in sequential data processing, Recurrent Neural Networks...
research
10/15/2020

RNNs can generate bounded hierarchical languages with optimal memory

Recurrent neural networks empirically generate natural language with hig...
research
06/02/2018

A Novel Framework for Recurrent Neural Networks with Enhancing Information Processing and Transmission between Units

This paper proposes a novel framework for recurrent neural networks (RNN...
research
04/13/2018

Neural Trajectory Analysis of Recurrent Neural Network In Handwriting Synthesis

Recurrent neural networks (RNNs) are capable of learning to generate hig...

Please sign up or login with your details

Forgot password? Click here to reset