Learning to update Auto-associative Memory in Recurrent Neural Networks for Improving Sequence Memorization

09/19/2017
by   Wei Zhang, et al.
0

Learning to remember long sequences remains a challenging task for recurrent neural networks. Register memory and attention mechanisms were both proposed to resolve the issue with either high computational cost to retain memory differentiability, or by discounting the RNN representation learning towards encoding shorter local contexts than encouraging long sequence encoding. Associative memory, which studies the compression of multiple patterns in a fixed size memory, were rarely considered in recent years. Although some recent work tries to introduce associative memory in RNN and mimic the energy decay process in Hopfield nets, it inherits the shortcoming of rule-based memory updates, and the memory capacity is limited. This paper proposes a method to learn the memory update rule jointly with task objective to improve memory capacity for remembering long sequences. Also, we propose an architecture that uses multiple such associative memory for more complex input encoding. We observed some interesting facts when compared to other RNN architectures on some well-studied sequence learning tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/24/2018

PRNN: Recurrent Neural Network with Persistent Memory

Although Recurrent Neural Network (RNN) has been a powerful tool for mod...
research
10/06/2020

Learning to Ignore: Long Document Coreference with Bounded Memory Neural Networks

Long document coreference resolution remains a challenging task due to t...
research
05/29/2022

The impact of memory on learning sequence-to-sequence tasks

The recent success of neural networks in machine translation and other f...
research
06/07/2023

Long Sequence Hopfield Memory

Sequence memory is an essential attribute of natural and artificial inte...
research
01/31/2020

Encoding-based Memory Modules for Recurrent Neural Networks

Learning to solve sequential tasks with recurrent models requires the ab...
research
03/03/2015

Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets

Despite the recent achievements in machine learning, we are still very f...
research
05/28/2021

Deep Memory Update

Recurrent neural networks are key tools for sequential data processing. ...

Please sign up or login with your details

Forgot password? Click here to reset