Associative Long Short-Term Memory

02/09/2016
by   Ivo Danihelka, et al.
0

We investigate a new method to augment recurrent neural networks with extra memory without increasing the number of network parameters. The system has an associative memory based on complex-valued vectors and is closely related to Holographic Reduced Representations and Long Short-Term Memory networks. Holographic Reduced Representations have limited capacity: as they store more information, each retrieval becomes noisier due to interference. Our system in contrast creates redundant copies of stored information, which enables retrieval with reduced noise. Experiments demonstrate faster learning on multiple memorization tasks.

READ FULL TEXT
research
04/18/2018

Fast Weight Long Short-Term Memory

Associative memory using fast weights is a short-term memory mechanism t...
research
09/30/2020

Analyzing the Capacity of Distributed Vector Representations to Encode Spatial Information

Vector Symbolic Architectures belong to a family of related cognitive mo...
research
09/22/2021

Palimpsest Memories Stored in Memristive Synapses

Biological synapses store multiple memories on top of each other in a pa...
research
04/04/2018

Boosting Handwriting Text Recognition in Small Databases with Transfer Learning

In this paper we deal with the offline handwriting text recognition (HTR...
research
11/26/2018

Augmenting Robot Knowledge Consultants with Distributed Short Term Memory

Human-robot communication in situated environments involves a complex in...
research
01/05/2019

Learning to Remember More with Less Memorization

Memory-augmented neural networks consisting of a neural controller and a...
research
04/06/2019

Parallelizable Stack Long Short-Term Memory

Stack Long Short-Term Memory (StackLSTM) is useful for various applicati...

Please sign up or login with your details

Forgot password? Click here to reset