Distributed Memory based Self-Supervised Differentiable Neural Computer

07/21/2020
by   Taewon Park, et al.
0

A differentiable neural computer (DNC) is a memory augmented neural network devised to solve a wide range of algorithmic and question answering tasks and it showed promising performance in a variety of domains. However, its single memory-based operations are not enough to store and retrieve diverse informative representations existing in many tasks. Furthermore, DNC does not explicitly consider the memorization itself as a target objective, which inevitably leads to a very slow learning speed of the model. To address those issues, we propose a novel distributed memory-based self-supervised DNC architecture for enhanced memory augmented neural network performance. We introduce (i) a multiple distributed memory block mechanism that stores information independently to each memory block and uses stored information in a cooperative way for diverse representation and (ii) a self-supervised memory loss term which ensures how well a given input is written to the memory. Our experiments on algorithmic and question answering tasks show that the proposed model outperforms all other variations of DNC in a large margin, and also matches the performance of other state-of-the-art memory-based network models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/07/2018

Robust and Scalable Differentiable Neural Computer for Question Answering

Deep learning models are often not easily adaptable to new tasks and req...
research
05/25/2019

Neural Stored-program Memory

Neural networks powered with external memory simulate computer behaviors...
research
07/23/2019

Metalearned Neural Memory

We augment recurrent neural networks with an external memory mechanism t...
research
04/23/2019

Improving Differentiable Neural Computers Through Memory Masking, De-allocation, and Link Distribution Sharpness Control

The Differentiable Neural Computer (DNC) can learn algorithmic and quest...
research
06/02/2021

Learning to Rehearse in Long Sequence Memorization

Existing reasoning tasks often have an important assumption that the inp...
research
03/18/2020

Progress Extrapolating Algorithmic Learning to Arbitrary Sequence Lengths

Recent neural network models for algorithmic tasks have led to significa...
research
09/28/2020

An Entropic Associative Memory

Natural memories are associative, declarative and distributed. Symbolic ...

Please sign up or login with your details

Forgot password? Click here to reset