Learning to Rehearse in Long Sequence Memorization

06/02/2021
by   Zhu Zhang, et al.
0

Existing reasoning tasks often have an important assumption that the input contents can be always accessed while reasoning, requiring unlimited storage resources and suffering from severe time delay on long sequences. To achieve efficient reasoning on long sequences with limited storage resources, memory augmented neural networks introduce a human-like write-read memory to compress and memorize the long input sequence in one pass, trying to answer subsequent queries only based on the memory. But they have two serious drawbacks: 1) they continually update the memory from current information and inevitably forget the early contents; 2) they do not distinguish what information is important and treat all contents equally. In this paper, we propose the Rehearsal Memory (RM) to enhance long-sequence memorization by self-supervised rehearsal with a history sampler. To alleviate the gradual forgetting of early information, we design self-supervised rehearsal training with recollection and familiarity tasks. Further, we design a history sampler to select informative fragments for rehearsal training, making the memory focus on the crucial information. We evaluate the performance of our rehearsal memory by the synthetic bAbI task and several downstream tasks, including text/video question answering and recommendation on long sequences.

READ FULL TEXT
research
07/21/2020

Distributed Memory based Self-Supervised Differentiable Neural Computer

A differentiable neural computer (DNC) is a memory augmented neural netw...
research
05/12/2023

A Memory Model for Question Answering from Streaming Data Supported by Rehearsal and Anticipation of Coreference Information

Existing question answering methods often assume that the input content ...
research
11/05/2022

Differentiable Neural Computers with Memory Demon

A Differentiable Neural Computer (DNC) is a neural network with an exter...
research
09/30/2022

Match to Win: Analysing Sequences Lengths for Efficient Self-supervised Learning in Speech and Audio

Self-supervised learning (SSL) has proven vital in speech and audio-rela...
research
02/11/2018

Dual Control Memory Augmented Neural Networks for Treatment Recommendations

Machine-assisted treatment recommendations hold a promise to reduce phys...
research
08/03/2022

SpanDrop: Simple and Effective Counterfactual Learning for Long Sequences

Distilling supervision signal from a long sequence to make predictions i...
research
09/04/2023

Blind Biological Sequence Denoising with Self-Supervised Set Learning

Biological sequence analysis relies on the ability to denoise the imprec...

Please sign up or login with your details

Forgot password? Click here to reset