How Relevant is Selective Memory Population in Lifelong Language Learning?

10/03/2022
by   Vladimir Araujo, et al.
12

Lifelong language learning seeks to have models continuously learn multiple tasks in a sequential order without suffering from catastrophic forgetting. State-of-the-art approaches rely on sparse experience replay as the primary approach to prevent forgetting. Experience replay usually adopts sampling methods for the memory population; however, the effect of the chosen sampling strategy on model performance has not yet been studied. In this paper, we investigate how relevant the selective memory population is in the lifelong learning process of text classification and question-answering tasks. We found that methods that randomly store a uniform number of samples from the entire data stream lead to high performances, especially for low memory size, which is consistent with computer vision studies.

READ FULL TEXT
research
06/03/2019

Episodic Memory in Lifelong Language Learning

We introduce a lifelong language learning setup where a model needs to l...
research
09/10/2020

Meta-Learning with Sparse Experience Replay for Lifelong Language Learning

Lifelong learning requires models that can continuously learn from seque...
research
03/02/2023

Can BERT Refrain from Forgetting on Sequential Tasks? A Probing Study

Large pre-trained language models help to achieve state of the art on a ...
research
02/28/2018

Selective Experience Replay for Lifelong Learning

Deep reinforcement learning has emerged as a powerful tool for a variety...
research
10/06/2020

Efficient Meta Lifelong-Learning with Limited Memory

Current natural language processing models work well on a single task, y...
research
09/06/2018

Memory Replay GANs: learning to generate images from new categories without forgetting

Previous works on sequential learning address the problem of forgetting ...
research
05/25/2023

Condensed Prototype Replay for Class Incremental Learning

Incremental learning (IL) suffers from catastrophic forgetting of old ta...

Please sign up or login with your details

Forgot password? Click here to reset