Episodic Memory in Lifelong Language Learning

We introduce a lifelong language learning setup where a model needs to learn from a stream of text examples without any dataset identifier. We propose an episodic memory model that performs sparse experience replay and local adaptation to mitigate catastrophic forgetting in this setup. Experiments on text classification and question answering demonstrate the complementary benefits of sparse experience replay and local adaptation to allow the model to continuously learn from new datasets. We also show that the space complexity of the episodic memory module can be reduced significantly ( 50-90 choosing which examples to store in memory with a minimal decrease in performance. We consider an episodic memory component as a crucial building block of general linguistic intelligence and see our model as a first step in that direction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/03/2022

How Relevant is Selective Memory Population in Lifelong Language Learning?

Lifelong language learning seeks to have models continuously learn multi...
research
10/06/2020

Efficient Meta Lifelong-Learning with Limited Memory

Current natural language processing models work well on a single task, y...
research
09/10/2020

Meta-Learning with Sparse Experience Replay for Lifelong Language Learning

Lifelong learning requires models that can continuously learn from seque...
research
03/02/2023

Can BERT Refrain from Forgetting on Sequential Tasks? A Probing Study

Large pre-trained language models help to achieve state of the art on a ...
research
09/10/2021

Saliency Guided Experience Packing for Replay in Continual Learning

Artificial learning systems aspire to mimic human intelligence by contin...
research
04/06/2021

Hypothesis-driven Stream Learning with Augmented Memory

Stream learning refers to the ability to acquire and transfer knowledge ...

Please sign up or login with your details

Forgot password? Click here to reset