DRILL: Dynamic Representations for Imbalanced Lifelong Learning

05/18/2021
by   Kyra Ahrens, et al.
0

Continual or lifelong learning has been a long-standing challenge in machine learning to date, especially in natural language processing (NLP). Although state-of-the-art language models such as BERT have ushered in a new era in this field due to their outstanding performance in multitask learning scenarios, they suffer from forgetting when being exposed to a continuous stream of data with shifting data distributions. In this paper, we introduce DRILL, a novel continual learning architecture for open-domain text classification. DRILL leverages a biologically inspired self-organizing neural architecture to selectively gate latent language representations from BERT in a task-incremental manner. We demonstrate in our experiments that DRILL outperforms current methods in a realistic scenario of imbalanced, non-stationary data without prior knowledge about task boundaries. To the best of our knowledge, DRILL is the first of its kind to use a self-organizing neural architecture for open-domain lifelong learning in NLP.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/25/2020

Continual Domain Adaptation for Machine Reading Comprehension

Machine reading comprehension (MRC) has become a core component in a var...
research
03/20/2019

Online continual learning with no task boundaries

Continual learning is the ability of an agent to learn online with a non...
research
12/17/2020

Continual Lifelong Learning in Natural Language Processing: A Survey

Continual learning (CL) aims to enable information systems to learn from...
research
11/23/2022

Continual Learning of Natural Language Processing Tasks: A Survey

Continual learning (CL) is an emerging learning paradigm that aims to em...
research
04/03/2019

Unsupervised Continual Learning and Self-Taught Associative Memory Hierarchies

We first pose the Unsupervised Continual Learning (UCL) problem: learnin...
research
06/30/2020

Enabling Continual Learning with Differentiable Hebbian Plasticity

Continual learning is the problem of sequentially learning new tasks or ...
research
02/03/2021

Pitfalls of Static Language Modelling

Our world is open-ended, non-stationary and constantly evolving; thus wh...

Please sign up or login with your details

Forgot password? Click here to reset