DeepAI AI Chat
Log In Sign Up

Continual Lifelong Learning in Natural Language Processing: A Survey

by   Magdalena Biesialska, et al.

Continual learning (CL) aims to enable information systems to learn from a continuous data stream across time. However, it is difficult for existing deep learning architectures to learn a new task without largely forgetting previously acquired knowledge. Furthermore, CL is particularly challenging for language learning, as natural language is ambiguous: it is discrete, compositional, and its meaning is context-dependent. In this work, we look at the problem of CL through the lens of various NLP tasks. Our survey discusses major challenges in CL and current methods applied in neural network models. We also provide a critical review of the existing CL evaluation methods and datasets in NLP. Finally, we present our outlook on future research directions.


page 1

page 2

page 3

page 4


Continual Learning of Natural Language Processing Tasks: A Survey

Continual learning (CL) is an emerging learning paradigm that aims to em...

Federated Learning Meets Natural Language Processing: A Survey

Federated Learning aims to learn machine learning models from multiple d...

Beyond Leaderboards: A survey of methods for revealing weaknesses in Natural Language Inference data and models

Recent years have seen a growing number of publications that analyse Nat...

DRILL: Dynamic Representations for Imbalanced Lifelong Learning

Continual or lifelong learning has been a long-standing challenge in mac...

Scalable Recollections for Continual Lifelong Learning

Given the recent success of Deep Learning applied to a variety of single...

Learning with Latent Structures in Natural Language Processing: A Survey

While end-to-end learning with fully differentiable models has enabled t...

Enhancing Networking Cipher Algorithms with Natural Language

This work provides a survey of several networking cipher algorithms and ...