Overcoming Catastrophic Forgetting beyond Continual Learning: Balanced Training for Neural Machine Translation

03/08/2022
by   Chenze Shao, et al.
0

Neural networks tend to gradually forget the previously learned knowledge when learning multiple tasks sequentially from dynamic data distributions. This problem is called catastrophic forgetting, which is a fundamental challenge in the continual learning of neural networks. In this work, we observe that catastrophic forgetting not only occurs in continual learning but also affects the traditional static training. Neural networks, especially neural machine translation models, suffer from catastrophic forgetting even if they learn from a static training set. To be specific, the final model pays imbalanced attention to training samples, where recently exposed samples attract more attention than earlier samples. The underlying cause is that training samples do not get balanced training in each model update, so we name this problem imbalanced training. To alleviate this problem, we propose Complementary Online Knowledge Distillation (COKD), which uses dynamically updated teacher models trained on specific data orders to iteratively provide complementary knowledge to the student model. Experimental results on multiple machine translation tasks show that our method successfully alleviates the problem of imbalanced training and achieves substantial improvements over strong baseline systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2021

Prototypes-Guided Memory Replay for Continual Learning

Continual learning (CL) refers to a machine learning paradigm that using...
research
06/02/2021

Online Coreset Selection for Rehearsal-based Continual Learning

A dataset is a shred of crucial evidence to describe a task. However, ea...
research
09/07/2018

HC-Net: Memory-based Incremental Dual-Network System for Continual learning

Training a neural network for a classification task typically assumes th...
research
11/03/2022

Continual Learning of Neural Machine Translation within Low Forgetting Risk Regions

This paper considers continual learning of large-scale pretrained neural...
research
11/02/2020

Investigating Catastrophic Forgetting During Continual Training for Neural Machine Translation

Neural machine translation (NMT) models usually suffer from catastrophic...
research
06/12/2020

Understanding the Role of Training Regimes in Continual Learning

Catastrophic forgetting affects the training of neural networks, limitin...
research
07/06/2021

CoReD: Generalizing Fake Media Detection with Continual Representation using Distillation

Over the last few decades, artificial intelligence research has made tre...

Please sign up or login with your details

Forgot password? Click here to reset