Meta-Curriculum Learning for Domain Adaptation in Neural Machine Translation

by   Runzhe Zhan, et al.

Meta-learning has been sufficiently validated to be beneficial for low-resource neural machine translation (NMT). However, we find that meta-trained NMT fails to improve the translation performance of the domain unseen at the meta-training stage. In this paper, we aim to alleviate this issue by proposing a novel meta-curriculum learning for domain adaptation in NMT. During meta-training, the NMT first learns the similar curricula from each domain to avoid falling into a bad local optimum early, and finally learns the curricula of individualities to improve the model robustness for learning domain-specific knowledge. Experimental results on 10 different low-resource domains show that meta-curriculum learning can improve the translation performance of both familiar and unfamiliar domains. All the codes and data are freely available at


page 1

page 2

page 3

page 4


An Empirical Study of Domain Adaptation for Unsupervised Neural Machine Translation

Domain adaptation methods have been well-studied in supervised neural ma...

Meta-Learning for Few-Shot NMT Adaptation

We present META-MT, a meta-learning approach to adapt Neural Machine Tra...

Improving both domain robustness and domain adaptability in machine translation

We address two problems of domain adaptation in neural machine translati...

Meta-Learning for Low-Resource Unsupervised Neural MachineTranslation

Unsupervised machine translation, which utilizes unpaired monolingual co...

Token-wise Curriculum Learning for Neural Machine Translation

Existing curriculum learning approaches to Neural Machine Translation (N...

Dynamic Curriculum Learning for Low-Resource Neural Machine Translation

Large amounts of data has made neural machine translation (NMT) a big su...

Self-Induced Curriculum Learning in Neural Machine Translation

Self-supervised neural machine translation (SS-NMT) learns how to extrac...