Curriculum Learning for Domain Adaptation in Neural Machine Translation

05/14/2019
by   Xuan Zhang, et al.
0

We introduce a curriculum learning approach to adapt generic neural machine translation models to a specific domain. Samples are grouped by their similarities to the domain of interest and each group is fed to the training algorithm with a particular schedule. This approach is simple to implement on top of any neural framework or architecture, and consistently outperforms both unadapted and adapted baselines in experiments with two distinct domains and two language pairs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/19/2016

Domain specialization: a post-training domain adaptation for Neural Machine Translation

Domain adaptation is a key feature in Machine Translation. It generally ...
research
07/29/2017

Curriculum Learning and Minibatch Bucketing in Neural Machine Translation

We examine the effects of particular orderings of sentence pairs on the ...
research
11/02/2018

An Empirical Exploration of Curriculum Learning for Neural Machine Translation

Machine translation systems based on deep neural networks are expensive ...
research
09/06/2023

Epi-Curriculum: Episodic Curriculum Learning for Low-Resource Domain Adaptation in Neural Machine Translation

Neural Machine Translation (NMT) models have become successful, but thei...
research
09/14/2018

Freezing Subnetworks to Analyze Domain Adaptation in Neural Machine Translation

To better understand the effectiveness of continued training, we analyze...
research
11/05/2018

Compact Personalized Models for Neural Machine Translation

We propose and compare methods for gradient-based domain adaptation of s...
research
06/03/2019

Dynamically Composing Domain-Data Selection with Clean-Data Selection by "Co-Curricular Learning" for Neural Machine Translation

Noise and domain are important aspects of data quality for neural machin...

Please sign up or login with your details

Forgot password? Click here to reset