An Empirical Exploration of Curriculum Learning for Neural Machine Translation

11/02/2018
by   Xuan Zhang, et al.
0

Machine translation systems based on deep neural networks are expensive to train. Curriculum learning aims to address this issue by choosing the order in which samples are presented during training to help train better models faster. We adopt a probabilistic view of curriculum learning, which lets us flexibly evaluate the impact of curricula design, and perform an extensive exploration on a German-English translation task. Results show that it is possible to improve convergence time at no loss in translation quality. However, results are highly sensitive to the choice of sample difficulty criteria, curriculum schedule and other hyperparameters.

READ FULL TEXT
research
05/14/2019

Curriculum Learning for Domain Adaptation in Neural Machine Translation

We introduce a curriculum learning approach to adapt generic neural mach...
research
09/23/2021

Exploiting Curriculum Learning in Unsupervised Neural Machine Translation

Back-translation (BT) has become one of the de facto components in unsup...
research
12/02/2022

CLIP: Train Faster with Less Data

Deep learning models require an enormous amount of data for training. Ho...
research
08/15/2020

Curriculum Learning for Recurrent Video Object Segmentation

Video object segmentation can be understood as a sequence-to-sequence ta...
research
11/12/2020

Evaluating Curriculum Learning Strategies in Neural Combinatorial Optimization

Neural combinatorial optimization (NCO) aims at designing problem-indepe...
research
06/14/2019

A Strategy for Expert Recommendation From Open Data Available on the Lattes Platform

With the increasing volume of data and users of curriculum systems, the ...
research
05/10/2021

Self-Guided Curriculum Learning for Neural Machine Translation

In the field of machine learning, the well-trained model is assumed to b...

Please sign up or login with your details

Forgot password? Click here to reset