Netmarble AI Center's WMT21 Automatic Post-Editing Shared Task Submission

09/14/2021
by   Shinhyeok Oh, et al.
0

This paper describes Netmarble's submission to WMT21 Automatic Post-Editing (APE) Shared Task for the English-German language pair. First, we propose a Curriculum Training Strategy in training stages. Facebook Fair's WMT19 news translation model was chosen to engage the large and powerful pre-trained neural networks. Then, we post-train the translation model with different levels of data at each training stages. As the training stages go on, we make the system learn to solve multiple tasks by adding extra information at different training stages gradually. We also show a way to utilize the additional data in large volume for APE tasks. For further improvement, we apply Multi-Task Learning Strategy with the Dynamic Weight Average during the fine-tuning stage. To fine-tune the APE corpus with limited data, we add some related subtasks to learn a unified representation. Finally, for better performance, we leverage external translations as augmented machine translation (MT) during the post-training and fine-tuning. As experimental results show, our APE system significantly improves the translations of provided MT results by -2.848 and +3.74 on the development dataset in terms of TER and BLEU, respectively. It also demonstrates its effectiveness on the test dataset with higher quality than the development dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2020

Recipes for Adapting Pre-trained Monolingual and Multilingual Models to Machine Translation

There has been recent success in pre-training on monolingual data and fi...
research
08/19/2021

Attentive fine-tuning of Transformers for Translation of low-resourced languages @LoResMT 2021

This paper reports the Machine Translation (MT) systems submitted by the...
research
05/30/2019

Unbabel's Submission to the WMT2019 APE Shared Task: BERT-based Encoder-Decoder for Automatic Post-Editing

This paper describes Unbabel's submission to the WMT2019 APE Shared Task...
research
05/16/2016

Log-linear Combinations of Monolingual and Bilingual Neural Machine Translation Models for Automatic Post-Editing

This paper describes the submission of the AMU (Adam Mickiewicz Universi...
research
06/14/2019

A Simple and Effective Approach to Automatic Post-Editing with Transfer Learning

Automatic post-editing (APE) seeks to automatically refine the output of...
research
01/27/2023

A Multi-task Multi-stage Transitional Training Framework for Neural Chat Translation

Neural chat translation (NCT) aims to translate a cross-lingual chat bet...
research
09/15/2022

Examining Large Pre-Trained Language Models for Machine Translation: What You Don't Know About It

Pre-trained language models (PLMs) often take advantage of the monolingu...

Please sign up or login with your details

Forgot password? Click here to reset