DiDi's Machine Translation System for WMT2020

by   Tanfang Chen, et al.

This paper describes DiDi AI Labs' submission to the WMT2020 news translation shared task. We participate in the translation direction of Chinese->English. In this direction, we use the Transformer as our baseline model, and integrate several techniques for model enhancement, including data filtering, data selection, back-translation, fine-tuning, model ensembling, and re-ranking. As a result, our submission achieves a BLEU score of 36.6 in Chinese->English.


page 1

page 2

page 3

page 4


The NiuTrans Machine Translation Systems for WMT21

This paper describes NiuTrans neural machine translation systems of the ...

Microsoft's Submission to the WMT2018 News Translation Task: How I Learned to Stop Worrying and Love the Data

This paper describes the Microsoft submission to the WMT2018 news transl...

CUNI System for the WMT19 Robustness Task

We present our submission to the WMT19 Robustness Task. Our baseline sys...

Simultaneous paraphrasing and translation by fine-tuning Transformer models

This paper describes the third place submission to the shared task on si...

Data Processing Matters: SRPH-Konvergen AI's Machine Translation System for WMT'21

In this paper, we describe the submission of the joint Samsung Research ...

The University of Sydney's Machine Translation System for WMT19

This paper describes the University of Sydney's submission of the WMT 20...

Tilde at WMT 2020: News Task Systems

This paper describes Tilde's submission to the WMT2020 shared task on ne...