The NiuTrans Machine Translation Systems for WMT21

09/22/2021
by   Shuhan Zhou, et al.
0

This paper describes NiuTrans neural machine translation systems of the WMT 2021 news translation tasks. We made submissions to 9 language directions, including English↔{Chinese, Japanese, Russian, Icelandic} and English→Hausa tasks. Our primary systems are built on several effective variants of Transformer, e.g., Transformer-DLCL, ODE-Transformer. We also utilize back-translation, knowledge distillation, post-ensemble, and iterative fine-tuning techniques to enhance the model performance further.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/28/2022

Summer: WeChat Neural Machine Translation Systems for the WMT22 Biomedical Translation Task

This paper introduces WeChat's participation in WMT 2022 shared biomedic...
research
10/28/2020

The Volctrans Machine Translation System for WMT20

This paper describes our VolcTrans system on WMT20 shared news translati...
research
10/29/2020

Tilde at WMT 2020: News Task Systems

This paper describes Tilde's submission to the WMT2020 shared task on ne...
research
11/07/2019

Microsoft Research Asia's Systems for WMT19

We Microsoft Research Asia made submissions to 11 language directions in...
research
02/16/2022

EdgeFormer: A Parameter-Efficient Transformer for On-Device Seq2seq Generation

We propose EdgeFormer – a parameter-efficient Transformer of the encoder...
research
03/18/2019

Neutron: An Implementation of the Transformer Translation Model and its Variants

The Transformer translation model is easier to parallelize and provides ...
research
11/03/2021

Lingua Custodia's participation at the WMT 2021 Machine Translation using Terminologies shared task

This paper describes Lingua Custodia's submission to the WMT21 shared ta...

Please sign up or login with your details

Forgot password? Click here to reset