Microsoft Research Asia's Systems for WMT19

by   Yingce Xia, et al.

We Microsoft Research Asia made submissions to 11 language directions in the WMT19 news translation tasks. We won the first place for 8 of the 11 directions and the second place for the other three. Our basic systems are built on Transformer, back translation and knowledge distillation. We integrate several of our rececent techniques to enhance the baseline systems: multi-agent dual learning (MADL), masked sequence-to-sequence pre-training (MASS), neural architecture optimization (NAO), and soft contextual data augmentation (SCA).


page 1

page 2

page 3

page 4


The NiuTrans Machine Translation Systems for WMT21

This paper describes NiuTrans neural machine translation systems of the ...

The Volctrans Machine Translation System for WMT20

This paper describes our VolcTrans system on WMT20 shared news translati...

JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine Translation

Neural machine translation (NMT) needs large parallel corpora for state-...

Improved Data Augmentation for Translation Suggestion

Translation suggestion (TS) models are used to automatically provide alt...

Please sign up or login with your details

Forgot password? Click here to reset