Microsoft Translator at WMT 2019: Towards Large-Scale Document-Level Neural Machine Translation

07/14/2019
by   Marcin Junczys-Dowmunt, et al.
0

This paper describes the Microsoft Translator submissions to the WMT19 news translation shared task for English-German. Our main focus is document-level neural machine translation with deep transformer models. We start with strong sentence-level baselines, trained on large-scale data created via data-filtering and noisy back-translation and find that back-translation seems to mainly help with translationese input. We explore fine-tuning techniques, deeper models and different ensembling strategies to counter these effects. Using document boundaries present in the authentic and synthetic parallel data, we create sequences of up to 1000 subword segments and train transformer translation models. We experiment with data augmentation techniques for the smaller authentic data with document-boundaries and for larger authentic data without boundaries. We further explore multi-task training for the incorporation of document-level source language monolingual data via the BERT-objective on the encoder and two-pass decoding for combinations of sentence-level and document-level systems. Based on preliminary human evaluation results, evaluators strongly prefer the document-level systems over our comparable sentence-level system. The document-level systems also seem to score higher than the human references in source-based direct assessment.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2020

Learning Contextualized Sentence Representations for Document-Level Neural Machine Translation

Document-level machine translation incorporates inter-sentential depende...
research
09/07/2022

Adam Mickiewicz University at WMT 2022: NER-Assisted and Quality-Aware Neural Machine Translation

This paper presents Adam Mickiewicz University's (AMU) submissions to th...
research
06/11/2019

Cued@wmt19:ewc&lms

Two techniques provide the fabric of the Cambridge University Engineerin...
research
10/16/2019

Using Whole Document Context in Neural Machine Translation

In Machine Translation, considering the document as a whole can help to ...
research
04/25/2023

Escaping the sentence-level paradigm in machine translation

It is well-known that document context is vital for resolving a range of...
research
12/07/2020

Topical Change Detection in Documents via Embeddings of Long Sequences

In a longer document, the topic often slightly shifts from one passage t...
research
10/08/2018

Improving the Transformer Translation Model with Document-Level Context

Although the Transformer translation model (Vaswani et al., 2017) has ac...

Please sign up or login with your details

Forgot password? Click here to reset