Learning Contextualized Sentence Representations for Document-Level Neural Machine Translation

03/30/2020
by   Pei Zhang, et al.
0

Document-level machine translation incorporates inter-sentential dependencies into the translation of a source sentence. In this paper, we propose a new framework to model cross-sentence dependencies by training neural machine translation (NMT) to predict both the target translation and surrounding sentences of a source sentence. By enforcing the NMT model to predict source context, we want the model to learn "contextualized" source sentence representations that capture document-level dependencies on the source side. We further propose two different methods to learn and integrate such contextualized sentence embeddings into NMT: a joint training method that jointly trains an NMT model with the source context prediction model and a pre-training fine-tuning method that pretrains the source context prediction model on a large-scale monolingual document corpus and then fine-tunes it with the NMT model. Experiments on Chinese-English and English-German translation show that both methods can substantially improve the translation quality over a strong document-level Transformer baseline.

READ FULL TEXT
research
03/11/2020

Capturing document context inside sentence-level neural machine translation models with self-training

Neural machine translation (NMT) has arguably achieved human level parit...
research
07/14/2019

Microsoft Translator at WMT 2019: Towards Large-Scale Document-Level Neural Machine Translation

This paper describes the Microsoft Translator submissions to the WMT19 n...
research
05/31/2021

Verdi: Quality Estimation and Error Detection for Bilingual

Translation Quality Estimation is critical to reducing post-editing effo...
research
11/30/2017

Cache-based Document-level Neural Machine Translation

Sentences in a well-formed text are connected to each other via various ...
research
06/07/2021

Diverse Pretrained Context Encodings Improve Document Translation

We propose a new architecture for adapting a sentence-level sequence-to-...
research
06/12/2018

Fusing Recency into Neural Machine Translation with an Inter-Sentence Gate Model

Neural machine translation (NMT) systems are usually trained on a large ...
research
02/11/2021

Towards Personalised and Document-level Machine Translation of Dialogue

State-of-the-art (SOTA) neural machine translation (NMT) systems transla...

Please sign up or login with your details

Forgot password? Click here to reset