Capturing document context inside sentence-level neural machine translation models with self-training

03/11/2020
by   Elman Mansimov, et al.
0

Neural machine translation (NMT) has arguably achieved human level parity when trained and evaluated at the sentence-level. Document-level neural machine translation has received less attention and lags behind its sentence-level counterpart. The majority of the proposed document-level approaches investigate ways of conditioning the model on several source or target sentences to capture document context. These approaches require training a specialized NMT model from scratch on parallel document-level corpora. We propose an approach that doesn't require training a specialized model on parallel document-level corpora and is applied to a trained sentence-level NMT model at decoding time. We process the document from left to right multiple times and self-train the sentence-level model on pairs of source sentences and generated translations. Our approach reinforces the choices made by the model, thus making it more likely that the same choices will be made in other sentences in the document. We evaluate our approach on three document-level datasets: NIST Chinese-English, WMT'19 Chinese-English and OpenSubtitles English-Russian. We demonstrate that our approach has higher BLEU score and higher human preference than the baseline. Qualitative analysis of our approach shows that choices made by model are consistent across the document.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2020

Learning Contextualized Sentence Representations for Document-Level Neural Machine Translation

Document-level machine translation incorporates inter-sentential depende...
research
10/01/2019

Putting Machine Translation in Context with the Noisy Channel Model

We show that Bayes' rule provides a compelling mechanism for controlling...
research
10/09/2020

Dynamic Context Selection for Document-level Neural Machine Translation via Reinforcement Learning

Document-level neural machine translation has yielded attractive improve...
research
05/04/2023

Unified Model Learning for Various Neural Machine Translation

Existing neural machine translation (NMT) studies mainly focus on develo...
research
07/30/2019

English-Czech Systems in WMT19: Document-Level Transformer

We describe our NMT systems submitted to the WMT19 shared task in Englis...
research
09/16/2023

SLIDE: Reference-free Evaluation for Machine Translation using a Sliding Document Window

Reference-based metrics that operate at the sentence level typically out...
research
11/30/2017

Cache-based Document-level Neural Machine Translation

Sentences in a well-formed text are connected to each other via various ...

Please sign up or login with your details

Forgot password? Click here to reset