Log In Sign Up

Context-Adaptive Document-Level Neural Machine Translation

by   Linlin Zhang, et al.

Most existing document-level neural machine translation (NMT) models leverage a fixed number of the previous or all global source sentences to handle the context-independent problem in standard NMT. However, the translating of each source sentence benefits from various sizes of context, and inappropriate context may harm the translation performance. In this work, we introduce a data-adaptive method that enables the model to adopt the necessary and useful context. Specifically, we introduce a light predictor into two document-level translation models to select the explicit context. Experiments demonstrate the proposed approach can significantly improve the performance over the previous methods with a gain up to 1.99 BLEU points.


page 1

page 2

page 3

page 4


Document-level Neural Machine Translation with Document Embeddings

Standard neural machine translation (NMT) is on the assumption of docume...

Dynamic Context Selection for Document-level Neural Machine Translation via Reinforcement Learning

Document-level neural machine translation has yielded attractive improve...

Capturing Longer Context for Document-level Neural Machine Translation: A Multi-resolutional Approach

Discourse context has been proven useful when translating documents. It ...

When and Why is Document-level Context Useful in Neural Machine Translation?

Document-level context has received lots of attention for compensating n...

Context in Neural Machine Translation: A Review of Models and Evaluations

This review paper discusses how context has been used in neural machine ...

Learning to Remember Translation History with a Continuous Cache

Existing neural machine translation (NMT) models generally translate sen...

SMDT: Selective Memory-Augmented Neural Document Translation

Existing document-level neural machine translation (NMT) models have suf...