When and Why is Document-level Context Useful in Neural Machine Translation?

10/01/2019
by   Yunsu Kim, et al.
0

Document-level context has received lots of attention for compensating neural machine translation (NMT) of isolated sentences. However, recent advances in document-level NMT focus on sophisticated integration of the context, explaining its improvement with only a few selected examples or targeted test sets. We extensively quantify the causes of improvements by a document-level model in general test sets, clarifying the limit of the usefulness of document-level context in NMT. We show that most of the improvements are not interpretable as utilizing the context. We also show that a minimal encoding is sufficient for the context modeling and very long context is not helpful for NMT.

READ FULL TEXT

page 3

page 8

10/31/2019

Document-level Neural Machine Translation with Inter-Sentence Attention

Standard neural machine translation (NMT) is on the assumption of docume...
09/16/2020

Document-level Neural Machine Translation with Document Embeddings

Standard neural machine translation (NMT) is on the assumption of docume...
04/16/2021

Context-Adaptive Document-Level Neural Machine Translation

Most existing document-level neural machine translation (NMT) models lev...
10/09/2020

Dynamic Context Selection for Document-level Neural Machine Translation via Reinforcement Learning

Document-level neural machine translation has yielded attractive improve...
01/05/2022

SMDT: Selective Memory-Augmented Neural Document Translation

Existing document-level neural machine translation (NMT) models have suf...
09/19/2020

Long-Short Term Masking Transformer: A Simple but Effective Baseline for Document-level Neural Machine Translation

Many document-level neural machine translation (NMT) systems have explor...
09/02/2019

Enhancing Context Modeling with a Query-Guided Capsule Network for Document-level Translation

Context modeling is essential to generate coherent and consistent transl...