DeepAI AI Chat
Log In Sign Up

HanoiT: Enhancing Context-aware Translation via Selective Context

by   Jian Yang, et al.

Context-aware neural machine translation aims to use the document-level context to improve translation quality. However, not all words in the context are helpful. The irrelevant or trivial words may bring some noise and distract the model from learning the relationship between the current sentence and the auxiliary context. To mitigate this problem, we propose a novel end-to-end encoder-decoder model with a layer-wise selection mechanism to sift and refine the long document context. To verify the effectiveness of our method, extensive experiments and extra quantitative analysis are conducted on four document-level machine translation benchmarks. The experimental results demonstrate that our model significantly outperforms previous models on all datasets via the soft selection mechanism.


Selective Attention for Context-aware Neural Machine Translation

Despite the progress made in sentence-level NMT, current systems still f...

Lexically Cohesive Neural Machine Translation with Copy Mechanism

Lexically cohesive translations preserve consistency in word choices in ...

Diving Deep into Context-Aware Neural Machine Translation

Context-aware neural machine translation (NMT) is a promising direction ...

Context-Aware Learning for Neural Machine Translation

Interest in larger-context neural machine translation, including documen...

Enhancing Context Modeling with a Query-Guided Capsule Network for Document-level Translation

Context modeling is essential to generate coherent and consistent transl...

Surgical Instruction Generation with Transformers

Automatic surgical instruction generation is a prerequisite towards intr...

Context-aware Dynamic Block

Although deeper and larger neural networks have achieved better performa...