Dynamic Context Selection for Document-level Neural Machine Translation via Reinforcement Learning

10/09/2020
by   Xiaomian Kang, et al.
0

Document-level neural machine translation has yielded attractive improvements. However, majority of existing methods roughly use all context sentences in a fixed scope. They neglect the fact that different source sentences need different sizes of context. To address this problem, we propose an effective approach to select dynamic context so that the document-level translation model can utilize the more useful selected context sentences to produce better translations. Specifically, we introduce a selection module that is independent of the translation module to score each candidate context sentence. Then, we propose two strategies to explicitly select a variable number of context sentences and feed them into the translation module. We train the two modules end-to-end via reinforcement learning. A novel reward is proposed to encourage the selection and utilization of dynamic context sentences. Experiments demonstrate that our approach can select adaptive context sentences for different source sentences, and significantly improves the performance of document-level translation methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/16/2021

Context-Adaptive Document-Level Neural Machine Translation

Most existing document-level neural machine translation (NMT) models lev...
research
03/11/2020

Capturing document context inside sentence-level neural machine translation models with self-training

Neural machine translation (NMT) has arguably achieved human level parit...
research
10/01/2019

When and Why is Document-level Context Useful in Neural Machine Translation?

Document-level context has received lots of attention for compensating n...
research
10/08/2020

Leveraging Discourse Rewards for Document-Level Neural Machine Translation

Document-level machine translation focuses on the translation of entire ...
research
09/02/2019

Enhancing Context Modeling with a Query-Guided Capsule Network for Document-level Translation

Context modeling is essential to generate coherent and consistent transl...
research
09/23/2021

Exploiting Curriculum Learning in Unsupervised Neural Machine Translation

Back-translation (BT) has become one of the de facto components in unsup...
research
03/12/2019

Context-Aware Learning for Neural Machine Translation

Interest in larger-context neural machine translation, including documen...

Please sign up or login with your details

Forgot password? Click here to reset