Modeling Past and Future for Neural Machine Translation

11/27/2017
by   Zaixiang Zheng, et al.
0

Existing neural machine translation systems do not explicitly model what has been translated and what has not during the decoding phase. To address this problem, we propose a novel mechanism that separates the source information into two parts: translated Past contents and untranslated Future contents, which are modeled by two additional recurrent layers. The Past and Future contents are fed to both the attention model and the decoder states, which offers NMT systems the knowledge of translated and untranslated contents. Experimental results show that the proposed approach significantly improves translation performance in Chinese-English, German-English and English-German translation tasks. Specifically, the proposed model outperforms the conventional coverage model in both of the translation quality and the alignment error rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/19/2016

Modeling Coverage for Neural Machine Translation

Attention mechanism has enhanced state-of-the-art Neural Machine Transla...
research
02/06/2018

Decoding-History-Based Adaptive Control of Attention for Neural Machine Translation

Attention-based sequence-to-sequence model has proved successful in Neur...
research
04/21/2019

Dynamic Past and Future for Neural Machine Translation

Previous studies have shown that neural machine translation (NMT) models...
research
07/15/2020

Dual Past and Future for Neural Machine Translation

Though remarkable successes have been achieved by Neural Machine Transla...
research
06/15/2019

Tagged Back-Translation

Recent work in Neural Machine Translation (NMT) has shown significant qu...
research
05/18/2023

Evaluating the validity of a German translation of an uncanniness questionnaire

When researching on the acceptance of robots in Human-Robot-Interaction ...
research
01/06/2022

Phrase-level Adversarial Example Generation for Neural Machine Translation

While end-to-end neural machine translation (NMT) has achieved impressiv...

Please sign up or login with your details

Forgot password? Click here to reset