Modeling Coverage for Neural Machine Translation

01/19/2016
by   Zhaopeng Tu, et al.
0

Attention mechanism has enhanced state-of-the-art Neural Machine Translation (NMT) by jointly learning to align and translate. It tends to ignore past alignment information, however, which often leads to over-translation and under-translation. To address this problem, we propose coverage-based NMT in this paper. We maintain a coverage vector to keep track of the attention history. The coverage vector is fed to the attention model to help adjust future attention, which lets NMT system to consider more about untranslated source words. Experiments show that the proposed approach significantly improves both translation quality and alignment quality over standard attention-based NMT.

READ FULL TEXT

page 2

page 8

research
05/10/2016

Coverage Embedding Models for Neural Machine Translation

In this paper, we enhance the attention-based neural machine translation...
research
11/27/2017

Modeling Past and Future for Neural Machine Translation

Existing neural machine translation systems do not explicitly model what...
research
03/30/2018

Fine-Grained Attention Mechanism for Neural Machine Translation

Neural machine translation (NMT) has been a new paradigm in machine tran...
research
05/21/2018

Sparse and Constrained Attention for Neural Machine Translation

In NMT, words are sometimes dropped from the source or generated repeate...
research
06/25/2019

Saliency-driven Word Alignment Interpretation for Neural Machine Translation

Despite their original goal to jointly learn to align and translate, Neu...
research
09/11/2021

Modeling Concentrated Cross-Attention for Neural Machine Translation with Gaussian Mixture Model

Cross-attention is an important component of neural machine translation ...
research
07/06/2016

Guided Alignment Training for Topic-Aware Neural Machine Translation

In this paper, we propose an effective way for biasing the attention mec...

Please sign up or login with your details

Forgot password? Click here to reset