Supervised Attentions for Neural Machine Translation

07/30/2016
by   Haitao Mi, et al.
0

In this paper, we improve the attention or alignment accuracy of neural machine translation by utilizing the alignments of training sentence pairs. We simply compute the distance between the machine attentions and the "true" alignments, and minimize this cost in the training procedure. Our experiments on large-scale Chinese-to-English task show that our model improves both translation and alignment qualities significantly over the large-vocabulary neural machine translation system, and even beats a state-of-the-art traditional syntax-based system.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2018

Tensor2Tensor for Neural Machine Translation

Tensor2Tensor is a library for deep learning models that is well-suited ...
research
05/10/2016

Vocabulary Manipulation for Neural Machine Translation

In order to capture rich language phenomena, neural machine translation ...
research
12/15/2015

Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation

The attentional mechanism has proven to be effective in improving end-to...
research
10/09/2017

What does Attention in Neural Machine Translation Pay Attention to?

Attention in neural machine translation provides the possibility to enco...
research
09/26/2019

Large-scale Pretraining for Neural Machine Translation with Tens of Billions of Sentence Pairs

In this paper, we investigate the problem of training neural machine tra...
research
12/19/2016

Boosting Neural Machine Translation

Training efficiency is one of the main problems for Neural Machine Trans...
research
11/02/2018

Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization

Although neural machine translation has made significant progress recent...

Please sign up or login with your details

Forgot password? Click here to reset