Interactive Attention for Neural Machine Translation

10/17/2016
by   Fandong Meng, et al.
0

Conventional attention-based Neural Machine Translation (NMT) conducts dynamic alignment in generating the target sentence. By repeatedly reading the representation of source sentence, which keeps fixed after generated by the encoder (Bahdanau et al., 2015), the attention mechanism has greatly enhanced state-of-the-art NMT. In this paper, we propose a new attention mechanism, called INTERACTIVE ATTENTION, which models the interaction between the decoder and the representation of source sentence during translation by both reading and writing operations. INTERACTIVE ATTENTION can keep track of the interaction history and therefore improve the translation performance. Experiments on NIST Chinese-English translation task show that INTERACTIVE ATTENTION can achieve significant improvements over both the previous attention-based NMT baseline and some state-of-the-art variants of attention-based NMT (i.e., coverage models (Tu et al., 2016)). And neural machine translator with our INTERACTIVE ATTENTION can outperform the open source attention-based NMT system Groundhog by 4.22 BLEU points and the open source phrase-based system Moses by 3.94 BLEU points averagely on multiple test sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2015

Effective Approaches to Attention-based Neural Machine Translation

An attentional mechanism has lately been used to improve neural machine ...
research
09/14/2016

Neural Machine Translation with Supervised Attention

The attention mechanisim is appealing for neural machine translation, si...
research
10/31/2018

You May Not Need Attention

In NMT, how far can we get without attention and without separate encodi...
research
03/06/2015

Encoding Source Language with Convolutional Neural Network for Machine Translation

The recently proposed neural network joint model (NNJM) (Devlin et al., ...
research
05/16/2018

Are BLEU and Meaning Representation in Opposition?

One of possible ways of obtaining continuous-space sentence representati...
research
12/06/2017

Multi-channel Encoder for Neural Machine Translation

Attention-based Encoder-Decoder has the effective architecture for neura...
research
08/03/2019

Invariance-based Adversarial Attack on Neural Machine Translation Systems

Recently, NLP models have been shown to be susceptible to adversarial at...

Please sign up or login with your details

Forgot password? Click here to reset