Learning When to Attend for Neural Machine Translation

05/31/2017
by   Junhui Li, et al.
0

In the past few years, attention mechanisms have become an indispensable component of end-to-end neural machine translation models. However, previous attention models always refer to some source words when predicting a target word, which contradicts with the fact that some target words have no corresponding source words. Motivated by this observation, we propose a novel attention model that has the capability of determining when a decoder should attend to source words and when it should not. Experimental results on NIST Chinese-English translation tasks show that the new model achieves an improvement of 0.8 BLEU score over a state-of-the-art baseline.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2017

Look-ahead Attention for Generation in Neural Machine Translation

The attention model has become a standard component in neural machine tr...
research
07/18/2016

Neural Machine Translation with Recurrent Attention Modeling

Knowing which words have been attended to in previous time steps while g...
research
04/18/2022

Dynamic Position Encoding for Transformers

Recurrent models have been dominating the field of neural machine transl...
research
08/22/2018

Learning When to Concentrate or Divert Attention: Self-Adaptive Attention Temperature for Neural Machine Translation

Most of the Neural Machine Translation (NMT) models are based on the seq...
research
04/27/2017

A GRU-Gated Attention Model for Neural Machine Translation

Neural machine translation (NMT) heavily relies on an attention network ...
research
05/21/2018

Sparse and Constrained Attention for Neural Machine Translation

In NMT, words are sometimes dropped from the source or generated repeate...
research
09/11/2021

Modeling Concentrated Cross-Attention for Neural Machine Translation with Gaussian Mixture Model

Cross-attention is an important component of neural machine translation ...

Please sign up or login with your details

Forgot password? Click here to reset