An Empirical Study of Adequate Vision Span for Attention-Based Neural Machine Translation

12/19/2016
by   Raphael Shu, et al.
0

Recently, the attention mechanism plays a key role to achieve high performance for Neural Machine Translation models. However, as it computes a score function for the encoder states in all positions at each decoding step, the attention model greatly increases the computational complexity. In this paper, we investigate the adequate vision span of attention models in the context of machine translation, by proposing a novel attention framework that is capable of reducing redundant score computation dynamically. The term "vision span" means a window of the encoder states considered by the attention model in one step. In our experiments, we found that the average window size of vision span can be reduced by over 50 English-Japanese and German-English translation tasks. that the conventional attention mechanism performs a significant amount of redundant computation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/04/2017

An empirical study on the effectiveness of images in Multimodal Neural Machine Translation

In state-of-the-art Neural Machine Translation (NMT), an attention mecha...
research
08/22/2018

Learning When to Concentrate or Divert Attention: Self-Adaptive Attention Temperature for Neural Machine Translation

Most of the Neural Machine Translation (NMT) models are based on the seq...
research
11/25/2016

Neural Machine Translation with Latent Semantic of Image and Text

Although attention-based Neural Machine Translation have achieved great ...
research
07/29/2016

Recurrent Neural Machine Translation

The vanilla attention-based neural machine translation has achieved prom...
research
02/06/2018

Decoding-History-Based Adaptive Control of Attention for Neural Machine Translation

Attention-based sequence-to-sequence model has proved successful in Neur...
research
08/20/2017

Neural Machine Translation with Extended Context

We investigate the use of extended context in attention-based neural mac...
research
11/10/2019

Modelling Bahdanau Attention using Election methods aided by Q-Learning

Neural Machine Translation has lately gained a lot of "attention" with t...

Please sign up or login with your details

Forgot password? Click here to reset