Temporal Attention Model for Neural Machine Translation

08/09/2016
by   Baskaran Sankaran, et al.
0

Attention-based Neural Machine Translation (NMT) models suffer from attention deficiency issues as has been observed in recent research. We propose a novel mechanism to address some of these limitations and improve the NMT attention. Specifically, our approach memorizes the alignments temporally (within each sentence) and modulates the attention with the accumulated temporal memory, as the decoder generates the candidate translation. We compare our approach against the baseline NMT model and two other related approaches that address this issue either explicitly or implicitly. Large-scale experiments on two language pairs show that our approach achieves better and robust gains over the baseline and related NMT approaches. Our model further outperforms strong SMT baselines in some settings even without using ensembles.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2017

Memory-augmented Chinese-Uyghur Neural Machine Translation

Neural machine translation (NMT) has achieved notable performance recent...
research
05/20/2017

Search Engine Guided Non-Parametric Neural Machine Translation

In this paper, we extend an attention-based neural machine translation (...
research
11/21/2018

Neural Machine Translation with Adequacy-Oriented Learning

Although Neural Machine Translation (NMT) models have advanced state-of-...
research
06/29/2017

Stronger Baselines for Trustable Results in Neural Machine Translation

Interest in neural machine translation has grown rapidly as its effectiv...
research
03/22/2022

Learning Confidence for Transformer-based Neural Machine Translation

Confidence estimation aims to quantify the confidence of the model predi...
research
08/02/2019

Retrosynthesis with Attention-Based NMT Model and Chemical Analysis of the "Wrong" Predictions

We cast retrosynthesis as a machine translation problem by introducing a...
research
09/30/2019

Interrogating the Explanatory Power of Attention in Neural Machine Translation

Attention models have become a crucial component in neural machine trans...

Please sign up or login with your details

Forgot password? Click here to reset