Global-Context Neural Machine Translation through Target-Side Attentive Residual Connections

09/14/2017
by   Lesly Miculicich Werlen, et al.
0

Neural sequence-to-sequence models achieve remarkable performance not only in machine translation (MT) but also in other language processing tasks. One of the reasons for their effectiveness is the ability of their decoder to capture contextual information through its recurrent layer. However, its sequential modeling over-emphasizes the local context, i.e. the previously translated word in the case of MT. As a result, the model ignores important information from the global context of the translation. In this paper, we address this limitation by introducing attentive residual connections from the previously translated words to the output of the decoder, which enable the learning of longer-range dependencies between words. The proposed model can emphasize any of the previously translated words, as opposed to only the last one, gaining access to the global context of the translated text. The model outperforms strong neural MT baselines on three language pairs, as well as a neural language modeling baseline. The analysis of the attention learned by the decoder confirms that it emphasizes a wide context, and reveals resemblance to syntactic-like structures.

READ FULL TEXT
research
02/04/2017

Doubly-Attentive Decoder for Multi-modal Neural Machine Translation

We introduce a Multi-modal Neural Machine Translation model in which a d...
research
04/06/2015

Bengali to Assamese Statistical Machine Translation using Moses (Corpus Based)

Machine dialect interpretation assumes a real part in encouraging man-ma...
research
05/03/2017

Chunk-Based Bi-Scale Decoder for Neural Machine Translation

In typical neural machine translation (NMT), the decoder generates a sen...
research
07/01/2021

Modeling Target-side Inflection in Placeholder Translation

Placeholder translation systems enable the users to specify how a specif...
research
09/17/2018

Quantum Statistics-Inspired Neural Attention

Sequence-to-sequence (encoder-decoder) models with attention constitute ...
research
04/11/2017

What do Neural Machine Translation Models Learn about Morphology?

Neural machine translation (MT) models obtain state-of-the-art performan...
research
04/25/2023

State Spaces Aren't Enough: Machine Translation Needs Attention

Structured State Spaces for Sequences (S4) is a recently proposed sequen...

Please sign up or login with your details

Forgot password? Click here to reset