DeepAI AI Chat
Log In Sign Up

Effective Approaches to Attention-based Neural Machine Translation

08/17/2015
by   Minh-Thang Luong, et al.
Stanford University
0

An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation. However, there has been little work exploring useful architectures for attention-based NMT. This paper examines two simple and effective classes of attentional mechanism: a global approach which always attends to all source words and a local one that only looks at a subset of source words at a time. We demonstrate the effectiveness of both approaches over the WMT translation tasks between English and German in both directions. With local attention, we achieve a significant gain of 5.0 BLEU points over non-attentional systems which already incorporate known techniques such as dropout. Our ensemble model using different attention architectures has established a new state-of-the-art result in the WMT'15 English to German translation task with 25.9 BLEU points, an improvement of 1.0 BLEU points over the existing best system backed by NMT and an n-gram reranker.

READ FULL TEXT
10/17/2016

Interactive Attention for Neural Machine Translation

Conventional attention-based Neural Machine Translation (NMT) conducts d...
08/03/2019

Invariance-based Adversarial Attack on Neural Machine Translation Systems

Recently, NLP models have been shown to be susceptible to adversarial at...
06/14/2016

Deep Recurrent Models with Fast-Forward Connections for Neural Machine Translation

Neural machine translation (NMT) aims at solving machine translation (MT...
09/30/2019

Interrogating the Explanatory Power of Attention in Neural Machine Translation

Attention models have become a crucial component in neural machine trans...
01/05/2016

Multi-Source Neural Translation

We build a multi-source machine translation model and train it to maximi...
04/28/2015

Lexical Translation Model Using a Deep Neural Network Architecture

In this paper we combine the advantages of a model using global source s...
09/16/2020

Graph-to-Sequence Neural Machine Translation

Neural machine translation (NMT) usually works in a seq2seq learning way...

Code Repositories

ABSA

Aspect Based Sentiment Analysis (ABSA)


view repo

seq2seqModel

Pytorch implimentation of Sequence-to-Sequence Learning with Attentional Neural Networks


view repo

pytorch-seq2seq_with_attention

Paper Implementation about Attention Mechanism in Neural Network


view repo