Modeling Recurrence for Transformer

04/05/2019
by   Jie Hao, et al.
0

Recently, the Transformer model that is based solely on attention mechanisms, has advanced the state-of-the-art on various machine translation tasks. However, recent studies reveal that the lack of recurrence hinders its further improvement of translation capacity. In response to this problem, we propose to directly model recurrence for Transformer with an additional recurrence encoder. In addition to the standard recurrent neural network, we introduce a novel attentive recurrent network to leverage the strengths of both attention and recurrent networks. Experimental results on the widely-used WMT14 English-German and WMT17 Chinese-English translation tasks demonstrate the effectiveness of the proposed approach. Our studies also reveal that the proposed model benefits from a short-cut that bridges the source and target sequences with a single recurrent layer, which outperforms its deep counterpart.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2019

Improving Neural Machine Translation with Parent-Scaled Self-Attention

Most neural machine translation (NMT) models operate on source and targe...
research
06/05/2019

Learning Deep Transformer Models for Machine Translation

Transformer is the state-of-the-art model in recent machine translation ...
research
07/30/2018

Doubly Attentive Transformer Machine Translation

In this paper a doubly attentive transformer machine translation model (...
research
03/02/2020

Transformer++

Recent advancements in attention mechanisms have replaced recurrent neur...
research
05/16/2019

Joint Source-Target Self Attention with Locality Constraints

The dominant neural machine translation models are based on the encoder-...
research
03/18/2019

Neutron: An Implementation of the Transformer Translation Model and its Variants

The Transformer translation model is easier to parallelize and provides ...
research
08/29/2019

Regularized Context Gates on Transformer for Machine Translation

Context gates are effective to control the contributions from the source...

Please sign up or login with your details

Forgot password? Click here to reset