DTMT: A Novel Deep Transition Architecture for Neural Machine Translation

12/19/2018
by   Fandong Meng, et al.
0

Past years have witnessed rapid developments in Neural Machine Translation (NMT). Most recently, with advanced modeling and training techniques, the RNN-based NMT (RNMT) has shown its potential strength, even compared with the well-known Transformer (self-attentional) model. Although the RNMT model can possess very deep architectures through stacking layers, the transition depth between consecutive hidden states along the sequential axis is still shallow. In this paper, we further enhance the RNN-based NMT through increasing the transition depth between consecutive hidden states and build a novel Deep Transition RNN-based Architecture for Neural Machine Translation, named DTMT. This model enhances the hidden-to-hidden transition with multiple non-linear transformations, as well as maintains a linear transformation path throughout this deep transition by the well-designed linear transformation mechanism to alleviate the gradient vanishing problem. Experiments show that with the specially designed deep transition modules, our DTMT can achieve remarkable improvements on translation quality. Experimental results on Chinese->English translation task show that DTMT can outperform the Transformer model by +2.09 BLEU points and achieve the best results ever reported in the same dataset. On WMT14 English->German and English->French translation tasks, DTMT shows superior quality to the state-of-the-art NMT systems, including the Transformer and the RNMT+.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2020

Very Deep Transformers for Neural Machine Translation

We explore the application of very deep Transformer models for Neural Ma...
research
07/24/2017

Deep Architectures for Neural Machine Translation

It has been shown that increasing model depth improves the quality of ne...
research
10/08/2020

Shallow-to-Deep Training for Neural Machine Translation

Deep encoders have been proven to be effective in improving neural machi...
research
04/26/2018

The Best of Both Worlds: Combining Recent Advances in Neural Machine Translation

The past year has witnessed rapid advances in sequence-to-sequence (seq2...
research
06/14/2016

Deep Recurrent Models with Fast-Forward Connections for Neural Machine Translation

Neural machine translation (NMT) aims at solving machine translation (MT...
research
08/22/2018

Training Deeper Neural Machine Translation Models with Transparent Attention

While current state-of-the-art NMT models, such as RNN seq2seq and Trans...
research
05/02/2017

Deep Neural Machine Translation with Linear Associative Unit

Deep Neural Networks (DNNs) have provably enhanced the state-of-the-art ...

Please sign up or login with your details

Forgot password? Click here to reset