Towards Two-Dimensional Sequence to Sequence Model in Neural Machine Translation

10/09/2018
by   Parnia Bahar, et al.
0

This work investigates an alternative model for neural machine translation (NMT) and proposes a novel architecture, where we employ a multi-dimensional long short-term memory (MDLSTM) for translation modeling. In the state-of-the-art methods, source and target sentences are treated as one-dimensional sequences over time, while we view translation as a two-dimensional (2D) mapping using an MDLSTM layer to define the correspondence between source and target words. We extend beyond the current sequence to sequence backbone NMT models to a 2D structure in which the source and target sentences are aligned with each other in a 2D grid. Our proposed topology shows consistent improvements over attention-based sequence to sequence model on two WMT 2017 tasks, GermanEnglish.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/10/2018

Deconvolution-Based Global Decoding for Neural Machine Translation

A great proportion of sequence-to-sequence (Seq2Seq) models for Neural M...
research
11/23/2018

A Hierarchical Neural Network for Sequence-to-Sequences Learning

In recent years, the sequence-to-sequence learning neural networks with ...
research
01/04/2016

Mutual Information and Diverse Decoding Improve Neural Machine Translation

Sequence-to-sequence neural translation models learn semantic and syntac...
research
08/24/2018

Approximate Distribution Matching for Sequence-to-Sequence Learning

Sequence-to-Sequence models were introduced to tackle many real-life pro...
research
03/03/2022

As Little as Possible, as Much as Necessary: Detecting Over- and Undertranslations with Contrastive Conditioning

Omission and addition of content is a typical issue in neural machine tr...
research
08/29/2019

Regularized Context Gates on Transformer for Machine Translation

Context gates are effective to control the contributions from the source...
research
02/25/2019

Lost in Machine Translation: A Method to Reduce Meaning Loss

A desideratum of high-quality translation systems is that they preserve ...

Please sign up or login with your details

Forgot password? Click here to reset