Is Encoder-Decoder Redundant for Neural Machine Translation?

10/21/2022
by   Yingbo Gao, et al.
0

Encoder-decoder architecture is widely adopted for sequence-to-sequence modeling tasks. For machine translation, despite the evolution from long short-term memory networks to Transformer networks, plus the introduction and development of attention mechanism, encoder-decoder is still the de facto neural network architecture for state-of-the-art models. While the motivation for decoding information from some hidden space is straightforward, the strict separation of the encoding and decoding steps into an encoder and a decoder in the model architecture is not necessarily a must. Compared to the task of autoregressive language modeling in the target language, machine translation simply has an additional source sentence as context. Given the fact that neural language models nowadays can already handle rather long contexts in the target language, it is natural to ask whether simply concatenating the source and target sentences and training a language model to do translation would work. In this work, we investigate the aforementioned concept for machine translation. Specifically, we experiment with bilingual translation, translation with additional target monolingual data, and multilingual translation. In all cases, this alternative approach performs on par with the baseline encoder-decoder Transformer, suggesting that an encoder-decoder architecture might be redundant for neural machine translation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2020

Universal Vector Neural Machine Translation With Effective Attention

Neural Machine Translation (NMT) leverages one or more trained neural ne...
research
04/08/2023

Decoder-Only or Encoder-Decoder? Interpreting Language Model as a Regularized Encoder-Decoder

The sequence-to-sequence (seq2seq) task aims at generating the target se...
research
04/25/2023

State Spaces Aren't Enough: Machine Translation Needs Attention

Structured State Spaces for Sequences (S4) is a recently proposed sequen...
research
01/26/2018

A Multilayer Convolutional Encoder-Decoder Neural Network for Grammatical Error Correction

We improve automatic correction of grammatical, orthographic, and colloc...
research
02/11/2018

Tree-to-tree Neural Networks for Program Translation

Program translation is an important tool to migrate legacy code in one l...
research
02/04/2022

Data Scaling Laws in NMT: The Effect of Noise and Architecture

In this work, we study the effect of varying the architecture and traini...
research
05/05/2020

Neural Syntactic Preordering for Controlled Paraphrase Generation

Paraphrasing natural language sentences is a multifaceted process: it mi...

Please sign up or login with your details

Forgot password? Click here to reset