A Comparison of Transformer and Recurrent Neural Networks on Multilingual Neural Machine Translation

06/18/2018
by   Surafel M. Lakew, et al.
0

Recently, neural machine translation (NMT) has been extended to multilinguality, that is to handle more than one translation direction with a single system. Multilingual NMT showed competitive performance against pure bilingual systems. Notably, in low-resource settings, it proved to work effectively and efficiently, thanks to shared representation space that is forced across languages and induces a sort of transfer-learning. Furthermore, multilingual NMT enables so-called zero-shot inference across language pairs never seen at training time. Despite the increasing interest in this framework, an in-depth analysis of what a multilingual NMT model is capable of and what it is not is still missing. Motivated by this, our work (i) provides a quantitative and comparative analysis of the translations produced by bilingual, multilingual and zero-shot systems; (ii) investigates the translation quality of two of the currently dominant neural architectures in MT, which are the Recurrent and the Transformer ones; and (iii) quantitatively explores how the closeness between languages influences the zero-shot translation. Our analysis leverages multiple professional post-edits of automatic translations by several different systems and focuses both on automatic standard metrics (BLEU and TER) and on widely used error categories, which are lexical, morphology, and word order errors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/16/2019

Multilingual Neural Machine Translation for Zero-Resource Languages

In recent years, Neural Machine Translation (NMT) has been shown to be m...
research
11/04/2018

Improving Zero-Shot Translation of Low-Resource Languages

Recent work on multilingual neural machine translation reported competit...
research
06/20/2019

Improving Zero-shot Translation with Language-Independent Constraints

An important concern in training multilingual neural machine translation...
research
10/02/2021

Improving Zero-shot Multilingual Neural Machine Translation for Low-Resource Languages

Although the multilingual Neural Machine Translation(NMT), which extends...
research
07/23/2021

Modelling Latent Translations for Cross-Lingual Transfer

While achieving state-of-the-art results in multiple tasks and languages...
research
09/10/2021

Improving Multilingual Translation by Representation and Gradient Regularization

Multilingual Neural Machine Translation (NMT) enables one model to serve...
research
06/07/2018

Multi-Source Neural Machine Translation with Missing Data

Multi-source translation is an approach to exploit multiple inputs (e.g....

Please sign up or login with your details

Forgot password? Click here to reset