Multilingual Machine Translation: Closing the Gap between Shared and Language-specific Encoder-Decoders

04/14/2020
by   Carlos Escolano, et al.
0

State-of-the-art multilingual machine translation relies on a universal encoder-decoder, which requires retraining the entire system to add new languages. In this paper, we propose an alternative approach that is based on language-specific encoder-decoders, and can thus be more easily extended to new languages by learning their corresponding modules. So as to encourage a common interlingua representation, we simultaneously train the N initial languages. Our experiments show that the proposed approach outperforms the universal encoder-decoder by 3.28 BLEU points on average, and when adding new languages, without the need to retrain the rest of the modules. All in all, our work closes the gap between shared and language-specific encoder-decoders, advancing toward modular multilingual machine translation systems that can be flexibly extended in lifelong learning settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2020

Training Multilingual Machine Translation by Alternately Freezing Language-Specific Encoders-Decoders

We propose a modular architecture of language-specific encoder-decoders ...
research
05/04/2023

Learning Language-Specific Layers for Multilingual Machine Translation

Multilingual Machine Translation promises to improve translation quality...
research
05/23/2023

Condensing Multilingual Knowledge with Lightweight Language-Specific Modules

Incorporating language-specific (LS) modules is a proven method to boost...
research
08/11/2020

On Learning Language-Invariant Representations for Universal Machine Translation

The goal of universal machine translation is to learn to translate betwe...
research
09/16/2020

NABU - Multilingual Graph-based Neural RDF Verbalizer

The RDF-to-text task has recently gained substantial attention due to co...
research
09/08/2019

MULE: Multimodal Universal Language Embedding

Existing vision-language methods typically support two languages at a ti...
research
03/15/2022

Multilingual Mix: Example Interpolation Improves Multilingual Neural Machine Translation

Multilingual neural machine translation models are trained to maximize t...

Please sign up or login with your details

Forgot password? Click here to reset