From Bilingual to Multilingual Neural Machine Translation by Incremental Training

06/28/2019
by   Carlos Escolano, et al.
0

Multilingual Neural Machine Translation approaches are based on the use of task-specific models and the addition of one more language can only be done by retraining the whole system. In this work, we propose a new training schedule that allows the system to scale to more languages without modification of the previous components based on joint training and language-independent encoder/decoder modules allowing for zero-shot translation. This work in progress shows close results to the state-of-the-art in the WMT task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2020

Training Multilingual Machine Translation by Alternately Freezing Language-Specific Encoders-Decoders

We propose a modular architecture of language-specific encoder-decoders ...
research
10/19/2020

Revisiting Modularized Multilingual NMT to Meet Industrial Demands

The complete sharing of parameters for multilingual translation (1-1) ha...
research
09/17/2017

Unwritten Languages Demand Attention Too! Word Discovery with Encoder-Decoder Models

Word discovery is the task of extracting words from unsegmented text. In...
research
08/26/2018

Contextual Parameter Generation for Universal Neural Machine Translation

We propose a simple modification to existing neural machine translation ...
research
05/31/2021

Do Multilingual Neural Machine Translation Models Contain Language Pair Specific Attention Heads?

Recent studies on the analysis of the multilingual representations focus...
research
04/29/2022

How Robust is Neural Machine Translation to Language Imbalance in Multilingual Tokenizer Training?

A multilingual tokenizer is a fundamental component of multilingual neur...
research
07/28/2023

Multilingual Lexical Simplification via Paraphrase Generation

Lexical simplification (LS) methods based on pretrained language models ...

Please sign up or login with your details

Forgot password? Click here to reset