On the Pareto Front of Multilingual Neural Machine Translation

04/06/2023
by   Liang Chen, et al.
0

In this work, we study how the generalization performance of a given direction changes with its sampling ratio in Multilingual Neural Machine Translation (MNMT). By training over 200 multilingual models with various model sizes, directions, and total numbers of tasks, we find that scalarization leads to a multitask trade-off front that deviates from the traditional Pareto front when there exists data imbalance in the training corpus. That is, the performance of certain translation directions does not improve with the increase of its weight in the multi-task optimization objective, which poses a great challenge to improve the overall performance of all directions. Based on our observations, we propose the Double Power Law to predict the unique performance trade-off front in MNMT, which is robust across various languages, data adequacy, and the number of tasks. Finally, we formulate the sample ratio selection problem in MNMT as an optimization problem based on the Double Power Law, which achieves better performance than temperature searching and gradient manipulation methods using up to half of the total training budget in our experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/29/2022

How Robust is Neural Machine Translation to Language Imbalance in Multilingual Tokenizer Training?

A multilingual tokenizer is a fundamental component of multilingual neur...
research
05/25/2023

Towards Higher Pareto Frontier in Multilingual Machine Translation

Multilingual neural machine translation has witnessed remarkable progres...
research
05/20/2021

Contrastive Learning for Many-to-many Multilingual Neural Machine Translation

Existing multilingual machine translation approaches mainly focus on Eng...
research
04/14/2020

Balancing Training for Multilingual Neural Machine Translation

When training multilingual machine translation (MT) models that can tran...
research
01/19/2022

Improving Neural Machine Translation by Denoising Training

We present a simple and effective pretraining strategy Denoising Trainin...
research
02/19/2023

Scaling Laws for Multilingual Neural Machine Translation

In this work, we provide a large-scale empirical study of the scaling pr...
research
06/30/2022

Building Multilingual Machine Translation Systems That Serve Arbitrary X-Y Translations

Multilingual Neural Machine Translation (MNMT) enables one system to tra...

Please sign up or login with your details

Forgot password? Click here to reset