Demystify Optimization Challenges in Multilingual Transformers

04/15/2021
by   Xian Li, et al.
12

Multilingual Transformer improves parameter efficiency and crosslingual transfer. How to effectively train multilingual models has not been well studied. Using multilingual machine translation as a testbed, we study optimization challenges from loss landscape and parameter plasticity perspectives. We found that imbalanced training data poses task interference between high and low resource languages, characterized by nearly orthogonal gradients for major parameters and the optimization trajectory being mostly dominated by high resource. We show that local curvature of the loss surface affects the degree of interference, and existing heuristics of data subsampling implicitly reduces the sharpness, although still face a trade-off between high and low resource languages. We propose a principled multi-objective optimization algorithm, Curvature Aware Task Scaling (CATS), which improves both optimization and generalization especially for low resource. Experiments on TED, WMT and OPUS-100 benchmarks demonstrate that CATS advances the Pareto front of accuracy while being efficient to apply to massive multilingual settings at the scale of 100 languages.

READ FULL TEXT
research
05/31/2022

Refining Low-Resource Unsupervised Translation by Language Disentanglement of Multilingual Model

Numerous recent work on unsupervised machine translation (UMT) implies t...
research
10/20/2022

SMaLL-100: Introducing Shallow Multilingual Machine Translation Model for Low-Resource Languages

In recent years, multilingual machine translation models have achieved p...
research
04/26/2018

Lessons from the Bible on Modern Topics: Low-Resource Multilingual Topic Model Evaluation

Multilingual topic models enable document analysis across languages thro...
research
12/15/2022

Fixing MoE Over-Fitting on Low-Resource Languages in Multilingual Machine Translation

Sparsely gated Mixture of Experts (MoE) models have been shown to be a c...
research
10/06/2020

On Negative Interference in Multilingual Models: Findings and A Meta-Learning Treatment

Modern multilingual models are trained on concatenated text from multipl...
research
10/07/2020

Transfer Learning and Distant Supervision for Multilingual Transformer Models: A Study on African Languages

Multilingual transformer models like mBERT and XLM-RoBERTa have obtained...
research
05/25/2022

Bitext Mining Using Distilled Sentence Representations for Low-Resource Languages

Scaling multilingual representation learning beyond the hundred most fre...

Please sign up or login with your details

Forgot password? Click here to reset