DeepAI AI Chat
Log In Sign Up

Causes and Cures for Interference in Multilingual Translation

by   Uri Shaham, et al.

Multilingual machine translation models can benefit from synergy between different language pairs, but also suffer from interference. While there is a growing number of sophisticated methods that aim to eliminate interference, our understanding of interference as a phenomenon is still limited. This work identifies the main factors that contribute to interference in multilingual machine translation. Through systematic experimentation, we find that interference (or synergy) are primarily determined by model size, data size, and the proportion of each language pair within the total dataset. We observe that substantial interference occurs mainly when the model is very small with respect to the available training data, and that using standard transformer configurations with less than one billion parameters largely alleviates interference and promotes synergy. Moreover, we show that tuning the sampling temperature to control the proportion of each language pair in the data is key to balancing the amount of interference between low and high resource language pairs effectively, and can lead to superior performance overall.


Learning Language Specific Sub-network for Multilingual Machine Translation

Multilingual neural machine translation aims at learning a single transl...

Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism

We propose multi-way, multilingual neural machine translation. The propo...

Learning Policies for Multilingual Training of Neural Machine Translation Systems

Low-resource Multilingual Neural Machine Translation (MNMT) is typically...

Adaptive Sparse Transformer for Multilingual Translation

Multilingual machine translation has attracted much attention recently d...

Serial or Parallel? Plug-able Adapter for multilingual machine translation

Developing a unified multilingual translation model is a key topic in ma...

Shapley Head Pruning: Identifying and Removing Interference in Multilingual Transformers

Multilingual transformer-based models demonstrate remarkable zero and fe...

More Parameters? No Thanks!

This work studies the long-standing problems of model capacity and negat...