DeepAI
Log In Sign Up

More Parameters? No Thanks!

07/20/2021
by   Zeeshan Khan, et al.
0

This work studies the long-standing problems of model capacity and negative interference in multilingual neural machine translation MNMT. We use network pruning techniques and observe that pruning 50-70 trained MNMT model results only in a 0.29-1.98 drop in the BLEU score. Suggesting that there exist large redundancies even in MNMT models. These observations motivate us to use the redundant parameters and counter the interference problem efficiently. We propose a novel adaptation strategy, where we iteratively prune and retrain the redundant parameters of an MNMT to improve bilingual representations while retaining the multilinguality. Negative interference severely affects high resource languages, and our method alleviates it without any additional adapter modules. Hence, we call it parameter-free adaptation strategy, paving way for the efficient adaptation of MNMT. We demonstrate the effectiveness of our method on a 9 language MNMT trained on TED talks, and report an average improvement of +1.36 BLEU on high resource pairs. Code will be released here.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/19/2021

Learning Language Specific Sub-network for Multilingual Machine Translation

Multilingual neural machine translation aims at learning a single transl...
08/13/2018

Rapid Adaptation of Neural Machine Translation to New Languages

This paper examines the problem of adapting neural machine translation s...
10/06/2020

On Negative Interference in Multilingual Models: Findings and A Meta-Learning Treatment

Modern multilingual models are trained on concatenated text from multipl...
10/30/2019

Adapting Multilingual Neural Machine Translation to Unseen Languages

Multilingual Neural Machine Translation (MNMT) for low-resource language...
05/23/2022

The Importance of Being Parameters: An Intra-Distillation Method for Serious Gains

Recent model pruning methods have demonstrated the ability to remove red...
12/14/2022

Causes and Cures for Interference in Multilingual Translation

Multilingual machine translation models can benefit from synergy between...
04/20/2022

Does Interference Exist When Training a Once-For-All Network?

The Once-For-All (OFA) method offers an excellent pathway to deploy a tr...