Do Multilingual Neural Machine Translation Models Contain Language Pair Specific Attention Heads?

05/31/2021
by   Zae Myung Kim, et al.
0

Recent studies on the analysis of the multilingual representations focus on identifying whether there is an emergence of language-independent representations, or whether a multilingual model partitions its weights among different languages. While most of such work has been conducted in a "black-box" manner, this paper aims to analyze individual components of a multilingual neural translation (NMT) model. In particular, we look at the encoder self-attention and encoder-decoder attention heads (in a many-to-one NMT model) that are more specific to the translation of a certain language pair than others by (1) employing metrics that quantify some aspects of the attention weights such as "variance" or "confidence", and (2) systematically ranking the importance of attention heads with respect to translation quality. Experimental results show that surprisingly, the set of most important attention heads are very similar across the language pairs and that it is possible to remove nearly one-third of the less important heads without hurting the translation quality greatly.

READ FULL TEXT

page 4

page 8

research
06/08/2018

Multilingual Neural Machine Translation with Task-Specific Attention

Multilingual machine translation addresses the task of translating betwe...
research
06/28/2019

From Bilingual to Multilingual Neural Machine Translation by Incremental Training

Multilingual Neural Machine Translation approaches are based on the use ...
research
10/20/2022

Can Domains Be Transferred Across Languages in Multi-Domain Multilingual Neural Machine Translation?

Previous works mostly focus on either multilingual or multi-domain aspec...
research
11/01/2018

Multilingual NMT with a language-independent attention bridge

In this paper, we propose a multilingual encoder-decoder architecture ca...
research
05/03/2022

OmniKnight: Multilingual Neural Machine Translation with Language-Specific Self-Distillation

Although all-in-one-model multilingual neural machine translation (MNMT)...
research
05/19/2023

ReSeTOX: Re-learning attention weights for toxicity mitigation in machine translation

Our proposed method, ReSeTOX (REdo SEarch if TOXic), addresses the issue...
research
09/14/2021

Efficient Inference for Multilingual Neural Machine Translation

Multilingual NMT has become an attractive solution for MT deployment in ...

Please sign up or login with your details

Forgot password? Click here to reset