Alternative Input Signals Ease Transfer in Multilingual Machine Translation

10/15/2021
by   Simeng Sun, et al.
0

Recent work in multilingual machine translation (MMT) has focused on the potential of positive transfer between languages, particularly cases where higher-resourced languages can benefit lower-resourced ones. While training an MMT model, the supervision signals learned from one language pair can be transferred to the other via the tokens shared by multiple source languages. However, the transfer is inhibited when the token overlap among source languages is small, which manifests naturally when languages use different writing systems. In this paper, we tackle inhibited transfer by augmenting the training data with alternative signals that unify different writing systems, such as phonetic, romanized, and transliterated input. We test these signals on Indic and Turkic languages, two language families where the writing systems differ but languages still share common features. Our results indicate that a straightforward multi-source self-ensemble – training a model on a mixture of various signals and ensembling the outputs of the same model fed with different signals during inference, outperforms strong ensemble baselines by 1.3 BLEU points on both language families. Further, we find that incorporating alternative inputs via self-ensemble can be particularly effective when training set is small, leading to +5 BLEU when only 5 data is accessible. Finally, our analysis demonstrates that including alternative signals yields more consistency and translates named entities more accurately, which is crucial for increased factuality of automated systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2023

Beyond Shared Vocabulary: Increasing Representational Word Similarities across Languages for Multilingual Machine Translation

Using a shared vocabulary is common practice in Multilingual Neural Mach...
research
05/12/2020

A Framework for Hierarchical Multilingual Machine Translation

Multilingual machine translation has recently been in vogue given its po...
research
08/02/2020

Multilingual Translation with Extensible Multilingual Pretraining and Finetuning

Recent work demonstrates the potential of multilingual pretraining of cr...
research
05/19/2023

Viewing Knowledge Transfer in Multilingual Machine Translation Through a Representational Lens

We argue that translation quality alone is not a sufficient metric for m...
research
10/20/2016

Learning variable length units for SMT between related languages via Byte Pair Encoding

We explore the use of segments learnt using Byte Pair Encoding (referred...
research
08/13/2018

Rapid Adaptation of Neural Machine Translation to New Languages

This paper examines the problem of adapting neural machine translation s...
research
09/28/2022

Multilingual Transitivity and Bidirectional Multilingual Agreement for Multilingual Document-level Machine Translation

Multilingual machine translation has been proven an effective strategy t...

Please sign up or login with your details

Forgot password? Click here to reset