DeepAI
Log In Sign Up

University of Cape Town's WMT22 System: Multilingual Machine Translation for Southern African Languages

10/21/2022
by   Khalid N. Elmadani, et al.
0

The paper describes the University of Cape Town's submission to the constrained track of the WMT22 Shared Task: Large-Scale Machine Translation Evaluation for African Languages. Our system is a single multilingual translation model that translates between English and 8 South / South East African Languages, as well as between specific pairs of the African languages. We used several techniques suited for low-resource machine translation (MT), including overlap BPE, back-translation, synthetic training data generation, and adding more translation directions during training. Our results show the value of these techniques, especially for directions where very little or no bilingual training data is available.

READ FULL TEXT
09/29/2021

EdinSaar@WMT21: North-Germanic Low-Resource Multilingual NMT

We describe the EdinSaar submission to the shared task of Multilingual L...
07/11/2022

HLT-MT: High-resource Language-specific Training for Multilingual Neural Machine Translation

Multilingual neural machine translation (MNMT) trained in multiple langu...
04/14/2020

Balancing Training for Multilingual Neural Machine Translation

When training multilingual machine translation (MT) models that can tran...
10/15/2021

Breaking Down Multilingual Machine Translation

While multilingual training is now an essential ingredient in machine tr...
10/20/2022

The University of Edinburgh's Submission to the WMT22 Code-Mixing Shared Task (MixMT)

The University of Edinburgh participated in the WMT22 shared task on cod...
12/20/2022

Lego-MT: Towards Detachable Models in Massively Multilingual Machine Translation

Traditional multilingual neural machine translation (MNMT) uses a single...
10/28/2020

The Volctrans Machine Translation System for WMT20

This paper describes our VolcTrans system on WMT20 shared news translati...