Geographical Distance Is The New Hyperparameter: A Case Study Of Finding The Optimal Pre-trained Language For English-isiZulu Machine Translation

05/17/2022
by   Muhammad Umair Nasir, et al.
0

Stemming from the limited availability of datasets and textual resources for low-resource languages such as isiZulu, there is a significant need to be able to harness knowledge from pre-trained models to improve low resource machine translation. Moreover, a lack of techniques to handle the complexities of morphologically rich languages has compounded the unequal development of translation models, with many widely spoken African languages being left behind. This study explores the potential benefits of transfer learning in an English-isiZulu translation framework. The results indicate the value of transfer learning from closely related languages to enhance the performance of low-resource translation models, thus providing a key strategy for low-resource translation going forward. We gathered results from 8 different language corpora, including one multi-lingual corpus, and saw that isiXhosa-isiZulu outperformed all languages, with a BLEU score of 8.56 on the test set which was better from the multi-lingual corpora pre-trained model by 2.73. We also derived a new coefficient, Nasir's Geographical Distance Coefficient (NGDC) which provides an easy selection of languages for the pre-trained models. NGDC also indicated that isiXhosa should be selected as the language for the pre-trained model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/08/2021

Machine Translation of Low-Resource Indo-European Languages

Transfer learning has been an important technique for low-resource neura...
research
05/18/2023

NollySenti: Leveraging Transfer Learning and Machine Translation for Nigerian Movie Sentiment Classification

Africa has over 2000 indigenous languages but they are under-represented...
research
07/01/2023

Low-Resource Cross-Lingual Adaptive Training for Nigerian Pidgin

Developing effective spoken language processing systems for low-resource...
research
10/13/2022

Tone prediction and orthographic conversion for Basaa

In this paper, we present a seq2seq approach for transliterating mission...
research
02/18/2021

Meta-Transfer Learning for Low-Resource Abstractive Summarization

Neural abstractive summarization has been studied in many pieces of lite...
research
05/30/2021

How Low is Too Low? A Computational Perspective on Extremely Low-Resource Languages

Despite the recent advancements of attention-based deep learning archite...
research
01/04/2021

Transformers and Transfer Learning for Improving Portuguese Semantic Role Labeling

Semantic Role Labeling (SRL) is a core Natural Language Processing task....

Please sign up or login with your details

Forgot password? Click here to reset