Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?

03/16/2022
by   En-Shiun Annie Lee, et al.
0

What can pre-trained multilingual sequence-to-sequence models like mBART contribute to translating low-resource languages? We conduct a thorough empirical experiment in 10 languages to ascertain this, considering five factors: (1) the amount of fine-tuning data, (2) the noise in the fine-tuning data, (3) the amount of pre-training data in the model, (4) the impact of domain mismatch, and (5) language typology. In addition to yielding several heuristics, the experiments form a framework for evaluating the data sensitivities of machine translation systems. While mBART is robust to domain differences, its translations for unseen and typologically distant languages remain below 3.0 BLEU. In answer to our title's question, mBART is not a low-resource panacea; we therefore encourage shifting the emphasis from new models to new data.

READ FULL TEXT
research
09/29/2021

EdinSaar@WMT21: North-Germanic Low-Resource Multilingual NMT

We describe the EdinSaar submission to the shared task of Multilingual L...
research
06/02/2023

Leveraging Auxiliary Domain Parallel Data in Intermediate Task Fine-tuning for Low-resource Translation

NMT systems trained on Pre-trained Multilingual Sequence-Sequence (PMSS)...
research
11/04/2022

Intriguing Properties of Compression on Multilingual Models

Multilingual models are often particularly dependent on scaling to gener...
research
07/28/2018

Domain Robust Feature Extraction for Rapid Low Resource ASR Development

Developing a practical speech recognizer for a low resource language is ...
research
09/11/2022

Detecting Suicide Risk in Online Counseling Services: A Study in a Low-Resource Language

With the increased awareness of situations of mental crisis and their so...
research
08/27/2022

MDIA: A Benchmark for Multilingual Dialogue Generation in 46 Languages

Owing to the lack of corpora for low-resource languages, current works o...
research
09/19/2023

Using fine-tuning and min lookahead beam search to improve Whisper

The performance of Whisper in low-resource languages is still far from p...

Please sign up or login with your details

Forgot password? Click here to reset