Learning to translate by learning to communicate

07/14/2022
by   C. M. Downey, et al.
0

We formulate and test a technique to use Emergent Communication (EC) with a pretrained multilingual model to improve on modern Unsupervised NMT systems, especially for low-resource languages. It has been argued that the currently dominant paradigm in NLP of pretraining on text-only corpora will not yield robust natural language understanding systems, and the need for grounded, goal-oriented, and interactive language learning has been highlighted. In our approach, we embed a modern multilingual model (mBART, Liu et. al. 2020) into an EC image-reference game, in which the model is incentivized to use multilingual generations to accomplish a vision-grounded task, with the hypothesis that this will align multiple languages to a shared task space. We present two variants of EC Fine-Tuning (Steinert-Threlkeld et. al. 2022), one of which outperforms a backtranslation-based baseline in 6/8 translation settings, and proves especially beneficial for the very low-resource languages of Nepali and Sinhala.

READ FULL TEXT
research
10/16/2021

Multilingual unsupervised sequence segmentation transfers to extremely low-resource languages

We show that unsupervised sequence-segmentation performance can be trans...
research
09/30/2022

Language-Family Adapters for Multilingual Neural Machine Translation

Massively multilingual models pretrained on abundant corpora with self-s...
research
12/10/2020

Exploring Pair-Wise NMT for Indian Languages

In this paper, we address the task of improving pair-wise machine transl...
research
04/10/2019

A Grounded Unsupervised Universal Part-of-Speech Tagger for Low-Resource Languages

Unsupervised part of speech (POS) tagging is often framed as a clusterin...
research
04/27/2023

UIO at SemEval-2023 Task 12: Multilingual fine-tuning for sentiment classification in low-resource languages

Our contribution to the 2023 AfriSenti-SemEval shared task 12: Sentiment...
research
04/06/2021

Efficient transfer learning for NLP with ELECTRA

Clark et al. [2020] claims that the ELECTRA approach is highly efficient...
research
06/20/2020

SIGMORPHON 2020 Shared Task 0: Typologically Diverse Morphological Inflection

A broad goal in natural language processing (NLP) is to develop a system...

Please sign up or login with your details

Forgot password? Click here to reset