Unsupervised Machine Translation On Dravidian Languages

03/29/2021
by   Sai Koneru, et al.
0

Unsupervised neural machine translation (UNMT) is beneficial especially for low resource languages such as those from the Dravidian family. However, UNMT systems tend to fail in realistic scenarios involving actual low resource languages. Recent works propose to utilize auxiliary parallel data and have achieved state-of-the-art results. In this work, we focus on unsupervised translation between English and Kannada, a low resource Dravidian language. We additionally utilize a limited amount of auxiliary data between English and other related Dravidian languages. We show that unifying the writing systems is essential in unsupervised translation between the Dravidian languages. We explore several model architectures that use the auxiliary data in order to maximize knowledge sharing and enable UNMT for distant language pairs. Our experiments demonstrate that it is crucial to include auxiliary languages that are similar to our focal language, Kannada. Furthermore, we propose a metric to measure language similarity and show that it serves as a good indicator for selecting the auxiliary languages.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/23/2020

Harnessing Multilinguality in Unsupervised Machine Translation for Rare Languages

Unsupervised translation has reached impressive performance on resource-...
research
05/26/2023

On the Copying Problem of Unsupervised NMT: A Training Schedule with a Language Discriminator Loss

Although unsupervised neural machine translation (UNMT) has achieved suc...
research
01/31/2020

Unsupervised Bilingual Lexicon Induction Across Writing Systems

Recent embedding-based methods in unsupervised bilingual lexicon inducti...
research
06/06/2019

Unsupervised Pivot Translation for Distant Languages

Unsupervised neural machine translation (NMT) has attracted a lot of att...
research
02/15/2018

Universal Neural Machine Translation for Extremely Low Resource Languages

In this paper, we propose a new universal machine translation approach f...
research
10/06/2021

The Low-Resource Double Bind: An Empirical Study of Pruning for Low-Resource Machine Translation

A "bigger is better" explosion in the number of parameters in deep neura...
research
09/07/2018

Logographic Subword Model for Neural Machine Translation

A novel logographic subword model is proposed to reinterpret logograms a...

Please sign up or login with your details

Forgot password? Click here to reset