A Theory of Unsupervised Translation Motivated by Understanding Animal Communication

11/20/2022
by   Shafi Goldwasser, et al.
0

Recent years have seen breakthroughs in neural language models that capture nuances of language, culture, and knowledge. Neural networks are capable of translating between languages – in some cases even between two languages where there is little or no access to parallel translations, in what is known as Unsupervised Machine Translation (UMT). Given this progress, it is intriguing to ask whether machine learning tools can ultimately enable understanding animal communication, particularly that of highly intelligent animals. Our work is motivated by an ambitious interdisciplinary initiative, Project CETI, which is collecting a large corpus of sperm whale communications for machine analysis. We propose a theoretical framework for analyzing UMT when no parallel data are available and when it cannot be assumed that the source and target corpora address related subject domains or posses similar linguistic structure. The framework requires access to a prior probability distribution that should assign non-zero probability to possible translations. We instantiate our framework with two models of language. Our analysis suggests that accuracy of translation depends on the complexity of the source language and the amount of “common ground” between the source language and target prior. We also prove upper bounds on the amount of data required from the source language in the unsupervised setting as a function of the amount of data required in a hypothetical supervised setting. Surprisingly, our bounds suggest that the amount of source data required for unsupervised translation is comparable to the supervised setting. For one of the language models which we analyze we also prove a nearly matching lower bound. Our analysis is purely information-theoretic and as such can inform how much source data needs to be collected, but does not yield a computationally efficient procedure.

READ FULL TEXT
research
04/05/2020

Reference Language based Unsupervised Neural Machine Translation

Exploiting common language as an auxiliary for better translation has a ...
research
04/15/2021

Simultaneous Multi-Pivot Neural Machine Translation

Parallel corpora are indispensable for training neural machine translati...
research
09/01/2021

An Unsupervised Method for Building Sentence Simplification Corpora in Multiple Languages

The availability of parallel sentence simplification (SS) is scarce for ...
research
02/07/2023

Learning Translation Quality Evaluation on Low Resource Languages from Large Language Models

Learned metrics such as BLEURT have in recent years become widely employ...
research
04/22/2020

When and Why is Unsupervised Neural Machine Translation Useless?

This paper studies the practicality of the current state-of-the-art unsu...
research
02/01/2018

Emerging Language Spaces Learned From Massively Multilingual Corpora

Translations capture important information about languages that can be u...
research
10/14/2020

A Relaxed Matching Procedure for Unsupervised BLI

Recently unsupervised Bilingual Lexicon Induction (BLI) without any para...

Please sign up or login with your details

Forgot password? Click here to reset