Emerging Language Spaces Learned From Massively Multilingual Corpora

02/01/2018
by   Jörg Tiedemann, et al.
0

Translations capture important information about languages that can be used as implicit supervision in learning linguistic properties and semantic representations. In an information-centric view, translated texts may be considered as semantic mirrors of the original text and the significant variations that we can observe across various languages can be used to disambiguate a given expression using the linguistic signal that is grounded in translation. Parallel corpora consisting of massive amounts of human translations with a large linguistic variation can be applied to increase abstractions and we propose the use of highly multilingual machine translation models to find language-independent meaning representations. Our initial experiments show that neural machine translation models can indeed learn in such a setup and we can show that the learning algorithm picks up information about the relation between languages in order to optimize transfer leaning with shared parameters. The model creates a continuous language space that represents relationships in terms of geometric distances, which we can visualize to illustrate how languages cluster according to language families and groups. Does this open the door for new ideas of data-driven language typology with promising models and techniques in empirical cross-linguistic research?

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2020

Bridging linguistic typology and multilingual machine translation with multi-view language representations

Sparse language vectors from linguistic typology databases and learned e...
research
09/04/2022

Informative Language Representation Learning for Massively Multilingual Neural Machine Translation

In a multilingual neural machine translation model that fully shares par...
research
01/09/2019

What do Language Representations Really Represent?

A neural language model trained on a text corpus can be used to induce d...
research
10/20/2020

Complete Multilingual Neural Machine Translation

Multilingual Neural Machine Translation (MNMT) models are commonly train...
research
11/01/2019

On the Linguistic Representational Power of Neural Machine Translation Models

Despite the recent success of deep neural networks in natural language p...
research
12/15/2021

Human Languages with Greater Information Density Increase Communication Speed, but Decrease Conversation Breadth

Language is the primary medium through which human information is commun...
research
11/20/2022

A Theory of Unsupervised Translation Motivated by Understanding Animal Communication

Recent years have seen breakthroughs in neural language models that capt...

Please sign up or login with your details

Forgot password? Click here to reset