DeepAI AI Chat
Log In Sign Up

Inducing Language-Agnostic Multilingual Representations

08/20/2020
by   Wei Zhao, et al.
0

Multilingual representations have the potential to make cross-lingual systems available to the vast majority of languages in the world. However, they currently require large pretraining corpora, or assume access to typologically similar languages. In this work, we address these obstacles by removing language identity signals from multilingual embeddings. We examine three approaches for this: 1) re-aligning the vector spaces of target languages (all together) to a pivot source language; 2) removing languages-specific means and variances, which yields better discriminativeness of embeddings as a by-product; and 3) normalizing input texts by removing morphological contractions and sentence reordering, thus yielding language-agnostic representations. We evaluate on the tasks of XNLI and reference-free MT evaluation of varying difficulty across 19 selected languages. Our experiments demonstrate the language-agnostic behavior of our multilingual representations, which manifest the potential of zero-shot cross-lingual transfer to distant and low-resource languages, and decrease the performance gap by 8.9 points (M-BERT) and 18.2 points (XLM-R) on average across all tasks and languages. We make our codes and models available.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/09/2022

Detecting Languages Unintelligible to Multilingual Models through Local Structure Probes

Providing better language tools for low-resource and endangered language...
10/11/2022

Shapley Head Pruning: Identifying and Removing Interference in Multilingual Transformers

Multilingual transformer-based models demonstrate remarkable zero and fe...
11/08/2019

How Language-Neutral is Multilingual BERT?

Multilingual BERT (mBERT) provides sentence representations for 104 lang...
09/23/2018

Towards Language Agnostic Universal Representations

When a bilingual student learns to solve word problems in math, we expec...
09/10/2021

A Simple and Effective Method To Eliminate the Self Language Bias in Multilingual Representations

Language agnostic and semantic-language information isolation is an emer...
09/16/2021

Locating Language-Specific Information in Contextualized Embeddings

Multilingual pretrained language models (MPLMs) exhibit multilinguality ...
03/18/2022

Do Multilingual Language Models Capture Differing Moral Norms?

Massively multilingual sentence representations are trained on large cor...