Language Embeddings for Typology and Cross-lingual Transfer Learning

06/03/2021
by   Dian Yu, et al.
0

Cross-lingual language tasks typically require a substantial amount of annotated data or parallel translation data. We explore whether language representations that capture relationships among languages can be learned and subsequently leveraged in cross-lingual tasks without the use of parallel data. We generate dense embeddings for 29 languages using a denoising autoencoder, and evaluate the embeddings using the World Atlas of Language Structures (WALS) and two extrinsic tasks in a zero-shot setting: cross-lingual dependency parsing and cross-lingual natural language inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2022

Marvelous Agglutinative Language Effect on Cross Lingual Transfer Learning

As for multilingual language models, it is important to select languages...
research
05/04/2022

Cross-lingual Word Embeddings in Hyperbolic Space

Cross-lingual word embeddings can be applied to several natural language...
research
10/30/2018

Learning Cross-Lingual Sentence Representations via a Multi-task Dual-Encoder Model

Neural language models have been shown to achieve an impressive level of...
research
09/18/2017

Limitations of Cross-Lingual Learning from Image Search

Cross-lingual representation learning is an important step in making NLP...
research
11/30/2022

Domain Mismatch Doesn't Always Prevent Cross-Lingual Transfer Learning

Cross-lingual transfer learning without labeled target language data or ...
research
05/11/2021

Backretrieval: An Image-Pivoted Evaluation Metric for Cross-Lingual Text Representations Without Parallel Corpora

Cross-lingual text representations have gained popularity lately and act...
research
10/24/2019

Cross-Lingual Vision-Language Navigation

Vision-Language Navigation (VLN) is the task where an agent is commanded...

Please sign up or login with your details

Forgot password? Click here to reset