Scalable Cross-Lingual Transfer of Neural Sentence Embeddings

04/11/2019
by   Hanan Aldarmaki, et al.
0

We develop and investigate several cross-lingual alignment approaches for neural sentence embedding models, such as the supervised inference classifier, InferSent, and sequential encoder-decoder models. We evaluate three alignment frameworks applied to these models: joint modeling, representation transfer learning, and sentence mapping, using parallel text to guide the alignment. Our results support representation transfer as a scalable approach for modular cross-lingual alignment of neural sentence embeddings, where we observe better performance compared to joint models in intrinsic and extrinsic evaluations, particularly with smaller sets of parallel data.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/29/2020

ABSent: Cross-Lingual Sentence Representation Mapping with Bidirectional GANs

A number of cross-lingual transfer learning approaches based on neural n...
06/05/2020

Cross-lingual Transfer Learning for COVID-19 Outbreak Alignment

The spread of COVID-19 has become a significant and troubling aspect of ...
09/30/2019

Simple and Effective Paraphrastic Similarity from Parallel Translations

We present a model and methodology for learning paraphrastic sentence em...
08/18/2016

A Strong Baseline for Learning Cross-Lingual Word Embeddings from Sentence Alignments

While cross-lingual word embeddings have been studied extensively in rec...
02/08/2021

SLUA: A Super Lightweight Unsupervised Word Alignment Model via Cross-Lingual Contrastive Learning

Word alignment is essential for the down-streaming cross-lingual languag...
10/30/2021

TransAug: Translate as Augmentation for Sentence Embeddings

While contrastive learning greatly advances the representation of senten...
09/06/2021

From Alignment to Assignment: Frustratingly Simple Unsupervised Entity Alignment

Cross-lingual entity alignment (EA) aims to find the equivalent entities...