Simple and Effective Paraphrastic Similarity from Parallel Translations

09/30/2019
by   John Wieting, et al.
0

We present a model and methodology for learning paraphrastic sentence embeddings directly from bitext, removing the time-consuming intermediate step of creating paraphrase corpora. Further, we show that the resulting model can be applied to cross-lingual tasks where it both outperforms and is orders of magnitude faster than more complex state-of-the-art baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/04/2022

Cross-lingual Word Embeddings in Hyperbolic Space

Cross-lingual word embeddings can be applied to several natural language...
research
05/23/2023

Linear Cross-Lingual Mapping of Sentence Embeddings

Semantics of a sentence is defined with much less ambiguity than semanti...
research
04/11/2019

Scalable Cross-Lingual Transfer of Neural Sentence Embeddings

We develop and investigate several cross-lingual alignment approaches fo...
research
04/03/2023

Simple Yet Effective Neural Ranking and Reranking Baselines for Cross-Lingual Information Retrieval

The advent of multilingual language models has generated a resurgence of...
research
01/29/2020

ABSent: Cross-Lingual Sentence Representation Mapping with Bidirectional GANs

A number of cross-lingual transfer learning approaches based on neural n...
research
04/12/2018

Learning Multilingual Embeddings for Cross-Lingual Information Retrieval in the Presence of Topically Aligned Corpora

Cross-lingual information retrieval is a challenging task in the absence...
research
11/27/2019

word2word: A Collection of Bilingual Lexicons for 3,564 Language Pairs

We present word2word, a publicly available dataset and an open-source Py...

Please sign up or login with your details

Forgot password? Click here to reset