Learning Geometric Word Meta-Embeddings

04/20/2020
by   Pratik Jawanpuria, et al.
0

We propose a geometric framework for learning meta-embeddings of words from different embedding sources. Our framework transforms the embeddings into a common latent space, where, for example, simple averaging of different embeddings (of a given word) is more amenable. The proposed latent space arises from two particular geometric transformations - the orthogonal rotations and the Mahalanobis metric scaling. Empirical results on several word similarity and word analogy benchmarks illustrate the efficacy of the proposed framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2018

Learning Multilingual Word Embeddings in Latent Metric Space: A Geometric Approach

We propose a novel geometric approach for learning bilingual mappings gi...
research
08/27/2018

Learning Multilingual Word Embeddings in a Latent Metric Space: A Geometric Approach

We propose a novel geometric approach for learning bilingual mappings gi...
research
12/02/2020

On Extending NLP Techniques from the Categorical to the Latent Space: KL Divergence, Zipf's Law, and Similarity Search

Despite the recent successes of deep learning in natural language proces...
research
09/02/2019

Rotate King to get Queen: Word Relationships as Orthogonal Transformations in Embedding Space

A notable property of word embeddings is that word relationships can exi...
research
05/16/2020

Geodesics in fibered latent spaces: A geometric approach to learning correspondences between conditions

This work introduces a geometric framework and a novel network architect...
research
06/04/2018

Absolute Orientation for Word Embedding Alignment

We propose a new technique to align word embeddings which are derived fr...
research
08/31/2019

Rethinking travel behavior modeling representations through embeddings

This paper introduces the concept of travel behavior embeddings, a metho...

Please sign up or login with your details

Forgot password? Click here to reset