From Hyperbolic Geometry Back to Word Embeddings

04/26/2022
by   Sultan Nurmukhamedov, et al.
0

We choose random points in the hyperbolic disc and claim that these points are already word representations. However, it is yet to be uncovered which point corresponds to which word of the human language of interest. This correspondence can be approximately established using a pointwise mutual information between words and recent alignment techniques.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2022

Hyperbolic Centroid Calculations for Text Classification

A new development in NLP is the construction of hyperbolic word embeddin...
research
02/27/2020

Binarized PMI Matrix: Bridging Word Embeddings and Hyperbolic Spaces

We show analytically that removing sigmoid transformation in the SGNS ob...
research
10/15/2018

Poincaré GloVe: Hyperbolic Word Embeddings

Words are not created equal. In fact, they form an aristocratic graph wi...
research
10/15/2020

A Theory of Hyperbolic Prototype Learning

We introduce Hyperbolic Prototype Learning, a type of supervised learnin...
research
06/06/2021

Combining Static Word Embeddings and Contextual Representations for Bilingual Lexicon Induction

Bilingual Lexicon Induction (BLI) aims to map words in one language to t...
research
10/20/2018

pair2vec: Compositional Word-Pair Embeddings for Cross-Sentence Inference

Reasoning about implied relationships (e.g. paraphrastic, common sense, ...
research
05/12/2023

Multi-Relational Hyperbolic Word Embeddings from Natural Language Definitions

Neural-based word embeddings using solely distributional information hav...

Please sign up or login with your details

Forgot password? Click here to reset