Hyperbolic Centroid Calculations for Text Classification

11/08/2022
by   Aydın Gerek, et al.
0

A new development in NLP is the construction of hyperbolic word embeddings. As opposed to their Euclidean counterparts, hyperbolic embeddings are represented not by vectors, but by points in hyperbolic space. This makes the most common basic scheme for constructing document representations, namely the averaging of word vectors, meaningless in the hyperbolic setting. We reinterpret the vector mean as the centroid of the points represented by the vectors, and investigate various hyperbolic centroid schemes and their effectiveness at text classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2018

Skip-gram word embeddings in hyperbolic space

Embeddings of tree-like graphs in hyperbolic space were recently shown t...
research
04/26/2022

From Hyperbolic Geometry Back to Word Embeddings

We choose random points in the hyperbolic disc and claim that these poin...
research
02/27/2020

Binarized PMI Matrix: Bridging Word Embeddings and Hyperbolic Spaces

We show analytically that removing sigmoid transformation in the SGNS ob...
research
10/15/2020

A Theory of Hyperbolic Prototype Learning

We introduce Hyperbolic Prototype Learning, a type of supervised learnin...
research
05/04/2022

Hyperbolic Relevance Matching for Neural Keyphrase Extraction

Keyphrase extraction is a fundamental task in natural language processin...
research
01/03/2022

On Automating Triangle Constructions in Absolute and Hyperbolic Geometry

We describe first steps towards a system for automated triangle construc...
research
05/26/2018

Stable Geodesic Update on Hyperbolic Space and its Application to Poincare Embeddings

A hyperbolic space has been shown to be more capable of modeling complex...

Please sign up or login with your details

Forgot password? Click here to reset