Representation Tradeoffs for Hyperbolic Embeddings

04/10/2018
by   Christopher De Sa, et al.
0

Hyperbolic embeddings offer excellent quality with few dimensions when embedding hierarchical data structures like synonym or type hierarchies. Given a tree, we give a combinatorial construction that embeds the tree in hyperbolic space with arbitrarily low distortion without using optimization. On WordNet, our combinatorial embedding obtains a mean-average-precision of 0.989 with only two dimensions, while Nickel et al.'s recent construction obtains 0.87 using 200 dimensions. We provide upper and lower bounds that allow us to characterize the precision-dimensionality tradeoff inherent in any hyperbolic embedding. To embed general metric spaces, we propose a hyperbolic generalization of multidimensional scaling (h-MDS). We show how to perform exact recovery of hyperbolic points from distances, provide a perturbation analysis, and give a recovery result that allows us to reduce dimensionality. The h-MDS approach offers consistently low distortion even with few dimensions across several datasets. Finally, we extract lessons from the algorithms and theory above to design a PyTorch-based implementation that can handle incomplete information and is scalable.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2020

Tree! I am no Tree! I am a Low Dimensional Hyperbolic Embedding

Given data, finding a faithful low-dimensional hyperbolic embedding of t...
research
05/26/2011

Multidimensional Scaling in the Poincare Disk

Multidimensional scaling (MDS) is a class of projective algorithms tradi...
research
05/16/2022

Browser-based Hyperbolic Visualization of Graphs

Hyperbolic geometry offers a natural focus + context for data visualizat...
research
09/15/2021

Comparing Euclidean and Hyperbolic Embeddings on the WordNet Nouns Hypernymy Graph

Nickel and Kiela (2017) present a new method for embedding tree nodes in...
research
08/18/2023

Capacity Bounds for Hyperbolic Neural Network Representations of Latent Tree Structures

We study the representation capacity of deep hyperbolic neural networks ...
research
05/24/2023

Shadow Cones: Unveiling Partial Orders in Hyperbolic Space

Hyperbolic space has been shown to produce superior low-dimensional embe...
research
07/14/2022

Supervising Embedding Algorithms Using the Stress

While classical scaling, just like principal component analysis, is para...

Please sign up or login with your details

Forgot password? Click here to reset