
Embedding Node Structural Role Identity into Hyperbolic Space
Recently, there has been an interest in embedding networks in hyperbolic...
read it

Hydra: A method for strainminimizing hyperbolic embedding
We introduce hydra (hyperbolic distance recovery and approximation), a n...
read it

Geometry of Comparisons
Many data analysis problems can be cast as distance geometry problems in...
read it

Music Recommendations in Hyperbolic Space: An Application of Empirical Bayes and Hierarchical Poincaré Embeddings
Matrix Factorization (MF) is a common method for generating recommendati...
read it

Less but Better: Generalization Enhancement of Ordinal Embedding via Distributional Margin
In the absence of prior knowledge, ordinal embedding methods obtain new ...
read it

FineGrained Entity Typing in Hyperbolic Space
How can we represent hierarchical information present in large type inve...
read it

A Quantized Representation of Intertemporal Choice in the Brain
Value [4][5] is typically modeled using a continuous representation (i.e...
read it
Generalization Error Bound for Hyperbolic Ordinal Embedding
Hyperbolic ordinal embedding (HOE) represents entities as points in hyperbolic space so that they agree as well as possible with given constraints in the form of entity i is more similar to entity j than to entity k. It has been experimentally shown that HOE can obtain representations of hierarchical data such as a knowledge base and a citation network effectively, owing to hyperbolic space's exponential growth property. However, its theoretical analysis has been limited to ideal noiseless settings, and its generalization error in compensation for hyperbolic space's exponential representation ability has not been guaranteed. The difficulty is that existing generalization error bound derivations for ordinal embedding based on the Gramian matrix do not work in HOE, since hyperbolic space is not innerproduct space. In this paper, through our novel characterization of HOE with decomposed Lorentz Gramian matrices, we provide a generalization error bound of HOE for the first time, which is at most exponential with respect to the embedding space's radius. Our comparison between the bounds of HOE and Euclidean ordinal embedding shows that HOE's generalization error is reasonable as a cost for its exponential representation ability.
READ FULL TEXT
Comments
There are no comments yet.