Nested Hyperbolic Spaces for Dimensionality Reduction and Hyperbolic NN Design

12/03/2021
by   Xiran Fan, et al.
0

Hyperbolic neural networks have been popular in the recent past due to their ability to represent hierarchical data sets effectively and efficiently. The challenge in developing these networks lies in the nonlinearity of the embedding space namely, the Hyperbolic space. Hyperbolic space is a homogeneous Riemannian manifold of the Lorentz group. Most existing methods (with some exceptions) use local linearization to define a variety of operations paralleling those used in traditional deep neural networks in Euclidean spaces. In this paper, we present a novel fully hyperbolic neural network which uses the concept of projections (embeddings) followed by an intrinsic aggregation and a nonlinearity all within the hyperbolic space. The novelty here lies in the projection which is designed to project data on to a lower-dimensional embedded hyperbolic space and hence leads to a nested hyperbolic space representation independently useful for dimensionality reduction. The main theoretical contribution is that the proposed embedding is proved to be isometric and equivariant under the Lorentz transformations. This projection is computationally efficient since it can be expressed by simple linear operations, and, due to the aforementioned equivariance property, it allows for weight sharing. The nested hyperbolic space representation is the core component of our network and therefore, we first compare this ensuing nested hyperbolic space representation with other dimensionality reduction methods such as tangent PCA, principal geodesic analysis (PGA) and HoroPCA. Based on this equivariant embedding, we develop a novel fully hyperbolic graph convolutional neural network architecture to learn the parameters of the projection. Finally, we present experiments demonstrating comparative performance of our network on several publicly available data sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2021

HoroPCA: Hyperbolic Dimensionality Reduction via Horospherical Projections

This paper studies Principal Component Analysis (PCA) for data lying in ...
research
01/06/2023

Principal Component Analysis in Space Forms

Principal component analysis (PCA) is a workhorse of modern data science...
research
02/07/2023

FFHR: Fully and Flexible Hyperbolic Representation for Knowledge Graph Completion

Learning hyperbolic embeddings for knowledge graph (KG) has gained incre...
research
05/31/2021

Fully Hyperbolic Neural Networks

Hyperbolic neural networks have shown great potential for modeling compl...
research
06/28/2019

Angular separability of data clusters or network communities in geometrical space and its relevance to hyperbolic embedding

Analysis of 'big data' characterized by high-dimensionality such as word...
research
08/29/2023

Hyperbolic Convolutional Neural Networks

Deep Learning is mostly responsible for the surge of interest in Artific...

Please sign up or login with your details

Forgot password? Click here to reset