Towards Scalable Hyperbolic Neural Networks using Taylor Series Approximations

06/07/2022
by   Nurendra Choudhary, et al.
0

Hyperbolic networks have shown prominent improvements over their Euclidean counterparts in several areas involving hierarchical datasets in various domains such as computer vision, graph analysis, and natural language processing. However, their adoption in practice remains restricted due to (i) non-scalability on accelerated deep learning hardware, (ii) vanishing gradients due to the closure of hyperbolic space, and (iii) information loss due to frequent mapping between local tangent space and fully hyperbolic space. To tackle these issues, we propose the approximation of hyperbolic operators using Taylor series expansions, which allows us to reformulate the computationally expensive tangent and cosine hyperbolic functions into their polynomial equivariants which are more efficient. This allows us to retain the benefits of preserving the hierarchical anatomy of the hyperbolic space, while maintaining the scalability over current accelerated deep learning infrastructure. The polynomial formulation also enables us to utilize the advancements in Euclidean networks such as gradient clipping and ReLU activation to avoid vanishing gradients and remove errors due to frequent switching between tangent space and hyperbolic space. Our empirical evaluation on standard benchmarks in the domain of graph analysis and computer vision shows that our polynomial formulation is as scalable as Euclidean architectures, both in terms of memory and time complexity, while providing results as effective as hyperbolic models. Moreover, our formulation also shows a considerable improvement over its baselines due to our solution to vanishing gradients and information loss.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2022

Pseudo-Poincaré: A Unification Framework for Euclidean and Hyperbolic Graph Neural Networks

Hyperbolic neural networks have recently gained significant attention du...
research
07/23/2021

Free Hyperbolic Neural Networks with Limited Radii

Non-Euclidean geometry with constant negative curvature, i.e., hyperboli...
research
02/14/2022

HyLa: Hyperbolic Laplacian Features For Graph Learning

Due to its geometric properties, hyperbolic space can support high-fidel...
research
05/11/2023

Hyperbolic Deep Learning in Computer Vision: A Survey

Deep representation learning is a ubiquitous part of modern computer vis...
research
03/28/2023

Hyperbolic Geometry in Computer Vision: A Novel Framework for Convolutional Neural Networks

Real-world visual data exhibit intrinsic hierarchical structures that ca...
research
08/29/2023

Hyperbolic Convolutional Neural Networks

Deep Learning is mostly responsible for the surge of interest in Artific...
research
02/14/2022

Reproduction Capabilities of Penalized Hyperbolic-polynomial Splines

This paper investigates two important analytical properties of hyperboli...

Please sign up or login with your details

Forgot password? Click here to reset