HyLa: Hyperbolic Laplacian Features For Graph Learning

02/14/2022
by   Tao Yu, et al.
0

Due to its geometric properties, hyperbolic space can support high-fidelity embeddings of tree- and graph-structured data. For graph learning, points in hyperbolic space have been used successfully as signals in deep neural networks: e.g. hyperbolic graph convolutional networks (GCN) can outperform vanilla GCN. However, existing hyperbolic networks are computationally expensive and can be numerically unstable, and cannot scale to large graphs due to these shortcomings. In this paper, we propose HyLa, a completely different approach to using hyperbolic space in graph learning: HyLa maps once from a learned hyperbolic-space embedding to Euclidean space via the eigenfunctions of the Laplacian operator in the hyperbolic space. Our method is inspired by the random Fourier feature methodology, which uses the eigenfunctions of the Laplacian in Euclidean space. We evaluate HyLa on downstream tasks including node classification and text classification, where HyLa shows significant improvements over hyperbolic GCN and other baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2019

Hyperbolic Graph Convolutional Neural Networks

Graph convolutional neural networks (GCNs) embed nodes in a graph into E...
research
04/14/2021

A Hyperbolic-to-Hyperbolic Graph Convolutional Network

Hyperbolic graph convolutional networks (GCNs) demonstrate powerful repr...
research
05/29/2017

Neural Embeddings of Graphs in Hyperbolic Space

Neural embeddings have been used with great success in Natural Language ...
research
08/29/2023

Hyperbolic Convolutional Neural Networks

Deep Learning is mostly responsible for the surge of interest in Artific...
research
07/06/2022

Text Enriched Sparse Hyperbolic Graph Convolutional Networks

Heterogeneous networks, which connect informative nodes containing text ...
research
05/21/2021

Graph Convolutional Networks in Feature Space for Image Deblurring and Super-resolution

Graph convolutional networks (GCNs) have achieved great success in deali...
research
06/07/2022

Towards Scalable Hyperbolic Neural Networks using Taylor Series Approximations

Hyperbolic networks have shown prominent improvements over their Euclide...

Please sign up or login with your details

Forgot password? Click here to reset