Position-based Hash Embeddings For Scaling Graph Neural Networks

08/31/2021
by   Maria Kalantzi, et al.
0

Graph Neural Networks (GNNs) bring the power of deep representation learning to graph and relational data and achieve state-of-the-art performance in many applications. GNNs compute node representations by taking into account the topology of the node's ego-network and the features of the ego-network's nodes. When the nodes do not have high-quality features, GNNs learn an embedding layer to compute node embeddings and use them as input features. However, the size of the embedding layer is linear to the product of the number of nodes in the graph and the dimensionality of the embedding and does not scale to big data and graphs with hundreds of millions of nodes. To reduce the memory associated with this embedding layer, hashing-based approaches, commonly used in applications like NLP and recommender systems, can potentially be used. However, a direct application of these ideas fails to exploit the fact that in many real-world graphs, nodes that are topologically close will tend to be related to each other (homophily) and as such their representations will be similar. In this work, we present approaches that take advantage of the nodes' position in the graph to dramatically reduce the memory required, with minimal if any degradation in the quality of the resulting GNN model. Our approaches decompose a node's embedding into two components: a position-specific component and a node-specific component. The position-specific component models homophily and the node-specific component models the node-to-node variation. Extensive experiments using different datasets and GNN models show that our methods are able to reduce the memory requirements by 88 all cases, better classification accuracy than other competing approaches, including the full embeddings.

READ FULL TEXT
research
02/21/2020

Memory-Based Graph Networks

Graph neural networks (GNNs) are a class of deep models that operate on ...
research
03/04/2020

Learning to Hash with Graph Neural Networks for Recommender Systems

Graph representation learning has attracted much attention in supporting...
research
06/21/2022

Nimble GNN Embedding with Tensor-Train Decomposition

This paper describes a new method for representing embedding tables of g...
research
08/21/2019

Hebbian Graph Embeddings

Representation learning has recently been successfully used to create ve...
research
07/11/2022

Boosting Heterogeneous Catalyst Discovery by Structurally Constrained Deep Learning Models

The discovery of new catalysts is one of the significant topics of compu...
research
05/07/2021

Graph Entropy Guided Node Embedding Dimension Selection for Graph Neural Networks

Graph representation learning has achieved great success in many areas, ...
research
09/25/2019

MONET: Debiasing Graph Embeddings via the Metadata-Orthogonal Training Unit

Are Graph Neural Networks (GNNs) fair? In many real world graphs, the fo...

Please sign up or login with your details

Forgot password? Click here to reset