Nimble GNN Embedding with Tensor-Train Decomposition

06/21/2022
by   Chunxing Yin, et al.
0

This paper describes a new method for representing embedding tables of graph neural networks (GNNs) more compactly via tensor-train (TT) decomposition. We consider the scenario where (a) the graph data that lack node features, thereby requiring the learning of embeddings during training; and (b) we wish to exploit GPU platforms, where smaller tables are needed to reduce host-to-GPU communication even for large-memory GPUs. The use of TT enables a compact parameterization of the embedding, rendering it small enough to fit entirely on modern GPUs even for massive graphs. When combined with judicious schemes for initialization and hierarchical graph partitioning, this approach can reduce the size of node embedding vectors by 1,659 times to 81,362 times on large publicly available benchmark datasets, achieving comparable or better accuracy and significant speedups on multi-GPU systems. In some cases, our model without explicit node features on input can even match the accuracy of models that use node features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2021

Position-based Hash Embeddings For Scaling Graph Neural Networks

Graph Neural Networks (GNNs) bring the power of deep representation lear...
research
09/16/2021

Efficient Scaling of Dynamic Graph Neural Networks

We present distributed algorithms for training dynamic Graph Neural Netw...
research
08/25/2023

Staleness-Alleviated Distributed GNN Training via Online Dynamic-Embedding Prediction

Despite the recent success of Graph Neural Networks (GNNs), it remains c...
research
05/28/2020

A Distributed Multi-GPU System for Large-Scale Node Embedding at Tencent

Scaling node embedding systems to efficiently process networks in real-w...
research
08/06/2023

Communication-Free Distributed GNN Training with Vertex Cut

Training Graph Neural Networks (GNNs) on real-world graphs consisting of...
research
07/14/2023

DistTGL: Distributed Memory-Based Temporal Graph Neural Network Training

Memory-based Temporal Graph Neural Networks are powerful tools in dynami...
research
01/25/2021

TT-Rec: Tensor Train Compression for Deep Learning Recommendation Models

The memory capacity of embedding tables in deep learning recommendation ...

Please sign up or login with your details

Forgot password? Click here to reset