Accelerating SpMM Kernel with Cache-First Edge Sampling for Graph Neural Networks

04/21/2021
by   Chien-Yu Lin, et al.
14

Graph neural networks (GNNs), an emerging deep learning model class, can extract meaningful representations from highly expressive graph-structured data and are therefore gaining popularity for wider ranges of applications. However, current GNNs suffer from the poor performance of their sparse-dense matrix multiplication (SpMM) operator, even when using powerful GPUs. Our analysis shows that 95 popular GNN models on NVIDIA's advanced V100 GPU. Such SpMM performance bottleneck hinders GNNs' applicability to large-scale problems or the development of more sophisticated GNN models. To address this inference time bottleneck, we introduce ES-SpMM, a cache-first edge sampling mechanism and codesigned SpMM kernel. ES-SpMM uses edge sampling to downsize the graph to fit into GPU's shared memory. It thus reduces the computation cost and improves SpMM's cache locality. To evaluate ES-SpMM's performance, we integrated it with a popular GNN framework, DGL, and tested it using representative GNN models and datasets. Our results show that ES-SpMM outperforms the highly optimized cuSPARSE SpMM kernel by up to 4.35x with no accuracy loss and by 45.3x with less than a 1

READ FULL TEXT

page 1

page 9

page 11

research
12/03/2021

TC-GNN: Accelerating Sparse Graph Neural Network Computation Via Dense Tensor Core on GPUs

Recently, graph neural networks (GNNs), as the backbone of graph-based m...
research
08/26/2020

FeatGraph: A Flexible and Efficient Backend for Graph Neural Network Systems

Graph neural networks (GNNs) are gaining increasing popularity as a prom...
research
12/16/2021

BGL: GPU-Efficient GNN Training by Optimizing Graph Data I/O and Preprocessing

Graph neural networks (GNNs) have extended the success of deep neural ne...
research
05/07/2020

Reducing Communication in Graph Neural Network Training

Graph Neural Networks (GNNs) are powerful and flexible neural networks t...
research
06/09/2020

On the Bottleneck of Graph Neural Networks and its Practical Implications

Graph neural networks (GNNs) were shown to effectively learn from highly...
research
10/30/2021

Optimizing Sparse Matrix Multiplications for Graph Neural Networks

Graph neural networks (GNNs) are emerging as a powerful technique for mo...
research
01/18/2023

ReFresh: Reducing Memory Access from Exploiting Stable Historical Embeddings for Graph Neural Network Training

A key performance bottleneck when training graph neural network (GNN) mo...

Please sign up or login with your details

Forgot password? Click here to reset