Parallel and Distributed Graph Neural Networks: An In-Depth Concurrency Analysis

05/19/2022
by   Maciej Besta, et al.
92

Graph neural networks (GNNs) are among the most powerful tools in deep learning. They routinely solve complex problems on unstructured networks, such as node classification, graph classification, or link prediction, with high accuracy. However, both inference and training of GNNs are complex, and they uniquely combine the features of irregular graph processing with dense and regular computations. This complexity makes it very challenging to execute GNNs efficiently on modern massively parallel architectures. To alleviate this, we first design a taxonomy of parallelism in GNNs, considering data and model parallelism, and different forms of pipelining. Then, we use this taxonomy to investigate the amount of parallelism in numerous GNN models, GNN-driven machine learning tasks, software frameworks, or hardware accelerators. We use the work-depth model, and we also assess communication volume and synchronization. We specifically focus on the sparsity/density of the associated tensors, in order to understand how to effectively apply techniques such as vectorization. We also formally analyze GNN pipelining, and we generalize the established Message-Passing class of GNN models to cover arbitrary pipeline depths, facilitating future optimizations. Finally, we investigate different forms of asynchronicity, navigating the path for future asynchronous parallel GNN pipelines. The outcomes of our analysis are synthesized in a set of insights that help to maximize GNN performance, and a comprehensive list of challenges and opportunities for further research into efficient GNN computations. Our work will help to advance the design of future GNNs.

READ FULL TEXT

page 3

page 5

page 6

page 7

page 9

page 13

page 14

page 17

research
12/20/2020

Analyzing the Performance of Graph Neural Networks with Pipe Parallelism

Many interesting datasets ubiquitous in machine learning and deep learni...
research
03/14/2021

A Taxonomy for Classification and Comparison of Dataflows for GNN Accelerators

Recently, Graph Neural Networks (GNNs) have received a lot of interest b...
research
03/19/2021

GNNerator: A Hardware/Software Framework for Accelerating Graph Neural Networks

Graph Neural Networks (GNNs) use a fully-connected layer to extract feat...
research
09/20/2022

Neural Graph Databases

Graph databases (GDBs) enable processing and analysis of unstructured, c...
research
03/23/2022

Pathways: Asynchronous Distributed Dataflow for ML

We present the design of a new large scale orchestration layer for accel...
research
11/29/2022

Graph Neural Networks: A Powerful and Versatile Tool for Advancing Design, Reliability, and Security of ICs

Graph neural networks (GNNs) have pushed the state-of-the-art (SOTA) for...
research
03/23/2022

Graph Neural Networks in Particle Physics: Implementations, Innovations, and Challenges

Many physical systems can be best understood as sets of discrete data wi...

Please sign up or login with your details

Forgot password? Click here to reset