GraphTensor: Comprehensive GNN-Acceleration Framework for Efficient Parallel Processing of Massive Datasets

05/27/2023
by   Junhyeok Jang, et al.
0

We present GraphTensor, a comprehensive open-source framework that supports efficient parallel neural network processing on large graphs. GraphTensor offers a set of easy-to-use programming primitives that appreciate both graph and neural network execution behaviors from the beginning (graph sampling) to the end (dense data processing). Our framework runs diverse graph neural network (GNN) models in a destination-centric, feature-wise manner, which can significantly shorten training execution times in a GPU. In addition, GraphTensor rearranges multiple GNN kernels based on their system hyperparameters in a self-governing manner, thereby reducing the processing dimensionality and the latencies further. From the end-to-end execution viewpoint, GraphTensor significantly shortens the service-level GNN latency by applying pipeline parallelism for efficient graph dataset preprocessing. Our evaluation shows that GraphTensor exhibits 1.4x better training performance than emerging GNN frameworks under the execution of large-scale, real-world graph workloads. For the end-to-end services, GraphTensor reduces training latencies of an advanced version of the GNN frameworks (optimized for multi-threaded graph sampling) by 2.4x, on average.

READ FULL TEXT
research
12/03/2021

TC-GNN: Accelerating Sparse Graph Neural Network Computation Via Dense Tensor Core on GPUs

Recently, graph neural networks (GNNs), as the backbone of graph-based m...
research
01/23/2022

Hardware/Software Co-Programmable Framework for Computational SSDs to Accelerate Deep Learning Service on Large-Scale Graphs

Graph neural networks (GNNs) process large-scale graphs consisting of a ...
research
06/11/2020

GNNAdvisor: An Efficient Runtime System for GNN Acceleration on GPUs

As the emerging trend of the graph-based deep learning, Graph Neural Net...
research
05/27/2022

Bias Reduction via Cooperative Bargaining in Synthetic Graph Dataset Generation

In general, to draw robust conclusions from a dataset, all the analyzed ...
research
08/26/2020

FeatGraph: A Flexible and Efficient Backend for Graph Neural Network Systems

Graph neural networks (GNNs) are gaining increasing popularity as a prom...
research
08/26/2021

GNNSampler: Bridging the Gap between Sampling Algorithms of GNN and Hardware

Sampling is a critical operation in the training of Graph Neural Network...
research
01/27/2023

TrojanSAINT: Gate-Level Netlist Sampling-Based Inductive Learning for Hardware Trojan Detection

We propose TrojanSAINT, a graph neural network (GNN)-based hardware Troj...

Please sign up or login with your details

Forgot password? Click here to reset