MLPInit: Embarrassingly Simple GNN Training Acceleration with MLP Initialization

09/30/2022
by   Xiaotian Han, et al.
0

Training graph neural networks (GNNs) on large graphs is complex and extremely time consuming. This is attributed to overheads caused by sparse matrix multiplication, which are sidestepped when training multi-layer perceptrons (MLPs) with only node features. MLPs, by ignoring graph context, are simple and faster for graph data, however they usually sacrifice prediction accuracy, limiting their applications for graph data. We observe that for most message passing-based GNNs, we can trivially derive an analog MLP (we call this a PeerMLP) whose weights can be made identical, making us curious about how do GNNs using weights from a fully trained PeerMLP perform? Surprisingly, we find that GNNs initialized with such weights significantly outperform their PeerMLPs for graph data, motivating us to use PeerMLP training as a precursor, initialization step to GNN training. To this end, we propose an embarrassingly simple, yet hugely effective initialization method for GNN training acceleration, called MLPInit. Our extensive experiments on multiple large-scale graph datasets with diverse GNN architectures validate that MLPInit can accelerate the training of GNNs (up to 33X speedup on OGB-products) and often improve prediction performance (e.g., up to 7.97 across 7 datasets for node classification, and up to 17.81 4 datasets for link prediction on metric Hits@10). Most importantly, MLPInit is extremely simple to implement and can be flexibly used as a plug-and-play initialization method for message passing-based GNNs.

READ FULL TEXT
research
01/25/2021

Identity-aware Graph Neural Networks

Message passing Graph Neural Networks (GNNs) provide a powerful modeling...
research
12/18/2022

Graph Neural Networks are Inherently Good Generalizers: Insights by Bridging GNNs and MLPs

Graph neural networks (GNNs), as the de-facto model class for representa...
research
04/20/2023

Decouple Graph Neural Networks: Train Multiple Simple GNNs Simultaneously Instead of One

Graph neural networks (GNN) suffer from severe inefficiency. It is mainl...
research
11/28/2022

You Can Have Better Graph Neural Networks by Not Training Weights at All: Finding Untrained GNNs Tickets

Recent works have impressively demonstrated that there exists a subnetwo...
research
04/27/2022

FlowGNN: A Dataflow Architecture for Universal Graph Neural Network Inference via Multi-Queue Streaming

Graph neural networks (GNNs) have recently exploded in popularity thanks...
research
06/01/2023

SpotTarget: Rethinking the Effect of Target Edges for Link Prediction in Graph Neural Networks

Graph Neural Networks (GNNs) have demonstrated promising outcomes across...
research
10/26/2020

Towards Scale-Invariant Graph-related Problem Solving by Iterative Homogeneous Graph Neural Networks

Current graph neural networks (GNNs) lack generalizability with respect ...

Please sign up or login with your details

Forgot password? Click here to reset