Graph Representation Learning with Individualization and Refinement

03/17/2022
by   Mohammed Haroon Dupty, et al.
0

Graph Neural Networks (GNNs) have emerged as prominent models for representation learning on graph structured data. GNNs follow an approach of message passing analogous to 1-dimensional Weisfeiler Lehman (1-WL) test for graph isomorphism and consequently are limited by the distinguishing power of 1-WL. More expressive higher-order GNNs which operate on k-tuples of nodes need increased computational resources in order to process higher-order tensors. Instead of the WL approach, in this work, we follow the classical approach of Individualization and Refinement (IR), a technique followed by most practical isomorphism solvers. Individualization refers to artificially distinguishing a node in the graph and refinement is the propagation of this information to other nodes through message passing. We learn to adaptively select nodes to individualize and to aggregate the resulting graphs after refinement to help handle the complexity. Our technique lets us learn richer node embeddings while keeping the computational complexity manageable. Theoretically, we show that our procedure is more expressive than the 1-WL test. Experiments show that our method outperforms prominent 1-WL GNN models as well as competitive higher-order baselines on several benchmark synthetic and real datasets. Furthermore, our method opens new doors for exploring the paradigm of learning on graph structures with individualization and refinement.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/02/2023

Factor Graph Neural Networks

In recent years, we have witnessed a surge of Graph Neural Networks (GNN...
research
10/02/2020

The Surprising Power of Graph Neural Networks with Random Node Initialization

Graph neural networks (GNNs) are effective models for representation lea...
research
07/29/2023

Graph Condensation for Inductive Node Representation Learning

Graph neural networks (GNNs) encounter significant computational challen...
research
06/19/2023

P-tensors: a General Formalism for Constructing Higher Order Message Passing Networks

Several recent papers have recently shown that higher order graph neural...
research
12/08/2021

Trainability for Universal GNNs Through Surgical Randomness

Message passing neural networks (MPNN) have provable limitations, which ...
research
12/06/2020

Counting Substructures with Higher-Order Graph Neural Networks: Possibility and Impossibility Results

While massage passing based Graph Neural Networks (GNNs) have become inc...
research
04/20/2022

Simplicial Attention Networks

Graph representation learning methods have mostly been limited to the mo...

Please sign up or login with your details

Forgot password? Click here to reset