NCGNN: Node-level Capsule Graph Neural Network

12/07/2020
by   Rui Yang, et al.
0

Message passing has evolved as an effective tool for designing Graph Neural Networks (GNNs). However, most existing works naively sum or average all the neighboring features to update node representations, which suffers from the following limitations: (1) lack of interpretability to identify crucial node features for GNN's prediction; (2) over-smoothing issue where repeated averaging aggregates excessive noise, making features of nodes in different classes over-mixed and thus indistinguishable. In this paper, we propose the Node-level Capsule Graph Neural Network (NCGNN) to address these issues with an improved message passing scheme. Specifically, NCGNN represents nodes as groups of capsules, in which each capsule extracts distinctive features of its corresponding node. For each node-level capsule, a novel dynamic routing procedure is developed to adaptively select appropriate capsules for aggregation from a subgraph identified by the designed graph filter. Consequently, as only the advantageous capsules are aggregated and harmful noise is restrained, over-mixing features of interacting nodes in different classes tends to be avoided to relieve the over-smoothing issue. Furthermore, since the graph filter and the dynamic routing identify a subgraph and a subset of node features that are most influential for the prediction of the model, NCGNN is inherently interpretable and exempt from complex post-hoc explanations. Extensive experiments on six node classification benchmarks demonstrate that NCGNN can well address the over-smoothing issue and outperforms the state of the arts by producing better node embeddings for classification.

READ FULL TEXT
research
02/03/2023

Ordered GNN: Ordering Message Passing to Deal with Heterophily and Over-smoothing

Most graph neural networks follow the message passing mechanism. However...
research
03/20/2022

LEReg: Empower Graph Neural Networks with Local Energy Regularization

Researches on analyzing graphs with Graph Neural Networks (GNNs) have be...
research
10/17/2022

AMPNet: Attention as Message Passing for Graph Neural Networks

Feature-level interactions between nodes can carry crucial information f...
research
08/28/2020

Graph Convolutional Neural Networks with Node Transition Probability-based Message Passing and DropNode Regularization

Graph convolutional neural networks (GCNNs) have received much attention...
research
06/03/2023

Message-passing selection: Towards interpretable GNNs for graph classification

In this paper, we strive to develop an interpretable GNNs' inference par...
research
05/24/2022

Not too little, not too much: a theoretical analysis of graph (over)smoothing

We analyze graph smoothing with mean aggregation, where each node succes...
research
12/16/2020

Hierarchical Graph Capsule Network

Graph Neural Networks (GNNs) draw their strength from explicitly modelin...

Please sign up or login with your details

Forgot password? Click here to reset