-
Hierarchical Message-Passing Graph Neural Networks
Graph Neural Networks (GNNs) have become a promising approach to machine...
read it
-
Graph Convolutional Neural Networks with Node Transition Probability-based Message Passing and DropNode Regularization
Graph convolutional neural networks (GCNNs) have received much attention...
read it
-
Graph Neural Networks with High-order Feature Interactions
Network representation learning, a fundamental research problem which ai...
read it
-
Measuring and Relieving the Over-smoothing Problem for Graph Neural Networks from the Topological View
Graph Neural Networks (GNNs) have achieved promising performance on a wi...
read it
-
Complete the Missing Half: Augmenting Aggregation Filtering with Diversification for Graph Convolutional Networks
The core operation of Graph Neural Networks (GNNs) is the aggregation en...
read it
-
Hierarchical Graph Capsule Network
Graph Neural Networks (GNNs) draw their strength from explicitly modelin...
read it
-
Conv-MPN: Convolutional Message Passing Neural Network for Structured Outdoor Architecture Reconstruction
This paper proposes a novel message passing neural (MPN) architecture Co...
read it
NCGNN: Node-level Capsule Graph Neural Network
Message passing has evolved as an effective tool for designing Graph Neural Networks (GNNs). However, most existing works naively sum or average all the neighboring features to update node representations, which suffers from the following limitations: (1) lack of interpretability to identify crucial node features for GNN's prediction; (2) over-smoothing issue where repeated averaging aggregates excessive noise, making features of nodes in different classes over-mixed and thus indistinguishable. In this paper, we propose the Node-level Capsule Graph Neural Network (NCGNN) to address these issues with an improved message passing scheme. Specifically, NCGNN represents nodes as groups of capsules, in which each capsule extracts distinctive features of its corresponding node. For each node-level capsule, a novel dynamic routing procedure is developed to adaptively select appropriate capsules for aggregation from a subgraph identified by the designed graph filter. Consequently, as only the advantageous capsules are aggregated and harmful noise is restrained, over-mixing features of interacting nodes in different classes tends to be avoided to relieve the over-smoothing issue. Furthermore, since the graph filter and the dynamic routing identify a subgraph and a subset of node features that are most influential for the prediction of the model, NCGNN is inherently interpretable and exempt from complex post-hoc explanations. Extensive experiments on six node classification benchmarks demonstrate that NCGNN can well address the over-smoothing issue and outperforms the state of the arts by producing better node embeddings for classification.
READ FULL TEXT
Comments
There are no comments yet.