DeepAI AI Chat
Log In Sign Up

Graph Decipher: A transparent dual-attention graph neural network to understand the message-passing mechanism for the node classification

01/04/2022
by   Yan Pang, et al.
University of Colorado Denver
0

Graph neural networks can be effectively applied to find solutions for many real-world problems across widely diverse fields. The success of graph neural networks is linked to the message-passing mechanism on the graph, however, the message-aggregating behavior is still not entirely clear in most algorithms. To improve functionality, we propose a new transparent network called Graph Decipher to investigate the message-passing mechanism by prioritizing in two main components: the graph structure and node attributes, at the graph, feature, and global levels on a graph under the node classification task. However, the computation burden now becomes the most significant issue because the relevance of both graph structure and node attributes are computed on a graph. In order to solve this issue, only relevant representative node attributes are extracted by graph feature filters, allowing calculations to be performed in a category-oriented manner. Experiments on seven datasets show that Graph Decipher achieves state-of-the-art performance while imposing a substantially lower computation burden under the node classification task. Additionally, since our algorithm has the ability to explore the representative node attributes by category, it is utilized to alleviate the imbalanced node classification problem on multi-class graph datasets.

READ FULL TEXT
02/28/2023

Framelet Message Passing

Graph neural networks (GNNs) have achieved champion in wide applications...
02/08/2022

Boosting Graph Neural Networks by Injecting Pooling in Message Passing

There has been tremendous success in the field of graph neural networks ...
05/08/2019

PiNet: A Permutation Invariant Graph Neural Network for Graph Classification

We propose an end-to-end deep learning learning model for graph classifi...
10/06/2021

Geometric and Physical Quantities improve E(3) Equivariant Message Passing

Including covariant information, such as position, force, velocity or sp...
09/05/2020

A Class of Optimal Structures for Node Computations in Message Passing Algorithms

Consider the computations at a node in the message passing algorithms. A...
03/17/2023

Neural-prior stochastic block model

The stochastic block model (SBM) is widely studied as a benchmark for gr...
12/30/2022

Self-organization Preserved Graph Structure Learning with Principle of Relevant Information

Most Graph Neural Networks follow the message-passing paradigm, assuming...