Unifying gradient regularization for Heterogeneous Graph Neural Networks

05/25/2023
by   Xiao Yang, et al.
0

Heterogeneous Graph Neural Networks (HGNNs) are a class of powerful deep learning methods widely used to learn representations of heterogeneous graphs. Despite the fast development of HGNNs, they still face some challenges such as over-smoothing, and non-robustness. Previous studies have shown that these problems can be reduced by using gradient regularization methods. However, the existing gradient regularization methods focus on either graph topology or node features. There is no universal approach to integrate these features, which severely affects the efficiency of regularization. In addition, the inclusion of gradient regularization into HGNNs sometimes leads to some problems, such as an unstable training process, increased complexity and insufficient coverage regularized information. Furthermore, there is still short of a complete theoretical analysis of the effects of gradient regularization on HGNNs. In this paper, we propose a novel gradient regularization method called Grug, which iteratively applies regularization to the gradients generated by both propagated messages and the node features during the message-passing process. Grug provides a unified framework integrating graph topology and node features, based on which we conduct a detailed theoretical analysis of their effectiveness. Specifically, the theoretical analyses elaborate the advantages of Grug: 1) Decreasing sample variance during the training process (Stability); 2) Enhancing the generalization of the model (Universality); 3) Reducing the complexity of the model (Simplicity); 4) Improving the integrity and diversity of graph information utilization (Diversity). As a result, Grug has the potential to surpass the theoretical upper bounds set by DropMessage (AAAI-23 Distinguished Papers). In addition, we evaluate Grug on five public real-world datasets with two downstream tasks...

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2022

DropMessage: Unifying Random Dropping for Graph Neural Networks

Graph Neural Networks (GNNs) are powerful tools for graph representation...
research
02/06/2023

On Over-Squashing in Message Passing Neural Networks: The Impact of Width, Depth, and Topology

Message Passing Neural Networks (MPNNs) are instances of Graph Neural Ne...
research
04/03/2021

Topological Regularization for Graph Neural Networks Augmentation

The complexity and non-Euclidean structure of graph data hinder the deve...
research
04/21/2022

Detecting Topology Attacks against Graph Neural Networks

Graph neural networks (GNNs) have been widely used in many real applicat...
research
06/09/2021

Scaling Up Graph Neural Networks Via Graph Coarsening

Scalability of graph neural networks remains one of the major challenges...
research
11/13/2021

Learning to Evolve on Dynamic Graphs

Representation learning in dynamic graphs is a challenging problem becau...
research
02/18/2022

Space4HGNN: A Novel, Modularized and Reproducible Platform to Evaluate Heterogeneous Graph Neural Network

Heterogeneous Graph Neural Network (HGNN) has been successfully employed...

Please sign up or login with your details

Forgot password? Click here to reset