Preventing Over-Smoothing for Hypergraph Neural Networks

03/31/2022
by   Guanzi Chen, et al.
0

In recent years, hypergraph learning has attracted great attention due to its capacity in representing complex and high-order relationships. However, current neural network approaches designed for hypergraphs are mostly shallow, thus limiting their ability to extract information from high-order neighbors. In this paper, we show both theoretically and empirically, that the performance of hypergraph neural networks does not improve as the number of layers increases, which is known as the over-smoothing problem. To tackle this issue, we develop a new deep hypergraph convolutional network called Deep-HGCN, which can maintain the heterogeneity of node representation in deep layers. Specifically, we prove that a k-layer Deep-HGCN simulates a polynomial filter of order k with arbitrary coefficients, which can relieve the problem of over-smoothing. Experimental results on various datasets demonstrate the superior performance of the proposed model comparing to the state-of-the-art hypergraph learning approaches.

READ FULL TEXT
research
01/23/2019

Hypergraph Convolution and Hypergraph Attention

Recently, graph neural networks have attracted great attention and achie...
research
08/26/2022

Deep Hypergraph Structure Learning

Learning on high-order correlation has shown superiority in data represe...
research
10/23/2020

SAHDL: Sparse Attention Hypergraph Regularized Dictionary Learning

In recent years, the attention mechanism contributes significantly to hy...
research
05/02/2021

Residual Enhanced Multi-Hypergraph Neural Network

Hypergraphs are a generalized data structure of graphs to model higher-o...
research
09/25/2018

Hypergraph Neural Networks

In this paper, we present a hypergraph neural networks (HGNN) framework ...
research
10/07/2022

Scientific Paper Classification Based on Graph Neural Network with Hypergraph Self-attention Mechanism

The number of scientific papers has increased rapidly in recent years. H...
research
11/09/2022

Hyper-GST: Predict Metro Passenger Flow Incorporating GraphSAGE, Hypergraph, Social-meaningful Edge Weights and Temporal Exploitation

Predicting metro passenger flow precisely is of great importance for dyn...

Please sign up or login with your details

Forgot password? Click here to reset