HINormer: Representation Learning On Heterogeneous Information Networks with Graph Transformer

02/22/2023
by   Qiheng Mao, et al.
0

Recent studies have highlighted the limitations of message-passing based graph neural networks (GNNs), e.g., limited model expressiveness, over-smoothing, over-squashing, etc. To alleviate these issues, Graph Transformers (GTs) have been proposed which work in the paradigm that allows message passing to a larger coverage even across the whole graph. Hinging on the global range attention mechanism, GTs have shown a superpower for representation learning on homogeneous graphs. However, the investigation of GTs on heterogeneous information networks (HINs) is still under-exploited. In particular, on account of the existence of heterogeneity, HINs show distinct data characteristics and thus require different treatment. To bridge this gap, in this paper we investigate the representation learning on HINs with Graph Transformer, and propose a novel model named HINormer, which capitalizes on a larger-range aggregation mechanism for node representation learning. In particular, assisted by two major modules, i.e., a local structure encoder and a heterogeneous relation encoder, HINormer can capture both the structural and heterogeneous information of nodes on HINs for comprehensive node representations. We conduct extensive experiments on four HIN benchmark datasets, which demonstrate that our proposed model can outperform the state-of-the-art.

READ FULL TEXT

page 4

page 8

research
04/04/2021

Uniting Heterogeneity, Inductiveness, and Efficiency for Graph Representation Learning

With the ubiquitous graph-structured data in various applications, model...
research
05/12/2023

AGFormer: Efficient Graph Representation with Anchor-Graph Transformer

To alleviate the local receptive issue of GCN, Transformers have been ex...
research
02/14/2021

Relation-aware Graph Attention Model With Adaptive Self-adversarial Training

This paper describes an end-to-end solution for the relationship predict...
research
10/25/2019

Improving Graph Attention Networks with Large Margin-based Constraints

Graph Attention Networks (GATs) are the state-of-the-art neural architec...
research
08/11/2022

Heterogeneous Line Graph Transformer for Math Word Problems

This paper describes the design and implementation of a new machine lear...
research
05/27/2022

Personalized PageRank Graph Attention Networks

There has been a rising interest in graph neural networks (GNNs) for rep...
research
04/14/2023

Symbiotic Message Passing Model for Transfer Learning between Anti-Fungal and Anti-Bacterial Domains

Machine learning, and representation learning in particular, has the pot...

Please sign up or login with your details

Forgot password? Click here to reset