Heterformer: A Transformer Architecture for Node Representation Learning on Heterogeneous Text-Rich Networks

05/20/2022
by   Bowen Jin, et al.
0

We study node representation learning on heterogeneous text-rich networks, where nodes and edges are multi-typed and some types of nodes are associated with text information. Although recent studies on graph neural networks (GNNs) and pretrained language models (PLMs) have demonstrated their power in encoding network and text signals, respectively, less focus has been given to delicately coupling these two types of models on heterogeneous text-rich networks. Specifically, existing GNNs rarely model text in each node in a contextualized way; existing PLMs can hardly be applied to characterize graph structures due to their sequence architecture. In this paper, we propose Heterformer, a Heterogeneous GNN-nested transformer that blends GNNs and PLMs into a unified model. Different from previous "cascaded architectures" that directly add GNN layers upon a PLM, our Heterformer alternately stacks two modules - a graph-attention-based neighbor aggregation module and a transformer-based text and neighbor joint encoding module - to facilitate thorough mutual enhancement between network and text signals. Meanwhile, Heterformer is capable of characterizing network heterogeneity and nodes without text information. Comprehensive experiments on three large-scale datasets from different domains demonstrate the superiority of Heterformer over state-of-the-art baselines in link prediction, transductive/inductive node classification, node clustering, and semantics-based retrieval.

READ FULL TEXT

page 2

page 3

page 4

page 5

page 6

page 9

page 11

page 12

research
02/21/2023

Edgeformers: Graph-Empowered Transformers for Representation Learning on Textual-Edge Networks

Edges in many real-world social/information networks are associated with...
research
03/03/2020

Heterogeneous Graph Transformer

Recent years have witnessed the emerging success of graph neural network...
research
05/06/2021

GraphFormers: GNN-nested Language Models for Linked Text Representation

Linked text representation is critical for many intelligent web applicat...
research
08/01/2023

PVG: Progressive Vision Graph for Vision Recognition

Convolution-based and Transformer-based vision backbone networks process...
research
11/29/2022

Text Representation Enrichment Utilizing Graph based Approaches: Stock Market Technical Analysis Case Study

Graph neural networks (GNNs) have been utilized for various natural lang...
research
08/14/2021

AdaGNN: A multi-modal latent representation meta-learner for GNNs based on AdaBoosting

As a special field in deep learning, Graph Neural Networks (GNNs) focus ...
research
04/30/2022

HDGT: Heterogeneous Driving Graph Transformer for Multi-Agent Trajectory Prediction via Scene Encoding

One essential task for autonomous driving is to encode the information o...

Please sign up or login with your details

Forgot password? Click here to reset