Relphormer: Relational Graph Transformer for Knowledge Graph Representation

05/22/2022
by   Zhen Bi, et al.
0

Transformers have achieved remarkable performance in widespread fields, including natural language processing, computer vision and graph mining. However, in the knowledge graph representation, where translational distance paradigm dominates this area, vanilla Transformer architectures have not yielded promising improvements. Note that vanilla Transformer architectures struggle to capture the intrinsically semantic and structural information of knowledge graphs and can hardly scale to long-distance neighbors due to quadratic dependency. To this end, we propose a new variant of Transformer for knowledge graph representation dubbed Relphormer. Specifically, we introduce Triple2Seq which can dynamically sample contextualized sub-graph sequences as the input of the Transformer to alleviate the scalability issue. We then propose a novel structure-enhanced self-attention mechanism to encode the relational information and keep the globally semantic information among sub-graphs. Moreover, we propose masked knowledge modeling as a new paradigm for knowledge graph representation learning to unify different link prediction tasks. Experimental results show that our approach can obtain better performance on benchmark datasets compared with baselines.

READ FULL TEXT
research
10/25/2021

Gophormer: Ego-Graph Transformer for Node Classification

Transformers have achieved remarkable performance in a myriad of fields ...
research
08/28/2020

HittER: Hierarchical Transformers for Knowledge Graph Embeddings

This paper examines the challenging problem of learning representations ...
research
02/18/2022

Unleashing the Power of Transformer for Graphs

Despite recent successes in natural language processing and computer vis...
research
05/20/2022

A Unified and Biologically-Plausible Relational Graph Representation of Vision Transformers

Vision transformer (ViT) and its variants have achieved remarkable succe...
research
12/17/2020

A Generalization of Transformer Networks to Graphs

We propose a generalization of transformer neural network architecture f...
research
07/10/2021

PatentMiner: Patent Vacancy Mining via Context-enhanced and Knowledge-guided Graph Attention

Although there are a small number of work to conduct patent research by ...
research
06/09/2021

Do Transformers Really Perform Bad for Graph Representation?

The Transformer architecture has become a dominant choice in many domain...

Please sign up or login with your details

Forgot password? Click here to reset