Semi-supervised Training for Knowledge Base Graph Self-attention Networks on Link Prediction

09/03/2022
by   Shuanglong Yao, et al.
0

The task of link prediction aims to solve the problem of incomplete knowledge caused by the difficulty of collecting facts from the real world. GCNs-based models are widely applied to solve link prediction problems due to their sophistication, but GCNs-based models are suffering from two problems in the structure and training process. 1) The transformation methods of GCN layers become increasingly complex in GCN-based knowledge representation models; 2) Due to the incompleteness of the knowledge graph collection process, there are many uncollected true facts in the labeled negative samples. Therefore, this paper investigates the characteristic of the information aggregation coefficient (self-attention) of adjacent nodes and redesigns the self-attention mechanism of the GAT structure. Meanwhile, inspired by human thinking habits, we designed a semi-supervised self-training method over pre-trained models. Experimental results on the benchmark datasets FB15k-237 and WN18RR show that our proposed self-attention mechanism and semi-supervised self-training method can effectively improve the performance of the link prediction task. If you look at FB15k-237, for example, the proposed method improves Hits@1 by about 30

READ FULL TEXT
research
08/23/2020

TSAM: Temporal Link Prediction in Directed Networks based on Self-Attention Mechanism

The development of graph neural networks (GCN) makes it possible to lear...
research
02/27/2020

DSSLP: A Distributed Framework for Semi-supervised Link Prediction

Link prediction is widely used in a variety of industrial applications, ...
research
12/20/2021

Self-attention Presents Low-dimensional Knowledge Graph Embeddings for Link Prediction

Recently, link prediction problem, also known as knowledge graph complet...
research
05/11/2023

HAHE: Hierarchical Attention for Hyper-Relational Knowledge Graphs in Global and Local Level

Link Prediction on Hyper-relational Knowledge Graphs (HKG) is a worthwhi...
research
11/11/2020

VStreamDRLS: Dynamic Graph Representation Learning with Self-Attention for Enterprise Distributed Video Streaming Solutions

Live video streaming has become a mainstay as a standard communication s...
research
12/13/2021

Split GCN: Effective Interactive Annotation for Segmentation of Disconnected Instance

Annotating object boundaries by humans demands high costs. Recently, pol...
research
02/22/2023

Do We Really Need Complicated Model Architectures For Temporal Networks?

Recurrent neural network (RNN) and self-attention mechanism (SAM) are th...

Please sign up or login with your details

Forgot password? Click here to reset