Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction

05/22/2023
by   Adrian Kochsiek, et al.
0

We propose KGT5-context, a simple sequence-to-sequence model for link prediction (LP) in knowledge graphs (KG). Our work expands on KGT5, a recent LP model that exploits textual features of the KG, has small model size, and is scalable. To reach good predictive performance, however, KGT5 relies on an ensemble with a knowledge graph embedding model, which itself is excessively large and costly to use. In this short paper, we show empirically that adding contextual information - i.e., information about the direct neighborhood of a query vertex - alleviates the need for a separate KGE model to obtain good performance. The resulting KGT5-context model obtains state-of-the-art performance in our experimental study, while at the same time reducing model size significantly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2016

Trust from the past: Bayesian Personalized Ranking based Link Prediction in Knowledge Graphs

Link prediction, or predicting the likelihood of a link in a knowledge g...
research
03/19/2022

Sequence-to-Sequence Knowledge Graph Completion and Question Answering

Knowledge graph embedding (KGE) models represent each entity and relatio...
research
11/11/2019

Decompressing Knowledge Graph Representations for Link Prediction

This paper studies the problem of predicting missing relationships betwe...
research
09/17/2022

Flashlight: Scalable Link Prediction with Effective Decoders

Link prediction (LP) has been recognized as an important task in graph l...
research
02/03/2020

Knowledge Graph Embedding for Link Prediction: A Comparative Analysis

Knowledge Graphs (KGs) have found many applications in industry and acad...
research
08/22/2023

Generalising sequence models for epigenome predictions with tissue and assay embeddings

Sequence modelling approaches for epigenetic profile prediction have rec...

Please sign up or login with your details

Forgot password? Click here to reset