Decompressing Knowledge Graph Representations for Link Prediction

by   Xiang Kong, et al.

This paper studies the problem of predicting missing relationships between entities in knowledge graphs through learning their representations. Currently, the majority of existing link prediction models employ simple but intuitive scoring functions and relatively small embedding size so that they could be applied to large-scale knowledge graphs. However, these properties also restrict the ability to learn more expressive and robust features. Therefore, diverging from most of the prior works which focus on designing new objective functions, we propose, DeCom, a simple but effective mechanism to boost the performance of existing link predictors such as DistMult, ComplEx, etc, through extracting more expressive features while preventing overfitting by adding just a few extra parameters. Specifically, embeddings of entities and relationships are first decompressed to a more expressive and robust space by decompressing functions, then knowledge graph embedding models are trained in this new feature space. Experimental results on several benchmark knowledge graphs and advanced link prediction systems demonstrate the generalization and effectiveness of our method. Especially, RESCAL + DeCom achieves state-of-the-art performance on the FB15k-237 benchmark across all evaluation metrics. In addition, we also show that compared with DeCom, explicitly increasing the embedding size significantly increase the number of parameters but could not achieve promising performance improvement.


page 1

page 2

page 3

page 4


MDistMult: A Multiple Scoring Functions Model for Link Prediction on Antiviral Drugs Knowledge Graph

Knowledge graphs (KGs) on COVID-19 have been constructed to accelerate t...

Augmenting Knowledge Graphs for Better Link Prediction

Embedding methods have demonstrated robust performance on the task of li...

NePTuNe: Neural Powered Tucker Network for Knowledge Graph Completion

Knowledge graphs link entities through relations to provide a structured...

Self-attention Presents Low-dimensional Knowledge Graph Embeddings for Link Prediction

Recently, link prediction problem, also known as knowledge graph complet...

Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction

We propose KGT5-context, a simple sequence-to-sequence model for link pr...

Embedding Cardinality Constraints in Neural Link Predictors

Neural link predictors learn distributed representations of entities and...

Runtime Performances Benchmark for Knowledge Graph Embedding Methods

This paper wants to focus on providing a characterization of the runtime...

Please sign up or login with your details

Forgot password? Click here to reset