On the Knowledge Graph Completion Using Translation Based Embedding: The Loss Is as Important as the Score

09/02/2019
by   Mojtaba Nayyeri, et al.
0

Knowledge graphs (KGs) represent world's facts in structured forms. KG completion exploits the existing facts in a KG to discover new ones. Translation-based embedding model (TransE) is a prominent formulation to do KG completion. Despite the efficiency of TransE in memory and time, it suffers from several limitations in encoding relation patterns such as many-to-many relation patterns, symmetric, reflexive etc. To tackle this problem, most of the attempts have circled around the revision of the score function of TransE i.e., proposing a more complicated score function such as Trans(A, D, G, H, R, etc) to mitigate the limitations. In this paper, we tackle this problem from a different perspective. We pose theoretical investigations of the main limitations of TransE in the light of loss function rather than the score function. To the best of our knowledge, this has not been investigated so far comprehensively. We show that by a proper selection of the loss function for training the TransE model, the main limitations of the model are mitigated. This is explained by setting upper-bound for the scores of positive samples, showing the region of truth (i.e., the region that a triple is considered positive by the model). Our theoretical proofs with experimental results fill the gap between the capability of translation-based class of embedding models and the loss function. The theories emphasize the importance of the selection of the loss functions for training the models. Our experimental evaluations on different loss functions used for training the models justify our theoretical proofs and confirm the importance of the loss functions on the performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2019

Adaptive Margin Ranking Loss for Knowledge Graph Embeddings via a Correntropy Objective Function

Translation-based embedding models have gained significant attention in ...
research
12/04/2015

Locally Adaptive Translation for Knowledge Graph Embedding

Knowledge graph embedding aims to represent entities and relations in a ...
research
04/12/2020

Exploring Effects of Random Walk Based Minibatch Selection Policy on Knowledge Graph Completion

In this paper, we have explored the effects of different minibatch sampl...
research
07/06/2019

Diachronic Embedding for Temporal Knowledge Graph Completion

Knowledge graphs (KGs) typically contain temporal facts indicating relat...
research
06/15/2023

Relation-Aware Network with Attention-Based Loss for Few-Shot Knowledge Graph Completion

Few-shot knowledge graph completion (FKGC) task aims to predict unseen f...
research
03/01/2023

Enhancing Knowledge Graph Embedding Models with Semantic-driven Loss Functions

Knowledge graph embedding models (KGEMs) are used for various tasks rela...
research
01/07/2022

Stay Positive: Knowledge Graph Embedding Without Negative Sampling

Knowledge graphs (KGs) are typically incomplete and we often wish to inf...

Please sign up or login with your details

Forgot password? Click here to reset