-
Knowledge Graph Embedding with Linear Representation for Link Prediction
Knowledge graph (KG) embedding aims to represent entities and relations ...
read it
-
RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space
We study the problem of learning representations of entities and relatio...
read it
-
5* Knowledge Graph Embeddings with Projective Transformations
Performing link prediction using knowledge graph embedding (KGE) models ...
read it
-
TeRo: A Time-aware Knowledge Graph Embedding via Temporal Rotation
In the last few years, there has been a surge of interest in learning re...
read it
-
DensE: An Enhanced Non-Abelian Group Representation for Knowledge Graph Embedding
Capturing the composition patterns of relations is a vital task in knowl...
read it
-
Knowledge Graph Embedding Bi-Vector Models for Symmetric Relation
Knowledge graph embedding (KGE) models have been proposed to improve the...
read it
-
On the Knowledge Graph Completion Using Translation Based Embedding: The Loss Is as Important as the Score
Knowledge graphs (KGs) represent world's facts in structured forms. KG c...
read it
RatE: Relation-Adaptive Translating Embedding for Knowledge Graph Completion
Many graph embedding approaches have been proposed for knowledge graph completion via link prediction. Among those, translating embedding approaches enjoy the advantages of light-weight structure, high efficiency and great interpretability. Especially when extended to complex vector space, they show the capability in handling various relation patterns including symmetry, antisymmetry, inversion and composition. However, previous translating embedding approaches defined in complex vector space suffer from two main issues: 1) representing and modeling capacities of the model are limited by the translation function with rigorous multiplication of two complex numbers; and 2) embedding ambiguity caused by one-to-many relations is not explicitly alleviated. In this paper, we propose a relation-adaptive translation function built upon a novel weighted product in complex space, where the weights are learnable, relation-specific and independent to embedding size. The translation function only requires eight more scalar parameters each relation, but improves expressive power and alleviates embedding ambiguity problem. Based on the function, we then present our Relation-adaptive translating Embedding (RatE) approach to score each graph triple. Moreover, a novel negative sampling method is proposed to utilize both prior knowledge and self-adversarial learning for effective optimization. Experiments verify RatE achieves state-of-the-art performance on four link prediction benchmarks.
READ FULL TEXT
Comments
There are no comments yet.