A Relational Tucker Decomposition for Multi-Relational Link Prediction

02/03/2019
by   Yanjie Wang, et al.
0

We propose the Relational Tucker3 (RT) decomposition for multi-relational link prediction in knowledge graphs. We show that many existing knowledge graph embedding models are special cases of the RT decomposition with certain predefined sparsity patterns in its components. In contrast to these prior models, RT decouples the sizes of entity and relation embeddings, allows parameter sharing across relations, and does not make use of a predefined sparsity pattern. We use the RT decomposition as a tool to explore whether it is possible and beneficial to automatically learn sparsity patterns, and whether dense models can outperform sparse models (using the same number of parameters). Our experiments indicate that---depending on the dataset--both questions can be answered affirmatively.

READ FULL TEXT
research
05/23/2019

Multi-relational Poincaré Graph Embeddings

Hyperbolic embeddings have recently gained attention in machine learning...
research
03/18/2021

ChronoR: Rotation Based Temporal Knowledge Graph Embedding

Despite the importance and abundance of temporal knowledge graphs, most ...
research
03/13/2019

MMKG: Multi-Modal Knowledge Graphs

We present MMKG, a collection of three knowledge graphs that contain bot...
research
06/11/2021

Inter-domain Multi-relational Link Prediction

Multi-relational graph is a ubiquitous and important data structure, all...
research
07/28/2019

Probabilistic Models of Relational Implication

Relational data in its most basic form is a static collection of known f...
research
07/02/2020

Software Engineering Event Modeling using Relative Time in Temporal Knowledge Graphs

We present a multi-relational temporal Knowledge Graph based on the dail...
research
06/20/2016

Complex Embeddings for Simple Link Prediction

In statistical relational learning, the link prediction problem is key t...

Please sign up or login with your details

Forgot password? Click here to reset