Multi-teacher Knowledge Distillation for Knowledge Graph Completion

10/14/2020
by   Kai Wang, et al.
10

Link prediction based on knowledge graph embedding (KGE) aims to predict new triples to complete knowledge graphs (KGs) automatically. However, recent KGE models tend to improve performance by excessively increasing vector dimensions, which would cause enormous training costs and save storage in practical applications. To address this problem, we first theoretically analyze the capacity of low-dimensional space for KG embeddings based on the principle of minimum entropy. Then, we propose a novel knowledge distillation framework for knowledge graph embedding, utilizing multiple low-dimensional KGE models as teachers. Under a novel iterative distillation strategy, the MulDE model produces soft labels according to training epochs and student performance adaptively. The experimental results show that MulDE can effectively improve the performance and training speed of low-dimensional KGE models. The distilled 32-dimensional models are very competitive compared to some of state-or-the-art (SotA) high-dimensional methods on several commonly-used datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2022

Improving Knowledge Graph Embedding via Iterative Self-Semantic Knowledge Distillation

Knowledge graph embedding (KGE) has been intensively investigated for li...
research
01/03/2022

Swift and Sure: Hardness-aware Contrastive Learning for Low-dimensional Knowledge Graph Embeddings

Knowledge graph embedding (KGE) has drawn great attention due to its pot...
research
09/13/2020

DistilE: Distiling Knowledge Graph Embeddings for Faster and Cheaper Reasoning

Knowledge Graph Embedding (KGE) is a popular method for KG reasoning and...
research
04/11/2021

Multiple Run Ensemble Learning withLow-Dimensional Knowledge Graph Embeddings

Among the top approaches of recent years, link prediction using knowledg...
research
05/25/2023

Collective Knowledge Graph Completion with Mutual Knowledge Distillation

Knowledge graph completion (KGC), the task of predicting missing informa...
research
07/12/2021

Technical Report of Team GraphMIRAcles in the WikiKG90M-LSC Track of OGB-LSC @ KDD Cup 2021

Link prediction in large-scale knowledge graphs has gained increasing at...
research
03/22/2023

From Wide to Deep: Dimension Lifting Network for Parameter-efficient Knowledge Graph Embedding

Knowledge graph embedding (KGE) that maps entities and relations into ve...

Please sign up or login with your details

Forgot password? Click here to reset