MLMLM: Link Prediction with Mean Likelihood Masked Language Model

09/15/2020
by   Louis Clouâtre, et al.
5

Knowledge Bases (KBs) are easy to query, verifiable, and interpretable. They however scale with man-hours and high-quality data. Masked Language Models (MLMs), such as BERT, scale with computing power as well as unstructured raw text data. The knowledge contained within those models is however not directly interpretable. We propose to perform link prediction with MLMs to address both the KBs scalability issues and the MLMs interpretability issues. To do that we introduce MLMLM, Mean Likelihood Masked Language Model, an approach comparing the mean likelihood of generating the different entities to perform link prediction in a tractable manner. We obtain State of the Art (SotA) results on the WN18RR dataset and the best non-entity-embedding based results on the FB15k-237 dataset. We also obtain convincing results on link prediction on previously unseen entities, making MLMLM a suitable approach to introducing new entities to a KB.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2016

STransE: a novel embedding model of entities and relationships in knowledge bases

Knowledge bases of real-world facts about entities and their relationshi...
research
10/07/2020

Inductive Entity Representations from Text via Link Prediction

We present a method for learning representations of entities, that uses ...
research
06/03/2021

A Systematic Investigation of KB-Text Embedding Alignment at Scale

Knowledge bases (KBs) and text often contain complementary knowledge: KB...
research
02/17/2020

Interpretable and Fair Comparison of Link Prediction or Entity Alignment Methods with Adjusted Mean Rank

In this work, we take a closer look at the evaluation of two families of...
research
04/22/2020

Semantic Entity Enrichment by Leveraging Multilingual Descriptions for Link Prediction

Most Knowledge Graphs (KGs) contain textual descriptions of entities in ...
research
04/18/2021

CEAR: Cross-Entity Aware Reranker for Knowledge Base Completion

Pre-trained language models (LMs) like BERT have shown to store factual ...
research
06/02/2015

Combining Two And Three-Way Embeddings Models for Link Prediction in Knowledge Bases

This paper tackles the problem of endogenous link prediction for Knowled...

Please sign up or login with your details

Forgot password? Click here to reset