CEAR: Cross-Entity Aware Reranker for Knowledge Base Completion

04/18/2021
by   Keshav Kolluru, et al.
12

Pre-trained language models (LMs) like BERT have shown to store factual knowledge about the world. This knowledge can be used to augment the information present in Knowledge Bases, which tend to be incomplete. However, prior attempts at using BERT for task of Knowledge Base Completion (KBC) resulted in performance worse than embedding based techniques that rely only on the graph structure. In this work we develop a novel model, Cross-Entity Aware Reranker (CEAR), that uses BERT to re-rank the output of existing KBC models with cross-entity attention. Unlike prior work that scores each entity independently, CEAR uses BERT to score the entities together, which is effective for exploiting its factual knowledge. CEAR establishes a new state of the art performance with 42.6 HITS@1 in FB15k-237 (32.7 and 5.3 pt improvement in HITS@1 for Open Link Prediction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/23/2017

An overview of embedding models of entities and relationships for knowledge base completion

Knowledge bases of real-world facts about entities and their relationshi...
research
06/21/2016

Neighborhood Mixture Model for Knowledge Base Completion

Knowledge bases are useful resources for many natural language processin...
research
03/03/2021

OAG-BERT: Pre-train Heterogeneous Entity-augmented Academic Language Model

To enrich language models with domain knowledge is crucial but difficult...
research
12/06/2017

A Novel Embedding Model for Knowledge Base Completion Based on Convolutional Neural Network

We introduce a novel embedding method for knowledge base completion task...
research
10/12/2022

Focusing on Context is NICE: Improving Overshadowed Entity Disambiguation

Entity disambiguation (ED) is the task of mapping an ambiguous entity me...
research
11/12/2021

Time in a Box: Advancing Knowledge Graph Completion with Temporal Scopes

Almost all statements in knowledge bases have a temporal scope during wh...
research
09/15/2020

MLMLM: Link Prediction with Mean Likelihood Masked Language Model

Knowledge Bases (KBs) are easy to query, verifiable, and interpretable. ...

Please sign up or login with your details

Forgot password? Click here to reset