KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation

11/13/2019
by   Xiaozhi Wang, et al.
0

Pre-trained language representation models (PLMs) learn effective language representations from large-scale unlabeled corpora. Knowledge embedding (KE) algorithms encode the entities and relations in knowledge graphs into informative embeddings to do knowledge graph completion and provide external knowledge for various NLP applications. In this paper, we propose a unified model for Knowledge Embedding and Pre-trained LanguagE Representation (KEPLER), which not only better integrates factual knowledge into PLMs but also effectively learns knowledge graph embeddings. Our KEPLER utilizes a PLM to encode textual descriptions of entities as their entity embeddings, and then jointly learn the knowledge embeddings and language representations. Experimental results on various NLP tasks such as the relation extraction and the entity typing show that our KEPLER can achieve comparable results to the state-of-the-art knowledge-enhanced PLMs without any additional inference overhead. Furthermore, we construct Wikidata5m, a new large-scale knowledge graph dataset with aligned text descriptions, to evaluate KE embedding methods in both the traditional transductive setting and the challenging inductive setting, which needs the models to predict entity embeddings for unseen entities. Experiments demonstrate our KEPLER can achieve good results in both settings.

READ FULL TEXT
research
10/01/2020

CoLAKE: Contextualized Language and Knowledge Embedding

With the emerging branch of incorporating factual knowledge into pre-tra...
research
04/27/2022

Learning to Borrow – Relation Representation for Without-Mention Entity-Pairs for Knowledge Graph Completion

Prior work on integrating text corpora with knowledge graphs (KGs) to im...
research
03/17/2022

Enhanced Temporal Knowledge Embeddings with Contextualized Language Representations

With the emerging research effort to integrate structured and unstructur...
research
09/19/2020

Inductive Learning on Commonsense Knowledge Graph Completion

Commonsense knowledge graph (CKG) is a special type of knowledge graph (...
research
08/22/2022

Repurposing Knowledge Graph Embeddings for Triple Representation via Weak Supervision

The majority of knowledge graph embedding techniques treat entities and ...
research
03/29/2021

Entity Context Graph: Learning Entity Representations fromSemi-Structured Textual Sources on the Web

Knowledge is captured in the form of entities and their relationships an...
research
08/20/2022

Representing Knowledge by Spans: A Knowledge-Enhanced Model for Information Extraction

Knowledge-enhanced pre-trained models for language representation have b...

Please sign up or login with your details

Forgot password? Click here to reset