JAKET: Joint Pre-training of Knowledge Graph and Language Understanding

10/02/2020
by   Donghan Yu, et al.
0

Knowledge graphs (KGs) contain rich information about world knowledge, entities and relations. Thus, they can be great supplements to existing pre-trained language models. However, it remains a challenge to efficiently integrate information from KG into language modeling. And the understanding of a knowledge graph requires related context. We propose a novel joint pre-training framework, JAKET, to model both the knowledge graph and language. The knowledge module and language module provide essential information to mutually assist each other: the knowledge module produces embeddings for entities in text while the language module generates context-aware initial embeddings for entities and relations in the graph. Our design enables the pre-trained model to easily adapt to unseen knowledge graphs in new domains. Experimental results on several knowledge-aware NLP tasks show that our proposed framework achieves superior performance by effectively leveraging knowledge in language understanding.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/09/2020

Fusing Context Into Knowledge Graph for Commonsense Reasoning

Commonsense reasoning requires a model to make presumptions about world ...
research
10/11/2022

Revisiting and Advancing Chinese Natural Language Understanding with Accelerated Heterogeneous Knowledge Pre-training

Recently, knowledge-enhanced pre-trained language models (KEPLMs) improv...
research
12/02/2021

DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding

Knowledge-Enhanced Pre-trained Language Models (KEPLMs) are pre-trained ...
research
03/17/2022

Enhanced Temporal Knowledge Embeddings with Contextualized Language Representations

With the emerging research effort to integrate structured and unstructur...
research
08/16/2019

Learning Conceptual-Contexual Embeddings for Medical Text

External knowledge is often useful for natural language understanding ta...
research
08/25/2019

Unsupervised Construction of Knowledge Graphs From Text and Code

The scientific literature is a rich source of information for data minin...
research
09/29/2020

Contextual Knowledge Selection and Embedding towards Enhanced Pre-Trained Language Models

Several recent efforts have been devoted to enhancing pre-trained langua...

Please sign up or login with your details

Forgot password? Click here to reset