Inductive Learning on Commonsense Knowledge Graph Completion

09/19/2020
by   Bin Wang, et al.
29

Commonsense knowledge graph (CKG) is a special type of knowledge graph (KG), where entities are composed of free-form text. However, most existing CKG completion methods focus on the setting where all the entities are presented at training time. Although this setting is standard for conventional KG completion, it has limitations for CKG completion. At test time, entities in CKGs can be unseen because they may have unseen text/names and entities may be disconnected from the training graph, since CKGs are generally very sparse. Here, we propose to study the inductive learning setting for CKG completion where unseen entities may present at test time. We develop a novel learning framework named InductivE. Different from previous approaches, InductiveE ensures the inductive learning capability by directly computing entity embeddings from raw entity attributes/text. InductiveE consists of a free-text encoder, a graph encoder, and a KG completion decoder. Specifically, the free-text encoder first extracts the textual representation of each entity based on the pre-trained language model and word embedding. The graph encoder is a gated relational graph convolutional neural network that learns from a densified graph for more informative entity representation learning. We develop a method that densifies CKGs by adding edges among semantic-related entities and provide more supportive information for unseen entities, leading to better generalization ability of entity embedding for unseen entities. Finally, inductiveE employs Conv-TransE as the CKG completion decoder. Experimental results show that InductiveE significantly outperforms state-of-the-art baselines in both standard and inductive settings on ATOMIC and ConceptNet benchmarks. InductivE performs especially well on inductive scenarios where it achieves above 48

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/13/2019

KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation

Pre-trained language representation models (PLMs) learn effective langua...
research
06/11/2021

Robust Knowledge Graph Completion with Stacked Convolutions and a Student Re-Ranking Network

Knowledge Graph (KG) completion research usually focuses on densely conn...
research
02/17/2020

Entity Context and Relational Paths for Knowledge Graph Completion

Knowledge graph completion aims to predict missing relations between ent...
research
09/07/2023

Extending Transductive Knowledge Graph Embedding Models for Inductive Logical Relational Inference

Many downstream inference tasks for knowledge graphs, such as relation p...
research
01/22/2021

Knowledge Graph Completion with Text-aided Regularization

Knowledge Graph Completion is a task of expanding the knowledge graph/ba...
research
11/22/2022

Relation-dependent Contrastive Learning with Cluster Sampling for Inductive Relation Prediction

Relation prediction is a task designed for knowledge graph completion wh...
research
04/10/2023

Incorporating Structured Sentences with Time-enhanced BERT for Fully-inductive Temporal Relation Prediction

Temporal relation prediction in incomplete temporal knowledge graphs (TK...

Please sign up or login with your details

Forgot password? Click here to reset