Inductive Learning on Commonsense Knowledge Graph Completion

09/19/2020
by   Bin Wang, et al.
29

Commonsense knowledge graph (CKG) is a special type of knowledge graph (KG), where entities are composed of free-form text. However, most existing CKG completion methods focus on the setting where all the entities are presented at training time. Although this setting is standard for conventional KG completion, it has limitations for CKG completion. At test time, entities in CKGs can be unseen because they may have unseen text/names and entities may be disconnected from the training graph, since CKGs are generally very sparse. Here, we propose to study the inductive learning setting for CKG completion where unseen entities may present at test time. We develop a novel learning framework named InductivE. Different from previous approaches, InductiveE ensures the inductive learning capability by directly computing entity embeddings from raw entity attributes/text. InductiveE consists of a free-text encoder, a graph encoder, and a KG completion decoder. Specifically, the free-text encoder first extracts the textual representation of each entity based on the pre-trained language model and word embedding. The graph encoder is a gated relational graph convolutional neural network that learns from a densified graph for more informative entity representation learning. We develop a method that densifies CKGs by adding edges among semantic-related entities and provide more supportive information for unseen entities, leading to better generalization ability of entity embedding for unseen entities. Finally, inductiveE employs Conv-TransE as the CKG completion decoder. Experimental results show that InductiveE significantly outperforms state-of-the-art baselines in both standard and inductive settings on ATOMIC and ConceptNet benchmarks. InductivE performs especially well on inductive scenarios where it achieves above 48

READ FULL TEXT

page 1

page 2

page 3

page 4

11/13/2019

KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation

Pre-trained language representation models (PLMs) learn effective langua...
06/11/2021

Robust Knowledge Graph Completion with Stacked Convolutions and a Student Re-Ranking Network

Knowledge Graph (KG) completion research usually focuses on densely conn...
07/04/2022

VEM^2L: A Plug-and-play Framework for Fusing Text and Structure Knowledge on Sparse Knowledge Graph Completion

Knowledge Graph Completion has been widely studied recently to complete ...
09/24/2021

How Does Knowledge Graph Embedding Extrapolate to Unseen Data: a Semantic Evidence View

Knowledge Graph Embedding (KGE) aims to learn representations for entiti...
10/07/2020

Inductive Entity Representations from Text via Link Prediction

We present a method for learning representations of entities, that uses ...
01/14/2022

Reasoning Through Memorization: Nearest Neighbor Knowledge Graph Embeddings

Previous knowledge graph embedding approaches usually map entities to re...
11/09/2017

Open-World Knowledge Graph Completion

Knowledge Graphs (KGs) have been applied to many tasks including Web sea...

Code Repositories

InductivE

Code for the paper: Inductive Learning on Commonsense Knowledge Graph Completion


view repo