Exploiting Structural and Semantic Context for Commonsense Knowledge Base Completion

10/07/2019
by   Chaitanya Malaviya, et al.
11

Automatic KB completion for commonsense knowledge graphs (e.g., ATOMIC and ConceptNet) poses unique challenges compared to the much studied conventional knowledge bases (e.g., Freebase). Commonsense knowledge graphs use free-form text to represent nodes, resulting in orders of magnitude more nodes compared to conventional KBs (18x more nodes in ATOMIC compared to Freebase (FB15K-237)). Importantly, this implies significantly sparser graph structures - a major challenge for existing KB completion methods that assume densely connected graphs over a relatively smaller set of nodes. In this paper, we present novel KB completion models that can address these challenges by exploiting the structural and semantic context of nodes. Specifically, we investigate two key ideas: (1) learning from local graph structure, using graph convolutional networks and automatic graph densification and (2) transfer learning from pre-trained language models to knowledge graphs for enhanced contextual representation of knowledge. We describe our method to incorporate information from both these sources in a joint model and provide the first empirical results for KB completion on ATOMIC and evaluation with ranking metrics on ConceptNet. Our results demonstrate the effectiveness of language model representations in boosting link prediction performance and the advantages of learning from local graph structure (+1.5 points in MRR for ConceptNet) when training on subgraphs for computational efficiency. Further analysis on model predictions shines light on the types of commonsense knowledge that language models capture well.

READ FULL TEXT

page 2

page 9

research
06/12/2019

COMET: Commonsense Transformers for Automatic Knowledge Graph Construction

We present the first comprehensive study on automatic knowledge base con...
research
10/14/2021

Symbolic Knowledge Distillation: from General Language Models to Commonsense Models

The common practice for training commonsense models has gone from-human-...
research
01/03/2023

A Survey On Few-shot Knowledge Graph Completion with Structural and Commonsense Knowledge

Knowledge graphs (KG) have served as the key component of various natura...
research
05/10/2023

CADGE: Context-Aware Dialogue Generation Enhanced with Graph-Structured Knowledge Aggregation

Commonsense knowledge is crucial to many natural language processing tas...
research
10/27/2020

DualTKB: A Dual Learning Bridge between Text and Knowledge Base

In this work, we present a dual learning approach for unsupervised text ...
research
03/06/2020

On the Role of Conceptualization in Commonsense Knowledge Graph Construction

Commonsense knowledge graphs (CKG) like Atomic and ASER are substantiall...
research
02/10/2023

Adversarial Transformer Language Models for Contextual Commonsense Inference

Contextualized or discourse aware commonsense inference is the task of g...

Please sign up or login with your details

Forgot password? Click here to reset