Learning Contextualized Knowledge Structures for Commonsense Reasoning

10/24/2020
by   Jun Yan, et al.
0

Recently, neural-symbolic architectures have achieved success on commonsense reasoning through effectively encoding relational structures retrieved from external knowledge graphs (KGs) and obtained state-of-the-art results in tasks such as (commonsense) question answering and natural language inference. However, these methods rely on quality and contextualized knowledge structures (i.e., fact triples) that are retrieved at the pre-processing stage but overlook challenges caused by incompleteness of a KG, limited expressiveness of its relations, and retrieved facts irrelevant to the reasoning context. In this paper, we present a novel neural-symbolic model, named Hybrid Graph Network (HGN), which jointly generates feature representations for new triples (as a complement to existing edges in the KG), determines the relevance of the triples to the reasoning context, and learns graph module parameters for encoding the relational information. Our model learns a compact graph structure (comprising both extracted and generated edges) through filtering edges that are unhelpful to the reasoning process. We show marked improvement on three commonsense reasoning benchmarks and demonstrate the superiority of the learned graph structures with user studies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/14/2021

Neural-Symbolic Commonsense Reasoner with Relation Predictors

Commonsense reasoning aims to incorporate sets of commonsense facts, ret...
research
09/04/2019

KagNet: Knowledge-Aware Graph Networks for Commonsense Reasoning

Commonsense reasoning aims to empower machines with the human ability to...
research
12/09/2020

Fusing Context Into Knowledge Graph for Commonsense Reasoning

Commonsense reasoning requires a model to make presumptions about world ...
research
04/30/2020

Dynamic Language Binding in Relational Visual Reasoning

We present Language-binding Object Graph Network, the first neural reaso...
research
06/12/2022

CoSe-Co: Text Conditioned Generative CommonSense Contextualizer

Pre-trained Language Models (PTLMs) have been shown to perform well on n...
research
12/06/2021

JointLK: Joint Reasoning with Language Models and Knowledge Graphs for Commonsense Question Answering

Existing KG-augmented models for question answering primarily focus on d...
research
02/23/2022

Commonsense Reasoning for Identifying and Understanding the Implicit Need of Help and Synthesizing Assistive Actions

Human-Robot Interaction (HRI) is an emerging subfield of service robotic...

Please sign up or login with your details

Forgot password? Click here to reset