CoSe-Co: Text Conditioned Generative CommonSense Contextualizer

06/12/2022
by   Rachit Bansal, et al.
4

Pre-trained Language Models (PTLMs) have been shown to perform well on natural language tasks. Many prior works have leveraged structured commonsense present in the form of entities linked through labeled relations in Knowledge Graphs (KGs) to assist PTLMs. Retrieval approaches use KG as a separate static module which limits coverage since KGs contain finite knowledge. Generative methods train PTLMs on KG triples to improve the scale at which knowledge can be obtained. However, training on symbolic KG entities limits their applicability in tasks involving natural language text where they ignore overall context. To mitigate this, we propose a CommonSense Contextualizer (CoSe-Co) conditioned on sentences as input to make it generically usable in tasks for generating knowledge relevant to the overall context of input text. To train CoSe-Co, we propose a novel dataset comprising of sentence and commonsense knowledge pairs. The knowledge inferred by CoSe-Co is diverse and contain novel entities not present in the underlying KG. We augment generated knowledge in Multi-Choice QA and Open-ended CommonSense Reasoning tasks leading to improvements over current best methods on CSQA, ARC, QASC and OBQA datasets. We also demonstrate its applicability in improving performance of a baseline model for paraphrase generation task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2022

Language Models of Code are Few-Shot Commonsense Learners

We address the general task of structured commonsense reasoning: given a...
research
05/02/2020

Connecting the Dots: A Knowledgeable Path Generator for Commonsense Question Answering

Commonsense question answering (QA) requires the modeling of general bac...
research
09/24/2020

Language Generation with Multi-Hop Reasoning on Commonsense Knowledge Graph

Despite the success of generative pre-trained language models on a serie...
research
03/14/2022

Diversifying Content Generation for Commonsense Reasoning with Mixture of Knowledge Graph Experts

Generative commonsense reasoning (GCR) in natural language is to reason ...
research
11/15/2022

kogito: A Commonsense Knowledge Inference Toolkit

In this paper, we present kogito, an open-source tool for generating com...
research
09/02/2019

Commonsense Knowledge Mining from Pretrained Models

Inferring commonsense knowledge is a key challenge in natural language p...
research
10/24/2020

Learning Contextualized Knowledge Structures for Commonsense Reasoning

Recently, neural-symbolic architectures have achieved success on commons...

Please sign up or login with your details

Forgot password? Click here to reset