Teaching Pretrained Models with Commonsense Reasoning: A Preliminary KB-Based Approach

09/20/2019
by   Shiyang Li, et al.
0

Recently, pretrained language models (e.g., BERT) have achieved great success on many downstream natural language understanding tasks and exhibit a certain level of commonsense reasoning ability. However, their performance on commonsense tasks is still far from that of humans. As a preliminary attempt, we propose a simple yet effective method to teach pretrained models with commonsense reasoning by leveraging the structured knowledge in ConceptNet, the largest commonsense knowledge base (KB). Specifically, the structured knowledge in KB allows us to construct various logical forms, and then generate multiple-choice questions requiring commonsense logical reasoning. Experimental results demonstrate that, when refined on these training examples, the pretrained models consistently improve their performance on tasks that require commonsense reasoning, especially in the few-shot learning setting. Besides, we also perform analysis to understand which logical relations are more relevant to commonsense reasoning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/16/2021

Commonsense Knowledge-Augmented Pretrained Language Models for Causal Reasoning Classification

Commonsense knowledge can be leveraged for identifying causal relations ...
research
09/16/2022

Possible Stories: Evaluating Situated Commonsense Reasoning under Multiple Possible Scenarios

The possible consequences for the same context may vary depending on the...
research
10/08/2020

Precise Task Formalization Matters in Winograd Schema Evaluations

Performance on the Winograd Schema Challenge (WSC), a respected English ...
research
03/15/2022

Things not Written in Text: Exploring Spatial Commonsense from Visual Signals

Spatial commonsense, the knowledge about spatial position and relationsh...
research
11/28/2020

A Data-Driven Study of Commonsense Knowledge using the ConceptNet Knowledge Base

Acquiring commonsense knowledge and reasoning is recognized as an import...
research
09/12/2023

Do PLMs Know and Understand Ontological Knowledge?

Ontological knowledge, which comprises classes and properties and their ...
research
10/02/2019

Cracking the Contextual Commonsense Code: Understanding Commonsense Reasoning Aptitude of Deep Contextual Representations

Pretrained deep contextual representations have advanced the state-of-th...

Please sign up or login with your details

Forgot password? Click here to reset