Commonsense Knowledge-Augmented Pretrained Language Models for Causal Reasoning Classification

12/16/2021
by   Pedram Hosseini, et al.
0

Commonsense knowledge can be leveraged for identifying causal relations in text. In this work, we verbalize triples in ATOMIC2020, a wide coverage commonsense reasoning knowledge graph, to natural language text and continually pretrain a BERT pretrained language model. We evaluate the resulting model on answering commonsense reasoning questions. Our results show that a continually pretrained language model augmented with commonsense reasoning knowledge outperforms our baseline on two commonsense causal reasoning benchmarks, COPA and BCOPA-CE, without additional improvement on the base model or using quality-enhanced data for fine-tuning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/20/2019

Teaching Pretrained Models with Commonsense Reasoning: A Preliminary KB-Based Approach

Recently, pretrained language models (e.g., BERT) have achieved great su...
research
03/21/2022

Relevant CommonSense Subgraphs for "What if..." Procedural Reasoning

We study the challenge of learning causal reasoning over procedural text...
research
10/30/2021

Automatic Knowledge Augmentation for Generative Commonsense Reasoning

Generative commonsense reasoning is the capability of a language model t...
research
06/02/2021

COM2SENSE: A Commonsense Reasoning Benchmark with Complementary Sentences

Commonsense reasoning is intuitive for humans but has been a long-term c...
research
01/02/2021

KM-BART: Knowledge Enhanced Multimodal BART for Visual Commonsense Generation

We present Knowledge Enhanced Multimodal BART (KM-BART), which is a Tran...
research
10/15/2020

Natural Language Rationales with Full-Stack Visual Reasoning: From Pixels to Semantic Frames to Commonsense Graphs

Natural language rationales could provide intuitive, higher-level explan...
research
04/19/2021

BERTić – The Transformer Language Model for Bosnian, Croatian, Montenegrin and Serbian

In this paper we describe a transformer model pre-trained on 8 billion t...

Please sign up or login with your details

Forgot password? Click here to reset