Paragraph-Level Commonsense Transformers with Recurrent Memory

10/04/2020
by   Saadia Gabriel, et al.
0

Human understanding of narrative texts requires making commonsense inferences beyond what is stated in the text explicitly. A recent model, COMeT, can generate such inferences along several dimensions such as pre- and post-conditions, motivations, and mental-states of the participants. However, COMeT was trained on short phrases, and is therefore discourse-agnostic. When presented with each sentence of a multi-sentence narrative, it might generate inferences that are inconsistent with the rest of the narrative. We present the task of discourse-aware commonsense inference. Given a sentence within a narrative, the goal is to generate commonsense inferences along predefined dimensions, while maintaining coherence with the rest of the narrative. Such large-scale paragraph-level annotation is hard to get and costly, so we use available sentence-level annotations to efficiently and automatically construct a distantly supervised corpus. Using this corpus, we train PARA-COMeT, a discourse-aware model that incorporates paragraph-level information to generate coherent commonsense inferences from narratives. PARA-COMeT captures both semantic knowledge pertaining to prior world knowledge, and episodic knowledge involving how current events relate to prior and future events in a narrative. Our results confirm that PARA-COMeT outperforms the sentence-level baselines, particularly in generating inferences that are both coherent and novel.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2023

What happens before and after: Multi-Event Commonsense in Event Coreference Resolution

Event coreference models cluster event mentions pertaining to the same r...
research
09/11/2021

Implicit Premise Generation with Discourse-aware Commonsense Knowledge Models

Enthymemes are defined as arguments where a premise or conclusion is lef...
research
05/24/2023

COMET-M: Reasoning about Multiple Events in Complex Sentences

Understanding the speaker's intended meaning often involves drawing comm...
research
09/02/2022

Mind the Gap! Injecting Commonsense Knowledge for Abstractive Dialogue Summarization

In this paper, we propose to leverage the unique characteristics of dial...
research
01/01/2021

DISCOS: Bridging the Gap between Discourse Knowledge and Commonsense Knowledge

Commonsense knowledge is crucial for artificial intelligence systems to ...
research
02/10/2023

Adversarial Transformer Language Models for Contextual Commonsense Inference

Contextualized or discourse aware commonsense inference is the task of g...

Please sign up or login with your details

Forgot password? Click here to reset