Paragraph-Level Commonsense Transformers with Recurrent Memory

by   Saadia Gabriel, et al.

Human understanding of narrative texts requires making commonsense inferences beyond what is stated in the text explicitly. A recent model, COMeT, can generate such inferences along several dimensions such as pre- and post-conditions, motivations, and mental-states of the participants. However, COMeT was trained on short phrases, and is therefore discourse-agnostic. When presented with each sentence of a multi-sentence narrative, it might generate inferences that are inconsistent with the rest of the narrative. We present the task of discourse-aware commonsense inference. Given a sentence within a narrative, the goal is to generate commonsense inferences along predefined dimensions, while maintaining coherence with the rest of the narrative. Such large-scale paragraph-level annotation is hard to get and costly, so we use available sentence-level annotations to efficiently and automatically construct a distantly supervised corpus. Using this corpus, we train PARA-COMeT, a discourse-aware model that incorporates paragraph-level information to generate coherent commonsense inferences from narratives. PARA-COMeT captures both semantic knowledge pertaining to prior world knowledge, and episodic knowledge involving how current events relate to prior and future events in a narrative. Our results confirm that PARA-COMeT outperforms the sentence-level baselines, particularly in generating inferences that are both coherent and novel.


page 1

page 2

page 3

page 4


Implicit Premise Generation with Discourse-aware Commonsense Knowledge Models

Enthymemes are defined as arguments where a premise or conclusion is lef...

Mind the Gap! Injecting Commonsense Knowledge for Abstractive Dialogue Summarization

In this paper, we propose to leverage the unique characteristics of dial...

DISCOS: Bridging the Gap between Discourse Knowledge and Commonsense Knowledge

Commonsense knowledge is crucial for artificial intelligence systems to ...

CIS2: A Simplified Commonsense Inference Evaluation for Story Prose

Transformers have been showing near-human performance on a variety of ta...

GLUCOSE: GeneraLized and COntextualized Story Explanations

When humans read or listen, they make implicit commonsense inferences th...

STaCK: Sentence Ordering with Temporal Commonsense Knowledge

Sentence order prediction is the task of finding the correct order of se...

Discourse-Aware Neural Rewards for Coherent Text Generation

In this paper, we investigate the use of discourse-aware rewards with re...