Back to the Future: Unsupervised Backprop-based Decoding for Counterfactual and Abductive Commonsense Reasoning

10/12/2020
by   Lianhui Qin, et al.
0

Abductive and counterfactual reasoning, core abilities of everyday human cognition, require reasoning about what might have happened at time t, while conditioning on multiple contexts from the relative past and future. However, simultaneous incorporation of past and future contexts using generative language models (LMs) can be challenging, as they are trained either to condition only on the past context or to perform narrowly scoped text-infilling. In this paper, we propose DeLorean, a new unsupervised decoding algorithm that can flexibly incorporate both the past and future contexts using only off-the-shelf, left-to-right language models and no supervision. The key intuition of our algorithm is incorporating the future through back-propagation, during which, we only update the internal representation of the output while fixing the model parameters. By alternating between forward and backward propagation, DeLorean can decode the output representation that reflects both the left and right contexts. We demonstrate that our approach is general and applicable to two nonmonotonic reasoning tasks: abductive text generation and counterfactual story revision, where DeLorean outperforms a range of unsupervised and some supervised methods, based on automatic and human evaluation.

READ FULL TEXT
research
10/16/2020

Reflective Decoding: Unsupervised Paraphrasing and Abductive Reasoning

Pretrained Language Models (LMs) generate text with remarkable quality, ...
research
09/09/2019

Counterfactual Story Reasoning and Generation

Counterfactual reasoning requires predicting how alternative events, con...
research
09/16/2022

Possible Stories: Evaluating Situated Commonsense Reasoning under Multiple Possible Scenarios

The possible consequences for the same context may vary depending on the...
research
09/17/2023

Contrastive Decoding Improves Reasoning in Large Language Models

We demonstrate that Contrastive Decoding – a simple, computationally lig...
research
04/02/2021

Sketch and Customize: A Counterfactual Story Generator

Recent text generation models are easy to generate relevant and fluent t...
research
07/28/2020

BUT-FIT at SemEval-2020 Task 5: Automatic detection of counterfactual statements with deep pre-trained language representation models

This paper describes BUT-FIT's submission at SemEval-2020 Task 5: Modell...

Please sign up or login with your details

Forgot password? Click here to reset