multiPRover: Generating Multiple Proofs for Improved Interpretability in Rule Reasoning

by   Swarnadeep Saha, et al.

We focus on a type of linguistic formal reasoning where the goal is to reason over explicit knowledge in the form of natural language facts and rules (Clark et al., 2020). A recent work, named PRover (Saha et al., 2020), performs such reasoning by answering a question and also generating a proof graph that explains the answer. However, compositional reasoning is not always unique and there may be multiple ways of reaching the correct answer. Thus, in our work, we address a new and challenging problem of generating multiple proof graphs for reasoning over natural language rule-bases. Each proof provides a different rationale for the answer, thereby improving the interpretability of such reasoning systems. In order to jointly learn from all proof graphs and exploit the correlations between multiple proofs for a question, we pose this task as a set generation problem over structured output spaces where each proof is represented as a directed graph. We propose two variants of a proof-set generation model, multiPRover. Our first model, Multilabel-multiPRover, generates a set of proofs via multi-label classification and implicit conditioning between the proofs; while the second model, Iterative-multiPRover, generates proofs iteratively by explicitly conditioning on the previously generated proofs. Experiments on multiple synthetic, zero-shot, and human-paraphrased datasets reveal that both multiPRover models significantly outperform PRover on datasets containing multiple gold proofs. Iterative-multiPRover obtains state-of-the-art proof F1 in zero-shot scenarios where all examples have single correct proofs. It also generalizes better to questions requiring higher depths of reasoning where multiple proofs are more frequent. Our code and models are publicly available at


page 1

page 2

page 3

page 4


Probabilistic Graph Reasoning for Natural Proof Generation

In this paper, we investigate the problem of reasoning over natural lang...

Generating Natural Language Proofs with Verifier-Guided Search

Deductive reasoning (drawing conclusions from assumptions) is a challeng...

Interpretable Proof Generation via Iterative Backward Reasoning

We present IBR, an Iterative Backward Reasoning model to solve the proof...

Coqatoo: Generating Natural Language Versions of Coq Proofs

Due to their numerous advantages, formal proofs and proof assistants, su...

Semantic Graphs for Generating Deep Questions

This paper proposes the problem of Deep Question Generation (DQG), which...

A Spectrum of Applications of Automated Reasoning

The likelihood of an automated reasoning program being of substantial as...

Surface Form Competition: Why the Highest Probability Answer Isn't Always Right

Large language models have shown promising results in zero-shot settings...

Please sign up or login with your details

Forgot password? Click here to reset