DeepAI
Log In Sign Up

HyperTree Proof Search for Neural Theorem Proving

05/23/2022
by   Guillaume Lample, et al.
0

We propose an online training procedure for a transformer-based automated theorem prover. Our approach leverages a new search algorithm, HyperTree Proof Search (HTPS), inspired by the recent success of AlphaZero. Our model learns from previous proof searches through online training, allowing it to generalize to domains far from the training distribution. We report detailed ablations of our pipeline's main components by studying performance on three environments of increasing complexity. In particular, we show that with HTPS alone, a model trained on annotated proofs manages to prove 65.4 Metamath theorems, significantly outperforming the previous state of the art of 56.5 to 82.6 on the Lean-based miniF2F-curriculum dataset from 31

READ FULL TEXT

page 1

page 2

page 3

page 4

08/09/2018

Proof Simplification and Automated Theorem Proving

The proofs first generated by automated theorem provers are far from opt...
12/20/2021

Proving Theorems using Incremental Learning and Hindsight Experience Replay

Traditional automated theorem provers for first-order logic depend on sp...
01/23/2017

ENIGMA: Efficient Learning-based Inference Guiding Machine

ENIGMA is a learning-based method for guiding given clause selection in ...
04/02/2018

Learning to Prove with Tactics

We implement a automated tactical prover TacticToe on top of the HOL4 in...
05/30/2019

Towards Finding Longer Proofs

We present a reinforcement learning (RL) based guidance system for autom...
01/24/2017

Deep Network Guided Proof Search

Deep learning techniques lie at the heart of several significant AI adva...
07/18/2022

CD Tools – Condensed Detachment and Structure Generating Theorem Proving (System Description)

CD Tools is a Prolog library for experimenting with condensed detachment...