Log In Sign Up

HyperTree Proof Search for Neural Theorem Proving

by   Guillaume Lample, et al.

We propose an online training procedure for a transformer-based automated theorem prover. Our approach leverages a new search algorithm, HyperTree Proof Search (HTPS), inspired by the recent success of AlphaZero. Our model learns from previous proof searches through online training, allowing it to generalize to domains far from the training distribution. We report detailed ablations of our pipeline's main components by studying performance on three environments of increasing complexity. In particular, we show that with HTPS alone, a model trained on annotated proofs manages to prove 65.4 Metamath theorems, significantly outperforming the previous state of the art of 56.5 to 82.6 on the Lean-based miniF2F-curriculum dataset from 31


page 1

page 2

page 3

page 4


Proof Simplification and Automated Theorem Proving

The proofs first generated by automated theorem provers are far from opt...

Proving Theorems using Incremental Learning and Hindsight Experience Replay

Traditional automated theorem provers for first-order logic depend on sp...

ENIGMA: Efficient Learning-based Inference Guiding Machine

ENIGMA is a learning-based method for guiding given clause selection in ...

Learning to Prove with Tactics

We implement a automated tactical prover TacticToe on top of the HOL4 in...

Towards Finding Longer Proofs

We present a reinforcement learning (RL) based guidance system for autom...

Deep Network Guided Proof Search

Deep learning techniques lie at the heart of several significant AI adva...

CD Tools – Condensed Detachment and Structure Generating Theorem Proving (System Description)

CD Tools is a Prolog library for experimenting with condensed detachment...