Training a First-Order Theorem Prover from Synthetic Data

03/05/2021
by   Vlad Firoiu, et al.
0

A major challenge in applying machine learning to automated theorem proving is the scarcity of training data, which is a key ingredient in training successful deep learning models. To tackle this problem, we propose an approach that relies on training purely with synthetically generated theorems, without any human data aside from axioms. We use these theorems to train a neurally-guided saturation-based prover. Our neural prover outperforms the state-of-the-art E-prover on this synthetic data in both time and search steps, and shows significant transfer to the unseen human-written theorems from the TPTP library, where it solves 72% of first-order problems without equality.

READ FULL TEXT
research
06/19/2020

Learning to Prove from Synthetic Theorems

A major challenge in applying machine learning to automated theorem prov...
research
02/17/2020

Learning to Prove Theorems by Learning to Generate Theorems

We consider the task of automated theorem proving, a key AI task. Deep l...
research
03/09/2022

Gym-saturation: an OpenAI Gym environment for saturation provers

`gym-saturation` is an OpenAI Gym environment for reinforcement learning...
research
12/20/2021

Proving Theorems using Incremental Learning and Hindsight Experience Replay

Traditional automated theorem provers for first-order logic depend on sp...
research
05/27/2022

Learning to Find Proofs and Theorems by Learning to Refine Search Strategies

We propose a new approach to automated theorem proving and deductive pro...
research
04/16/2020

In Search of Life: Learning from Synthetic Data to Detect Vital Signs in Videos

Automatically detecting vital signs in videos, such as the estimation of...
research
02/06/2021

Vampire With a Brain Is a Good ITP Hammer

Vampire has been for a long time the strongest first-order automated the...

Please sign up or login with your details

Forgot password? Click here to reset