Guiding Theorem Proving by Recurrent Neural Networks

05/20/2019
by   Bartosz Piotrowski, et al.
0

We describe two theorem proving tasks -- premise selection and internal guidance -- for which machine learning has been recently used with some success. We argue that the existing methods however do not correspond to the way how humans approach these tasks. In particular, the existing methods so far lack the notion of a state that is updated each time a choice in the reasoning process is made. To address that, we propose an analogy with tasks such as machine translation, where stateful architectures such as recurrent neural networks have been recently very successful. Then we develop and publish a series of sequence-to-sequence data sets that correspond to the theorem proving tasks using several encodings, and provide the first experimental evaluation of the performance of recurrent neural networks on such tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/11/2020

Stateful Premise Selection by Recurrent Neural Networks

In this work, we develop a new learning-based method for selecting facts...
research
03/01/2017

HolStep: A Machine Learning Dataset for Higher-order Logic Theorem Proving

Large computer-understandable proofs consist of millions of intermediate...
research
09/19/2022

How do humans succeed in tasks like proving Fermat's Theorem or predicting the Higgs boson?

I discuss issues of inverting feasibly computable functions, optimal dis...
research
06/14/2016

DeepMath - Deep Sequence Models for Premise Selection

We study the effectiveness of neural sequence models for premise selecti...
research
05/29/2022

The impact of memory on learning sequence-to-sequence tasks

The recent success of neural networks in machine translation and other f...
research
04/16/2021

Predicting the Binding of SARS-CoV-2 Peptides to the Major Histocompatibility Complex with Recurrent Neural Networks

Predicting the binding of viral peptides to the major histocompatibility...
research
05/27/2020

Fast and Effective Robustness Certification for Recurrent Neural Networks

We present a precise and scalable verifier for recurrent neural networks...

Please sign up or login with your details

Forgot password? Click here to reset