DeepAI AI Chat
Log In Sign Up

Neural Networks as Artificial Specifications

09/15/2018
by   I. S. W. B. Prasetya, et al.
Utrecht University
0

In theory, a neural network can be trained to act as an artificial specification for a program by showing it samples of the programs executions. In practice, the training turns out to be very hard. Programs often operate on discrete domains for which patterns are difficult to discern. Earlier experiments reported too much false positives. This paper revisits an experiment by Vanmali et al. by investigating several aspects that were uninvestigated in the original work: the impact of using different learning modes, aggressiveness levels, and abstraction functions. The results are quite promising.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/30/2021

A Rice's Theorem for Abstract Semantics

Classical results in computability theory, notably Rice's theorem, focus...
05/18/2017

Verifying Programs via Intermediate Interpretation

We explore an approach to verification of programs via program transform...
11/11/2020

GRCNN: Graph Recognition Convolutional Neural Network for Synthesizing Programs from Flow Charts

Program synthesis is the task to automatically generate programs based o...
12/02/2016

Probabilistic Neural Programs

We present probabilistic neural programs, a framework for program induct...
09/27/2021

Self-Replicating Neural Programs

In this work, a neural network is trained to replicate the code that tra...
04/13/2022

Lessons learned from replicating a study on information-retrieval based test case prioritization

Objective: In this study, we aim to replicate an artefact-based study on...