SuperCoder: Program Learning Under Noisy Conditions From Superposition of States

12/07/2020
by   Ali Davody, et al.
0

We propose a new method of program learning in a Domain Specific Language (DSL) which is based on gradient descent with no direct search. The first component of our method is a probabilistic representation of the DSL variables. At each timestep in the program sequence, different DSL functions are applied on the DSL variables with a certain probability, leading to different possible outcomes. Rather than handling all these outputs separately, whose number grows exponentially with each timestep, we collect them into a superposition of variables which captures the information in a single, but fuzzy, state. This state is to be contrasted at the final timestep with the ground-truth output, through a loss function. The second component of our method is an attention-based recurrent neural network, which provides an appropriate initialization point for the gradient descent that optimizes the probabilistic representation. The method we have developed surpasses the state-of-the-art for synthesising long programs and is able to learn programs under noise.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2016

Summary - TerpreT: A Probabilistic Programming Language for Program Induction

We study machine learning formulations of inductive program synthesis; t...
research
08/15/2016

TerpreT: A Probabilistic Programming Language for Program Induction

We study machine learning formulations of inductive program synthesis; g...
research
08/26/2020

Gravilon: Applications of a New Gradient Descent Method to Machine Learning

Gradient descent algorithms have been used in countless applications sin...
research
02/25/2022

An initial alignment between neural network and target is needed for gradient descent to learn

This paper introduces the notion of "Initial Alignment" (INAL) between a...
research
09/12/2018

Automatic Program Synthesis of Long Programs with a Learned Garbage Collector

We consider the problem of generating automatic code given sample input-...
research
08/01/2019

A self-organizing fuzzy neural network for sequence learning

In this paper, a new self-organizing fuzzy neural network model is prese...
research
02/13/2019

Proving Expected Sensitivity of Probabilistic Programs with Randomized Execution Time

The notion of program sensitivity (aka Lipschitz continuity) specifies t...

Please sign up or login with your details

Forgot password? Click here to reset