Neural Attribute Machines for Program Generation

05/25/2017
by   Matthew Amodio, et al.
0

Recurrent neural networks have achieved remarkable success at generating sequences with complex structures, thanks to advances that include richer embeddings of input and cures for vanishing gradients. Trained only on sequences from a known grammar, though, they can still struggle to learn rules and constraints of the grammar. Neural Attribute Machines (NAMs) are equipped with a logical machine that represents the underlying grammar, which is used to teach the constraints to the neural machine by (i) augmenting the input sequence, and (ii) optimizing a custom loss function. Unlike traditional RNNs, NAMs are exposed to the grammar, as well as samples from the language of the grammar. During generation, NAMs make significantly fewer violations of the constraints of the underlying grammar than RNNs trained only on samples from the language of the grammar.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/24/2019

A model for a Lindenmayer reconstruction algorithm

Given an input string s and a specific Lindenmayer system (the so-called...
research
07/22/2018

Generating an ATL Model Checker using an Attribute Grammar

In this paper we present the attribute grammars as a formal approach for...
research
10/28/2017

Inducing Regular Grammars Using Recurrent Neural Networks

Grammar induction is the task of learning a grammar from a set of exampl...
research
10/18/2018

Sample-Free Learning of Input Grammars for Comprehensive Software Fuzzing

Generating valid test inputs for a program is much easier if one knows t...
research
06/05/2023

Improving Grammar-based Sequence-to-Sequence Modeling with Decomposition and Constraints

Neural QCFG is a grammar-based sequence-tosequence (seq2seq) model with ...
research
11/28/2017

Sampling Markov Models under Constraints: Complexity Results for Binary Equalities and Grammar Membership

We aim at enforcing hard constraints to impose a global structure on seq...
research
11/05/2019

Language coverage and generalization in RNN-based continuous sentence embeddings for interacting agents

Continuous sentence embeddings using recurrent neural networks (RNNs), w...

Please sign up or login with your details

Forgot password? Click here to reset