DeepAI AI Chat
Log In Sign Up

A Syntactic Neural Model for General-Purpose Code Generation

by   Pengcheng Yin, et al.

We consider the problem of parsing natural language descriptions into source code written in a general-purpose programming language like Python. Existing data-driven methods treat this problem as a language generation task without considering the underlying syntax of the target programming language. Informed by previous work in semantic parsing, in this paper we propose a novel neural architecture powered by a grammar model to explicitly capture the target syntax as prior knowledge. Experiments find this an effective way to scale up to generation of complex programs from natural language descriptions, achieving state-of-the-art results that well outperform previous code generation and semantic parsing approaches.


page 1

page 2

page 3

page 4


Program Synthesis and Semantic Parsing with Learned Code Idioms

Program synthesis of general-purpose source code from natural language s...

Incorporating External Knowledge through Pre-training for Natural Language to Code Generation

Open-domain code generation aims to generate code in a general-purpose p...

CODEP: Grammatical Seq2Seq Model for General-Purpose Code Generation

General-purpose code generation (GPCG) aims to automatically convert the...

TreeGen: A Tree-Based Transformer Architecture for Code Generation

A code generation system generates programming language code based on an...

A Grammar-Based Structural CNN Decoder for Code Generation

Code generation maps a program description to executable source code in ...

Learning Programmatic Idioms for Scalable Semantic Parsing

Programmers typically organize executable source code using high-level c...

Teach me how to Label: Labeling Functions from Natural Language with Text-to-text Transformers

Annotated data has become the most important bottleneck in training accu...

Code Repositories


A syntactic neural model for parsing natural language to executable code

view repo