Latent Attention For If-Then Program Synthesis

11/07/2016
by   Xinyun Chen, et al.
0

Automatic translation from natural language descriptions into programs is a longstanding challenging problem. In this work, we consider a simple yet important sub-problem: translation from textual descriptions to If-Then programs. We devise a novel neural network architecture for this task which we train end-to-end. Specifically, we introduce Latent Attention, which computes multiplicative weights for the words in the description in a two-stage process with the goal of better leveraging the natural language structures that indicate the relevant parts for predicting program elements. Our architecture reduces the error rate by 28.57 one-shot learning scenario of If-Then program synthesis and simulate it with our existing dataset. We demonstrate a variation on the training procedure for this scenario that outperforms the original procedure, significantly closing the gap to the model trained with all data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/01/2020

Latent Programmer: Discrete Latent Codes for Program Synthesis

In many sequence learning tasks, such as program synthesis and document ...
research
10/23/2018

Ain't Nobody Got Time For Coding: Structure-Aware Program Synthesis From Natural Language

Program synthesis from natural language (NL) is practical for humans and...
research
03/30/2023

Synthesis of Mathematical programs from Natural Language Specifications

Several decision problems that are encountered in various business domai...
research
10/05/2021

Truth-Conditional Captioning of Time Series Data

In this paper, we explore the task of automatically generating natural l...
research
05/11/2022

Identifying concept libraries from language about object structure

Our understanding of the visual world goes beyond naming objects, encomp...
research
04/10/2020

Joint translation and unit conversion for end-to-end localization

A variety of natural language tasks require processing of textual data w...
research
06/01/2015

Predicting Deep Zero-Shot Convolutional Neural Networks using Textual Descriptions

One of the main challenges in Zero-Shot Learning of visual categories is...

Please sign up or login with your details

Forgot password? Click here to reset