Learning Structural Edits via Incremental Tree Transformations

01/28/2021
by   Ziyu Yao, et al.
10

While most neural generative models generate outputs in a single pass, the human creative process is usually one of iterative building and refinement. Recent work has proposed models of editing processes, but these mostly focus on editing sequential data and/or only model a single editing pass. In this paper, we present a generic model for incremental editing of structured data (i.e., "structural edits"). Particularly, we focus on tree-structured data, taking abstract syntax trees of computer programs as our canonical example. Our editor learns to iteratively generate tree edits (e.g., deleting or adding a subtree) and applies them to the partially edited data, thereby the entire editing process can be formulated as consecutive, incremental tree transformations. To show the unique benefits of modeling tree edits directly, we further propose a novel edit encoder for learning to represent edits, as well as an imitation learning method that allows the editor to be more robust. We evaluate our proposed editor on two source code edit datasets, where results show that, with the proposed edit encoder, our editor significantly improves accuracy over previous approaches that generate the edited program directly in one pass. Finally, we demonstrate that training our editor to imitate experts and correct its mistakes dynamically can further improve its performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2022

Learning to Model Editing Processes

Most existing sequence generation models produce outputs in one pass, us...
research
05/27/2020

A Structural Model for Contextual Code Changes

We address the problem of predicting edit completions based on a learned...
research
05/29/2023

Coeditor: Leveraging Contextual Changes for Multi-round Code Auto-editing

Developers often dedicate significant time to maintaining and refactorin...
research
03/17/2022

An Imitation Learning Curriculum for Text Editing with Non-Autoregressive Models

We propose a framework for training non-autoregressive sequence-to-seque...
research
06/11/2021

Assessing the Effectiveness of Syntactic Structure to Learn Code Edit Representations

In recent times, it has been shown that one can use code as data to aid ...
research
04/13/2022

Fix Bugs with Transformer through a Neural-Symbolic Edit Grammar

We introduce NSEdit (neural-symbolic edit), a novel Transformer-based co...
research
03/13/2021

PhotoApp: Photorealistic Appearance Editing of Head Portraits

Photorealistic editing of portraits is a challenging task as humans are ...

Please sign up or login with your details

Forgot password? Click here to reset