Dynamically Relative Position Encoding-Based Transformer for Automatic Code Edit

05/26/2022
by   Shiyi Qi, et al.
0

Adapting Deep Learning (DL) techniques to automate non-trivial coding activities, such as code documentation and defect detection, has been intensively studied recently. Learning to predict code changes is one of the popular and essential investigations. Prior studies have shown that DL techniques such as Neural Machine Translation (NMT) can benefit meaningful code changes, including bug fixing and code refactoring. However, NMT models may encounter bottleneck when modeling long sequences, thus are limited in accurately predicting code changes. In this work, we design a Transformer-based approach, considering that Transformer has proven effective in capturing long-term dependencies. Specifically, we propose a novel model named DTrans. For better incorporating the local structure of code, i.e., statement-level information in this paper, DTrans is designed with dynamically relative position encoding in the multi-head attention of Transformer. Experiments on benchmark datasets demonstrate that DTrans can more accurately generate patches than the state-of-the-art methods, increasing the performance by at least 5.45%-46.57% in terms of the exact match metric on different datasets. Moreover, DTrans can locate the lines to change with 1.75%-24.21% higher accuracy than the existing methods.

READ FULL TEXT

page 1

page 10

research
01/25/2019

On Learning Meaningful Code Changes via Neural Machine Translation

Recent years have seen the rise of Deep Learning (DL) techniques applied...
research
09/30/2018

Tree2Tree Neural Translation Model for Learning Source Code Changes

The way developers edit day-to-day code tend to be repetitive and often ...
research
06/04/2019

Lattice-Based Transformer Encoder for Neural Machine Translation

Neural machine translation (NMT) takes deterministic sequences for sourc...
research
05/22/2020

Character-level Transformer-based Neural Machine Translation

Neural machine translation (NMT) is nowadays commonly applied at the sub...
research
09/06/2021

PermuteFormer: Efficient Relative Position Encoding for Long Sequences

A recent variation of Transformer, Performer, scales Transformer to long...
research
05/09/2023

BadCS: A Backdoor Attack Framework for Code search

With the development of deep learning (DL), DL-based code search models ...
research
03/13/2020

Learning to Encode Position for Transformer with Continuous Dynamical Model

We introduce a new way of learning to encode position information for no...

Please sign up or login with your details

Forgot password? Click here to reset