Relevance Transformer: Generating Concise Code Snippets with Relevance Feedback

07/06/2020
by   Carlos Gemmell, et al.
0

Tools capable of automatic code generation have the potential to augment programmer's capabilities. While straightforward code retrieval is incorporated into many IDEs, an emerging area is explicit code generation. Code generation is currently approached as a Machine Translation task, with Recurrent Neural Network (RNN) based encoder-decoder architectures trained on code-description pairs. In this work we introduce and study modern Transformer architectures for this task. We further propose a new model called the Relevance Transformer that incorporates external knowledge using pseudo-relevance feedback. The Relevance Transformer biases the decoding process to be similar to existing retrieved code while enforcing diversity. We perform experiments on multiple standard benchmark datasets for code generation including Django, Hearthstone, and CoNaLa. The results show improvements over state-of-the-art methods based on BLEU evaluation. The Relevance Transformer model shows the potential of Transformer-based architectures for code generation and introduces a method of incorporating pseudo-relevance feedback during inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/20/2021

PGT: Pseudo Relevance Feedback Using a Graph-Based Transformer

Most research on pseudo relevance feedback (PRF) has been done in vector...
research
02/12/2021

DeepPseudo: Deep Pseudo-code Generation via Transformer and Code Feature Extraction

Pseudo-code written by natural language is helpful for novice developers...
research
06/10/2022

StructCoder: Structure-Aware Transformer for Code Generation

There has been a recent surge of interest in automating software enginee...
research
04/27/2023

Neural Keyphrase Generation: Analysis and Evaluation

Keyphrase generation aims at generating topical phrases from a given tex...
research
03/13/2023

AMOM: Adaptive Masking over Masking for Conditional Masked Language Model

Transformer-based autoregressive (AR) methods have achieved appealing pe...
research
05/11/2020

MART: Memory-Augmented Recurrent Transformer for Coherent Video Paragraph Captioning

Generating multi-sentence descriptions for videos is one of the most cha...
research
05/20/2022

Low-cost Relevance Generation and Evaluation Metrics for Entity Resolution in AI

Entity Resolution (ER) in voice assistants is a prime component during r...

Please sign up or login with your details

Forgot password? Click here to reset