A Convolutional Attention Network for Extreme Summarization of Source Code

02/09/2016
by   Miltiadis Allamanis, et al.
0

Attention mechanisms in neural networks have proved useful for problems in which the input and output do not have fixed dimension. Often there exist features that are locally translation invariant and would be valuable for directing the model's attention, but previous attentional architectures are not constructed to learn such features specifically. We introduce an attentional neural network that employs convolution on the input tokens to detect local time-invariant and long-range topical attention features in a context-dependent way. We apply this architecture to the problem of extreme summarization of source code snippets into short, descriptive function name-like summaries. Using those features, the model sequentially generates a summary by marginalizing over two attention mechanisms: one that predicts the next summary token based on the attention weights of the input tokens and another that is able to copy a code token as-is directly into the summary. We demonstrate our convolutional attention neural network's performance on 10 popular Java projects showing that it achieves better performance compared to previous attentional mechanisms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/01/2020

A Transformer-based Approach for Source Code Summarization

Generating a readable summary that describes the functionality of a prog...
research
05/18/2023

Tram: A Token-level Retrieval-augmented Mechanism for Source Code Summarization

Automatically generating human-readable text describing the functionalit...
research
05/16/2023

Towards Modeling Human Attention from Eye Movements for Neural Source Code Summarization

Neural source code summarization is the task of generating natural langu...
research
08/10/2023

AST-MHSA : Code Summarization using Multi-Head Self-Attention

Code summarization aims to generate concise natural language description...
research
07/04/2021

A Topic Guided Pointer-Generator Model for Generating Natural Language Code Summaries

Code summarization is the task of generating natural language descriptio...
research
12/01/2019

Not All Attention Is Needed: Gated Attention Network for Sequence Data

Although deep neural networks generally have fixed network structures, t...
research
06/09/2022

Topic-Aware Evaluation and Transformer Methods for Topic-Controllable Summarization

Topic-controllable summarization is an emerging research area with a wid...

Please sign up or login with your details

Forgot password? Click here to reset