Code Completion with Neural Attention and Pointer Networks

11/27/2017
by   Jian Li, et al.
0

Intelligent code completion has become an essential tool to accelerate modern software development. To facilitate effective code completion for dynamically-typed programming languages, we apply neural language models by learning from large codebases, and investigate the effectiveness of attention mechanism on the code completion task. However, standard neural language models even with attention mechanism cannot correctly predict out-of-vocabulary (OoV) words thus restrict the code completion performance. In this paper, inspired by the prevalence of locally repeated terms in program source code, and the recently proposed pointer networks which can reproduce words from local context, we propose a pointer mixture network for better predicting OoV words in code completion. Based on the context, the pointer mixture network learns to either generate a within-vocabulary word through an RNN component, or copy an OoV word from local context through a pointer component. Experiments on two benchmarked datasets demonstrate the effectiveness of our attention mechanism and pointer mixture network on the code completion task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/25/2019

Improve Language Modelling for Code Completion through Statement Level Language Model based on Statement Embedding Generated by BiLSTM

Language models such as RNN, LSTM or other variants have been widely use...
research
04/14/2020

Code Completion using Neural Attention and Byte Pair Encoding

In this paper, we aim to do code completion based on implementing a Neur...
research
08/17/2022

CCTEST: Testing and Repairing Code Completion Systems

Code completion, a highly valuable topic in the software development dom...
research
06/01/2023

Better Context Makes Better Code Language Models: A Case Study on Function Call Argument Completion

Pretrained code language models have enabled great progress towards prog...
research
09/13/2022

Learning to Prevent Profitless Neural Code Completion

Currently, large pre-trained models are widely applied in neural code co...
research
09/16/2019

A Self-Attentional Neural Architecture for Code Completion with Multi-Task Learning

Code completion, one of the most useful features in the integrated devel...
research
07/05/2020

Automatically Generating Codes from Graphical Screenshots Based on Deep Autocoder

During software front-end development, the work to convert Graphical Use...

Please sign up or login with your details

Forgot password? Click here to reset