On-the-fly Improving Performance of Deep Code Models via Input Denoising

08/19/2023
by   Zhao Tian, et al.
0

Deep learning has been widely adopted to tackle various code-based tasks by building deep code models based on a large amount of code snippets. While these deep code models have achieved great success, even state-of-the-art models suffer from noise present in inputs leading to erroneous predictions. While it is possible to enhance models through retraining/fine-tuning, this is not a once-and-for-all approach and incurs significant overhead. In particular, these techniques cannot on-the-fly improve performance of (deployed) models. There are currently some techniques for input denoising in other domains (such as image processing), but since code input is discrete and must strictly abide by complex syntactic and semantic constraints, input denoising techniques in other fields are almost not applicable. In this work, we propose the first input denoising technique (i.e., CodeDenoise) for deep code models. Its key idea is to localize noisy identifiers in (likely) mispredicted inputs, and denoise such inputs by cleansing the located identifiers. It does not need to retrain or reconstruct the model, but only needs to cleanse inputs on-the-fly to improve performance. Our experiments on 18 deep code models (i.e., three pre-trained models with six code-based datasets) demonstrate the effectiveness and efficiency of CodeDenoise. For example, on average, CodeDenoise successfully denoises 21.91 2.04 0.48 second spent on each input, substantially outperforming the widely-used fine-tuning strategy.

READ FULL TEXT
research
01/06/2023

Adversarial Attacks on Neural Models of Code via Code Difference Reduction

Deep learning has been widely used to solve various code-based tasks by ...
research
12/12/2022

Parameter-Efficient Finetuning of Transformers for Source Code

Pretrained Transformers achieve state-of-the-art performance in various ...
research
03/07/2022

Input-Tuning: Adapting Unfamiliar Inputs to Frozen Pretrained Models

Recently the prompt-tuning paradigm has attracted significant attention....
research
02/01/2023

CoderEval: A Benchmark of Pragmatic Code Generation with Generative Pre-trained Models

Code generation models based on the pre-training and fine-tuning paradig...
research
10/30/2019

Ordering Matters: Word Ordering Aware Unsupervised NMT

Denoising-based Unsupervised Neural Machine Translation (U-NMT) models t...
research
11/24/2021

Supervised Neural Discrete Universal Denoiser for Adaptive Denoising

We improve the recently developed Neural DUDE, a neural network-based ad...
research
01/18/2022

Using Pre-Trained Models to Boost Code Review Automation

Code review is a practice widely adopted in open source and industrial p...

Please sign up or login with your details

Forgot password? Click here to reset