Making the most of small Software Engineering datasets with modern machine learning

06/29/2021
by   Julian Aron Prenner, et al.
0

This paper provides a starting point for Software Engineering (SE) researchers and practitioners faced with the problem of training machine learning models on small datasets. Due to the high costs associated with labeling data, in Software Engineering,there exist many small (< 1 000 samples) and medium-sized (< 100 000 samples) datasets. While deep learning has set the state of the art in many machine learning tasks, it is only recently that it has proven effective on small-sized datasets, primarily thanks to pre-training, a semi-supervised learning technique that leverages abundant unlabelled data alongside scarce labelled data.In this work, we evaluate pre-trained Transformer models on a selection of 13 smaller datasets from the SE literature, covering both,source code and natural language. Our results suggest that pre-trained Transformers are competitive and in some cases superior to previous models, especially for tasks involving natural language; whereas for source code tasks, in particular for very small datasets,traditional machine learning methods often has the edge.In addition, we experiment with several techniques that ought to aid training on small datasets, including active learning, data augmentation, soft labels, self-training and intermediate-task fine-tuning, and issue recommendations on when they are effective. We also release all the data, scripts, and most importantly pre-trained models for the community to reuse on their own datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2022

Deep Learning Meets Software Engineering: A Survey on Pre-Trained Models of Source Code

Recent years have seen the successful application of deep learning to so...
research
04/06/2021

CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing

Currently, a growing number of mature natural language processing applic...
research
02/08/2023

Automating Code-Related Tasks Through Transformers: The Impact of Pre-training

Transformers have gained popularity in the software engineering (SE) lit...
research
08/06/2021

Distilling Transformers for Neural Cross-Domain Search

Pre-trained transformers have recently clinched top spots in the gamut o...
research
06/15/2022

NatGen: Generative pre-training by "Naturalizing" source code

Pre-trained Generative Language models (e.g. PLBART, CodeT5, SPT-Code) f...
research
12/03/2021

Multilingual training for Software Engineering

Well-trained machine-learning models, which leverage large amounts of op...
research
05/26/2023

ChatGPT: A Study on its Utility for Ubiquitous Software Engineering Tasks

ChatGPT (Chat Generative Pre-trained Transformer) is a chatbot launched ...

Please sign up or login with your details

Forgot password? Click here to reset