patch2vec: Distributed Representation of Code Changes

11/18/2019
by   Rocío Cabrera Lozoya, et al.
0

Deep learning methods, which have found successful applications in fields like image classification and natural language processing, have recently been applied to source code analysis too, due to the enormous amount of freely available source code (e.g., from open-source software repositories). In this work, we elaborate upon a state-of-the-art approach to the representation of source code that uses information about its syntactic structure, and we adapt it to represent source changes (i.e., commits). We use this representation to classify security-relevant commits. Because our method uses transfer learning (that is, we train a network on a "pretext task" for which abundant labeled data is available, and then we use such network for the target task of commit classification, for which fewer labeled instances are available), we studied the impact of pre-training the network using two different pretext tasks versus a randomly initialized model. Our results indicate that representations that leverage the structural information obtained through code syntax outperform token-based representations. Furthermore, the performance metrics obtained when pre-training on a loosely related pretext task with a very large dataset (>10^6 samples) were surpassed when pretraining on a smaller dataset (>10^4 samples) but for a pretext task that is more closely related to the target task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2019

Commit2Vec: Learning Distributed Representations of Code Changes

Deep learning methods, which have found successful applications in field...
research
08/10/2021

SynCoBERT: Syntax-Guided Multi-Modal Contrastive Pre-Training for Code Representation

Code representation learning, which aims to encode the semantics of sour...
research
05/23/2022

AdaptivePaste: Code Adaptation through Learning Semantics-aware Variable Usage Representations

In software development, it is common for programmers to copy-paste code...
research
06/15/2022

NatGen: Generative pre-training by "Naturalizing" source code

Pre-trained Generative Language models (e.g. PLBART, CodeT5, SPT-Code) f...
research
04/27/2022

Transfer Learning with Pre-trained Conditional Generative Models

Transfer learning is crucial in training deep neural networks on new tar...
research
05/16/2022

The use of deep learning in interventional radiotherapy (brachytherapy): a review with a focus on open source and open data

Deep learning advanced to one of the most important technologies in almo...
research
02/16/2018

Instance-based Inductive Deep Transfer Learning by Cross-Dataset Querying with Locality Sensitive Hashing

Supervised learning models are typically trained on a single dataset and...

Please sign up or login with your details

Forgot password? Click here to reset