CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing

by   Ahmed Elnaggar, et al.

Currently, a growing number of mature natural language processing applications make people's life more convenient. Such applications are built by source code - the language in software engineering. However, the applications for understanding source code language to ease the software engineering process are under-researched. Simultaneously, the transformer model, especially its combination with transfer learning, has been proven to be a powerful technique for natural language processing tasks. These breakthroughs point out a promising direction for process source code and crack software engineering tasks. This paper describes CodeTrans - an encoder-decoder transformer model for tasks in the software engineering domain, that explores the effectiveness of encoder-decoder transformer models for six software engineering tasks, including thirteen sub-tasks. Moreover, we have investigated the effect of different training strategies, including single-task learning, transfer learning, multi-task learning, and multi-task learning with fine-tuning. CodeTrans outperforms the state-of-the-art models on all the tasks. To expedite future works in the software engineering domain, we have published our pre-trained models of CodeTrans.


page 1

page 2

page 3

page 4


On the validity of pre-trained transformers for natural language processing in the software engineering domain

Transformers are the current state-of-the-art of natural language proces...

Natural Language or Not (NLoN) - A Package for Software Engineering Text Analysis Pipeline

The use of natural language processing (NLP) is gaining popularity in so...

Making the most of small Software Engineering datasets with modern machine learning

This paper provides a starting point for Software Engineering (SE) resea...

Total Recall, Language Processing, and Software Engineering

A broad class of software engineering problems can be generalized as the...

Distilling Transformers for Neural Cross-Domain Search

Pre-trained transformers have recently clinched top spots in the gamut o...

BERT_SE: A Pre-trained Language Representation Model for Software Engineering

The application of Natural Language Processing (NLP) has achieved a high...

Patching as Translation: the Data and the Metaphor

Machine Learning models from other fields, like Computational Linguistic...

Code Repositories


Pretrained Language Models for Source code

view repo