TreeBERT: A Tree-Based Pre-Trained Model for Programming Language

05/26/2021
by   Xue Jiang, et al.
0

Source code can be parsed into the abstract syntax tree (AST) based on defined syntax rules. However, in pre-training, little work has considered the incorporation of tree structure into the learning process. In this paper, we present TreeBERT, a tree-based pre-trained model for improving programming language-oriented generation tasks. To utilize tree structure, TreeBERT represents the AST corresponding to the code as a set of composition paths and introduces node position embedding. The model is trained by tree masked language modeling (TMLM) and node order prediction (NOP) with a hybrid objective. TMLM uses a novel masking strategy designed according to the tree's characteristics to help the model understand the AST and infer the missing semantics of the AST. With NOP, TreeBERT extracts the syntactical structure by learning the order constraints of nodes in AST. We pre-trained TreeBERT on datasets covering multiple programming languages. On code summarization and code documentation tasks, TreeBERT outperforms other pre-trained models and state-of-the-art models designed for these tasks. Furthermore, TreeBERT performs well when transferred to the pre-trained unseen programming language.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2020

GraphCodeBERT: Pre-training Code Representations with Data Flow

Pre-trained models for programming language have achieved dramatic empir...
research
02/14/2022

What Do They Capture? – A Structural Analysis of Pre-Trained Language Models for Source Code

Recently, many pre-trained language models for source code have been pro...
research
03/10/2023

Model-Agnostic Syntactical Information for Pre-Trained Programming Language Models

Pre-trained Programming Language Models (PPLMs) achieved many recent sta...
research
01/20/2022

AstBERT: Enabling Language Model for Code Understanding with Abstract Syntax Tree

Using a pre-trained language model (i.e. BERT) to apprehend source codes...
research
09/30/2019

Structural Language Models of Code

We address the problem of any-code completion - generating a missing pie...
research
05/27/2022

Understanding Long Programming Languages with Structure-Aware Sparse Attention

Programming-based Pre-trained Language Models (PPLMs) such as CodeBERT h...
research
03/14/2021

Improving Code Summarization with Block-wise Abstract Syntax Tree Splitting

Automatic code summarization frees software developers from the heavy bu...

Please sign up or login with your details

Forgot password? Click here to reset