Hyperparameter Optimization for AST Differencing

11/20/2020
by   Matias Martinez, et al.
0

Computing the differences between two versions of the same program is an essential task for software development and software evolution research. AST differencing is the most advanced way of doing so, and an active research area. Yet, AST differencing still relies on default configurations or manual tweaking. In this paper we present a novel approach named DAT for hyperparameter optimization of AST differencing. We thoroughly state the problem of hyper configuration for AST differencing. We show that our data-driven approach to hyperoptimize AST differencing systems increases the edit-script quality in up to 53

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/29/2018

While Tuning is Good, No Tuner is Best

Hyperparameter tuning is the black art of automatically finding a good c...
research
07/29/2018

Is One Hyperparameter Optimizer Enough?

Hyperparameter tuning is the black art of automatically finding a good c...
research
02/05/2019

How to "DODGE" Complex Software Analytics?

AI software is still software. Software engineers need better tools to m...
research
06/10/2021

Meta-Learning for Symbolic Hyperparameter Defaults

Hyperparameter optimization in machine learning (ML) deals with the prob...
research
03/02/2022

Hyperparameter optimization of data-driven AI models on HPC systems

In the European Center of Excellence in Exascale computing "Research on ...
research
03/21/2021

Continuous API Evolution in Heterogenous Enterprise Software Systems

The ability to independently deploy parts of a software system is one of...
research
12/25/2018

The Next Generation of Metadata-Oriented Testing of Research Software

Research software refers to software development tools that accelerate d...

Please sign up or login with your details

Forgot password? Click here to reset