Bad Smells in Software Analytics Papers

03/14/2018
by   Rahul Krishna, et al.
0

CONTEXT: There has been a rapid growth in the use of data analytics to underpin evidence-based software engineering. However the combination of complex techniques, diverse reporting standards and complex underlying phenomena are causing some concern as to the reliability of studies. OBJECTIVE: Our goal is to provide guidance for producers and consumers of software analytics studies (computational experiments and correlation studies). METHOD: We propose using "bad smells", i.e. surface indications of deeper problems and popular in the agile software community and consider how they may be manifest in software analytics studies. RESULTS: We provide a list of 11 "bad smells" in decreasing order of severity and show their impact by examples. CONCLUSIONS: We should encourage more debate on what constitutes a `valid' study (so we expect our list will mature over time).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset