Agile software development has become an extremely popular way of managing projects, and there have also been studies showing the effectiveness of such approaches . Many different measurement models for agility have been proposed but according to  Sidky’s Agile Adoption Framework  is the most complete to date. Many companies have implemented an agile approach but lack a way to evaluate the level of implementation, and need contextually specific tools . However, getting some quantitative feedback from the teams could be a useful indication of agility that can be used to study (and improve) agile teams and also to compare them. This paper presents preliminary results from such a tool.
tool and how we use it here. First, we measure the present level of agility and not agile potential. In order to measure agility level, the items were altered in time in our tool so they ask about the current situation. Second, it lets the group members fill out the survey and allows a statistical confidence interval to the result. One validation study has been conducted testing the tool in such a way. The authors report that the feedback was considered useful by the team participating in the pretest, but that the measurement itself shows problems with validity. Despite that, and since the tool was considered useful by the team itself in the focus group, we believe our tool can be used to guide improvement efforts. Sidky , also validated the content in the tool by letting practitioners evaluate the items and their connection to what they think agility is.
Sidky’s framework is divided into “agile levels”, “principles”, “practices and concepts”, and “items or indicators”.
To assess the results the Agile Adoption Framework includes 4 steps: First, calculating a weight for each item (the weight of 1 is divided by the number of items if all items are considered equally important), second, computing weighted intervals. These intervals are created by taking the answer of each item and calculating a pessimistic and optimistic result for each one, and the Likert scale is then divided into a percentage.
To clarify, this means that the practices “Reflect and tune process”, “Collaborative planning”, “Collaborative teams”, “Empowered and motivated teams”, “Working standards/procedures”, “Knowledge sharing tools”, “Task volunteering”, and “Customer commitment” are assessed in the tool. The actual items (or indicators) are published in . The tool consists of one survey for managers and one survey for developers. To assess an agile practice the analysis method proposed uses answers from both surveys. Some of the items are also used to assess more than one practice.
Below is the description of what the agile characteristics set out to determine. This list is needed to find what the different scores mean in the feedback table presented later.
2 Feedback to Companies
So far, we have tested the tool on seven teams in three multinational companies consisting of two US-based and one European-based. Below is the result and recommendations that were given to each team based on the survey result. Only one result and feedback from the one team at Company A is shown in this paper as an example, but all teams were given feedback the same way.
Results and Recommendations for Company A — One team
Company A could focus on the following aspects to improve their agility (they are explained in more detail below): (1) Collaborative Planning with regard to the Power Distance between management and group members. (2) Task volunteering. (3) Management’s buy-in of reflecting and tuning the process after each iteration or release.
Table 1 shows the results for the team. Collaborative planning and its aspect of power distance got a lower score, which means the teams would benefit from having a flatter and less hierarchical organizational structure when planning projects. Having management plan and set up the projects without participation from team members, will result in less accurate and suboptimal goals. The team members are experts on how much work the team can do under a given period of time.
At Team A, “task volunteering” got a lower score from both mangers and developers. Task volunteering is also a key to success, since developers need to take full responsibility for their current chores and deliverables together with the team as a whole. Team members are often also better at knowing their capability and making time estimates on tasks.
The last aspect of Team A concerned manager’s buy-in of “Reflect and Tune Process”. This means that the management is not willing to reflect and tune the process after each iteration or release. Continuous improvement is a key to achieving as high performance as possible. Management of Team A would benefit from having a workshop on how agile principles are supposed to work in teams, especially how the process needs fine-tuning and adaptation to each team/or context. A practice that works well in one team might need to be adapted to a new context but with the reason behind it kept. I.e. the agile principle behind an implementation can be kept, but the actual implementation must be tuned to, and reflected on, in the new context. Hopefully this will guide them into accepting and seeing the benefits of reflecting and tuning the processes and task volunteering.
3 Conclusions and Future Work
This paper has presented how the Agile Adoption Framework can be used to assess agility and pinpoint focus areas for companies that want to improve. Management found it useful in general to get data on possible focus points for improvement. Agility implies a set of principles that need to be followed in order to have the proposed responsiveness to change. We believe our tool could be useful as one step in the agile process assessment.
-  R. M. Fontana, S. Reinehr, and A. Malucelli. Maturing in agile: What is it about? In G. Cantone and M. Marchesi, editors, Agile Processes in Software Engineering and Extreme Programming (XP2014), volume 179 of Lecture Notes in Business Information Processing, pages 94–109. Springer, 2014.
-  L. Gren, R. Torkar, and R. Feldt. The prospects of a quantitative measurement of agility: A validation study on an agile maturity model. Journal of Systems and Software, 107:38–49, 2015.
-  O. Ozcan-Top and O. Demirors. Assessment of agile maturity models: A multiple case study. In T. Woronowicz, T. Rout, R. O’Connor, and A. Dorling, editors, Software Process Improvement and Capability Determination, volume 349 of Communications in Computer and Information Science, pages 130–141. Springer Berlin Heidelberg, 2013.
-  P. Serrador and J. K. Pinto. Does Agile work? A quantitative analysis of agile project success. International Journal of Project Management, 33(5):1040–1051, July 2015.
-  A. Sidky. A structured approach to adopting agile practices: The agile adoption framework. PhD thesis, Virginia Polytechnic Institute and State University, 2007.