Using the agile adoption framework to assess agility and guide improvements

04/05/2019
by   Lucas Gren, et al.
Göteborgs universitet
0

Agility implies a set of principles that need to be followed in order to have the proposed responsiveness to change. This paper presents how the Agile Adoption Framework can be used to assess agility and pinpoint focus areas for companies that want to improve. Management found it useful in general to get data on possible focus points for improvement in their agile transformation. We believe the suggested tool could be useful as one step in the agile process assessment.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

09/22/2019

Agility is responsiveness to change: An essential definition

There is some ambiguity of what agile means in both research and practic...
10/03/2017

Concerns and Limitations in Agile Software Development: A Survey with Paraguayan Companies

This year, the Agile Manifesto completes seventeen years and, throughout...
05/29/2020

Design of Transformation Initiatives Implementing Organisational Agility – An Empirical Study

This study uses 125 responses from companies of all sizes headquartered ...
07/01/2019

A Survey of Maturity Models from Nolon to DevOps and Their Applications in Process Improvement

This paper traces the history of Maturity Models and their impact on Pro...
07/08/2020

Agile Approach for IT Forensics Management

The forensic investigation of cyber attacks and IT incidents is becoming...
11/11/2017

Assessing Agile Transformation Success Factors

Research on success factors involved in the agile transformation process...
04/05/2019

Agile Process Consultation -- An Applied Psychology Approach to Agility

An agile change effort in an organization needs to be understood in rela...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Agile software development has become an extremely popular way of managing projects, and there have also been studies showing the effectiveness of such approaches [4]. Many different measurement models for agility have been proposed but according to [3] Sidky’s Agile Adoption Framework [5] is the most complete to date. Many companies have implemented an agile approach but lack a way to evaluate the level of implementation, and need contextually specific tools [1]. However, getting some quantitative feedback from the teams could be a useful indication of agility that can be used to study (and improve) agile teams and also to compare them. This paper presents preliminary results from such a tool.

Sidky’s [5] framework assesses the level of agility an organization is ready to implement and recommends what methods these should be. There are two main differences between Sidky’s [5]

tool and how we use it here. First, we measure the present level of agility and not agile potential. In order to measure agility level, the items were altered in time in our tool so they ask about the current situation. Second, it lets the group members fill out the survey and allows a statistical confidence interval to the result. One validation study has been conducted testing the tool in such a way

[2]. The authors report that the feedback was considered useful by the team participating in the pretest, but that the measurement itself shows problems with validity. Despite that, and since the tool was considered useful by the team itself in the focus group, we believe our tool can be used to guide improvement efforts. Sidky [5], also validated the content in the tool by letting practitioners evaluate the items and their connection to what they think agility is.

Sidky’s framework is divided into “agile levels”, “principles”, “practices and concepts”, and “items or indicators”.

To assess the results the Agile Adoption Framework includes 4 steps: First, calculating a weight for each item (the weight of 1 is divided by the number of items if all items are considered equally important), second, computing weighted intervals. These intervals are created by taking the answer of each item and calculating a pessimistic and optimistic result for each one, and the Likert scale is then divided into a percentage.

To clarify, this means that the practices “Reflect and tune process”, “Collaborative planning”, “Collaborative teams”, “Empowered and motivated teams”, “Working standards/procedures”, “Knowledge sharing tools”, “Task volunteering”, and “Customer commitment” are assessed in the tool. The actual items (or indicators) are published in [2]. The tool consists of one survey for managers and one survey for developers. To assess an agile practice the analysis method proposed uses answers from both surveys. Some of the items are also used to assess more than one practice.

Below is the description of what the agile characteristics set out to determine. This list is needed to find what the different scores mean in the feedback table presented later.

Table 1: The Result for Team A (the “notes below table” can be found in Section 1 of this paper).

(1) Whether or not a collaborative or a command-control relation exists between managers and subordinates. The management style is an indication of whether or not management trusts the developers and vice versa. (2) Whether or not management is supportive of or resistive to having a collaborative environment. (3) Whether or not management can be open with customers and developers, i.e., no politics and secrets. (4) Whether or not people are intimidated/afraid to give honest feedback and participation in the presence of their managers. (5) Whether or not the developers are willing to plan in a collaborative environment. (6) Whether or not the organization does basic planning for its projects. (7) Whether or not any levels of interaction exist between people thus laying a foundation for more team work. (8) Whether or not people believe in group work and helping others or are just concerned about themselves. (9) Whether or not people are willing to work in teams. (10) Whether or not people recognize that their input is valuable in group work. (11) Whether or not the developers see the benefit and are willing to apply coding standards. (12) Whether or not developers believe in and can see the benefits of having project information communicated to the whole team. (13) Whether or not managers believe in and can see the benefits of having project information communicated to the whole team. (14) Whether or not management will be willing to buy into and can see benefits from employees volunteering for tasks instead of being assigned. (15) Whether or not developers are willing to see the benefits from volunteering for tasks. (16) Whether or not management empowers teams with decision making authority. (17) Whether or not people are treated in a way that motivates them. (18) Whether or not managers trust and believe in the technical team in order to truly empower them. (19) Whether or not developers are willing to commit to reflecting about and tuning the process after each iteration or release. (20) Whether or not management is willing to commit to reflecting about and tuning the process after each iteration or release. (21) Whether or not the organization can handle process change in the middle of the project.

2 Feedback to Companies

So far, we have tested the tool on seven teams in three multinational companies consisting of two US-based and one European-based. Below is the result and recommendations that were given to each team based on the survey result. Only one result and feedback from the one team at Company A is shown in this paper as an example, but all teams were given feedback the same way.

Results and Recommendations for Company A — One team

Company A could focus on the following aspects to improve their agility (they are explained in more detail below): (1) Collaborative Planning with regard to the Power Distance between management and group members. (2) Task volunteering. (3) Management’s buy-in of reflecting and tuning the process after each iteration or release.

Table 1 shows the results for the team. Collaborative planning and its aspect of power distance got a lower score, which means the teams would benefit from having a flatter and less hierarchical organizational structure when planning projects. Having management plan and set up the projects without participation from team members, will result in less accurate and suboptimal goals. The team members are experts on how much work the team can do under a given period of time.

At Team A, “task volunteering” got a lower score from both mangers and developers. Task volunteering is also a key to success, since developers need to take full responsibility for their current chores and deliverables together with the team as a whole. Team members are often also better at knowing their capability and making time estimates on tasks.

The last aspect of Team A concerned manager’s buy-in of “Reflect and Tune Process”. This means that the management is not willing to reflect and tune the process after each iteration or release. Continuous improvement is a key to achieving as high performance as possible. Management of Team A would benefit from having a workshop on how agile principles are supposed to work in teams, especially how the process needs fine-tuning and adaptation to each team/or context. A practice that works well in one team might need to be adapted to a new context but with the reason behind it kept. I.e. the agile principle behind an implementation can be kept, but the actual implementation must be tuned to, and reflected on, in the new context. Hopefully this will guide them into accepting and seeing the benefits of reflecting and tuning the processes and task volunteering.

3 Conclusions and Future Work

This paper has presented how the Agile Adoption Framework can be used to assess agility and pinpoint focus areas for companies that want to improve. Management found it useful in general to get data on possible focus points for improvement. Agility implies a set of principles that need to be followed in order to have the proposed responsiveness to change. We believe our tool could be useful as one step in the agile process assessment.

References

  • [1] R. M. Fontana, S. Reinehr, and A. Malucelli. Maturing in agile: What is it about? In G. Cantone and M. Marchesi, editors, Agile Processes in Software Engineering and Extreme Programming (XP2014), volume 179 of Lecture Notes in Business Information Processing, pages 94–109. Springer, 2014.
  • [2] L. Gren, R. Torkar, and R. Feldt. The prospects of a quantitative measurement of agility: A validation study on an agile maturity model. Journal of Systems and Software, 107:38–49, 2015.
  • [3] O. Ozcan-Top and O. Demirors. Assessment of agile maturity models: A multiple case study. In T. Woronowicz, T. Rout, R. O’Connor, and A. Dorling, editors, Software Process Improvement and Capability Determination, volume 349 of Communications in Computer and Information Science, pages 130–141. Springer Berlin Heidelberg, 2013.
  • [4] P. Serrador and J. K. Pinto. Does Agile work? A quantitative analysis of agile project success. International Journal of Project Management, 33(5):1040–1051, July 2015.
  • [5] A. Sidky. A structured approach to adopting agile practices: The agile adoption framework. PhD thesis, Virginia Polytechnic Institute and State University, 2007.