Search This Blog

Thursday, September 6, 2018

Agile Metrics Proposal

The Chief Product Officer in my company was asking me what metrics can be used to measure value delivery in any Agile project. He approached about measuring time logged by the individuals and the amount of story points, number of user stories, etc. I felt that such a metric driven mindset is an anti-pattern to self-organization and working agreements in a team. Recalling what Einstein said about not measuring everything countable as they won't count, I emphasized that we should measure what matters. 

Agile projects are predominantly characterized by the iterative nature of delivering incremental value to the customer by a self-organized team. So, it is important to measure value delivered through the iterations and releases rather than measuring individual contributions. I proposed four categories so that we can unearth the rationale behind them and not use the metrics to count against the team. 

Efficiency: This category is tactical in nature and aims at measuring how well the team is delivering value to the customer. Some example measure include:

  1. Team Velocity - Work done by the team over sprints
  2. Cycle Time - Conception to Value
  3. Burn Rate - Rate at which releases are consuming backlog

Collaboration: This category is self-organization. With the growing popularity of virtual and remote teams, it is important to measure the collaboration among the team members. Some measure include:

  1. Number Retrospective Items in Backlog - How many retrospective items identified are included in backlogs?
  2. Processes Improvement Rate - What types of processes are improved as noted by training, documents, and continuous improvements?
  3. Backlog Growth Rate - How much and what types of stories (Customer Value, Business Value, Technical Value, Process Value) are being added to the backlog and refined?

Effectiveness: This category is strategic and focuses on business value. Some measures include:

  1. Velocity Variance - Planned (Committed) to Actual (Delivered)
  2. Value Delivery - Measure Value in terms of Customer Satisfaction and Business Benefit
  3. Technical Debt - Quantify Value of work not done in the interest of pushing customer value stories

Quality: This category aims at how good a quality is produced as sprints evolve and should be evaluated at multiple levels for continuous improvement and optimization. Here, we can go into more detail. Some thoughts include:

  1. Automated Test Cases: Number of test cases that are automated
  2. Defect Detection Rate: Number of incidents detected in a release
  3. Test Case Reusability: Number of modular test cases reused within the test steps
  4. Test Case Coverage: Number of requirements that are covered by test cases excluding summary use cases
These quality specific metrics can be further broken into additional areas for ongoing optimization and improvement, such as the following

Metrics at the Test Case Level

    1. Test cases created for each user story
    2. Test cases reworked (based on clarification of requirements)
    3. Defects reported against the requirements (called defect density)
    4. Defects reported against the test case for modification of test case
    5. Test cases executed by passed, failed, blocked, etc.

Metrics at the Defect Level

    1. Defects by Severity (Show stoppers that impact the release or client)
    2. Escaped Defects (Indication of Internal Failure)
    3. Defect Reopen Rate
    4. Defects unidentified by internal users and clients
    5. Number of Exploratory or unscripted testing in an iteration or release

What are your thoughts? What would you remove or add? 

No comments: