by Srinivas Saripalli

Everyone wanted to know what it was that I measured for each developer. Truthfully, I was surprised by the inquiry.
When you move to agile metrics, you want to do four things:

  1. Moving to a more holistic, whole-team view
  2. Move from measuring functional teams, such as a test team, to holistic, executing agile team view.
  3. To have fewer metrics and focus on four distinct areas of interest. It is:
    1. Predictability
    2. Value
    3. Quality
    4. the overall health of the team

Prefer metrics to be results-based rather than input-based. Instead of caring about “planned velocity”, care about “resulting velocity” instead.

Trending is much more important than any specific data point. Afterward, I had these ideas regarding agile metrics.

Related ReadingEmbracing Agile: Agile Is More Than Sprinting


I’ll categorise each of the metrics into three groups:

Input: Input is a term defined at the start of the pipeline. For example, planned test cases.

Output: the end result of a pipeline Velocity is a useful output metric.

Outcome:These are metrics that are geared towards customer testing and experiments. In this instance, we’d measure the new releases feature set by using a survey.


Most of these are crazy ideas. But before that, I wanted to hand out these 13 tips on what to measure in agile teams.
Anything additional would be really interesting. For example, individual test cases or functional test coverage run per day for each plan.

  1. Features removed over releases, quarterly, or possibly as a percentage backlog features de-scoped or simplified They are demonstrating how open we are to subtracting as well as adding. (Value, Output)
  2. Stop line events for a release or across an organisation, which could include CI/CD stops and other “process” stops. Clearly, a “quality-centric” perspective of the organisation. (quality, outcome)
  3. Root-cause discovery sessions conducted for each team per release; view corrective actions and identify patterns across teams? Continual quality improvement is the metric we’re focused on. There are numerous issues which may be considered too. (quality, outcome)
  4. The team resolved four retrospective items. A way to look at retrospectives without diminishing their confidential or integrity would be interesting. Is this going to work? (Quality,Output)
  5. No bugs – in a sprint, or during a release? A zero-bug phrase means brand-new bugs. Are the players keeping their agile values? (Quality,Output)
  6. Number of reworked user stories, which may include maintenance of a “Cost of Rework” factor. There is a healthy range here. It also becomes stagnant and unhealthy. How will we know? Decrease target is set at over 20%.You should clarify what went into the “technical debt” bucket. (quantity or quality)
  7. Perhaps, a rolling average in storey points Release-level predictability for each team is fascinating. Use caution when aggregating results across teams (the entire organisation). No team member velocity measurement! (Precision, Output)
  8. Time-stamp the type of work that flows through your teams, recording throughput per storey size. Next, you’ll have a range and mean for storey throughput. Release-level commitments can be predicted using this. Use caution when aggregating results across teams (the entire organisation). (Precision, Output)
  9. When considering the average variance across teams, and is the variance trending lower? The non-specific version of this: examining raw storey delivery variance. (Precision, Output)
  10. The test automation (incl. UI level, component / middle tier, and Unit level) coverage. Or instead of coverage, you could show the planned/running automation ratio. This would be an interesting development. (Quality, Output)
  11. The amount of each sprint dedicated to automation investments. The percentage of each sprint spent on Continuous Integration and Continuous Deployment activities. This again, focuses on long-term trends.
  12. Excellent monitoring & improvement trends team’s happiness factor
  13. Agile team member training budget. Simple. Trending year over year. Team productivity
  14. Customers’ use of delivered features (actual usage). Implement the appropriate measures to collect and analyse usage data.
  15. Customer survey’s to identify value provided (actual, not wishful) Maybe use a Net Promoter Score?
  16. The good dedicated Scrum Masters vs. multi-tasking Scrum Masters (Dual or more roles). Establish a relative investment ratio for agile transformation-core roles. Extend this to coaches and Product Owners, too. Team productivity
  17. Number of experimentation failures, risk-taking failures, or risk-tolerant failures. I may be seeking something beyond zero. Or it could be cross-team or organisational. (Loss of, Output)
  18. Measure commitment levels to date-driven targets/expectations. This shows the level of organisation planning and who is signing up for the plan’s credibility and feasibility.
  19. New test cases added each quarter/Release. test cases retired per release or quarter Both focus on making testing relevant in real-time. (Quality,Outcome)
  20. Most members of senior leadership attend Team Sprint Reviews. Re-work fuelled by positive feedback and sheer attendance can be measured. Ultimately, it’s a matter of who’s present for the review: Were they betrothed? (Value,Outcome)
  21. Team churn measuring changes (impacts) on a team sprint within a sprint. Somewhere I created an index of sorts using a formula. I cannot remember. Waste is one of the best metrics because it slows the team down.
  22. Backlog grooming and look-ahead is an essential team activity. Also, milestones where you want to groom the work for the next release. Groom backlog and storey analysis productivity over time. The “crisp” grooming of stories with a 5-8 minute pacing requires an egg timer. 

Summing It Up

Recall the article’s title. These metrics are a bit strange, if not outright silly. In my opinion, none of them are a good fit for your projects, teams, and work.

These were not written down because I believed they had “potential”. Regardless, I’d still like to hear from you on their quality and agility.

  • What can or would you recommend?
  • Any others?
  • How do you feel about input, output, and outcome metrics? Do you care?
  • Let’s boil it down to the top 5 measurements of high-performance agile teams.

Interesting tools to try for agile metrics

Some of the interesting agile tools which can be tried are – Plutora, Atlassian Jira.

You may also like

Leave a Comment

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More