Agile Workshop: Agile Metrics

21
Agile Metrics Santhana Krishnan Mar 20, 2009

Transcript of Agile Workshop: Agile Metrics

Page 1: Agile Workshop: Agile Metrics

Agile Metrics

Santhana Krishnan

Mar 20, 2009

Page 2: Agile Workshop: Agile Metrics

• What is Metrics?

• A Good Agile Metric or Diagnostic…

• Metric / Diagnostic Evaluation Checklist

• Metrics and Examples

• References

Contents

Page 3: Agile Workshop: Agile Metrics

• Quantitative measures of performance or production used to indicate progress or achievement against strategic goals - www.s3co.com/glossary/

• Measurable element of a service, process or function. The real value of metrics is seen in their change over time. Reliance on a single metric is not advised, especially if it has the potential to affect User behaviour in an undesirable way - www.infodiv.unimelb.edu.au/knowledgebase/itservices/a-z/m.html

What is Metrics?

Page 4: Agile Workshop: Agile Metrics

The Iron Triangle

Sco

peTim

e

Cost

Quality

Page 5: Agile Workshop: Agile Metrics

Affirms and reinforces Agile principlesSupports the customer-intimate and value focused traits that reinforce Agile principles. This requires that people who

understand Agile participate in metrics design. The truism "you get what you measure“ reminds us that counterproductive

behaviors may ensue if you reinforce the wrong things (ex: overtime, % utilization, paperwork)

Follows trends, not numbersMeasure "one level up“ to ensure you measure aggregated information, not sub-optimized parts of a whole. Aggregate

above the individual team level for upper management use. To promote process health, do not track at levels more

granular than “a team”, and “an iteration”.

Belongs to a small set of metrics and diagnosticsA "just enough" metrics approach is recommended: too much information can obscure important trends.

A Good Agile Metric or Diagnostic… (1 of 3)

Page 6: Agile Workshop: Agile Metrics

Measures outcome, not outputIn an Agile environment where simplicity or "maximizing the amount of work not done" is promoted, the most

spectacular outcome might be achieved by reducing planned output while maximizing delivered value. Outcomes

are measured in terms of delivered Customer value

Is easy to collectFor team-level diagnostics the ideal is "one button" automation - where data is drawn from operational tools (i.e.

the Product Backlog, acceptance test tools, code analyzers). For management use, avoid rework (ex:

powerpoints) and manipulation of lower level data, aggregation is preferable.

Reveals, rather than conceals, its context and significant variablesShould be visibly accompanied by notes on significant influencing factors, to discourage false assumptions and

facilitate improvement.

A Good Agile Metric or Diagnostic… (2 of 3)

Page 7: Agile Workshop: Agile Metrics

Provides fuel for meaningful conversationFace-to-face conversation is a very useful tool for process improvement. A measurement isolated from its context loses

its meaning. Note: It's a good sign when people talk about what they've learned by using a metric or diagnostic.

Provides feedback on a frequent and regular basisTo amplify learning and accelerate process improvement, metrics should preferably be available at each iteration

retrospective, and at key periodic management meetings.

May measure Value (Product) or ProcessDepending on where problems lie, diagnostics may measure anything suspected of inhibiting effectiveness. Consider

the appropriate audience for each metric, and document its context/assumptions to encourage proper use of its content.

And remember: you get what you measure!

Encourages "good-enough" qualityThe definition of what's "good enough" in a given context must come from that context's Business Customer or their

proxy, not the developers.

A Good Agile Metric or Diagnostic… (3 of 3)

Page 8: Agile Workshop: Agile Metrics

Metric / Diagnostic Evaluation ChecklistName This should be well chosen to avoid ambiguity, confusion, over simplification

Question It should answer a specific, clear question for a particular role of group. If there are multiple questions, design other metrics.

Basis of Measurement

Clearly state what is being measured, including units. Labeling of graph axes must be clear rather than brief

Assumptions Should be identified to ensure clear understanding of data represented

Level and Usage Indicate intended usages at various levels of the organization. Indicate limits on usage, if any

Expected Trend the designers of the metric should have some idea of what they expect to see happen. Once the metric is proven, document common trends

When to Use It what prompted creation or use of this metric? How has it historically been used?

When to Stop Using It when will it outlive its usefulness, become misleading or extra baggage? Design this in from the start

How to Game It think through the natural ways people will warp behavior or information to yield more ‘favorable’ outcomes

Warnings recommend balancing metrics, limits on use, and dangers of improper use

Page 9: Agile Workshop: Agile Metrics

Metrics and examples

Page 10: Agile Workshop: Agile Metrics

VelocityName VelocityQuestion How much software can my team deliver per iteration?

Basis of Measurement Story points or “ideal engineering hours”

Assumptions The team is delivering working software every iteration

Level and UsageVelocity is most useful at the project level. It allows the team to forecast how much work they can expect to complete based on prior efforts.

Expected Trend

Velocity can be affected by many things: Changing team members, obstacles, toolsets, difficulty of feature or amount of learning required, etc. will lower the velocity of the team. Barring unexpected obstacles, a stable team on the same project with the required resources will generally gain in velocity during the course of the project, then plateau.

When to Use ItVelocity is a very useful metric for the team, and should be used during the course of the project once work has started.

When to Stop Using It

In a longer project when the team, resources, and technology are all stable, velocity will also become stable. The team may suspend collecting velocity since it is "known."

How to Game ItVelocity is only meaningful to the exact team providing the data - each team will estimate their work differently from other teams.

Warnings

Velocity is not the same as value. A team with excellent velocity could spend months quickly and effectively delivering software that does not have the investment potential. Comparing velocity of teams is problematic (see above) and should be avoided: this diagnostic is a barometer for the team itself, as a unit. Team member velocities are problematic: velocity should be measured at the team level, since that is the unit that must self-organize to produce value

Page 11: Agile Workshop: Agile Metrics

Velocity Chart – an Example

Page 12: Agile Workshop: Agile Metrics

Scope Burn Up – an Example

Page 13: Agile Workshop: Agile Metrics

Story Rate

Average Story Point Per Day Per Developer

0.1

1.1

0.40.6

0.4

1.3

0.81.0

0.9

1.1 1.1

1.4

0.8

2.1

1.3

1.1

1.3 1.31.1

1.3

0.3

1.2 1.1

0.7

0.0

0.5

1.0

1.5

2.0

2.5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Iteration #

Sto

ry P

oin

t

Story Rate 3 per. Mov. Avg. (Story Rate)

Page 14: Agile Workshop: Agile Metrics

Cost per point

Page 15: Agile Workshop: Agile Metrics

Finger Chart

Page 16: Agile Workshop: Agile Metrics

Worm Chart

Page 17: Agile Workshop: Agile Metrics

QA Metrics – A Sample

Summary – Selenium Tests

Tests Failures Errors Success rate

Time (sec)

52 (677 assertions) 1 0 98% 39 mnts

Code Coverage

 

Iteration 17

Iteration 16

Status

Total Coverage 90.3% 88.9%  

Code Coverage 88.5% 86.6%  

Bugs

  Fixed Carried over Status

Iteration 17 15 5  

To Date 207 5  

Page 18: Agile Workshop: Agile Metrics

More Metrics

Build:• Frequency of builds • Average duration of builds• Number of broken builds per

iteration • Average duration of broken build • Number of builds per iteration

Development:• Cyclometric complexity measures • Distribution of method and class

lengths • Rate of change of source (loc

in/out) • Proportion of source code that is

test code

Tests:• Unit tests per story• Functional tests per story• Defects carried over per iteration• Defects per story

Scope:• Scope change (stories removed or added

from scope due to redundancy or rewrite) per iteration

• Scope changes not caused by additional stories per iteration

• User Stories carried forward (hangover) per iteration

• No of stories held in Analysis, Development, Testing per iteration

Page 19: Agile Workshop: Agile Metrics

• One can get completely lost in too many metrics

• A "just enough" metrics approach is recommended: too much information can obscure important trends

• "If you can't measure it, you can't manage it.“ – Peter Drucker

Just make it simple and practical. Measure those metrics you choose and act on them.

Warning!

Page 20: Agile Workshop: Agile Metrics

• Appropriate Agile Measurement: Using Metrics and Diagnostics to Deliver Business Value

Deborah Hartmann, Agile Process Coach, [email protected] & Robin Dymond, Agile Management Consultant, [email protected]

• Cohn, Mike, Agile Estimating and Planning, Prentice Hall, 2006

References

Page 21: Agile Workshop: Agile Metrics