Agile Metrics That Matter[1]

20
Forrester Research, Inc., 60 Acorn Park Drive, Cambridge, MA 02140 USA Tel: +1 617.613.6000 | Fax: +1 617.613.5000 | www.forrester.com Agile Metrics That Matter by Diego Lo Giudice, September 9, 2013 For: Application Development & Delivery Professionals KEY TAKEAWAYS The Precision Of Upfront Software Development Estimation And Planning Is A False Assumption Project managers and dev teams are forced to accurately forecast how long a development effort will really take and then draw up a concrete plan. en project governance and metrics must respect the forecast and follow the plan in excruciating detail! In truth, we can’t forecast soſtware with high precision and changes will happen in due course. Business Value Is Easy To Define, But Hard To Relate To Operational Delivery Metrics Measuring business value metrics is an emerging art. Few firms define or track business- specific value metrics to help understand if soſtware development creates value for customers and the business and to prescribe corrective actions if it’s not. Establishing that relationship in all contexts, from business to development teams, is not easy. Agile Teams Remix Old Metrics With New Ones e “iron triangle” (cost, scope, and deadline) won’t go away, but Agile teams take a more flexible approach to those metrics, fixing costs and deadlines but keeping scope variable. ey focus on metrics like cycle time, technical debt, and mean time to reduction that shiſt things toward speed to delivery, keep quality high, and improve processes. Agile Teams Should Define And Balance Metrics Based On Complexity With metrics, one size doesn’t fit all. We’re constantly asked which are the metrics that count, it turns out that the ability to use certain metrics depends on the complexity-- people, practice, and technology -- of your project. Use Forrester’s framework to determine what that complexity is and which metrics really count for you.

Transcript of Agile Metrics That Matter[1]

Forrester Research, Inc., 60 Acorn Park Drive, Cambridge, MA 02140 USA

Tel: +1 617.613.6000 | Fax: +1 617.613.5000 | www.forrester.com

Agile Metrics That Matterby Diego Lo Giudice, September 9, 2013

For: Application Development & Delivery Professionals

Key TaKeaways

The Precision Of Upfront software Development estimation and Planning Is a False assumptionProject managers and dev teams are forced to accurately forecast how long a development effort will really take and then draw up a concrete plan. Then project governance and metrics must respect the forecast and follow the plan in excruciating detail! In truth, we can’t forecast software with high precision and changes will happen in due course.

Business Value Is easy To Define, But Hard To Relate To Operational Delivery MetricsMeasuring business value metrics is an emerging art. Few firms define or track business-specific value metrics to help understand if software development creates value for customers and the business and to prescribe corrective actions if it’s not. Establishing that relationship in all contexts, from business to development teams, is not easy.

agile Teams Remix Old Metrics with New OnesThe “iron triangle” (cost, scope, and deadline) won’t go away, but Agile teams take a more flexible approach to those metrics, fixing costs and deadlines but keeping scope variable. They focus on metrics like cycle time, technical debt, and mean time to reduction that shift things toward speed to delivery, keep quality high, and improve processes.

agile Teams should Define and Balance Metrics Based On ComplexityWith metrics, one size doesn’t fit all. We’re constantly asked which are the metrics that count, it turns out that the ability to use certain metrics depends on the complexity-- people, practice, and technology -- of your project. Use Forrester’s framework to determine what that complexity is and which metrics really count for you.

© 2013, Forrester Research, Inc. All rights reserved. Unauthorized reproduction is strictly prohibited. Information is based on best available resources. Opinions reflect judgment at the time and are subject to change. Forrester®, Technographics®, Forrester Wave, RoleView, TechRadar, and Total Economic Impact are trademarks of Forrester Research, Inc. All other trademarks are the property of their respective companies. To purchase reprints of this document, please email [email protected]. For additional information, go to www.forrester.com.

For ApplicAtion Development & Delivery proFessionAls

wHy ReaD THIs RePORT

Tomorrow’s great application development and delivery (AD&D) leaders will be those who focus on delivering constant value and incremental improvement to their businesses. As changes to development processes in general and Agile in particular enable faster development of modern applications, measuring the value that teams deliver is becoming more important. Traditionally, application leaders have managed on the basis of cost, scope, and effort, but these basic measures aren’t as useful in an Agile context. This report exposes the metrics that successful Agile teams use to measure their progress and explores why traditional measurement approaches often lead development teams astray. To succeed in a modern application world, AD&D leaders need to understand that a one-size-fits-all approach to metrics does not work; instead, they need to gauge the complexity of their projects, the availability of best practices, and the skill levels of their development teams in order to successfully select the right metrics to measure project progress.

table of contents

Modern application Development Requires a Modern set Of Metrics

Determine your Metrics according To Project Complexity

recommenDAtions

Define, adapt, and Combine your Metrics To Continuously Improve

WhAt it meAns

Link Metrics To Clear Goals and Manage The Life Cycle

supplemental Material

notes & resources

Forrester interviewed a number of vendors, industry experts, and clients for this research, including 407 etr, ci&t, collabnet, Dominion Digital, evolve Beyond, iBm, rally software, scrum.org, serena software, tom Gilb & Kai Gilb, Xebialabs, and a global automotive manufacturer.

related research Documents

navigating the Agile testing tool landscapeJuly 18, 2013

Use A metrics Framework to Drive Bpm excellenceseptember 21, 2012

Justify Agile With shorter, Faster DevelopmentFebruary 8, 2012

agile Metrics That MatterBenchmarks: the Agile And lean playbookby Diego lo Giudicewith Jeffrey s. hammond, Kurt Bittner, and rowan curran

2

10

15

16

16

septemBer 9, 2013

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 2

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

MODeRN aPPLICaTION DeVeLOPMeNT ReqUIRes a MODeRN seT OF MeTRICs

Application software development is evolving into a strategic business process for many enterprises that develop customer-facing systems of engagement.1 In this new role, AD&D pros have a great opportunity to shift conversations with business stakeholders toward value delivered instead of zero-sum haggling over cost, scope, and deadlines. But that’s easier said than done. We’ve spoken with a number of experts and practitioners, and one theme ran through all of our interviews: Measuring the value of software development and relating it to business impact and/or business value is really hard.

“Our Agility Path program helps clients track business metrics as revenue per employee along with Agile operational IT metrics. You need to have a big human brain sitting between business and IT to link those metrics and reflect how changes on the IT side affect business value. Agility Path will make that link easier.” (Ken Schwaber, president, Scrum.org)

“In the context of our business transformation to introduce more automation and digital equipment on our highways, we adopted Agile everywhere we could and linked it to our enterprise architecture team’s new efforts. While it’s easy to see that teams are very busy and show just how busy they are, it’s hard to quantify how much value they are delivering.” (Keith Mann, group architect, 407 ETR)

Measuring the value delivered by development teams isn’t exactly a new concept. Tom and Kai Gilb have been working on the EVO (as in “evolutionary”) method for project management (PM) since the early 1980s and were among the first Agilists to define a PM method that focused on value for stakeholders in addition to delivering working software.2 New thought leaders are building on EVO; while case studies prove the benefits of focusing on value delivery, low adoption of EVO by development teams show that most mainstream Agile teams aren’t thinking along these lines yet.3 The reasons for this adoption gap lie in different concepts of “value delivery” — what the business considers to be valuable is not always what Agile developers focus on when they think about delivering value. All too often, developers like to concentrate on stuffing as much new code and as many new features as possible into every release without regard to the level of business value generated.

Traditional Development Metrics assume slow, Controlled, Predictable Change

Software development leaders have historically focused on a core set of metrics that measure various aspects of what’s come to be known as the “iron triangle”: deadline, cost, and scope, with quality occasionally added to the mix. Many application delivery organizations still base their measurement programs on these metrics. The main goal in using these traditional metrics is to manage projects against an estimate that sets the deadline and the scope (or alternatively, the cost) in stone. The assumption is that project managers and development teams can accurately forecast how long any given development effort really takes. But the truth is that:

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 3

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

■ Software development is not a predictable process. We often see naive comparisons between software development and cooking or a factory assembly line. But we know that we can follow a cooking recipe step by step and get a predictable outcome, and industrial processes are scripted to the smallest degree. With software development, it’s more like ordering the salmon and getting calamari instead. In modern application development, the outcome is not known beforehand and might be different from the planned result because the ingredients keep changing! (Think Iron Chef, with surprise ingredients at the start that may even change partway into the process.)

“Most engineering disciplines operate under relatively manageable uncertainty. By contrast, software development and delivery are dominated by human creativity, market change, complexity, and much higher levels of uncertainty.” (Walker Royce, chief software economist, IBM)4

■ You can’t always forecast deadline, scope, and effort with precision. Barry Boehm noted that the further a project progresses, the more accurate the estimates for the remaining effort and time become.5 NASA came to the same conclusion: At the beginning of the project life cycle — before gathering requirements — estimations have an uncertainty factor of 4x. This means that the actual duration can be either four times or one-quarter of the initial estimations (see Figure 1).6 There is an exception to this: small, repeatable projects whose requirements don’t change.

■ One size of development metrics can’t fit all. Over the next few years, we expect that enterprises will use at least two different development approaches, depending on whether they deliver fast-changing systems of engagement, more stable systems of record, or software-critical systems of operation. In addition, enterprises must consider the size, scale, and complexity of the organizational structures within their software delivery organization when selecting metrics that matter. It’s no surprise that different organizations will focus on different metrics — the ones that work for them.

“Given the level of complexity and size of our organization, we are happy when we can prove at the PMO level that we are delivering the features the business requires on time and on budget; that’s value for the business. We aim for more, but are not that mature yet.” (Program manager, global automotive manufacturer)

■ Measuring on false assumptions creates mistrust and skepticism. Unfortunately, many stakeholders demand precision and detail early on because it gives them the illusory comfort of progress. As the gap between reported and true progress reveals itself, stakeholder trust erodes.

“We’ve tried to estimate budgets with a precise process but always ended up with different (actual) numbers and lost trust that IT could ever fix this. We now review and adjust estimates as the projects unfold and are having a better conversation with IT.” (Business manager, large EMEA asset management firm)

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 4

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

Figure 1 Uncertainty Turns Into Certainty In Different Ways For Different Roles

Source: Forrester Research, Inc.94261

Source: Karmona Pragmatic Blog (http://blog.karmona.com)

False Precision Distracts Teams From Delivering Value

When development teams start with a false level of planning precision, it’s easy for the results to go awry. One reason: project managers and app-dev leaders create and use metrics like “percentage of project budget spent” and “percentage of scope delivered versus estimated” to determine whether projects are on track. And most project governance revolves around measuring variance from plans in excruciating detail — especially when projects use waterfall processes. As a result:

■ Fixed costs, deadlines, and effort sacrifice real business needs. When the most important metrics are a fixed date of delivery and a fixed scope or size, teams tend to take shortcuts as deadlines gets closer. They ignore defined requirements, sacrifice testing time, and focus on delivering as much working code as possible to meet a hard deadline. The alternative — making late scope changes and missing cost and deadline objectives — is also unpalatable. Unfortunately, 70% of respondents to our Q1 2013 Application Life-Cycle Management Online Survey claim that, when projects fail, it’s due to new and changing requirements. If

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 5

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

requirements change, “percentage complete” and “percentage of budget spent” metrics have little true value.

■ Rigid plans discourage or defer change even when it’s the right thing. Change management in traditional PM methodologies is a formal process that requires several steps, including (but not always limited to) a change request, an estimation of effort, and approval by a change review board. Traditional PM methodologies like PMI allow for change, but developers cannot flexibly accommodate it in short cycles. Stakeholders cannot quickly reprioritize requirements, because they’re not directly involved in what’s going on. The net result is that change usually gets pushed to future project work or releases; in the meantime, the company implements the “wrong thing.”

■ Dotting the Is and crossing the Ts becomes a primary goal . . . Instead of focusing on delivering a working minimum viable product (MVP) with core business features, traditional processes use metrics that measure if everyone is following the plan that was laid out at the start. Because the team assumes that the initial plan was correct, the value of completing tasks is flat and checking boxes equals progress.

■ . . . so teams manage to the letter of the law instead of the spirit of the law. All too often, when teams focus on the process instead to the result, they end up managing to metrics in unforeseen ways. The team can meet the metric, but the result may be the opposite of what’s intended (see Figure 2)!

“We have found that measuring throughput and output are extremely dangerous, because they incentivize the wrong behavior time and time again. For example, measuring the number of features delivered leads to more features being delivered. The additional features create waste — or wasted effort — and may not lead to better business outcomes.” (Gabrielle Benefield, CEO, Evolve Beyond)

Figure 2 Traditional Metrics Can Lead To Misbehavior

Source: Forrester Research, Inc.94261

Metric Potential misbehavior

Number of features delivered • Might not lead to a better business outcome• Might lead to unnecessary development as well as higher maintenance

costs because of a greater probability for defects• Delays time-to-market for the necessary features

• Number of defects �xed• Number of lines of code delivered• Number function points delivered

• Creation of unnecessary, low-quality code and higher maintenance(external provider doing both)

• More lines of code written necessitates more testing• No innovation or architecture improvement

Number of incidents handled Teams focus on patching instead of deep refactoring; systematic faultsgo unaddressed

Reduction in the number ofopen bugs

Defects spike: Bugs are retriaged but not �xed. System degradation andhigher incident rates

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 6

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

agile Projects Use Different Measures

What have we learned from the first 50 to 60 years of measuring software development projects? For many organizations, it seems like the answer is “Not much.” Changing deeply rooted metrics that were created to support waterfall projects is like convincing Italians to replace fusilli and linguine with hamburgers and ribs. But take heart! An increasing number of organizations are thinking differently: They accept that, no matter how good a team is, it just can’t estimate all development projects with the same precision. That’s especially true when new technologies, like mobile apps or scale-out public cloud infrastructure, are involved.

These teams are also changing their measurement practices to reflect the belief that it’s a waste of time to spend weeks or months specifying requirements up front. No matter how good the team is, requirements will change frequently during the initial stages of development. Many of these teams are the ones adopting Agile practices most aggressively; in the process, they are changing the focus of the metrics they use to measure success. In our interviews, inquiries, and discussions, we’ve found an emerging set of metrics grouped around measuring progress, quality, efficiency, and the realization of value and benefits (see Figure 3). Within those four categories, we see that:

■ Every Agile project measures velocity. Teams usually define velocity as number of story points from backlog delivered per sprint.7 Properly sizing user stories and tasks is the key to effectively determining velocity, but there’s no standard sizing approach.8 Most Agile teams calculate story points through planning poker, using a Fibonacci series for story size.9 Other use “ideal days,” development hours, “T-shirt size,” or more complex, formal approaches like function points or estimated lines of code. But be careful when measuring velocity; as with any metric, it can create its own measurement antipattern.10

“Velocity is great when teams use it right: It can give useful information on capacity once the team becomes consistent, and when peaks happen, it gives an indication of potential problems like bugs or too much rework. On the other hand, velocity is based on team subjective sizing methods; when velocity is used to compare the productivity of different teams, it can become a dysfunctional metric.” (Larry Maccherone, VP of analytics, Rally)

■ Agile projects extend quality metrics in new ways. A good sign of an effective Agile team is a high ratio of automated test coverage. This requires an increased focus on quality metrics, especially those that relate to the level of test automation. Typical metrics that we see used are regression testing automation coverage, the number of automation execution failures, or the frequency of failure of commit tests.11 The latter will help you understand if your automated tests are doing a good job of capturing defects during feature development. The number of defects in production, and how that number trends over time, are also good overall indicators of quality.

■ Using burndown charts creates a proxy for progress. Burndown charts track how fast the team is delivering user stories and tasks from the backlog. When product owners attach value

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 7

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

information to the user stories, burnup charts show how the software development team is actually delivering value as it progresses through sprints and releases (as long as the team only measures working software delivered). Burndown and burnup charts are effective visualization tools showing how activities, quality, efficiency, and values are trending as teams get close to delivery dates.

■ Tracking technical debt creates a basis for tradeoffs. As projects progress, development teams discover defects, improper designs, new requirements, and places where it can improve the code. Collectively, these create technical debt that the team needs to address. Measuring the amount of accumulated technical debt gives teams a good indication of when they need to refocus and turn from adding new features to refactoring existing ones. An item of technical debt should include the sizing and potential cost of the fix or the potential value of retiring it. If it’s placed on the backlog, the product owner will prioritize it just like any other user story or epic, depending on its importance and granularity. Measuring technical debt encourages a more mature conversation with business stakeholders when problems occur.

■ Measuring business value metrics is an emerging art. Thought leaders in some of the organizations we spoke with are busy tracking business-specific value metrics as part of their efforts to measure value delivery (see Figure 4). While the metrics are specific to the circumstance, the common goal is to define metrics that help understand if software development is creating value for the business and its customers and, if not, what corrective actions the firm needs to take.

“We are now able to measure our project’s contribution to the value of processing an insurance claim — one of our highest digital value streams. We now know the unitary cost of each claim and are focusing on reducing it.” (Leader of the project management office (PMO), large insurance company)

One possible source of business value metrics is the cycle time for various business processes, used as a way to measure a team’s improvement efforts. Cycle time is a Lean concept that works like a chronometer, registering the time that it takes to get from point A in a value stream or process to point B. An example of a cycle time metric for product development might be from concept to cash. But it’s not always easy to do; it might be clear when a business makes the first penny out of its value stream, but not necessarily when the idea was first conceptualized — so an alternative metric in this case might be cycle time from build to deploy (efficiency).

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 8

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

Figure 3 Agile Teams’ Metrics Commonly Span Progress, Quality, And Efficiency — But Rarely Value

Source: Forrester Research, Inc.94261

• Risk• Mean time to repair• Cycle time (frequency of

production deploymentper sprint)

• Cumulative �ow

• Customer-reported defects• Failure rates• Customer experience• Defect density• Number of defects per

sprint per release• Number of completed passed

tests• Modularity (scrap trend)• Adaptability (rework trend)• Regression testing

automation coverage• Number of automation

execution failures• Number of commit test failures

• ROI, TEI, EBITD• Revenue per employee• Sales, market size• Customer satisfaction• Cycle time (concept to market)• Number of successful

transactions• Earned value• Repeat visits• Cost• Business agility• Renewal rates• Customer usage• Validated learning

• Velocity• Capacity• Cycle time (plan to test)• Accumulated �ow• Burndown• Technical debt• Build rates• Delivery frequency• Critical practices employed• Uncertainty reduction

(reduction of cost tocomplete variance)

Value/bene�t

realizationQuality

E�ciencyProgress

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 9

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

Figure 4 What’s Boiling? Value And Business Metrics In Agile

Source: Forrester Research, Inc.94261

Approach Description Type of metrics

EVO(EvolutionaryMethod for ProjectManagement, TomGilb and Kai Gilb[www.Gilb.com])

Focuses on delivering real measurablestakeholder value incrementally; based onquanti�ed objectives, fast frequent iterations,value delivery to stakeholders, measurement,and learning. Core to EVO: Stakeholders areclearly identi�ed and stable, and product valuesare de�ned.

De�nes value decision tables wherebusiness goals, stakeholder value, andproduct values are speci�ed, measured,and improved.

The Lean Startup(Eric Ries[www.theleanstartup.com])

The Lean Startup assumption is that anorganization dedicated to creating somethingnew works under conditions of extremeuncertainty (true for a startup but also for aFortune 500). Validated learning is a rapidscienti�c experimentation approach (build,measure, learn). The Lean Startup measuresprogress by the fact of how much uncertaintygets reduced as the product is developed andhow much you’ve really learned about what thecustomer really wants.

Ries’ key economic metric is validatedlearning. It measures actual progresswithout “resorting to vanity metrics.”

Agile Path(Ken Schwaber[www.scrum.org])

The Agile Path is a continuous, two-stepfeedback loop. The �rst step is gathering andanalyzing the key business and process dataneeded to assess the current state of a companyin each of its critical function areas. The nextstep is using this data to identify whereimprovements are most needed to have themost immediate and positive impact on thecompany’s performance. Key to this process isbreaking down large, usually systemic problemareas, into manageable chunks that can beswiftly, e�ectively, and quanti�ably addressedand tracked using metrics.

These are broken down into enterprisemetrics that re�ect the business valuea company generates andfoundational metrics that measurevalue achieved by the softwaredevelopment functions and re�ect thequality of the software, the presence ofwaste in the code base, and speed tomarket.The enterprise metrics include:Revenue per employee, cost/revenueratio of relevant domains, employeesatisfaction, customer satisfaction,investment in agilityThe foundation metrics include:Release frequency, releasestabilization, functional usage(unused code), maintenance budget

Bayesiananalytics(IBM) (Filling in theblanks: The mathbehind NateSilver’s “The Signaland the Noise” andEconomicGovernance ofSoftware Delivery)

IBM is developing an analytic solution to predictthe likelihood of an agile project meeting itsdesired outcome and timely delivery of agreedupon scope and quality.The tool combines a variety of predictiveanalytic techniques including Monte Carlosimulation Bayesian probably, elicitation ofexpert opinion, and machine learning usingproject execution data.

• ROI, ROAE, NPV• Measure of mass (or use case points)• Change throughput (work items over

time/changes over time), backlogtrends

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 10

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

Figure 4 What’s Cooking? Value And Business Metrics In Agile (Cont.)

Source: Forrester Research, Inc.94261

Approach Description Type of metrics

ValueEngineering(Ci&T)

Value Engineering focuses on understandingand stating the key business drivers that supportthe value proposition of the product being builtand prioritizing the list of features (the backlog)in terms of features’ individual contributions toto those business drivers. The process createsobjectivity in the discussion of the value thatproject features contribute, helping PMOsoptimize the backlog for business value andfacilitating convergence in multistakeholderinitiatives.

• Planned value/e�ort ratio (e.g., 30% ofthe e�ort delivers 60% of the businessvalue)

• Adaptability: the percentage offeatures not present in the initialbacklog

DeTeRMINe yOUR MeTRICs aCCORDING TO PROjeCT COMPLexITy

Inquiries from our clients and the interviews for this research make it clear that, when it comes to metrics, one size does not fit all. In a very complex, fast-changing problem domain, it will be hard to establish a clear value stream and align all of the project resources with it. In such a situation, determining metrics like the unitary cost of a business process, like the cost of processing a claim or the cycle time from concept to cash, will be difficult. In contrast, if you work at an organization where transparency and trust predominate and there’s a culture of developer discipline, then more ambitious and effective metrics are possible. Before you start determining which metrics matter to you, make sure that you have a clear idea of what goal you want to accomplish.12 To select the right metrics, you should first determine your project’s complexity profile and then identify metrics that align with the profile.

Determine your Project’s Complexity Profile . . .

Complexity theories have been around for years, but David Snowden has created a framework that we think works especially well when thinking about software development. The Cynefin framework defines four types of complexity domains: simple, complicated, complex, and chaotic (see Figure 5).13 We’ve used the domains in the Cynefin framework to help define and select metrics appropriate for each problem domain you’re likely to encounter. Our approach takes into consideration three key factors: the level of certainty or uncertainty of requirements, team cohesiveness (how long the team has worked together and whether teams are working in a cross-functional or siloed organization), and the project team’s technology capabilities.14 To assess your project’s relative level of complexity, answer the questions in our assessment tool and determine (see Figure 6):

■ How well do you know the requirements? How much uncertainty is there around your business requirements? If you’re building a new mobile app or modernizing an existing

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 11

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

application, the answer is probably “A lot.” If you’re adding a new feature to a well-running system or you’re emulating a competitor’s capability, then your challenge might be more straightforward. If you have an exact idea of what you need to build and best practices to implement your requirements are in place, it’s more likely that traditional development metrics and processes will work.

■ How cohesive is your project team? Here you need to assess whether project team members have worked together on multiple sprints or if the team was just assembled for the purposes of the new project. An additional complexity to consider is whether the team is actually a cross-functional team fully dedicated to the same project or is instead composed of team members that belong to different organizational silos (e.g., development, testing center of excellence, PMO) and are only virtually assembled for the purposes of this project. Teams that work together over time tend to develop high-trust relationships and establish a track record of successful estimation and delivery. If they don’t, they tend to get reassigned to other projects or filtered out of the organization.

■ How well does the project team understand the technologies it uses? When it comes to assessing technology complexity, there are three fundamental aspects to consider: the level of architecture complexity; the level of integration and dependency the application will need to run; and the newness of the technology itself, including how much exposure to and experience with the technology existing team members have, such as using mobile tools, using SOA principles, and dealing with complex APIs.

You can extend Forrester’s Cynefin-based assessment tool to include additional assessment drivers beyond project complexity. Keep in mind that the metrics you select will very much rely on the cultural context of your organization. We recommend that you prioritize metrics that promote transparency and trust in the way they are used.

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 12

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

Figure 5 The Cynefin Framework

Source: Forrester Research, Inc.94261

Disorder

Source: Cognitive Edge

ComplicatedSense-analyze-respond

Mostly known

ChaoticAct-sense-respond

Turbulent,unconnected

SimpleSense-categorize-respond

Known

ComplexProbe-sense-respond

Unknown

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 13

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

Figure 6 Assess Your Cynefin Complexity

Source: Forrester Research, Inc.94261

The team does not have new members (>8 sprints or 2 releases together) and is cross-functional.

Requirements uncertaintyTrue orfalse?

Project team context

Requirements are totally unknown and keep changing.

Requirements are mostly unknown.

Requirements are mostly known.

Requirements are well-known and repeat themselves.

The team consists largely or completely of new members and is not cross-functional.

The team consists largely or completely of new members but is cross-functional.

The team consists partially of new members and is not cross-functional.

The team consists partially of new members and is cross-functional.

The team does not have new members (>8 sprints or 2 releases together) but is not cross-functional.

The spreadsheet associated with this �gure is interactive.

True orfalse?

Technology capabilities

The team has not mastered the technology and the technology is complex(major integration issues, complex app landscape).

The team has not mastered the technology but the technology is simple(web app, little integration, simple app landscape).

The team has mastered the technology and the technology is complex(major integration issues, complex app landscape).

The team has mastered the technology and the technology is simple(web app, little integration, simple app landscape).

True orfalse?

Note: Only one “True” response is permitted in each of the three groups; each item within a group is weighteddi�erently. The tool calculates the �nal score based on which items receive “True” responses; the �nal scoredetermines whether the project is simple, complicated, complex, or chaotic.

. . . and Then Identify Metrics That align with The Profile

Once you’ve determined the complexity of your project and mapped it to the domains of the Cynefin framework, it will be easier to select the best metrics to measure success, progress, and necessary improvements (see Figure 7). As your projects move from simple to complex, the use of value-based and progressive metrics will vary quite significantly. One the other hand, we find that quality and efficiency metrics remain relatively stable across all Cynefin domains:

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 14

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

■ For simple projects, concentrate on business value metrics. When projects are simple and straightforward, it’s not too difficult to compute business value metrics, such as ROI, the number of new customers, or percentage increase in sales, and use them to measure the project. Simple projects usually have known or stable requirements, and Agile development techniques are not as valuable in simple projects as they are in the complicated, complex, and chaotic domains. In fact, traditional metrics like the number of function points developed or number of completed features over time will work just fine. Another useful measure would be to examine whether the team is completing the “right” features over time by analyzing the number of completed features versus the number of new clients or increase in revenue.

■ Complicated project measures shift from business value to progressive quality and efficiency. Complicated projects are good candidates for Agile delivery tactics, because requirements are less stable and uncertainty is higher. You might still be able to use some value metrics, like cycle time from ideation to cash generation on a specific project or revenue per employee, if your company has a culture of tight financial governance, but it won’t be as straightforward and easy as it is with simple projects. In the complicated domain, velocity trends over multiple sprints can be correlated to customer satisfaction or sales increases, but the relationship might not be easy to create. From a quality perspective, complicated projects should focus on automation coverage while maintaining a focus on the mean time to repair (MTTR) for efficiency metrics.

■ Complex projects should employ strongly progressive metrics. Focus on metrics that measure the health of your projects as well as measuring improvement toward both business and technical goals. While value metrics might be hard to correlate to more technical metrics, you can still offer transparency to show how predictability is improving and how you are truly listening to business and client requests by measuring validated learning. Because changes are anticipated in due course on Agile projects, you can use the rate of incoming changes and reaction to them to track and prove business agility. Technical debt is also a useful metric to share with the business for complex projects.

■ Chaotic projects should focus on reducing risk. Projects in the chaotic domain are really experiments whose goal is figuring out potential causes and effects. A focus on validated learning as described in the “Lean Startup” approach is a good way to connect with the business and your end customers through an act-sense-respond approach. Attack the riskiest project items first and focus on feedback-oriented metrics like customer usage profiles, adoption rates, and the results of multivariate testing.

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 15

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

Figure 7 Select The Metrics That Match Your Profile

Source: Forrester Research, Inc.94261

Projectdomaincomplexity Type of metrics that count

Simple Value: Business value metrics like ROI, percentage increase in sales, percentage increase incustomers, unitary costs, etc., in addition to customer satisfaction.Quality: Number of defects in production; percentage of test cases coveredProgressive: sizing (FP, SLoC) over man-hours or days; number of completed features.Operational: MTR

Complicated Value: revenue per employee; cycle time (concept to market); customer satisfactionQuality: number of defects in production, defects in development, percentage of automation,percentage of test cases coveredProgressive: velocity over sprints, cycle time (build to deploy)Operational: MTR, risk

Complex Value: predictability, customer satisfaction, customer usage, validated learningQuality: number of defects in production, percentage of automation, percentage of automationregression tests, number of automation execution failures, number of commit test failuresProgressive: risk reduction, velocity over sprints, cycle time (build to deploy), technical debt,burndownOperational: MTR, cycle time (frequency of production deployment per sprint), cumulative�ow, risk

Chaos Value*: customer adoption rates, customer usage pro�les, validated learning, sales percentage,number of customers, innovation accounting, multivariate testingQuality*: number of defects in productionProgressive*: risk reduction (e.g., rate change, team cohesiveness)Operational: Measure e�ciency only once risk is manageable.

*The focus should be on emergent, feedback-based metrics.

R e c o m m e n d at i o n s

DeFINe, aDaPT, aND COMBINe yOUR MeTRICs TO CONTINUOUsLy IMPROVe

Metrics and measurement programs cannot be the same for all types of projects and all types of organizations. As the complexity of your development environment evolves:

■ Shift from “iron triangle” metrics to value or benefit realization metrics. The introduction of value metrics is where Agile and Lean and many of the thought leaders interviewed for this research are heading. Work to connect the dots between business value metrics and IT project efficiency or progressive metrics. If you have simple projects, start there; experiment and learn how to use value metrics to measure how development affects your business bottom line. Consider using multivariate testing to connect business value to development when more traditional approaches don’t work.15

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 16

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

■ Use quality metrics to measure if you’re doing the right things as well as doing things right. You will always have to include this category of metrics for any project, no matter what its complexity. Quality metrics address functional quality, which helps you answer the question: “Am I doing the right things for the business?” Example metrics include customer experience and adaptability (rework trends). Quality metrics must also address technical quality, so look to measures like defects in production, defect density, and failure rates. Agile teams should track quality metrics like the number of defects per sprint and release and, more importantly, metrics on automation trends (for test cases as well as process automation).

■ Use progress metrics to keep track of project health. Lean and Agile thinking is all about maximizing value for the business; progress metrics should continuously track the advancement toward that goal. If you are a mature Agile team, focus on cycle time. Cycles tracked should focus on throughput in different segments of the life cycle; from build to deploy is a common one when continuous deployment is part of your daily practice. Other useful metrics are the level of technical debt and reduced risk (and/or increased predictability).

■ Optimize your efficiency tracking metrics. Efficiency metrics are useful when you’ve reached a stable project state. MTTR is one of the key metrics in this category; it’s particularly helpful if you have a strong root-cause analysis process and act quickly to fix identified problems. MTTR is also an indicator of good and bad technical and architectural quality. For more advanced Lean teams, cumulative flow is an effective measure used to track the throughput efficiency of value streams.

W h at i t m e a n s

LINK MeTRICs TO CLeaR GOaLs aND MaNaGe THe LIFe CyCLe

Metrics have a life cycle. Once you’ve defined metrics, you will have to adapt and combine them, because their impact will change as complexity of your projects change. Use and extend our Cynefin metrics framework to manage the life cycle of your metrics. Make sure that you always link metrics to clear goals (business and IT) and that they help you drive process improvement and become a better application development team for your business.

sUPPLeMeNTaL MaTeRIaL

Online Resource

The online version of Figure 6 is an interactive tool to determine your project complexity.

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 17

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

Companies Interviewed For This Report

407 ETR

A global automotive manufacturer

Ci&T

CollabNet

Dominion Digital

Evolve Beyond

IBM

Rally Software

Scrum.org

Serena Software

Tom Gilb & Kai Gilb

XebiaLabs

eNDNOTes1 Systems of engagement are software applications that help decision-makers make decisions on the fly,

support clients in executing important personal and business tasks anytime and anywhere, and provide crucial in-the-moment information when needed. See the November 16, 2012, “Great Mobile Experiences Are Built On Systems Of Engagement” report.

2 Tom and Kai Gilb have a rich repertoire of books, blogs, and papers. Source: Tom Gilb & Kai Gilb (http://www.gilb.com).

3 Gabrielle and Robert Benefield are defining the “outcome delivery” method on the roots of EVO. Source: Evolving Systems (http://www.evolving.com/).

4 Walker Royce has written on the key role of uncertainty in software development and how to manage it. Source: Walker Royce, “Measuring Agility and Architectural Integrity,” ISCAS, March 2011 (http://walkerroyce.com/PDF/Measuring_Agility.pdf).

5 Source: Barry W. Boehm, Software Engineering Economics, Prentice Hall, 1981.

6 In the late 1980s, NASA developed a handbook of software development practices. Many of their conclusions hold true today. Source: “Manager’s Handbook for Software Development,” NASA, November 1990 (http://homepages.inf.ed.ac.uk/dts/pm/Papers/nasa-manage.pdf).

7 “Story point” is an arbitrary measure used by Scrum teams. This is used to measure the effort required to implement a story. A “backlog” is a list of features or technical tasks that the team maintains and which, at a given moment, are known to be necessary and sufficient to complete a project or a release. In the Scrum framework, all activities needed to implement entries from the Scrum product backlog are performed within sprints (also called “iterations”). Sprints are always short: normally about two to four weeks. Source:

“What is a story point?” AgileFaq, November 13, 2007 (http://agilefaq.wordpress.com/2007/11/13/what-is-a-story-point/).

8 It is critical for application development professionals to be able to effectively and objectively answer the question “How big is your software project?” in order to provide effective metrics, improve estimation

For ApplicAtion Development & Delivery proFessionAls

Agile metrics that matter 18

© 2013, Forrester Research, Inc. Reproduction Prohibited September 9, 2013

practices, target improvement initiatives, and refine governance and architectural processes. For more, see the July 27, 2009, “Software Size Matters, And You Should Measure It” report.

9 One commonly used method during the estimation process is to play “planning poker” (also called “Scrum poker”). When using planning poker, influences between the participants are minimized, producing a more accurate estimate. Source: “Scrum Effort Estimations — Planning Poker,” International Scrum Institute (http://www.scrum-institute.org/Effort_Estimations_Planning_Poker.php).

10 Baselining and benchmarking at the organizational level might be useful in transformation programs, but you must track a statistically relevant number of projects.

11 Commit tests are a basic set of tests that are run against each code commit to the mainline source trunk.

12 The best way to link metrics to clear objectives and goals is through the goal question metric framework. For more, see the August 26, 2011, “Introducing ADAM: Forrester’s Application Development Assessment Methodology” report.

13 For details on complexity theories and the origins Cynefin framework, check out the Cognitive Edge website. Source: Cognitive Edge (http://cognitive-edge.com/library/more/articles/summary-article-on-cynefin-origins/).

14 Many of David Snowden’s research papers and blogs make reference to complexity drivers mostly influenced by the people factor as well the uncertainty around requirements. For software projects, we think that how well (or poorly) teams master technology also makes a big difference.

15 When there isn’t an easy connection between business metrics and operational metrics. the hypothesis development approach is the best for trying to establishing the correlation — and tactics like multivariate testing help to do so. This is a widely used approach in mobile development. Source: Avinash Kaushik,

“Experimentation and Testing: A Primer,” Occam’s Razor, May 22, 2006 (http://www.kaushik.net/avinash/experimentation-and-testing-a-primer/).

Forrester Research, Inc. (Nasdaq: FORR) is an independent research company that provides pragmatic and forward-thinking advice to global leaders in business and technology. Forrester works with professionals in 13 key roles at major companies providing proprietary research, customer insight, consulting, events, and peer-to-peer executive programs. For more than 29 years, Forrester has been making IT, marketing, and technology industry leaders successful every day. For more information, visit www.forrester.com. 94261

«

Forrester Focuses On Application Development & Delivery Professionals responsible for leading the development and delivery of applications

that support your company’s business strategies, you also choose

technology and architecture while managing people, skills, practices,

and organization to maximize value. Forrester’s subject-matter expertise

and deep understanding of your role will help you create forward-thinking

strategies; weigh opportunity against risk; justify decisions; and optimize

your individual, team, and corporate performance.

andRea davies, client persona representing Application Development & Delivery Professionals

About ForresterA global research and advisory firm, Forrester inspires leaders,

informs better decisions, and helps the world’s top companies turn

the complexity of change into business advantage. our research-

based insight and objective advice enable it professionals to

lead more successfully within it and extend their impact beyond

the traditional it organization. tailored to your individual role, our

resources allow you to focus on important business issues —

margin, speed, growth — first, technology second.

foR moRe infoRmation

To find out how Forrester Research can help you be successful every day, please contact the office nearest you, or visit us at www.forrester.com. For a complete list of worldwide locations, visit www.forrester.com/about.

client suppoRt

For information on hard-copy or electronic reprints, please contact Client Support at +1 866.367.7378, +1 617.613.5730, or [email protected]. We offer quantity discounts and special pricing for academic and nonprofit institutions.