Quality Index: A Composite Metric for the Voice of Testing

16
©2015, Cognizant Technology solution 1 Quality Index A Composite Metric for the Voice of Testing October 1, 2015 Nirav Patel - Walgreens Boots Alliance [email protected] Sutharson Veeravalli – Cognizant Technology Solutions [email protected]

Transcript of Quality Index: A Composite Metric for the Voice of Testing

Page 1: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 1

Quality Index

A Composite Metric for the Voice

of Testing

October 1, 2015

Nirav Patel - Walgreens Boots

Alliance

[email protected]

Sutharson Veeravalli – Cognizant

Technology Solutions

[email protected]

Page 2: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 2 2 ©2015, Cognizant Technology solution

Speaker(s) Bio

A director at Cognizant’s Quality Engineering & Assurance (QE&A) practice, Sutharson

Veeravalli manages large delivery organizations and specializes in implementing

transformational initiatives that enhance the delivery ecosystem. Sutharson has twenty years

of experience spanning banking and retail domains, process and domain consulting, large

system delivery, product management, and QA transformation for clients. He brings a diverse

experience in leading multiple transformational initiatives that have delivered multimillion

dollar savings to clients

Nirav Patel, the QA manager of QAE for pharmacy, retail, and health and wellness divisions at

Walgreens, has been in QA for more than fifteen years. Specializing in metrics-driven

governance and financial control, Nirav drives the concept-to-implementation process of the

transformational initiatives across his portfolio. He is an active advocate of quality

engineering to drive consistency and efficiency within the practice.

Nirav

Sutharson

Page 3: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 3 3

The Curious case of information overload...

GUARD [on screen]: I've sealed the compartment. It can't get out, that lock's got a billion combinations. DOCTOR: A Dalek's a genius. It can calculate a thousand billion combinations in one second flat. DOCTOR: It's not just energy. That Dalek just absorbed the entire internet. It knows everything Doctor Who - 2005 Series, Episode 6

Blessed are the ones who have Daleks as their project managers… rest of us have to still deal with the information overload in managing projects

How many times have you felt…???

Overwhelmed at the amount of data & metrics produces by the teams. and yet not getting a holistic view on project health ?

Unsure if in-progress & completed programs are on track to deliver the intended business outcomes?

Limited in conveying the project/program status to non-technical business & senior management stakeholder? Challenged in articulating the quality

assurance view to external stakeholders?

©2015, Cognizant Technology solution

Page 4: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 4 4

Why will traditional metrics & articulation not solve our challenges?

Business Expectation

Simple non technical view on program/project status

Standardized, contextual & actionable information

Information aligned to outcomes to be delivered

QA challenges

Delivery metrics requires some level of understanding QA

domain

Plethora of metrics , too difficult to summarize

Voice of QA is limited and mostly overridden by program

pressures

Lack of business outcome view. Focus is more on tactical

delivery

QA should restrict to test execution and achieving end

date

Solution

Synergize bottom up metrics to create a multi dimensional

view that uses a simple composite metric to succinctly

summarize status against expected outcomes,

strengthening the “Voice of QA”

Standardized, business outcomes aligned metrics framework that uses “Quality Index” as voice of QA

©2015, Cognizant Technology solution

Page 5: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 5 5

Multi-dimensional Metrics Framework

• QUALITY INDEX based reporting • A unified view of quality depicting

the quality level against benchmark

Business outcome aligned

• At highest level metrics are aligned to business outcome • Execution predictability • Product stability • Learning & skills • Financial control

Broader Measures

• Measures not restricted to single phase.

• Specific measure (called quality gauge) defined for each phase of SDLC

Contextual insights

• Metrics collated contextual to stakeholder

• Level 1: for QA stakeholders

• Level 2: SDLC stakeholders

• Level 3: Business stakeholders

Objective : To create a unified metrics framework to • Standardize metrics, data sources, tools and process to capture data • Unified reporting for multi-level (L1, L2 and L3), contextual management status • Facilitate single view of Quality through ‘Quality index (QI)’ (Project, Program and release level) & Centralized dashboard

Single view of quality

©2015, Cognizant Technology solution

Page 6: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 6 6

Build maturity Requirements maturity QA efficiency QA Quality Upstream quality

Broader Measure & Contextual Insights

Requirements Analysis Test Design Test Execution & Closure Design & Build

Measure

SDLC Stakeholders

QA Stakeholders

Process adherence Automation/tooling

percentage Improvement

defects found in DIT

Build success rate Deployment time

Requirements traceability Test design productivity Rework effort (due to QA) Review Efficiency Test design productivity

(Auto) Performance test design

productivity Build to QA spend ratio

Requirements Implementation index (based on static testing defects

Test execution productivity

Test execution productivity

Performance test execution productivity

Test Efficiency =(1-defect rejection)

Requirement stability index (RSI)

Defect reopen rate Environment

downtime Schedule Variance Business time to

market QA committed

schedule

©2015, Cognizant Technology solution

Page 7: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 7 7

Business Outcome aligned metrics with QI

Benefits quantified through Single unit of

measure “Y-o-Y Improvement

on Cost/TCP”

Quality Index

Quality Index

Quality Index

Quality Index

Execution Predictability

Learning & Skills

Production Stability

Financial Control

• Requirements implementation index • Requirement stability index (RSI) • Defect reopen rate • Test environment downtime • Schedule Variance • Effort variation (due to QA) • Risk Aligned test coverage • Maturity of build Organization

• Team Competency Index

• Review Efficiency • Innovation Index

• Requirements Traceability • UT/DIT coverage • UT/DIT test results • Defect leakage to production • Application defect density

• Test Design productivity (manual, automation, performance)

• Test Execution productivity (manual, automation, performance)

• Rework Effort(Due to QA) • Test Effectiveness • Build to QA ratio • % CAPEX reduction through virtualization • Automation Coverage

What is the improvement ?

©2015, Cognizant Technology solution

Page 8: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 8 ©2015, Cognizant Technology solution 8

Quality Index – Voice of the QA

Quality Index (QI) is the composite metric used to

summarize the quality of each measure

A QI of 1 indicates the measure is in alignment with the

expected benchmark value.

How is QI Calculated ? A measure (e.g. Execution predictability) is linked to influencing metrics Every metric is measured against the expected benchmark The QI index is calculated as 1, if the metric value is equal to the expected benchmark level Based on QI value a RAG status is assigned: QI >1 is Green, > 0.8 and < 1 is Amber and < 0.8 is Red The individual metric level QI is rolled up to measure level by assigning weightages to each metric

How is QI rolled up ? Quality index is calculated at project level and rolled up through business as below

How is QI represented? QI is represented as dial for current value and also trend graph

Project 1 Project 2 Project n Project 3

Release 1 Release 2 Release n

LOB 1 LOB 2 Program

Project 4

Page 9: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 9 9

Weightage 15%

Benchmark 5%

Weightage 25%

Benchmark 8%

Weightage

10%

Benchmark

1%

Weightage 25%

Benchmark 5%

Weightage 15%

Benchmark 5%

Example of Calculating Execution Predictability

% Rework Effort due to change in

requirements - Measure of the effort going

on in rework of test artifacts

(Total Rework Effort (across all phases)) /

(Actual Effort expended for the project )*100

Tool #1

Tool #2 %

Execution Predictability - This view helps to analyze, evaluate & infer the execution predictability of the Project from QAE point of view. It helps to infer if the execution measures/metrics are on track & to help to change course correction based on deviations, if any. The metrics that are captured as part of this view are as listed below

Defect Reopen rate - Measures the quality

(No of defects reopened post retest/no of valid

defects logged)

Tool #1 %

Test Environment Downtime - Measures availability of test environments to test teams

(# of person days lost due to environment

downtime/ (total planned test execution person

days + planned test data setup person days))*100

Tool #1 %

Schedule Variance - Provides a measure of deviation of the actual

schedule against the planned schedule

Actual end date – Planned end date) / Planned end date – Planned start date) * 100

Tool #2 %

Requirement Stability Index - Measure of the stability of requirements

(Number of Original Requirements + Number of

requirements changed + Number of requirements added + Number

of requirements deleted)/ (Number of Original

Requirements)

Tool #1 #

Effort Variance - Provides a measure of the deviation of the actual effort

expended against the estimated effort

(Actual Effort -- Estimated Effort) / Estimated Effort) *

100 Tool #2 %

Weightage 15%

Metric value is compared against the benchmark. Meaning if “% Rework” value is “9%” and Benchmark was

“4%” then QI is (4% / 9%) * 100% = 44%

Value from Step 1 * Weightage %

= 44% * 15% = 6.6%

Execution predictability = SUMMATION of all Metrics Computed

Proportionally Weightage Index

Ste

p 1

Ste

p 2

Ste

p 3

Compute Base Quality Index

Compute Proportionally Weightage Index

Compute Execution Predictability

Benchmark 4%

©2015, Cognizant Technology solution

Page 10: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 10 10

Typical QI based dashboard

©2015, Cognizant Technology solution

Page 11: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 11 11

In Summary

Provide contextual, actionable and relevant information to stakeholders

Align messaging to ‘business outcomes’ as against ‘delivery status”

Have single view of quality using “Quality Index” like composite metric

Scale the framework outside delivery into all QA goals

Business Value of QA can be demonstrated by

Quality Dashboard

Outcome Alignment

Multi-dimensional Metrics Framework

Quality Index

©2015, Cognizant Technology solution

Page 12: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 12 12 ©2015, Cognizant Technology solution

Page 13: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 13 ©2015, Cognizant Technology solution

Page 14: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 14 14

Outcome Based View

Expected outcome Measure Influencing metrics

Programs and projects needs to be

monitored for ON TIME delivery

Execution Predictability

% Rework Effort due to change in requirements

Defect reopen rate

Test environment downtime

Schedule Variance

Requirement stability index (RSI)

% Effort Variation

Measure quality of product under

test by BSS QAE

Measure quality of product

deployed post BSS QAE phase

Product Stability - Inflight

Product Stability – Post QA

Requirements traceability

Application Defect Density

UAT Defect Leakage (%)

Production Defect Leakage (%)

©2015, Cognizant Technology solution

Page 15: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 15 15

QI – Calculation sample

Measure Influencing Metrics Release 1 Release 2 Release 3 Release 4 Weightage Benchmark Metric level

Quality Index

Measure level

Quality Index

Execution

predictability

% Rework Effort due to

change in requirements 0.3% 1.0% 9.0% 1.0% 15% 4% 150.0% 1.04

Defect reopen rate - 10.0% 12.0% 11.0% 20% 5% 0.0%

Test environment

downtime 12.0% 9.0% 9.0% 3.0% 25% 8% 66.7%

Schedule Variance 3.0% 10.0% - - 15% 5% 150.0%

Requirement stability

index (RSI) - 1.10 1.07 1.30 10% 1 0.0%

% Effort Variation -0.4% 56.0% -6.0% -17.0% 15% 5% 122.0%

Production

Stability –

Inflight

Requirements

traceability 100.0% 100.0% 100.0% 100.0% 5% 100% 100.0% 1.21

Application Defect

Density 14.0% 12.0% 8.0% 11.0% 20% 1% 70.0%

Production

Stability - Post

QA

UAT Defect Leakage (%) 1.0% 0.2% 1.0% 0.1% 35% 3% 150.0%

Production Defect

Leakage (%) 2.0% 0.8% 1.0% - 40% 2% 100.0%

©2015, Cognizant Technology solution

Page 16: Quality Index: A Composite Metric for the Voice of Testing

©2015, Cognizant Technology solution 16 16

Financial

Control

Production

Stability

Execution

Predictability

Learning &

Skills

A View for Agile

Requirements implementation index

Requirement stability index (RSI)

Defect reopen rate

Test environment downtime

Schedule Variance

Effort variation

Risk Aligned test coverage

Maturity of Dev Organization

Defect Density (Agile)

% Scope Creep

Testing Efficiency at the Sprint level

Test Design productivity (manual,

automation, performance) Test Execution productivity (manual,

automation, performance) Rework Effort(Due to QA) Test Effectiveness Build to QA ratio Offshore, Near shore leverage % CAPEX reduction through virtualization Velocity of the sprint Velocity rate Story point Productivity

Requirements Traceability UT/DIT coverage UT/DIT test results Defect leakage to production Application defect density Testing Efficiency at the Sprint level

Team Competency Index Review Efficiency Innovation Index

©2015, Cognizant Technology solution