1 Gary A. Gack MBA, Six Sigma Black Belt, ASQ Certified Software Quality Engineer Owner,...
-
Upload
clemence-greene -
Category
Documents
-
view
224 -
download
0
Transcript of 1 Gary A. Gack MBA, Six Sigma Black Belt, ASQ Certified Software Quality Engineer Owner,...
1
Gary A. GackMBA, Six Sigma Black Belt, ASQ Certified Software Quality Engineer
Owner, Process-Fusion.net
© 2011 Process-Fusion.net
Modeling and Managing Software Productivity & Quality … balancing Efficiency and Effectiveness
Softec 2011Kuala Lumpur, Malaysia
Agenda
© 2011 Process-Fusion.net 2
Measuring Efficiency (Productivity) • the Cost of Quality Framework
Measuring Effectiveness (Quality)• Defect Containment
Modeling & Managing Efficiency and Effectiveness• Why Modeling?• Scenarios Considered• Effectiveness Results• Efficiency Results
“An Apple a Day … “
What is “Efficient” (Productive)?How can it be measured?
A Lean Perspective
The Cost of Quality Framework
© 2011 Process-Fusion.net 3
A software process is “productive” (efficient) if, relative to an alternative …
• It produces an equivalent or better result at lower cost.
• For example, if defect-finding strategy “A”– finds the same number of defects as does strategy “B” (i.e., the two
strategies are equally effective), – but does so at lower cost, – strategy “A” is more efficient than strategy “B”. A is more
“productive” than B.
“Productive”? What does that mean?
© 2011 Process-Fusion.net 4
Feigenbaum’s Cost of (Poor) Quality Framework
© 2011 Process-Fusion.net 5
“pre-release”
“post-release”
“pre-release”
(finding defects)
(fixing defects)
(fixing defects)
“Lean” Meets Software Development
Taiichi Ohno’s 7 Wastes
Defects
Overproduction
Inventory
Extra processing
Unnecessary motion
Transportation
Waiting
Software/IT Translation
Rework - missing, wrong, extra, (avoidable)
Low value “features”, unused “hooks”
Unassigned Backlog – Requirements, Designs
Unused Documentation
Task switching, concurrent assignments
Delays for approvals, decisions, resources
Handoffs
6© 2011 Process-Fusion.net
What % of Spend is “Value Added”?(i.e., creating new features & functions)
(% Non-Value Added)
(Prevention + Appraisal)(Rework)
??
7© 2011 Process-Fusion.net
Total Cost =
Value Added: new features & functions
+ Finding & fixing defects - “internal” (pre-delivery) and “external” (post-delivery)
+ Prevention: training, process improvement efforts
Software Industry Cost of Quality
© 2011 Process-Fusion.net 8
Effort Devoted to “de-scoped” features
“C”=10% ??
“B”=20%
Source: Capers Jones
“A”=42% A + B + C >= 70%
To improve Efficiency (productivity), reduce NVA
• NVA ~= Appraisal + Rework• (Optimization = what, when, how)
Key “Take-away”:
© 2011 Process-Fusion.net 9
What is “Effectiveness”?How can it be measured?
A Quality Perspective
Defect Containment
© 2011 Process-Fusion.net 10
Delivered software is “effective” if:
(1) it serves a valid organizational purpose - efforts are made to quantify this aspect of effectiveness with return on
investment estimates, yet it is essentially a subjective evaluation.
(2) it is acceptably defect free. The term “defect” in this context is intended to be broadly construed
• e.g., a missed or incorrect requirement is a defect; a user-unfriendly design is a defect.
• Hence, once a project has been initiated the effectiveness of the software process used to execute the project is appropriately measured by defect containment – i.e., the percentage of defects removed before the software product is delivered to the customer.
“Effective”? What does that mean?
© 2011 Process-Fusion.net 11
• “Total Containment Effectiveness” (TCE)= % of defects found before release
e.g., 80 defects found in test, 20 found by customers = 80% TCE
Measure customer defects over agreed time frames (3/6/12 months)
• Defect Containment “Efficiency” considers cost• Phase/Iteration (“step”) Appraisal Containment
= % of defects present found by a specific appraisal event
e.g., of 50 defects present in requirements, 40 found by inspection = 80% “step” containment
Defects present can be estimated and/or evaluated in retrospect by identifying “origin”
Defect Containment Defined
12© 2011 Process-Fusion.net
© 2011 Process-Fusion.net 13
Defect Insertion and Removal Benchmarks
Capers Jones, Applied Software Measurement, 3rd Ed.
Do You Know Your Numbers?
Phase Introduced Defects per Function Point (MIS)
Requirements .75 (.84)
Design 1.50 (1.69)
Code 1.75 (1.97)
Documents .50 ( - )
Bad Fixes .50
Appraisal Method % Removed (MIS)
Unit Test 25%
Function Test 30%
Integration Test 30%
System Test 35%
Acceptance Test 25%
Inspections 60-90%
Leading Indicators Provide CONTROL
© 2011 Process-Fusion.net 14
Modeling & Managing Software Process Efficiency
and Effectiveness
• In many software groups finding and fixing defects consumes 50-70% of total cost – Best practice groups reduce that by at least 50%
• Models allow you to think through the consequences of alternative strategies … quickly, at very low cost
• Models allow you to forecast both quality and financial consequences of alternatives– Creating a business case in the process– Creating a basis for “quality adjusted” status evaluation
• Modeling motivates measurement and “management by fact”
Why Modeling?
© 2011 Process-Fusion.net 15
• Predict (1) delivered quality and (2) total non-value-added effort (cost)• Predict defect “insertion”
– Focus attention on defects, which account for the largest share of total development cost.
– Enable early monitoring of the relationship between defects likely to be present and those actually found – provide early awareness.
• Estimate effort needed to execute the volume of appraisal necessary to find the number of defects we forecast to remove. – a ‘sanity check’ on the planned level of appraisal effort – i.e., is it actually
plausible to remove an acceptable volume of defects with the level of effort planned?
• Forecast both “pre-release” (before delivery) and “post-release” (after delivery) NVA effort. – When delivered quality is poor, post-release defect repair costs can be 50%
of the original project budget.
Model Objectives
© 2011 Process-Fusion.net 16
Don’t focus on the parameter values I have used– The thought process is the important part– Actual values vary considerably from place to place
Where available I have used industry benchmarks– All benchmarks conceal large variation– Where benchmarks are not available I’ve used experience as a
guide– Your local values may well be quite different
Use models such as these to do “what if” analysis– “simulate” a range of assumptions– Best/worst/most likely values
IMPORTANT Caveats
© 2011 Process-Fusion.net 17
Scenarios Evaluated
© 2011 Process-Fusion.net 18
Simulation Results - Containment
© 2011 Process-Fusion.net 19
I will be happy to provide a copy of the model and related articles for your [email protected]
Simulation Results: Non-Value-Added
© 2011 Process-Fusion.net 20
“find”“fix”
© 2011 Process-Fusion.net 21
Cost of Quality Revisited
© 2011 Process-Fusion.net 22
WHEN you invest matters more than how much
• Formal inspections, conducted in accordance with IEEE Std. 1028-2008, are always efficient & effective … better than any form of testing
• Maximum benefits come when applied to requirements, architecture, and design
• YOU can both reduce cost (improve productivity) and deliver better quality
“An apple a day …”
© 2011 Process-Fusion.net 23
© 2011 Process-Fusion.net 24
Thank You!
terima kasih
謝謝