SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

29
6 SOFTWARE QUALITY IN 2011: Capers Jones & Associates LLC SOFTWARE QUALITY IN 2011: A SURVEY OF THE STATE OF THE ART Capers Jones President Copyright © 2011 by Capers Jones. All Rights Reserved. http://www.spr.com Capers.Jones3@GMAILcom August 31, 2011 Capers Jones, President Data collected from 1984 through 2011 About 675 companies (150 clients in Fortune 500 set) SOURCES OF QUALITY DATA About 675 companies (150 clients in Fortune 500 set) About 35 government/military groups About 13,500 total projects New data = about 50-75 projects per month SWQUAL08\12 Copyright © 2011 by Capers Jones. All Rights Reserved. Data collected from 24 countries Observations during more than 15 lawsuits

Transcript of SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

Page 1: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

6

SOFTWARE QUALITY IN 2011:

Capers Jones & Associates LLC

SOFTWARE QUALITY IN 2011:

A SURVEY OF THE STATE OF THE ART

Capers Jones President

Copyright © 2011 by Capers Jones. All Rights Reserved.

http://www.spr.comCapers.Jones3@GMAILcom August 31, 2011

Capers Jones, President

Data collected from 1984 through 2011

• About 675 companies (150 clients in Fortune 500 set)

SOURCES OF QUALITY DATA

About 675 companies (150 clients in Fortune 500 set)

• About 35 government/military groups

• About 13,500 total projects

• New data = about 50-75 projects per month

SWQUAL08\12Copyright © 2011 by Capers Jones. All Rights Reserved.

• Data collected from 24 countries

• Observations during more than 15 lawsuits

Page 2: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

7

• Functional Software QualitySoftware that combines low defect rates and high levelsOf user satisfaction. The software should also meet all

BASIC DEFINITIONS OF SOFTWARE QUALITY

user requirements and adhere to international standards.

• Structural Software QualitySoftware that exhibits a robust architecture and can operateIn a multi-tier environment without failures or degraded performance. Software has low cyclomatic complexitylevels.

• Aesthetic Software Quality

SWQUAL08\13Copyright © 2011 by Capers Jones. All Rights Reserved.

Aesthetic Software QualitySoftware with elegant and easy to use commands andInterfaces, attractive screens, and well formatted outputs.

• “Technical debt”The assertion (by Ward Cunningham in 1992) thatquick and careless development with poor quality leadsto many years of expensive maintenance and enhancements.

ECONOMIC DEFINITIONS OF SOFTWARE QUALITY

to many years of expensive maintenance and enhancements.

• Cost of Quality (COQ)The overall costs of prevention, appraisal, internal failures, and external failures. For software these mean defect prevention,pre-test defect removal, testing, and post-release defect repairs.(Consequential damages are usually not counted.)

• Total Cost of Ownership (TCO)The sum of development + enhancement + maintenance +

SWQUAL08\14Copyright © 2011 by Capers Jones. All Rights Reserved.

The sum of development + enhancement + maintenance +support from day 1 until application is retired.(Recalculation at 5 year intervals is recommended.)

Page 3: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

8

SOFTWARE QUALITY HAZARDS IN FIVE INDUSTRIES

INDUSTRY HAZARD

Airlines Safety hazards

Air traffic control problems

Flight schedule confusion

Navigation equipment failures

Maintenance schedules thrown off

D l i i D i t

SWQUAL08\15Copyright © 2011 by Capers Jones. All Rights Reserved.

Delay in opening Denver airport

Passengers booked into non-existent seats

Passengers misidentified as terror suspects

SOFTWARE QUALITY HAZARDS IN FIVE INDUSTRIES

INDUSTRY HAZARD

Finance Financial transaction hazards

Interest calculations in error

Account balances thrown off

Credit card charges in error

Funds transfer thrown off

M t /l i t t t i

SWQUAL08\16Copyright © 2011 by Capers Jones. All Rights Reserved.

Mortgage/loan interest payments in error

Hacking and identity theft due to software security flaws

Denial of service attacks due to software security flaws

Page 4: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

9

SOFTWARE QUALITY HAZARDS IN FIVE INDUSTRIES

INDUSTRY HAZARD

Health Care Safety hazards

Patient monitoring devices malfunction

Operating room schedules thrown off

Medical instruments malfunction

Prescription refill problems

H d d i t ti

SWQUAL08\17Copyright © 2011 by Capers Jones. All Rights Reserved.

Hazardous drug interactions

Billing problems

Medical records stolen or released by accident

SOFTWARE QUALITY HAZARDS IN FIVE INDUSTRIES

INDUSTRY HAZARD

State, Local Governments Local economic hazards

School taxes miscalculated

Jury records thrown off

Real-estate transactions misfiled

Divorce, marriage records misfiled

Alimony, child support payment records lost

SWQUAL08\18Copyright © 2011 by Capers Jones. All Rights Reserved.

y, pp p y

Death records filed for wrong people

Traffic light synchronization thrown off

Errors in property tax assessments

Page 5: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

10

SOFTWARE QUALITY HAZARDS IN FIVE INDUSTRIES

INDUSTRY HAZARD

Commercial Software

Schedule delays

Cost overruns

High recall and maintenance costs

Extra and costly venture funding rounds

SWQUAL08\19Copyright © 2011 by Capers Jones. All Rights Reserved.

Equity dilution for shareholders

Loss of customer confidence

Opportunities for fast-followers and new competitors

SOFTWARE QUALITY HAZARDS ALL INDUSTRIES

1. Software is blamed for more major business problems than any other

man-made product.

2. Poor software quality has become one of the most expensive topics in

human history: > $150 billion per year in U.S.; > $500 billion per year

world wide.

3. Projects cancelled due to poor quality >15% more costly than

successful projects of the same size and type.

SWQUAL08\20Copyright © 2011 by Capers Jones. All Rights Reserved.

4. Software executives, managers, and technical personnel are regarded

by many CEO’s as a painful necessity rather than top professionals.

5. Improving software quality is a key topic for all industries.

Page 6: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

11

FUNDAMENTAL SOFTWARE QUALITY METRICS

• Defect Potentials– Sum of requirements errors, design errors, code errors,

document errors, bad fix errors, test plan errors, and test case errors

• Defect Discovery Efficiency (DDE)– Percent of defects discovered before release

• Defect Removal Efficiency (DRE)– Percent of defects removed before release

SWQUAL08\21Copyright © 2011 by Capers Jones. All Rights Reserved.

• Defect Severity Levels (Valid unique defects)Severity 1 = Total stoppageSeverity 2 = Major errorSeverity 3 = Minor errorSeverity 4 = Cosmetic error

• Standard Cost of Quality– Prevention– Appraisal

FUNDAMENTAL SOFTWARE QUALITY METRICS (cont.)

pp– Internal failures– External failures

• Revised Software Cost of Quality– Defect Prevention– Pre-Test Defect Removal (inspections, static analysis)– Testing Defect Removal– Post-Release Defect Removal

SWQUAL08\22Copyright © 2011 by Capers Jones. All Rights Reserved.

• Error-Prone Module Effort– Identification– Removal or redevelopment– repairs and rework

Page 7: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

12

QUALITY MEASUREMENT PROBLEMS

• Cost per defect penalizes quality!

• (Buggiest software has lowest cost per defect!)

• Lines of code penalize high-level languages!

• Lines of code ignore non-coding defects!

• Most companies don’t measure all defects!

SWQUAL08\23Copyright © 2011 by Capers Jones. All Rights Reserved.

• Most common omissions are requirement bugs,design bugs, and bugs found by desk checksand unit testing. Real bugs can outnumbermeasured bugs by more than 5 to 1!

COST PER DEFECT PENALIZES QUALITY

Case A Case BHigh quality Low quality

D f t f d 50 500Defects found 50 500

Test case creation $10,000 $10,000

Test case execution $10,000 $10,000

Defect repairs $10,000 $70,000

SWQUAL08\24Copyright © 2011 by Capers Jones. All Rights Reserved.

TOTAL $30,000 $90,000

Cost per Defect $600 $180

$ Cost savings $60,000 $0.00

Page 8: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

13

A BASIC LAW OF MANUFACTURING ECONOMICS

“If a manufacturing cycle has a high proportion of fixed costsand there is a decline in the number of units producedthe cost per unit will go up.”g

1. As quality improves the number of defects goes down.

2. Test preparation and test execution act like fixed costs.

3. Therefore the “cost per defect” must go up.

4 Late defects must cost more than early defects

SWQUAL08\25Copyright © 2011 by Capers Jones. All Rights Reserved.

4. Late defects must cost more than early defects.

5. Defects in high quality software cost more than in bad quality software.

LINES OF CODE HARM HIGH-LEVEL LANGUGES

Case A Case BJAVA C

KLOC 50 125KLOC 50 125Function points 1,000 1,000Code defects found 500 1,250Defects per KLOC 10.00 10.00Defects per FP 0.5 1.25Defect repairs $70,000 $175,000

$ per KLOC $1,400 $1,400$ $ $

SWQUAL08\26Copyright © 2011 by Capers Jones. All Rights Reserved.

$ per Defect $140 $140$ per Function Point $70 $175

$ cost savings $105,000 $0.00

Page 9: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

14

A BASIC LAW OF MANUFACTURING ECONOMICS

“If a manufacturing cycle has a high proportion of fixed costsand there is a decline in the number of units producedthe cost per unit will go up.”g

1) As language levels go up the number of lines of codeproduced comes down.

2) The costs of requirements, architecture, design, anddocumentation act as fixed costs.

3) Therefore the “cost per line of code” must go up

SWQUAL08\27Copyright © 2011 by Capers Jones. All Rights Reserved.

3) Therefore the cost per line of code must go up.

4) Cost per line of code penalizes languages in directproportion to their level.

(Data expressed in terms of defects per function point)

U.S. AVERAGES FOR SOFTWARE QUALITY

Defect Removal DeliveredDefect Origins Potential Efficiency Defects

Requirements 1.00 77% 0.23Design 1.25 85% 0.19Coding 1.75 95% 0.09Documents 0.60 80% 0.12Bad Fixes 0 40 70% 0 12

SWQUAL08\28Copyright © 2011 by Capers Jones. All Rights Reserved.

Bad Fixes 0.40 70% 0.12

TOTAL 5.00 85% 0.75

(Function points show all defect sources - not just coding defects)(Code defects = 35% of total defects)

Page 10: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

15

Defect Removal DeliveredDefect Origins Potential Efficiency Defects

(Data expressed in terms of defects per function point)

BEST IN CLASS SOFTWARE QUALITY

Defect Origins Potential Efficiency Defects

Requirements 0.40 85% 0.08Design 0.60 97% 0.02Coding 1.00 99% 0.01Documents 0.40 98% 0.01Bad Fixes 0.10 95% 0.01

TOTAL 2 50 96% 0 13

SWQUAL08\29Copyright © 2011 by Capers Jones. All Rights Reserved.

OBSERVATIONS

(Most often found in systems software > SEI CMM Level 3 or in TSP projects)

TOTAL 2.50 96% 0.13

Defect Removal DeliveredDefect Origins Potential Efficiency Defects

(Data expressed in terms of defects per function point)

POOR SOFTWARE QUALITY - MALPRACTICE

Defect Origins Potential Efficiency Defects

Requirements 1.50 50% 0.75Design 2.20 50% 1.10Coding 2.50 80% 0.50Documents 1.00 70% 0.30Bad Fixes 0.80 50% 0.40

TOTAL 8 00 62% 3 05

SWQUAL08\30Copyright © 2011 by Capers Jones. All Rights Reserved.

OBSERVATIONS

(Most often found in large water fall projects > 10,000 Function Points).

TOTAL 8.00 62% 3.05

Page 11: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

16

• Formal Inspections (Requirements, Design, and Code)

• Static analysis (for about 25 languages out of 2,500 in all)

• Joint Application Design (JAD)

GOOD QUALITY RESULTS > 90% SUCCESS RATE

• Functional quality metrics using function points

• Structural quality metrics such as cyclomatic complexity

• Defect Detection Efficiency (DDE) measurements

• Defect Removal Efficiency (DRE) measurements

• Automated Defect tracking tools

• Active Quality Assurance (> 3% SQA staff)

• Utilization of effective methods (Agile, XP, RUP, TSP, etc.)

SWQUAL08\31Copyright © 2011 by Capers Jones. All Rights Reserved.

• Mathematical test case design based on design of experiments

• Quality estimation tools

• Testing specialists (certified)

• Root-Cause Analysis

• Quality Function Deployment (QFD)

MIXED QUALITY RESULTS: < 50% SUCCESS RATE

• CMMI level 3 or higher (some overlap among CMMI levels:

Best CMMI 1 groups better than worst CMMI 3 groups)

• ISO and IEEE quality standards (Prevent low quality;ISO and IEEE quality standards (Prevent low quality;

Little benefit for high-quality teams)

• Six-Sigma methods (unless tailored for software projects)

• Independent Verification & Validation (IV & V)

• Quality circles in the United States (more success in Japan)

• Clean-room methods for rapidly changing requirements• Kaizan (moving from Japan to U.S. and elsewhere)

SWQUAL08\32Copyright © 2011 by Capers Jones. All Rights Reserved.

Kaizan (moving from Japan to U.S. and elsewhere)• Cost of quality without software modifications

Page 12: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

17

POOR QUALITY RESULTS: < 25% SUCCESS RATE

• Testing as only form of defect removal

• Informal Testing and uncertified test personnel

• Testing only by developers; no test specialists

• Passive Quality Assurance (< 3% QA staff)

• Token Quality Assurance (< 1% QA staff)

• LOC Metrics for quality (omits non-code defects)

SWQUAL08\33Copyright © 2011 by Capers Jones. All Rights Reserved.

• Cost per defect metric (penalizes quality)

• Failure to estimate quality or risks early

• Quality measurement “leakage” such as unit test bugs

SOFTWARE QUALITY OBSERVATIONS

• Individual programmers -- Less than 50% efficientin finding bugs in their own software

Quality Measurements Have Found:

g g

• Normal test steps -- often less than 75% efficient(1 of 4 bugs remain)

• Design Reviews and Code Inspections -- often more than 65% efficient; have topped 90%

• Static analysis –often more than 65% efficient;

SWQUAL08\34Copyright © 2011 by Capers Jones. All Rights Reserved.

• Static analysis –often more than 65% efficient;has topped 95%

• Inspections, static analysis, and testingcombined lower costs and schedules by > 20%;lower total cost of ownership (TCO) by > 45%.

Page 13: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

18

TOTAL SOFTWARE DEFECTS IN RANK ORDER

Defect Origins Defects per Function Point

1. Data defects 2.50 *2. Code defects 1.753. Test case defects 1.65 *4. Web site defects 1.40 *5. Design defects 1.25 **6. Requirement Defects 1.00 **7. Structural defects 0.70 **8. Document defects 0.60 **9. Bad-fix defects 0.40 **10. Requirement creep defects 0.30 **11. Security defects 0.25 **

SWQUAL08\35Copyright © 2011 by Capers Jones. All Rights Reserved.

12. Architecture Defects 0.20 *

TOTAL DEFECTS 12.00

* NOTE 1: Usually not measured due to lack of size metrics** NOTE 2: Often omitted from defect measurements

ORIGINS OF HIGH-SEVERITY SOFTWARE DEFECTS

Defect Origins Percent of Severity 1 and 2Defects

1. Design defects 17.00%g2. Code defects 15.00%3. Structural defects 13.00%4. Data defects 11.00%5. Requirements creep defects 10.00%6. Requirements defects 9.00%7. Web site defects 8.00%8. Security defects 7.00%9. Bad fix defects 4.00%10. Test case defects 2.00%

SWQUAL08\36Copyright © 2011 by Capers Jones. All Rights Reserved.

11. Document defects 2.00%12. Architecture Defects 2.00%

TOTAL DEFECTS 100.00%

Severity 1 = total stoppage; Severity 2 = major defects

Page 14: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

19

ORIGINS OF LOW-SEVERITY SOFTWARE DEFECTS

Defect Origins Percent of Severity 3 and 4Defects

1. Code defects 35.00%2. Data defects 20.00%3. Web site defects 10.00%4. Design defects 7.00%5. Structural defects 6.00%6. Requirements defects 4.00%7. Requirements creep defects 4.00%8. Security defects 4.00%9. Bad fix defects 4.00%10. Test case defects 3.00%

SWQUAL08\37Copyright © 2011 by Capers Jones. All Rights Reserved.

11. Document defects 2.00%12. Architecture Defects 1.00%

TOTAL DEFECTS 100.00%

Severity 3 = minor defects; Severity 4 = cosmetic defects

ORIGINS OF DUPLICATE DEFECTS

Defect Origins Percent of Duplicate Defects(Many reports of the same bugs)

1. Code defects 30.00%2. Structural defects 20.00%3. Data defects 20.00%4. Web site defects 10.00%5. Security defects 4.00%6. Requirements defects 3.00% 7. Design defects 3.00%8. Bad fix defects 3.00%9. Requirements creep defects 2.00%10. Test case defects 2.00%

SWQUAL08\38Copyright © 2011 by Capers Jones. All Rights Reserved.

11. Document defects 2.00%12. Architecture Defects 1.00%

TOTAL DEFECTS 100.00%

Duplicate = Multiple reports for the same bug (> 10,000 can occur)

Page 15: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

20

ORIGINS OF INVALID DEFECTS

Defect Origins Percent of Invalid Defects(Defects not caused by software itself)

1. Data defects 25.00%2. Structural defects 20.00%3. Web site defects 13.00%4. User errors 12.00%5. Document defects 10.00%6. External software 10.00%7. Requirements creep defects 3.00%8. Requirements defects 1.00% 9. Code defects 1.00%10. Test case defects 1.00%

SWQUAL08\39Copyright © 2011 by Capers Jones. All Rights Reserved.

11. Security defects 1.00%12. Design defects 1.00%13. Bad fix defects 1.00%14. Architecture Defects 1.00%TOTAL DEFECTS 100.00%Invalid = Defects caused by platforms or external software applications

WORK HOURS AND COSTS FOR DEFECT REPAIRS

Defect Origins Work Hours Costs($75 per hour)

1. Security defects 10.00 $750.002 Design defects 8 50 $637 502. Design defects 8.50 $637.503. Requirements creep defects 8.00 $600.004. Requirements defects 7.50 $562.505. Structural defects 7.25 $543.756. Architecture defects 7.00 $525.00 7. Data defects 6.50 $487.508. Bad fix defects 6.00 $450.009. Web site defects 5.50 $412.5010. Invalid defects 4.75 $356.2511. Test case defects 4.00 $300.00

SWQUAL08\40Copyright © 2011 by Capers Jones. All Rights Reserved.

11. Test case defects 4.00 $300.0012. Code defects 3.00 $225.0013. Document defects 1.75 $131.5014. Duplicate defects 1.00 $75.00

AVERAGES 5.77 $432.69

Maximum can be > 10 times greater

Page 16: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

21

DEFECT DAMAGES AND RECOVERY COSTS

Defect Origins

1. Security defects $200,000,0002. Design defects $175,000,000g $ , ,3. Requirements defects $150,000,0004. Data defects $125,000,0005. Code defects $100,000,0006. Structural defects $95,000,000 7. Requirements creep defects $90,000,0008. Web site defects $80,000,0009. Architecture defects $80,000,00010. Bad fix defects $60,000,00011. Test case defects $50,000,000

$

SWQUAL08\41Copyright © 2011 by Capers Jones. All Rights Reserved.

12. Document Defects $25,000,000

AVERAGES $102,500,000

Defect recovery costs for major applications in large companiesand government agencies

WORK HOURS AND COSTS BY SEVERITY

Defect Severity Work Hours Costs($75 per hour)

Severity 1 (total stoppage) 6.00 $450.00Severity 2 (major errors) 9 00 $675 00Severity 2 (major errors) 9.00 $675.00Severity 3 (minor errors) 3.00 $225.00Severity 4 (cosmetic errors) 1.00 $75.00

Abeyant defects (special case) 40.00 $3,000.00Invalid defects 4.75 $355.25Duplicate defects 1.00 $75.00

Maximum can be > 10 times greater

SWQUAL08\42Copyright © 2011 by Capers Jones. All Rights Reserved.

Maximum can be > 10 times greater

Page 17: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

22

DEFECT REPAIRS BY APPLICATION SIZE

Function. Sev 1 Sev 2 Sev 3 Sev 4 AVERAGEPoints Hours Hours Hours Hours HOURS

10 2.00 3.00 1.50 0.50 1.75100 4 00 6 00 2 00 0 50 3 13100 4.00 6.00 2.00 0.50 3.13

1000 6.00 9.00 3.00 1.00 4.7510000 8.00 12.00 4.00 1.50 6.38

100000 18.00 24.00 6.00 2.00 12.50

Function Sev 1 Sev 2 Sev 3 Sev 4 AVERAGEPoints $ $ $ $ COSTS

SWQUAL08\43Copyright © 2011 by Capers Jones. All Rights Reserved.

10 $150 $220 $112 $38 $132100 $300 $450 $150 $38 $234

1000 $450 $675 $225 $75 $35610000 $600 $900 $300 $113 $478

100000 $1350 $1800 $450 $150 $938

DEFECT REPORTS IN FIRST YEAR OF USAGE

FunctionPoints 10 100 1000 10,000 100,000

Users1 55% 27% 12% 3% 1%1 55% 27% 12% 3% 1%

10 65% 35% 17% 7%% 3%

100 75% 42% 20% 10% 7%

1000 85% 50% 27% 12% 10%

10,000 95% 75% 35% 20% 12%

SWQUAL08\44Copyright © 2011 by Capers Jones. All Rights Reserved.

100,000 99% 87% 45% 35% 20%

1,000,000 100% 96% 77% 45% 32%

10,000,000 100% 100% 90% 65% 45%

Page 18: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

23

ELAPSED TIME IN DAYS FOR DEFECT RESOLUTION

Removal method Stat. Unit Inspect. Funct. Sys. Maint.Analy. Test Test Test

Preparation 1 2 5 6 8 7

Execution 1 1 2 4 6 3

Repair 1 1 1 1 2 2

Validate 1 1 1 2 4 5

Integrate 1 1 1 2 4 6

Distribute 1 1 1 2 3 7

SWQUAL08\45Copyright © 2011 by Capers Jones. All Rights Reserved.

Distribute 1 1 1 2 3 7

TOTAL DAYS 6 7 11 17 27 30

Defect repairs take < 12% of elapsed time

SOFTWARE DEFECT SEVERITY CATEGORIES

Severity 1: TOTAL FAILURE S 1% at release

Severity 2: MAJOR PROBLEMS 20% at release

Severity 3: MINOR PROBLEMS 35% at release

Severity 4: COSMETIC ERRORS 44% at release

STRUCTURAL MULTI-TIER DEFECTS 15% of reports

INVALIDUSER OR SYSTEM ERRORS 15% of reports

SWQUAL08\46Copyright © 2011 by Capers Jones. All Rights Reserved.

INVALIDUSER OR SYSTEM ERRORS 15% of reports

DUPLICATE MULTIPLE REPORTS 30% of reports

ABEYANT CAN’T RECREATE ERROR 5% of reports

Page 19: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

24

HOW QUALITY AFFECTS SOFTWARE COSTS

Pathological

T h i l d bt

COSTHealthy

Poor quality is cheaper untilthe end of the coding phase.After that high quality is

Technical debt

SWQUAL08\47Copyright © 2011 by Capers Jones. All Rights Reserved.

Requirements Design Coding Testing Maintenance

TIME

After that, high quality ischeaper.

U. S. SOFTWARE QUALITY AVERAGES CIRCA 2011

(Defects per Function Point)

System Commercial Information Military OutsourceSoftware Software Software Software Software

DefectPotentials 6.0 5.0 4.5 7.0 5.2

DefectRemoval 94% 90% 73% 96% 92%Efficiency

DeliveredDefects 0.36 0.50 1.22 0.28 0.42

SWQUAL08\48Copyright © 2011 by Capers Jones. All Rights Reserved.

First YearDiscovery Rate 65% 70% 30% 75% 60%

First YearReported 0.23 0.35 0.36 0.21 0.25Defects

Page 20: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

25

U. S. SOFTWARE QUALITY AVERAGES CIRCA 2011

(Defects per Function Point)

Web Embedded SEI-CMM 3 SEI-CMM 1 OverallSoftware Software Software Software Average

DefectPotentials 4.0 5.5 5.0 5.75 5.1

DefectRemoval 72% 95% 95% 83% 86.7%Efficiency

DeliveredDefects 1.12 0.3 0.25 0.90 0.68

SWQUAL08\49Copyright © 2011 by Capers Jones. All Rights Reserved.

First YearDiscovery Rate 95% 90% 60% 35% 64.4%

First YearReported 1.06 0.25 0.15 0.34 0.42Defects

SOFTWARE SIZE VS DEFECT REMOVAL EFFICIENCY

SizeDefect

Potential

DefectRemoval

EfficiencyDeliveredDefects

1st YearDiscovery

Rate

1st YearReportedDefects

(Data Expressed in terms of Defects per Function Point)

y

1 1.85 95.00% 0.09 90.00% 0.08

10 2.45 92.00% 0.20 80.00% 0.16

100 3.68 90.00% 0.37 70.00% 0.26

1000 5.00 85.00% 0.75 50.00% 0.38

10000 7 60 78 00% 1 67 40 00% 0 67

SWQUAL08\50Copyright © 2011 by Capers Jones. All Rights Reserved.

10000 7.60 78.00% 1.67 40.00% 0.67

100000 9.55 75.00% 2.39 30.00% 0.72

AVERAGE 5.02 85.83% 0.91 60.00% 0.38

Page 21: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

26

DEFECTS AND SOFTWARE METHODOLGOIES

(Data Expressed in Terms of Defects per Function PointFor projects nominally 1000 function points in size)

Defect Removal DeliveredSoftware methods Potential Efficiency Defects

Waterfall 5.50 80% 1.10

Iterative 4.75 87% 0.62

Object-Oriented 4.50 88% 0.54

Rational Unified Process (RUP) 4.25 92% 0.34

SWQUAL08\51Copyright © 2011 by Capers Jones. All Rights Reserved.

Agile 4.00 90% 0.40

PSP and TSP 3.50 96% 0.14

85% Certified reuse 1.75 99% 0.02

DEFECTS AND SOFTWARE METHODOLGOIES

(Data Expressed in Terms of Defects per Function PointFor projects nominally 10,000 function points in size)

Defect Removal DeliveredSoftware methods Potential Efficiency Defects

Waterfall 7.00 75% 1.75

Iterative 6.25 82% 1.13

Object-Oriented 5.75 85% 0.86

Rational Unified Process (RUP) 5.50 90% 0.55

SWQUAL08\52Copyright © 2011 by Capers Jones. All Rights Reserved.

Agile 5.50 87% 0.72

PSP and TSP 5.00 94% 0.30

85% Certified reuse 2.25 96% 0.09

Page 22: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

27

9

10

Defectsper FP

.

MAJOR SOFTWARE QUALITY ZONES

2

3

4

5

6

7

8

9

Malpractice

U.S. Average

.

SWQUAL08\53Copyright © 2011 by Capers Jones. All Rights Reserved.

0

1

50% 55% 60% 65% 70% 75% 80% 85% 90% 95% 100%

Defect Removal Efficiency

Best in Class

.

INDUSTRY-WIDE DEFECT CAUSES

1. Requirements problems (omissions; changes, errors)

2 D i bl ( i i h )

Ranked in order of effort required to fix the defects:

2. Design problems (omissions; changes; errors)

3. Security flaws and vulnerabilities

4. Interface problems between modules

5. Logic, branching, and structural problems

6. Memory allocation problems

7. Testing omissions and poor coverage

SWQUAL08\54Copyright © 2011 by Capers Jones. All Rights Reserved.

g p g

8. Test case errors

9. Stress/performance problems

10. Bad fixes/Regressions

Page 23: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

28

OPTIMIZING QUALITY AND PRODUCTIVITY

Projects that achieve 95% cumulative Defect Removal Efficiency will find:

1) Minimum schedules

2) Maximum productivity

3) High levels of user and team satisfaction

4) Low levels of delivered defects

SWQUAL08\55Copyright © 2011 by Capers Jones. All Rights Reserved.

4) Low levels of delivered defects

5) Low levels of maintenance costs

6) Low risk of litigation

INDUSTRY DATA ON DEFECT ORIGINSBecause defect removal is such a major cost element, studying defect origins is a valuable undertaking.

IBM Corporation (MVS) SPR Corporation (client studies)

45% Design errors 20% Requirements errors25% Coding errors 30% Design errors20% Bad fixes 35% Coding errors

5% Documentation errors 10% Bad fixes5% Administrative errors 5% Documentation errors

100% 100%

SWQUAL08\56Copyright © 2011 by Capers Jones. All Rights Reserved.

TRW Corporation Mitre Corporation Nippon Electric Corp.

60% Design errors 64% Design errors 60% Design errors40% Coding errors 36% Coding errors 40% Coding errors

100% 100% 100%

Page 24: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

29

SOFTWARE QUALITY AND PRODUCTIVITY

• The most effective way of improving software productivityand shortening project schedules is to reduce defect levels.

• Defect reduction can occur through:• Defect reduction can occur through:

1. Defect prevention technologiesStructured design and JADStructured codeUse of inspections, static analysisReuse of certified components

SWQUAL08\57Copyright © 2011 by Capers Jones. All Rights Reserved.

2. Defect removal technologiesDesign inspectionsCode inspections, static analysisFormal Testing using mathematical test case design

DEFECT REMOVAL EFFICIENCY EXAMPLE

DEVELOPMENT DEFECTS REMOVEDInspections 350Static analysis 300yTesting 250

Subtotal 900

USER-REPORTED DEFECTS IN FIRST 90 DAYSValid unique defects 100

TOTAL DEFECT VOLUMED f t t t l 1000

SWQUAL08\58Copyright © 2011 by Capers Jones. All Rights Reserved.

Defect totals 1000

REMOVAL EFFICIENCYDev. (900) / Total (1000) = 90%

Page 25: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

30

RANGES OF DEFECT REMOVAL EFFICIENCY

Lowest Median Highest

1 Requirements review (informal) 20% 30% 50%

2 Top-level design reviews (informal) 30% 40% 60%p g ( )

3 Detailed functional design inspection 30% 65% 85%

4 Detailed logic design inspection 35% 65% 75%

5 Code inspection or static analysis 35% 60% 90%

6 Unit tests 10% 25% 50%

7 New Function tests 20% 35% 65%

8 I t ti t t 25% 45% 60%

SWQUAL08\59Copyright © 2011 by Capers Jones. All Rights Reserved.

8 Integration tests 25% 45% 60%

9 System test 25% 50% 65%

10 External Beta tests 15% 40% 75%

CUMULATIVE EFFICIENCY 75% 98% 99.99%

NORMAL DEFECT ORIGIN/DISCOVERY GAPS

Defect Origins

Requirements Design Coding Documentation Testing Maintenance

g

DefectDiscovery

SWQUAL08\60Copyright © 2011 by Capers Jones. All Rights Reserved.

yRequirements Design Coding Documentation Testing Maintenance

Zone of Chaos

Page 26: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

31

Defect Origins

Requirements Design Coding Documentation Testing Maintenance

DEFECT ORIGINS/DISCOVERY WITH INSPECTIONS

g

DefectDiscovery

SWQUAL08\61Copyright © 2011 by Capers Jones. All Rights Reserved.

yRequirements Design Coding Documentation Testing Maintenance

DISTRIBUTION OF 1500 SOFTWARE PROJECTS BYDEFECT REMOVAL EFFICIENCY LEVEL

Defect Removal EfficiencyLevel (Percent) Number of Projects

Percent ofProjects

> 99 6 0.40%

95 - 99 104 6.93%

90 - 95 263 17.53%

85 - 90 559 37.26%

80 - 85 408 27.20%

SWQUAL08\62Copyright © 2011 by Capers Jones. All Rights Reserved.

< 80 161 10.73%

Total 1,500 100.00%

Page 27: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

32

CONCLUSIONS ON SOFTWARE QUALITY

• No single quality method is adequate by itself.

• Formal inspections, static analysis are most efficientp , y

• Inspections + static analysis + testing > 97% efficient.

• Defect prevention + removal best overall

• Quality function deployment & six-sigma prevent defects

Higher CMMI levels TSP RUP Agile XP are effective

SWQUAL08\63Copyright © 2011 by Capers Jones. All Rights Reserved.

• Higher CMMI levels, TSP, RUP, Agile, XP are effective

• Quality excellence has ROI > $15 for each $1 spent

• High quality benefits schedules, productivity, users

REFERENCES ON SOFTWARE QUALITY

Black, Rex; Managing the Testing Process; Microsoft Press, 1999.

Crosby, Phiip B.; Quality is Free; New American Library, Mentor Books, 1979.

Gack Gary, Managing the Black Hole; Business Expert Publishing, 2009

Gilb, Tom & Graham, Dorothy; Software Inspections; Addison Wesley, 1983.

Jones, Capers & Bonsignour, Olivier; The Economics of Software Quality;Addison Wesley, 2011

Jones, Capers; Software Engineering Best Practices; McGraw Hill, 2010

SWQUAL08\64Copyright © 2011 by Capers Jones. All Rights Reserved.

Jones, Capers; Applied Software Measurement; McGraw Hill, 2008.

Jones, Capers; Estimating Software Costs, McGraw Hill, 2007.

Jones, Capers; Assessments, Benchmarks, and Best Practices,Addison Wesley, 2000.

Page 28: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

33

REFERENCES ON SOFTWARE QUALITY

Kan, Steve; Metrics and Models in Software Quality Engineering,Addison Wesley, 2003.

McConnell; Steve; Code Complete 2; Microsoft Press, 2004

Pirsig, Robert; Zen and the Art of Motorcycle Maintenance; Bantam; 1984

Radice, Ron; High-quality, Low-cost Software Inspections,Paradoxican Publishing, 2002.

Wiegers, Karl; Peer Reviews in Software, Addison Wesley, 2002.

SWQUAL08\65Copyright © 2011 by Capers Jones. All Rights Reserved.

REFERENCES ON SOFTWARE QUALITY

www.ASQ.org (American Society for Quality)

www.IFPUG.org (Int. Func. Pt. Users Group)

www.ISBSG.org (Int. Software Bench. Standards Group)

www.ISO.org (International Organization for Standards)

www.ITMPI.org (Infor. Tech. Metrics and Productivity Institute)

www.PMI.org (Project Management Institute)

www.processfusion.net (Process Fusion)

SWQUAL08\66Copyright © 2011 by Capers Jones. All Rights Reserved.

www.SEI.CMU.edu (Software Engineering Institute)

www.SPR.com (Software Productivity Research LLC)

www.SSQ.org (Society for Software Quality)

Page 29: SOFTWARE QUALITY IN 2011 : A SURVEY OF THE STATE OF THE ART

34

REFERENCES ON SOFTWARE QUALITY

www.SEMAT.org (Software Engineering Methods and Theory)

www.CISQ.org (Consortium for IT software quality)www.CISQ.org (Consortium for IT software quality)

www.SANS.org Sans Institute listing of software defects

www.eoqsg.org European Institute for Software Qualiy

www,galorath.com Galorath Associates

www.associationforsoftwaretesting.orgAssociation for Software Testing

SWQUAL08\67Copyright © 2011 by Capers Jones. All Rights Reserved.

g

www.qualityassuranceinstitute.comQuality Assurance Institute (QAI)