Software Metrics
-
Upload
softwarecentral -
Category
Documents
-
view
4.629 -
download
4
description
Transcript of Software Metrics
1
SEA Side Software Engineering Annotations
Annotation11: Software Metrics
One hour presentation to inform you of new techniques and practices in software development.
Professor Sara StoecklinDirector of Software Engineering- Panama City
Florida State University – Computer [email protected]
850-522-2023 Ex 182
2
Express in Express in NumbersNumbers
Express in Express in NumbersNumbers
Measurement provides a mechanism for objective evaluation
3
Software Crisis Software Crisis
• According to American Programmer, 31.1% of computer software projects get canceled before they are completed,
• 52.7% will overrun their initial cost estimates by 189%.
• 94% of project start-ups are restarts of previously failed projects. Solution?systematic approach to software development and measurement
4
Software Metrics
• It refers to a broad range of quantitative measurements for computer software that enable to– improve the software process
continuously– assist in quality control and productivity– assess the quality of technical products– assist in tactical decision-making
5
Measure, Metrics, Indicators
• Measure.– provides a quantitative indication of the
extent, amount, dimension, capacity, or size of some attributes of a product or process.
• Metrics.– relates the individual measures in some way.
• Indicator.– a combination of metrics that provide insight
into the software process or project or product itself.
6
What Should Be Measured?
measurementmeasurement
What do weWhat do weuse as ause as abasis?basis? • • size?size? • • function?function?
project metricsproject metrics
process metricsprocess metricsprocessprocess
productproduct
product metricsproduct metrics
7
Metrics of Process Improvement
• Focus on Manageable Repeatable Process
• Use of Statistical SQA on Process
• Defect Removal Efficiency
8
Statistical Software Process ImprovementStatistical Software Process Improvement
All errors and defects are categorized by origin
The cost to correct each error and defect is recorded
No. of errors and defects in each category is counted and ranked in descending order
The overall cost in each category is computed
Resultant data are analyzed and the “culprit” category is uncovered
Plans are developed to eliminate the errors
9
Causes and Origin of Defects
Logic20%
Sofware Interface6%
Hardware Interface8%
User Interface12%
Data Handling11%
Error Checking11%
Standards7%
Specification25%
Logic20%
Sofware Interface6%
Hardware Interface8%
User Interface12%
Data Handling11%
Error Checking11%
Standards7%
Specification25%
10
Metrics of Project Management
• Budget• Schedule/ReResource
Management• Risk Management• Project goals met or
exceeded• Customer satisfaction
11
Metrics of the Software Product
• Focus on Deliverable Quality
• Analysis Products• Design Product
Complexity – algorithmic, architectural, data flow
• Code Products• Production System
12
How Is Quality Measured?
• Analysis Metrics– Function-based Metrics: Function
Points( Albrecht), Feature Points (C. Jones)
– Bang Metric (DeMarco): Functional Primitives, Data Elements, Objects, Relationships, States, Transitions, External Manual Primitives, Input Data Elements, Output Data Elements, Persistent Data Elements, Data Tokens, Relationship Connections.
13
Source Lines of Code (SLOC)
• Measures the number of physical lines of active code
• In general the higher the SLOC in a module the less understandable and maintainable the module is
14
Function Oriented Metric - Function Points
• Function Points are a measure of “how big” is the program, independently from the actual physical size of it
• It is a weighted count of several features of the program
• Dislikers claim FP make no sense wrt the representational theory of measurement
• There are firms and institutions taking them very seriously
15
complexity multiplier
function points
number of user inputs number of user outputs number of user inquiries number of files number of ext.interfaces
measurement parameter
3 4 3 7 5
countweighting factor
simple avg. complex
4 5 4 10 7
6 7 6 15 10
= = = = =
count-total
X X X X X
Analyzing the Information Domain
Assuming all inputs with the same weight, all output with the same weight, …
Complete Formula for the Unadjusted Function Points:
Assuming all inputs with the same weight, all output with the same weight, …
Complete Formula for the Unadjusted Function Points:
lesInternalFi terfacesExternalInInquiryOutputInputs
WeiWifWinWoWi
Unadjusted Function Points:Unadjusted Function Points:
16
Taking Complexity into Account
Factors are rated on a scale of 0 (not important) to 5 (very important):
data communications distributed functions heavily used configuration transaction rate on-line data entry end user efficiency
on-line update complex processing installation ease operational ease multiple sites facilitate change
MultiplierComplexity MultiplierComplexityFCM
Formula:Formula:
17
Typical Function-Oriented Metrics
• errors per FP (thousand lines of code)• defects per FP• $ per FP• pages of documentation per FP• FP per person-month
18
LOC vs. FP
• Relationship between lines of code and Relationship between lines of code and function points depends upon the function points depends upon the programming language that is used to programming language that is used to implement the software and the quality of implement the software and the quality of the designthe design
• Empirical studies show an approximate Empirical studies show an approximate relationship between LOC and FPrelationship between LOC and FP
19
LOC/FP (average)Assembly languageAssembly language 320320
CC 128128
COBOL, FORTRANCOBOL, FORTRAN 106106
C++C++ 64 64
Visual BasicVisual Basic 32 32
SmalltalkSmalltalk 22 22
SQLSQL 12 12
Graphical languages (icons)Graphical languages (icons) 4 4
20
How Is Quality Measured?
• Design Metrics– Structural Complexity: fan-in, fan-out, morphology
– System Complexity:– Data Complexity:– Component Metrics: Size, Modularity, Localization,
Encapsulation, Information Hiding, Inheritance, Abstraction, Complexity, Coupling, Cohesion, Polymorphism
• Implementation MetricsSize, Complexity, Efficiency, etc.
21
Comment Percentage (CP)
• Number of commented lines of code divided by the number of non-blank lines of code
• Usually 20% indicates adequate commenting for C or Fortran code
• The higher the CP value the more maintainable the module is
22
Size Oriented Metric - Fan In and Fan Out
• The Fan In of a module is the amount of information that “enters” the module
• The Fan Out of a module is the amount of information that “exits” a module
• We assume all the pieces of information with the same size
• Fan In and Fan Out can be computed for functions, modules, objects, and also non-code components
• Goal - Low Fan Out for ease of maintenance.
23
Size Oriented Metric - Halstead Software Science
Primitive Measures
number of distinct operators
number of distinct operands
total number of operator occurrences
total number of operand occurrences
Used to Derive
maintenance effort of software
testing time required for software
24
a
YX
Predicate Nodes
Flow Graph
if (a) {X();
} else {Y();
}
if (a) {X();
} else {Y();
}
•V(G) = E - N + 2
• where E = number of edges
• and N = number of nodes
25
McCabes Metric
• Smaller the V(G) the simpler the module.
• Modules larger than V(G) 10 are a little unmanageable.
• A high cyclomatic complexity indicates that the code may be of low quality and difficult to test and maintain
26
Chidamber and Kemerer Metrics
• Weighted methods per class (MWC)
• Depth of inheritance tree (DIT)
• Number of children (NOC)
• Coupling between object classes (CBO)
• Response for class (RFC)
• Lack of cohesion metric (LCOM)
27
Weighted methods per class (WMC)
• ci is the complexity of each method Mi of the class
– Often, only public methods are considered
• Complexity may be the McCabe complexity of the method
• Smaller values are better
• Perhaps the average complexity per method is a better metric?
n
iicWMC
1
The number of methods and complexity of methods involved is a direct predictor of how much time and effort is required to develop and maintain the class.
28
Depth of inheritance tree (DIT)
• For the system under examination, consider the hierarchy of classes
• DIT is the length of the maximum path from the node to the root of the tree
• Relates to the scope of the properties– How many ancestor classes can potential affect a class
• Smaller values are better
29
Number of children (NOC)
• For any class in the inheritance tree, NOC is the number of immediate children of the class– The number of direct subclasses
• How would you interpret this number?
• A moderate value indicates scope for reuse and high values may indicate an inappropriate abstraction in the design
30
Coupling between object classes (CBO)
• For a class, C, the CBO metric is the number of other classes to which the class is coupled
• A class, X, is coupled to class C if– X operates on (affects) C or– C operates on X
• Excessive coupling indicates weakness of class encapsulation and may inhibit reuse
• High coupling also indicates that more faults may be introduced due to inter-class activities
31
Response for class (RFC)
• Mci # of methods called in response to a message that invokes method Mi
– Fully nested set of calls
• Smaller numbers are better– Larger numbers indicate
increased complexity and debugging difficulties
n
iiMc
1
If a large number of methods can be invoked in response to a message, the testing and debugging of the class becomes more complicated
32
Lack of cohesion metric (LCOM)
• Number of methods in a class that reference a specific instance variable
• A measure of the “tightness” of the code
• If a method references many instance variables, then it is more complex, and less cohesive
• The larger the number of similar methods in a class the more cohesive the class is
• Cohesiveness of methods within a class is desirable, since it promotes encapsulation
33
Testing Metrics
• Metrics that predict the likely number of tests required during various testing phases
• Metrics that focus on test coverage for a given component
34
Views on SE Measurement
35
Views on SE Measurement
36
Views on SE Measurement
37
12 Steps to Useful Software Metrics
Step 1 - Identify Metrics Customers
Step 2 - Target Goals
Step 3 - Ask Questions
Step 4 - Select Metrics
Step 5 - Standardize Definitions
Step 6 - Choose a Model
Step 7 - Establish Counting Criteria
Step 8 - Decide On Decision Criteria
Step 9 - Define Reporting Mechanisms
Step 10 - Determine Additional Qualifiers
Step 11 - Collect Data
Step 12 - Consider Human Factors
38
Step 1 - Identify Metrics Customers
Who needs the information?
Who’s going to use the metrics?
If the metric does not have a customer -- do not use it.
39
Step 2 - Target Goals
Organizational goals
– Be the low cost provider
– Meet projected revenue targets
Project goals
– Deliver the product by June 1st
– Finish the project within budget
Task goals (entry & exit criteria)
– Effectively inspect software module ABC
– Obtain 100% statement coverage during testing
40
Step 3 - Ask Questions
Goal: Maintain a high level of customer satisfaction
• What is our current level of customer satisfaction?
• What attributes of our products and services are most important to our customers?
• How do we compare with our competition?
41
Step 4 - Select MetricsSelect metrics that provide information to help answer the questions
• Be practical, realistic, pragmatic
• Consider current engineering environment
• Start with the possible
Metrics don’t solve problems -- people solve problems
Metrics provide information so people can make better decisions
42
Selecting Metrics
Goal: Ensure all known defects are corrected before shipment
• • • • • • •
43
Metrics Objective Statement Template
To
understandevaluatecontrolpredict
theattributeof theentity
in order to
goal(s)
evaluate
% defectsfound &
corrected during testing
To thein order to
ensure all known defectsare corrected
before shipment
Example - Metric: % defects corrected
44
Step 5 - Standardize Definitions
Developer User
45
Step 6 - Choose a MeasurementModels for code inspection metrics
• Primitive Measurements:– Lines of Code Inspected = loc
– Hours Spent Preparing = prep_hrs
– Hours Spent Inspecting = in_hrs
– Discovered Defects = defects
• Other Measurements:– Preparation Rate = loc / prep_hrs
– Inspection Rate = loc / in_hrs
– Defect Detection Rate = defects / (prep_hrs + in_hrs)
46
Step 7 - Establish Counting Criteria
Lines of Code
• Variations in counting
• No industry accepted standard
• SEI guideline - check sheets for criteria
• Advice: use a tool
47
Counting Criteria - Effort
What is a Software Project?
• When does it start / stop?
• What activities does it include?
• Who works on it?
48
Step 8 - Decide On Decision Criteria
Establish Baselines
• Current value– Problem report backlog
– Defect prone modules
• Statistical analysis (mean & distribution)– Defect density
– Fix response time
– Cycle time
– Variance from budget (e.g., cost, schedule)
49
Step 9 - Define Reporting MechanismsOpen Fixed Resolved
Jan-97 23 13 3Feb-97 27 24 11Mar-97 18 26 15Apr-97 12 18 27
0
40
80
120
0 20 40 60 80 100 120
0
20
40
60
80
100
1st Qtr 2nd Qtr 3rd Qtr 4th Qtr
1 2 3 4 5 6 7 8 9 10 11 12
0
20
40
60
80
100
Jan Mar May July 0
40
80
120
160
1st Qtr 2nd Qtr 3rd Qtr 4th Qtr
50
Step 10 - Determine Additional Qualifiers
A good metric is a generic metric
Additional qualifiers:
• Provide demographic information
• Allow detailed analysis at multiple levels
• Define additional data requirements
51
Additional Qualifier Example
Metric: software defect arrival rate
• Release / product / product line
• Module / program / subsystem
• Reporting customer / customer group
• Root cause
• Phase found / phase introduced
• Severity
52
Step 11 – Collect Data What data to collect?
• Metric primitives
• Additional qualifiers
Who should collect the data?• The data owner
– Direct access to source of data
– Responsible for generating data
– Owners more likely to detect anomalies
– Eliminates double data entry
53
Examples of Data OwnershipOwner Examples of Data Owned
• Management • Schedule• Budget
• Engineers • Time spent per task• Inspection data including defects found• Root cause of defects
• Testers • Test Cases planned / executed / passed• Problems• Test coverage
• Configuration management • Lines of code specialists • Modules changed• Users • Problems
• Operation hours
54
Step 12 – Consider Human Factors
The People Side of the Metrics Equation
• How measures affect people
• How people affect measures
“Don’t underestimate the intelligence of your engineers. For any one metric you can come up with, they will find at least two ways to beat it.” [unknown]
55
Don’t
Measure individuals
Use metrics as a “stick”
Ignore the data
Use only one metric
Cost
Quality
Schedule
56
DoSelect metrics based on goals
Goal 1 Goal 2
Question 1 Question 2 Question 3 Question 4
Metrics 1 Metric 2 Metric 3 Metric 4 Metric 5
[Basili-88]
Focus on processes, products & services
Processes,Products &
Services
Provide feedback
Feedback
Data
Data Providers Metrics
Obtain “buy-in”
57
References• Chidamber, S. R. & Kemerer, C. F., “A Metrics Suite for Object Oriented Design”, IEEE Transactions on Software Engineering, Vol. 20,
#6, June 1994. • Hitz, M. and Montazeri, B. “Chidamber and Kemerer’s Metrics Suite: A Measurement Theory Perspective”, IEE Transaction on Software
Engineering, Vol. 22, No. 4, April 1996. • Lacovara , R.C., and Stark G. E., “A Short Guide to Complexity Analysis, Interpretation and Application”, May 17, 1994.
http://members.aol.com/GEShome/complexity/Comp.html • Tang, M., Kao, M., and Chen, M., “An Empirical Study on Object-Oriented Metrics”, IEEE Transactions on Software Engineering, 0-7695-
0403-5, 1999. • Tegarden, D., Sheetz, S., Monarchi, D., “Effectiveness of Traditional Software Metrics for Object-Oriented Systems”, Proceedings: 25th
Hawaii International Confernce on System Sciences, January, 1992, pp. 359-368.
• “Principal Components of Orthogonal Object-Oriented Metrics”http://satc.gsfc.nasa.gov/support/OSMASAS_SEP01/Principal_Components_of_Orthogonal_Object_Oriented_Metrics.htm
• Software Engineering Fundamentalsby Behforhooz & Hudson, Oxford Press, 1996Chapter 18: Software Quality and Quality Assurrance
• Software Engineering: A Practioner's Approachby Roger Pressman, McGraw-Hill, 1997
• IEEE Standard on Software Quality Metrics Validation Methdology (1061)
• Object-Oriented Metricsby Brian Henderson-Sellers, Prentice-Hall, 1996