Advanced Software Engineering Lecture 4: Process & Project Metrics.
-
Upload
garry-hawkins -
Category
Documents
-
view
220 -
download
0
description
Transcript of Advanced Software Engineering Lecture 4: Process & Project Metrics.
Advanced Software Engineering
Lecture 4: Process & Project Metrics
Today’s Topics
Measurements, measures, metrics & indicators Size-oriented metrics (LOC) Function-oriented metrics (FP) Measuring quality
Measurement
Measurement is:• “A prerequisite for all science”
(Lord Kelvin)• A hallmark of all engineering disciplines• The foundation for a quantitative, scientific approach to
software
Why Do We Measure?
To characterize “How can we characterize the robustness of our system?”
To evaluate “How can we demonstrate we meet our robustness goals?”
Why Do We Measure? [2]
To predict “How will the software behave if the number of current
transactions is doubled?”
To improve “What is the gap between current and desired performance?”
[From Park, Goethart & Florac 1996]
Software Metrics
Information from measurement can be used for:
• Continuous improvement of a process• Estimation• Quality control• Productivity assessment
Reflective Practice
“Those who ignore the past are doomed to repeat it...”
• Flaws in a product or process will be a source of continual productivity loss
• Good engineers rarely make the same mistake twice• Measurement is a key enabler for CMM process evolution
Measurements, Measures, and Indicators
measurement = action taken “Count the avg. defects / 1K LOC.”
measure = a measurement result“1.2 defects / 1K LOC”
indicator = comparison (trend) “v1.1 has .5 fewer defects / 1K LOC”
Process Indicators
Assess the paradigm Assess ongoing SE tasks Assess work products Evaluate milestones Adapt / change life-cycle model
(or individual process components)
“From project to project, what works and what doesn’t?”
Project Indicators
Track task status Track potential risk factors Evaluate quality control efforts Impact:
• Uncover problem areas early• Adjust workflow / tasks / resources
“How can I assess the health of my current project?”
Outcome Measures
Errors noted before delivery Errors noted after delivery Work products delivered Human effort expended Calendar time elapsed Schedule conformance Effort on “umbrella” activities
Private vs. Public Measurement
Per individual & per module Sensitive data: internal to project “Where can we improve?” vs. “Who’s doing
poorly?” PSP - Personal Software Process [Humphrey ‘95] “Software improvement must begin at the
individual level”
Statistical Software Process Improvement (SSPI)
Categorize errors by origin (specification, logic, etc.)
Estimate cost to correct Sort according to frequency Estimate cost of each error type Find highest-cost problems Prioritize debugging efforts
[From SEPA 5/e]
Defects andTheir Origin
Error Analysis “Fishbone” diagram used to analyze the causes of
defects The analysis can be used to derive indicators for
future improvements
[Grady ‘92]
[From SEPA 5/e]
FishboneDiagram
Direct Measures
Cost Effort applied LOC (lines of code) Execution speed Memory size Defects reported
Indirect Measures
Functionality Quality Complexity Efficiency Reliability Maintainability
Size-Oriented Metrics
Errors per KLOC Defects per KLOC $ per KLOC Documentation pages per KLOC Errors per person-month LOC per person-month $ per documentation page
Size-Oriented Metrics
[From SEPA 5/e]
Is LOC a Good Measure?
Lines of code are easily counted, but…• LOC not necessarily related to quality• Programming languages differ widely in LOC per functional
requirement• Difficult to estimate LOC• There is more to SE than writing code!
What LOC Can’t Measure...
People factors (team size, skill) Problem factors (complexity, change) Process factors (techniques, tools) Product factors (reliability, performance) Resource factors (people, tools)
Function-Oriented Metrics
The function points (FP) metric analyzes the information domain and SW complexity:• Number of inputs & outputs• Number of user queries / commands• Number of files• Number of external interfaces [Albrecht, 1979]
“Measure functionality, not LOC”
Weighting Factors
[From SEPA 5/e]
Set of 14 questions, answered on a scale from 0 to 10; e.g.:• “Does the system require reliable backup and recovery?”• “Are there distributed processing functions?”(See full list, page 91, SEPA 5/e)
Complexity Adjustment Values
Computing Function PointsFP = WF x [0.65 + 0.01 x CAV]
Weighting FactorCount Total
Complexity AdjustmentValue Total
Constants(EmpiricallyDetermined)
An abstract, relative measure (not concrete or absolute!)
PRO: Useful way to compare the estimated effort on two different systems, or for projects over time
CON: Must be tuned for each organization, domain; e.g.:• feature points [Jones, 1991]: algorithmic complexity• 3D function points [Whitmire, 1995]: data, function, control
Measures with Function Points
Errors per FP Defects per FP $ per FP Pages of documentation per FP FP per person-month
Lines of Code Per Function Point
[from SEPA 5/e]
Metrics for Software Quality
Software is only as good as the quality of...• the requirements description• the design of the solution• the code / program produced• the tests used to find errors
QA = life-cycle task, not just a “finishing” activity
QA must address process, too!
Measuring Quality
Correctness• defects per KLOC?• errors per inputs processed?
Maintainability• mean time to change (MTTC)?(Note: changes to software vary widely in scope and complexity!)
Integrity• assess threats, security of software
Measuring Quality (2)
Usability• physical / intellectual skill required?• slope of the learning curve?• net increase in productivity?• subjective assessment from users?
[From SEPA 5/e]
IntegratingMetrics inthe Process
Summary
Measurements, measures, metrics & indicators Size-oriented metrics (LOC) Function-oriented metrics (FP) Measuring quality Individual methods must be adjusted and tuned
for a given software domain!