(1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory...

20
1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii

Transcript of (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory...

Page 1: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(1)

Introduction toSoftware Review

Philip JohnsonCollaborative Software Development

LaboratoryInformation and Computer Sciences

University of Hawaii

Page 2: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(2)

Objectives Understand the motivation for technical review.

Become acquainted with “best practices” for “industrial strength” formal technical review.

Begin the journey toward 'optimal' review in your own practice.

Page 3: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(3)

Optimal Review For me, an 'optimal' review will:•find all important defects•give all reviewers deep insight into the code•enable the author to improve the code•all with least possible effort

In other words:•Optimal quality improvement•Optimal knowledge acquisition•With lowest possible cost

Normally, though, it’s “pick any two”.

Page 4: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(4)

Families of Review Methods

Walkthroughs

Minimal overhead

Developer training

Quick turnaroundDefect discoveryAmbiguity resolutionTraining

Method Family

Typical Goals Typical Attributes

Little/no preparation

No formal process

No measurement

Technical Reviews

Some formal process

Multiple stages

Wide range of discussion

Inspections

Detect and remove all defects efficiently and effectively.

Very formal process

Measurement

Verification

Page 5: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(5)

Methods vs. Optimality Walkthroughs:•Cost: very low•Knowledge transfer: undependable•Quality improvement: undependable

Technical reviews:•Cost: moderate•Knowledge transfer: better•Quality improvement: at least some

Inspection:•Cost: high (and not necessarily ‘least possible’)•Knowledge transfer: almost guaranteed.•Quality improvement: almost guaranteed

Page 6: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(6)

1. Reviews improve schedule predictability.

2. Reviews reduce rework.•Rework accounts for 44% of dev. cost!•Reqs (1%), Design (12%), Coding (12%), Testing (19%)

3. Reviews are pro-active tests.•Find errors not possible through testing, such as Errors of Omission (i.e. unimplemented requirement)

4. Reviews are training.•Domain, corporate standards, group.

What reviews do that testing doesn't

No Revs.

Revs

Req Design Code Test

R R RReq Design Code Test

Page 7: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(7)

Industry experience Aetna Insurance Company:•FTR found 82% of errors, 25% cost reduction.

Bell-Northern Research:•Inspection cost: 1 hour per defect.•Testing cost: 2-4 hours per defect. •Post-release cost: 33 hours per defect.

Hewlett-Packard•Est. inspection savings (1993): $21,454,000

IBM (using Cleanroom)•C system software•No errors from time of first compile.

Page 8: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(8)

There are many different kinds of review

TekInspect

Development MethodNon-Cleanroom Cleanroom

FTRinFTR

CodeInspection(Fagan76)

Inspection(Gilb93)

2-PersonInspection(Bisant89)

N-FoldInspection(Martin90)

Walkthrough(Yourdon89)

Verification-basedInspection(Dyer92)

ActiveDesignReviews(Parnas85)

FTArm(Johnson94)

ICICLE(Brothers90)

Scrutiny(Gintell93)

CAIS(Mashayekhi94)

ManualTool-Based

CodeReading(McConnell93)

SoftwareReview(Humphrey90)

Phased Insp.(Knight93)

Page 9: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(9)

Inspection: the most formal review

Preparation

Orientation

Planning

Review Meeting

Rework

Verify Verify product/process quality

Correct defects.

Consolidate issues.

Check product, note issues.

Present product, process, goals.

Choose team, materials,

dates.

Page 10: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(10)

Planning

Objectives•Gather review package: work product, checklists, references, and data sheets.•Form inspection team.•Determine dates for meetings.

Planning Orientation Preparation Review Mt. Rework Verify

Page 11: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(11)

Orientation

Objectives•Author provides overview.•Reviewers obtain review package.•Preparation goals established.•Reviewers commit to participate.

Planning Orientation Preparation Review Mt. Rework Verify

Page 12: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(12)

Preparation

Objectives•Find maximum number of non-minor issues.

Planning Orientation Preparation Review Mt. Rework Verify

Page 13: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(13)

Example Issue Classification

Critical•Defects that may cause the system to hang, crash, produce incorrect results or behavior, or corrupt user data. No known work-arounds.

Severe•Defects that cause incorrect results or behavior with known work-arounds. Large and/or important areas of the system is affected.

Moderate•Defects that affect limited areas of functionality that can either be worked around or ignored.

Minor•Defects that can be overlooked with no loss of functionality.

Page 14: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(14)

Review Meeting

Objectives•Create consolidated, comprehensive listing of non-minor issues.•Provide opportunity for group synergy.•Improve reviewing skill by observing others. •Create shared knowledge of work product.

Planning Orientation Preparation Review Mt. Rework Verify

Page 15: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(15)

Rework

Objectives•Assess each issue, determine if it is a defect, and remove it if necessary.•Produce written disposition of non-minor issue.•Resolve minor issues as necessary.

Planning Orientation Preparation Review Mt. Rework Verify

Page 16: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(16)

Verify

Objectives•Assess the (reworked) work product quality.•Assess the inspection process.•Pass or fail the work product.

Planning Orientation Preparation Review Mt. Rework Verify

Page 17: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(17)

ICS Software Engineering

Technical Reviews

Page 18: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(18)

Goals for our Tech. Reviews 1. Learn how to obtain useful feedback from classmates about your software system.

2. Learn how to critically read and evaluate code written by another developer.

3. Learn the difference between low-impact, “waste of time” reviews and high-impact, “would have never noticed this myself” reviews.

4. Learn that reviewing other code can be an effective way to improve your own coding skills.

Page 19: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(19)

Initial approach: Checklist-based technical

review Use a checklist to:•focus reviewer attention.•specify the concerns of the review

Write down reviewer comments to:•provide a clear record of reviewer feedback.•assess coverage of review.

Go over reviewer comments with author to:•Clarify meaning of comments. •Assess validity.•Provide opportunity for new issues to arise.

Page 20: (1) Introduction to Software Review Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii.

(20)