Slide number 1 COOUG Presentation © 2002 Houman Younessi Defect Management of Object-oriented...
-
Upload
cody-morris -
Category
Documents
-
view
215 -
download
0
Transcript of Slide number 1 COOUG Presentation © 2002 Houman Younessi Defect Management of Object-oriented...
COOUG Presentation© 2002 Houman Younessi
Slide number 1
Defect Management of Object-oriented Software
A presentation at the May 2002 meeting of the
Connecticut Object-oriented Users Group
Presented by:
Houman Younessi, PhD
Professor of Computer Science and Software Engineering
Rensselaer Polytechnic Institute, Hartford Graduate Campus
Email: [email protected]
COOUG Presentation© 2002 Houman Younessi
Slide number 2
The main objective of Software Engineering is to build artifacts that:
• Are of highest possible quality, whilst expending
• The least amount of resources
• Fitness-for-purpose
• Fitness-of-form
• Efficacy of means
To achieve such goals, we must ensure:
COOUG Presentation© 2002 Houman Younessi
Slide number 3
Any action or omission that detracts or stands in the way of ensuring these characteristics must therefore be managed. Such actions or omissions lead to the injection of what we can term a
DEFECTSoftware engineering, therefore, may in its entirety be deemed as the purposeful process of managing defects during the process of construction of software.
Software Engineering IS Defect Management
COOUG Presentation© 2002 Houman Younessi
Slide number 4
Defects may be of two types:
Defects of Omission
Defects of Commission
A proper SE process must manage both.
COOUG Presentation© 2002 Houman Younessi
Slide number 5
Some common terms:
Failure: An instance of the recognition that the final software product does not meet quality expectations.
Fault: An incorrect state entered by a program executable or an incorrect transformation undergone.
Defect: An imperfection in the software engineering work-product that requires rectification.
Bug: A defect that causes the generation of a fault
Error: A commission or omission at any stage of the software process by a software engineer that results in a defect.
COOUG Presentation© 2002 Houman Younessi
Slide number 6
Failure can be:
Operational, or
Structural
Operational failures detract from functionality, reliability and usability of the system
Structural failures detract from maintainability, reusability and efficiency of the software
COOUG Presentation© 2002 Houman Younessi
Slide number 7
Thus, we can term a defects that causes
A failure in the functionality of the system as a defect of functionality or an F-type defect
A failure in the reliability of the system as a defect of reliability or an R-type defect
A failure in the usability of the system as a defect of usability or a U-type defect
A failure in the maintainability of the system as a defect of maintainability or an M-type defect
and so on….
COOUG Presentation© 2002 Houman Younessi
Slide number 8
An ounce of prevention….
Corrective techniquesSeek, find and remove extant defect in the work-product
Preventive techniquesRemove the opportunity for an omission or commission that would lead to a defect.
COOUG Presentation© 2002 Houman Younessi
Slide number 9
Our objective therefore is to introduce both
corrective and
preventive techniques of
defect management that might be useful in the
Object oriented paradigm.
But before we can do so, we must investigate some relevant peculiarities of the object paradigm.
COOUG Presentation© 2002 Houman Younessi
Slide number 10
Into the quagmire…..
Data from several hundred projects indicates that the object-oriented approach has on average:
A higher defect potential, and
That it is harder to identify and remove defects
from object-oriented work-products than traditional work-products (Jones, 1997).
Sheppard and Carthwright (1997) also report that Object-oriented
systems score in general lower in testability compared to traditional systems
COOUG Presentation© 2002 Houman Younessi
Slide number 11
Jones’ assertions whilst interesting and counter-intuitive, might be attributed to differences in:
How we identify a defect in each paradigm
What we call a defect in each paradigm
The level of maturity of the overall process or organization utilizing each paradigm
The experience level of the practitioners
Sheppard’s assertion seems to be fundamentally traceable to the characteristics of object-orientation itself and is logically intuitive and acceptable.
COOUG Presentation© 2002 Houman Younessi
Slide number 12
Problems arise from issues of:
Observability
Context-based testing
Integration
Genericity
Inheritance
Polymorphism
COOUG Presentation© 2002 Houman Younessi
Slide number 13
Developing low defect requirements:
Comprehend the Essence and Context of the Problem Situation or Issue at Hand
Construct a Common Dictionary
Identify Stakeholders
Clients
Actors
Owners
Identify a Focus (a Goal)
Identify High-level Usecases
COOUG Presentation© 2002 Houman Younessi
Slide number 14
Developing low defect requirements:Draw Support Diagrams
Combine and Reconcile to Arrive at Consensus
Specify Quality and Acceptance Goals (Quality Matrix)
Analyze User Requirements
Feasibility Analysis
Consistency Analysis
Clarity Analysis
Reconcile Requirements
Document Requirements
COOUG Presentation© 2002 Houman Younessi
Slide number 15
Document Requirements
Usecases
State Behavior Definition
Graphical Models
UML Diagrams
Formal Models
Object Z, OVDM, FOOM,…
COOUG Presentation© 2002 Houman Younessi
Slide number 16
Identifying and removing requirements defects:
Examine Conformance to Standards
Validate Models (Graphical)
CRC
Validate Models (Formal)
Inspect Requirements Documents
COOUG Presentation© 2002 Houman Younessi
Slide number 17
Preventing design defects:Design for:
Functionality Reliability Usability
Maintainability Efficiency
Basic Elements of Good Design:
Modularity Abstraction
Minimality of Interactions Multiple levels of Granularity
Formality Anticipation of Invalid States
Redundancy Genericity
COOUG Presentation© 2002 Houman Younessi
Slide number 18
Preventing design defects:
Take an Architectural Approach
Review and Validate Your Design Documents
COOUG Presentation© 2002 Houman Younessi
Slide number 19
Design defect identification:
Simulate Your Designs
CRC
Inspect Your Designs
COOUG Presentation© 2002 Houman Younessi
Slide number 20
Program defect identification:
Direct Defect Identification:
Reviews Walk-throughs Inspections
Failure Detection:
Sub-domain Testing Statistically Based Testing
COOUG Presentation© 2002 Houman Younessi
Slide number 21
Program defect identification:
Testing in the Object-oriented Paradigm:
Class Testing
Class Hierarchy Testing
Method Testing
Module Testing (including regression Testing)
System Testing
COOUG Presentation© 2002 Houman Younessi
Slide number 22
Program defect identification:
Testing Base classes:
Test for Correct Generation of Instances
Test for Correct Attribute Values
Test if Routines Correctly Alter the Representation of the Corresponding Object
Use Assertions as a Basis for Testing
COOUG Presentation© 2002 Houman Younessi
Slide number 23
Program defect identification:
Integration Strategies:
Collective Integration
Top-down Integration
Usage-based (or bottom-up) Integration
Layered Integration
Progressive or collaborative Integration
COOUG Presentation© 2002 Houman Younessi
Slide number 24
Example Technique: Couple or Binary Testing
Assume:Client class C(n1,n2,n3,m1,m2,m3), and the Server Class S(q1,q2,q3,p1,p2,p3,p4)Where n and q are attributes and m and p are routines
Let us now assume that an instance of S (say s:S) serves an instance of C (say c:C)
We further assume that both S and C have been unit tested adequately.
COOUG Presentation© 2002 Houman Younessi
Slide number 25
We now construct a matrix n m representing the interactions between the attributes and routines of object c:C.
ROUTINES
ATTR
m1 m2 m3 m4 m5
n1
n2
n3
COOUG Presentation© 2002 Houman Younessi
Slide number 26
Similarly, we construct a q p matrix representing the interactions between the attributes and routines of object s:S.
ROUTINES
ATTR
p1 p2 p3 p4
q1
q2
q3
COOUG Presentation© 2002 Houman Younessi
Slide number 27
We then proceed to construct a matrix m p to represent the inter-relationship of the ROUTINES of c:C and s:S. If a routine of object c:C calls a routine of object s:S, the intersecting cell is marked.
ROUTINES-C
ROUTINES-S
m1 m2 m3 m4 m5
p1
p2
p3
p4
p5
COOUG Presentation© 2002 Houman Younessi
Slide number 28
We can now test for the impact on a single attribute of each and every routine that can potentially impact it (a slice).
Thus for each cell in the n m matrix of c:C if a routine m impacts an attribute, we can define at least one test that involves the pair.
ROUTINES
ATTR
m1 m2 m3 m4 m5
n1
n2
n3
Therefore for example for n1, the test sequence would be:
m1,m2,m4
m1,m4,m2
m2,m1,m4
m2,m4,m1
m4,m1,m2
m4,m2,m1
Note:
m3 and m5 do not participate
COOUG Presentation© 2002 Houman Younessi
Slide number 29
We now use the matrix of relationship between c:C and s:S to reflect the impact of the client on the server. For slice n1, the sequence would be:
m1,m2,m4 m1(p1),m2(p4), m1(p1),m2(p4)
m1,m4,m2 m1(p1), , m2(p4) m1(p1),m2(p4)
m2,m1,m4 m2(p4),m1(p1), m2(p4),m1(p1)
m2,m4,m1 m2(p4), , m1(p1) m2(p4),m1(p1)
m4,m1,m2 , m1(p1),m2(p4) m1(p1),m2(p4)
m4,m2,m1 , m2(p4),m1(p1) m2(p4),m1(p1)
where stands for null participation as m4 does not call any server routines, p.
Distilling by removing redundant sequences, we get:
m1(p1),m2(p4) and m2(p4), m1(p1) which implies test sequences
m1,m2 and m2,m1, for slice n1.
COOUG Presentation© 2002 Houman Younessi
Slide number 30
References:
Jones, C.; “ The Economics of Object-oriented Software”; Software Productivity Research; Burlington, MA; 1997
Sheppard, M.; Cartwright, M.; “An empirical study of object-oriented metrics”; Tech. Report TR 97/01. Department of Computing, Bournemouth University, U.K.; 1997.