Post on 14-Mar-2020
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill, 2009). Slides copyright 2009 by Roger Pressman.
Chapter 17
Slides in this presentation were taken from two sources:
Software Engineering: A Practitioner’s Approach, 7/eby Roger S. Pressman
Slides copyright © 1996, 2001, 2005, 2009 by Roger S. Pressman
Object-Oriented Software Engineering: Using UML, Patterns,
and Java, 2/e, Chapters 4 and 5by Bernd Bruegge and Allen H. Dutoit
The footer information identifies the source for each slide.
Software Testing Strategies
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 2
Software Testing
Testing is the process of exercising
a program with the specific intent of
finding errors prior to delivery to the
end user.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 3
What Testing Shows
errors
requirements conformance
performance
an indicationof quality
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 4
Strategic Approach
To perform effective testing, you should conduct effective technical reviews. By doing this, many errors will be eliminated before testing commences.
Testing begins at the component level and works "outward" toward the integration of the entire computer -based system.
Different testing techniques are appropriate for d ifferent software engineering approaches and at d ifferent points in time.
Testing is conducted by the developer of the software and (for large projects) an independent test group.
Testing and debugging are d ifferent activities, but debugging must be accommodated in any testing strategy.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 5
V & V
Verification refers to the set of tasks that ensure
that software correctly implements a specific
function.
Validation refers to a d ifferent set of tasks that
ensure that the software that has been built is
traceable to customer requirements. Boehm
[Boe81] states this another way:
Verification: "Are we build ing the product right?"
Validation: "Are we build ing the right product?"
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 6
Who Tests the Software?
developer independent tester
Understands the system
but, will test "gently"
and, is driven by "delivery"
Must learn about the system,
but, will attempt to break it
and, is driven by quality
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 7
Testing Strategy
System engineering
Analysis modeling
Design modeling
Code generation Unit test
Integration test
Validation test
System test
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 9
Testing Activities
Tested Subsystem
SubsystemCode
FunctionalIntegration
Unit
TestedSubsystem
RequirementsAnalysis
Document
SystemDesign
Document
Tested Subsystem
Test Test
Test
Unit Test
Unit Test
User Manual
RequirementsAnalysis
Document
SubsystemCode
SubsystemCode
All tests by developerAll tests by developer
FunctioningSystem
IntegratedSubsystems
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 10
GlobalRequirements
Testing Activities continued
User’s understandingTests by developerTests by developer
Performance Acceptance
Client’s Understanding
of Requirements
Test
FunctioningSystem
TestInstallation
User Environment
Test
System inUse
UsableSystem
ValidatedSystem
AcceptedSystem
Tests (?) by userTests (?) by user
Tests by clientTests by client
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 13
Types of Testing
♦ Unit Testing:Individual subsystemCarried out by developersGoal: Confirm that subsystems is correctly coded and carries out the intended functionality
♦ Integration Testing:Groups of subsystems (collection of classes) and eventually the entire systemCarried out by developersGoal: Test the interface among the subsystem
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 14
System Testing
♦ System Testing:The entire systemCarried out by developersGoal: Determine if the system meets the requirements (functional and global)
♦ Acceptance Testing:Evaluates the system delivered by developersCarried out by the client. May involve executing typical transactions on site on a trial basisGoal: Demonstrate that the system meets customer requirements and is ready to use
♦ Implementation (Coding) and testing go hand in hand
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 8
Testing Strategy
We begin by „testing-in-the-small‟ and move
toward „testing-in-the-large‟
For conventional software
The module (component) is our initial focus
Integration of modules follows
For OO software
our focus when “testing in the small” changes from
an individual module (the conventional view) to an
OO class that encompasses attributes and
operations and implies communication and
collaboration
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 9
Strategic Issues
Specify product requirements in a quantifiable manner long before testing commences.
State testing objectives explicitly.
Understand the users of the software and develop a profile for each user category.
Develop a testing plan that emphasizes “rapid cycle testing.”
Build “robust” software that is designed to test itself
Use effective technical reviews as a filter prior to testing
Conduct technical reviews to assess the test strategy and test cases themselves.
Develop a continuous improvement approach for the testing process.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 10
Unit Testing
moduleto be
tested
test cases
results
softwareengineer
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 11
Unit Testing
interface
local data structures
boundary conditions
independent paths
error handling paths
moduleto be
tested
test cases
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 12
Unit Test Environment
Module
stub stub
driver
RESULTS
interface
local data structures
boundary conditions
independent paths
error handling paths
test cases
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 15
Unit Testing
♦ Informal: Incremental coding
♦ Static Analysis:Hand execution: Reading the source codeWalk-Through (informal presentation to others)Code Inspection (formal presentation to others)Automated Tools checking for
syntactic and semantic errorsdeparture from coding standards
♦ Dynamic Analysis:Black-box testing (Test the input/output behavior)White-box testing (Test the internal logic of the subsystem or object)Data-structure based testing (Data types determine test cases)
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 16
Black-box Testing
♦ Focus: I/O behavior. If for any given input, we can predict the output, then the module passes the test.
Almost always impossible to generate all possible inputs ("test cases")
♦ Goal: Reduce number of test cases by equivalence partitioning:Divide input conditions into equivalence classesChoose test cases for each equivalence class. (Example: If an object is supposed to accept a negative number, testing one negative number is enough)
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 17
Black-box Testing (Continued)
♦ Selection of equivalence classes (No rules, only guidelines):Input is valid across range of values. Select test cases from 3equivalence classes:
Below the rangeWithin the rangeAbove the range
Input is valid if it is from a discrete set. Select test cases from 2 equivalence classes:
Valid discrete valueInvalid discrete value
♦ Another solution to select only a limited amount of test cases: Get knowledge about the inner workings of the unit being tested => white-box testing
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 18
White-box Testing
♦ Focus: Thoroughness (Coverage). Every statement in the component is executed at least once.
♦ Four types of white-box testingStatement TestingLoop TestingPath TestingBranch Testing
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 19
if ( i = TRUE) printf("YES\n"); else printf("NO\n");Test cases: 1) i = TRUE; 2) i = FALSE
White-box Testing (Continued)♦ Statement Testing (Algebraic Testing): Test single statements
(Choice of operators in polynomials, etc)♦ Loop Testing:
Cause execution of the loop to be skipped completely. (Exception: Repeat loops)Loop to be executed exactly onceLoop to be executed more than once
♦ Path testing:Make sure all paths in the program are executed
♦ Branch Testing (Conditional Testing): Make sure that each possible outcome from a condition is tested at least once
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 20
/*Read in and sum the scores*/
White-box Testing ExampleFindMean(float Mean, FILE ScoreFile) { SumOfScores = 0.0; NumberOfScores = 0; Mean = 0;Read(ScoreFile, Score); while (! EOF(ScoreFile) {
if ( Score > 0.0 ) {SumOfScores = SumOfScores + Score;NumberOfScores++;
}Read(ScoreFile, Score);
}/* Compute the mean and print the result */if (NumberOfScores > 0 ) {
Mean = SumOfScores/NumberOfScores;printf("The mean score is %f \n", Mean);
} else printf("No scores found in file\n");
}
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 21
White-box Testing Example: Determining the PathsFindMean (FILE ScoreFile){ float SumOfScores = 0.0;
int NumberOfScores = 0; float Mean=0.0; float Score;Read(ScoreFile, Score);while (! EOF(ScoreFile) {
if (Score > 0.0 ) {SumOfScores = SumOfScores + Score;NumberOfScores++;}
Read(ScoreFile, Score);}/* Compute the mean and print the result */if (NumberOfScores > 0) {
Mean = SumOfScores / NumberOfScores;printf(“ The mean score is %f\n”, Mean);
} elseprintf (“No scores found in file\n”);
}
1
23
4
5
7
6
8
9
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 22
Constructing the Logic Flow Diagram
Start
2
3
4 5
6
7
8 9
Exit
1
F
T F
T F
T
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 23
Finding the Test CasesStart
2
3
4 5
6
7
8 9
Exit
1
b
d e
gf
i j
hc
k l
a (Covered by any data)
(Data set must
(Data set must contain at least one value)
be empty)
(Total score > 0.0)(Total score < 0.0)
(Positive score) (Negative score)
(Reached if either f ore is reached)
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 24
Test Cases
♦ Test case 1 : ? (To execute loop exactly once)♦ Test case 2 : ? (To skip loop body)♦ Test case 3: ?,? (to execute loop more than once)
These 3 test cases cover all control flow paths
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 25
Comparison of White & Black-box Testing 25.1.2002
♦ White-box Testing:Potentially infinite number of paths have to be testedWhite-box testing often tests what is done, instead of what should be doneCannot detect missing use cases
♦ Black-box Testing:Potential combinatoricalexplosion of test cases (valid & invalid data)Often not clear whether the selected test cases uncover a particular errorDoes not discover extraneous use cases ("features")
♦ Both types of testing are needed♦ White-box testing and black box
testing are the extreme ends of a testing continuum.
♦ Any choice of test case lies in between and depends on the following:
Number of possible logical pathsNature of input dataAmount of computation Complexity of algorithms and data structures
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 13
Integration Testing Strategies
Options:
• the “big bang” approach
• an incremental construction strategy
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 14
Top Down Integration
top module is tested with stubs
stubs are replaced one at a time, "depth first"
as new modules are integrated, some subset of tests is re-run
A
B
C
D E
F G
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 15
Bottom-Up Integration
drivers are replaced one at a time, "depth first"
worker modules are grouped into builds and integrated
A
B
C
D E
F G
cluster
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 16
Sandwich Testing
Top modules are
tested with stubs
Worker modules are grouped into builds and integrated
A
B
C
D E
F G
cluster
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 4
Example: Three Layer Call Hierarchy
A
B C D
GFE
Layer I
Layer II
Layer III
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 5
Integration Testing: Big-Bang Approach
Unit Test F
Unit Test E
Unit Test D
Unit Test C
Unit Test B
Unit Test A
System Test
Don’t try this!
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 7
Bottom-up Integration A
B C D
GFE
Layer I
Layer II
Layer III
Test F
Test E
Test G
Test C
Test D,G
Test B, E, F
Test A, B, C, D,
E, F, G
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 8
Pros and Cons of bottom up integration testing
♦ Bad for functionally decomposed systems:Tests the most important subsystem (UI) last
♦ Useful for integrating the following systemsObject-oriented systemsreal-time systemssystems with strict performance requirements
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 9
Top-down Testing Strategy
♦ Test the top layer or the controlling subsystem first♦ Then combine all the subsystems that are called by the tested
subsystems and test the resulting collection of subsystems♦ Do this until all subsystems are incorporated into the test♦ Special program is needed to do the testing, Test stub :
A program or a method that simulates the activity of a missing subsystem by answering to the calling sequence of the calling subsystem and returning back fake data.
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 10
Top-down Integration TestingA
B C D
GFE
Layer I
Layer II
Layer III
Test A
Layer I
Test A, B, C, D
Layer I + II
Test A, B, C, D,
E, F, G
All Layers
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 11
Pros and Cons of top-down integration testing
♦ Test cases can be defined in terms of the functionality of the system (functional requirements)
♦ Writing stubs can be difficult: Stubs must allow all possible conditions to be tested.
♦ Possibly a very large number of stubs may be required, especially if the lowest level of the system contains many methods.
♦ One solution to avoid too many stubs: Modified top-down testing strategy
Test each layer of the system decomposition individually before merging the layers Disadvantage of modified top-down testing: Both, stubs and drivers are needed
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 12
Sandwich Testing Strategy
♦ Combines top-down strategy with bottom-up strategy♦ The system is view as having three layers
A target layer in the middleA layer above the targetA layer below the targetTesting converges at the target layer
♦ How do you select the target layer if there are more than 3 layers?
Heuristic: Try to minimize the number of stubs and drivers
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 13
Sandwich Testing Strategy A
B C D
GFE
Layer I
Layer II
Layer IIITest E
Test D,G
Test B, E, F
Test A, B, C, D,
E, F, G
Test F
Test G
Test A
BottomLayerTests
TopLayerTests
Test A,B,C, D
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 14
Pros and Cons of Sandwich Testing
♦ Top and Bottom Layer Tests can be done in parallel♦ Does not test the individual subsystems thoroughly before
integration♦ Solution: Modified sandwich testing strategy
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 15
Modified Sandwich Testing Strategy
♦ Test in parallel:Middle layer with drivers and stubsTop layer with stubsBottom layer with drivers
♦ Test in parallel:Top layer accessing middle layer (top layer replaces drivers)Bottom accessed by middle layer (bottom layer replaces stubs)
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 16
Modified Sandwich Testing StrategyA
B C D
GFE
Layer I
Layer II
Layer III
Test F
Test E
Test B
Test G
Test D
Test A
Test C
Test B, E, F
TripleTest I
TripleTest ITriple
Test ITripleTest I
Test D,G
DoubleTest II
DoubleTest II
DoubleTest II
DoubleTest II
DoubleTest I
DoubleTest I
DoubleTest I
DoubleTest I
Test A,C
Test A, B, C, D,
E, F, G
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 17
Regression Testing
Regression testing is the re-execution of some subset of tests that have already been conducted to ensure that changes have not propagated unintended side effects
Whenever software is corrected , some aspect of the software configuration (the program, its documentation, or the data that support it) is changed .
Regression testing helps to ensure that changes (due to testing or for other reasons) do not introduce unintended behavior or additional errors.
Regression testing may be conducted manually, by re-executing a subset of all test cases or using automated capture/ playback tools.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 18
Smoke Testing
A common approach for creating “daily builds” for product software
Smoke testing steps: Software components that have been translated into code are
integrated into a “build.” • A build includes all data files, libraries, reusable modules, and engineered
components that are required to implement one or more product functions.
A series of tests is designed to expose errors that will keep the build from properly performing its function.
• The intent should be to uncover “show stopper” errors that have the highest likelihood of throwing the software project behind schedule.
The build is integrated with other builds and the entire product (in its current form) is smoke tested daily.
• The integration approach may be top down or bottom up.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 19
Object-Oriented Testing
begins by evaluating the correctness and consistency of the analysis and design models
testing strategy changes the concept of the „unit‟ broadens due to
encapsulation
integration focuses on classes and their execution across a „thread‟ or in the context of a usage scenario
validation uses conventional black box methods
test case design draws on conventional methods, but also encompasses special features
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 20
Broadening the View of “Testing”
It can be argued that the review of OO analysis and
design models is especially useful because the
same semantic constructs (e.g., classes, attributes,
operations, messages) appear at the analysis,
design, and code level. Therefore, a problem in the
definition of class attributes that is uncovered
during analysis will circumvent side effects that
might occur if the problem were not discovered
until design or code (or even the next iteration of
analysis).
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 21
Testing the CRC Model
1. Revisit the CRC model and the object-relationship model.
2. Inspect the description of each CRC index card to determine if a delegated responsibility is part of the collaborator ’s definition.
3. Invert the connection to ensure that each collaborator that is asked for service is receiving requests from a reasonable source.
4. Using the inverted connections examined in step 3, determine whether other classes might be required or whether responsibilities are properly grouped among the classes.
5. Determine whether widely requested responsibilities might be combined into a single responsibility.
6. Steps 1 to 5 are applied iteratively to each class and through each evolution of the analysis model.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 22
OO Testing Strategy
class testing is the equivalent of unit testing
operations within the class are tested
the state behavior of the class is examined
integration applied three different strategies
thread-based testing—integrates the set of
classes required to respond to one input or event
use-based testing—integrates the set of classes
required to respond to one use case
cluster testing—integrates the set of classes
required to demonstrate one collaboration
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 23
WebApp Testing - I
The content model for the WebApp is
reviewed to uncover errors.
The interface model is reviewed to ensure that
all use cases can be accommodated .
The design model for the WebApp is reviewed
to uncover navigation errors.
The user interface is tested to uncover errors in
presentation and/ or navigation mechanics.
Each functional component is unit tested .
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 24
WebApp Testing - II
Navigation throughout the architecture is tested .
The WebApp is implemented in a variety of d ifferent environmental configurations and is tested for compatibility with each configuration.
Security tests are conducted in an attempt to exploit vulnerabilities in the WebApp or within its environment.
Performance tests are conducted .
The WebApp is tested by a controlled and monitored population of end -users. The results of their interaction with the system are evaluated for content and navigation errors, usability concerns, compatibility concerns, and WebApp reliability and performance.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 25
High Order Testing Validation testing
Focus is on software requirements
System testing Focus is on system integration
Alpha/Beta testing Focus is on customer usage
Recovery testing forces the software to fail in a variety of ways and verifies that recovery is
properly performed
Security testing verifies that protection mechanisms built into a system will, in fact, protect it
from improper penetration
Stress testing executes a system in a manner that demands resources in abnormal quantity,
frequency, or volume
Performance Testing test the run-time performance of software within the context of an integrated
system
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 26
Debugging: A Diagnostic Process
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 27
The Debugging Process
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 28
Debugging Effort
time requiredto diagnose thesymptom anddetermine thecause
time requiredto correct the errorand conductregression tests
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 29
Symptoms & Causes
symptom
cause
symptom and cause may be geographically separated
symptom may disappear when another problem is fixed
cause may be due to a combination of non-errors
cause may be due to a system or compiler error
cause may be due to assumptions that everyone believes
symptom may be intermittent
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 30
Consequences of Bugs
damage
mildannoying
disturbing
serious
extreme
catastrophic
infectious
Bug Type
Bug Categories: function-related bugs,
system-related bugs, data bugs, coding bugs, design bugs, documentation bugs, standards violations, etc.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 31
Debugging Techniques
brute force / testing
backtracking
induction
deduction
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 32
Correcting the Error
Is the cause of the bug reproduced in another part of the program? In
many situations, a program defect is caused by an erroneous
pattern of logic that may be reproduced elsewhere.
What "next bug" might be introduced by the fix I'm about to make?
Before the correction is made, the source code (or, better, the
design) should be evaluated to assess coupling of logic and
data structures.
What could we have done to prevent this bug in the first place? This
question is the first step toward establishing a statistical
software quality assurance approach. If you correct the process
as well as the product, the bug will be removed from the
current program and may be eliminated from all future
programs.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 33
Final Thoughts
Think -- before you act to correct
Use tools to gain additional insight
If you‟re at an impasse, get help from someone
else
Once you correct the bug, use regression
testing to uncover any side effects
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 20
System Testing
♦ Functional Testing♦ Structure Testing♦ Performance Testing♦ Acceptance Testing♦ Installation Testing
Impact of requirements on system testing:The more explicit the requirements, the easier they are to test.Quality of use cases determines the ease of functional testingQuality of subsystem decomposition determines the ease of structure testingQuality of nonfunctional requirements and constraints determinesthe ease of performance tests:
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 21
Structure Testing
♦ Essentially the same as white box testing.♦ Goal: Cover all paths in the system design
Exercise all input and output parameters of each component.Exercise all components and all calls (each component is called at least once and every component is called by all possible callers.)Use conditional and iteration testing as in unit testing.
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 22
Functional Testing
.
.
Essentially the same as black box testing♦ Goal: Test functionality of system♦ Test cases are designed from the requirements analysis
document (better: user manual) and centered around requirements and key functions (use cases)
♦ The system is treated as black box.♦ Unit test cases can be reused, but in end user oriented new test
cases have to be developed as well.
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 23
Performance Testing
♦ Stress TestingStress limits of system (maximum # of users, peak demands, extended operation)
♦ Volume testingTest what happens if large amounts of data are handled
♦ Configuration testingTest the various software and hardware configurations
♦ Compatibility testTest backward compatibility with existing systems
♦ Security testingTry to violate security requirements
♦ Timing testingEvaluate response times and time to perform a function
♦ Environmental testTest tolerances for heat, humidity, motion, portability
♦ Quality testingTest reliability, maintain- ability & availability of the system
♦ Recovery testingTests system’s response to presence of errors or loss of data.
♦ Human factors testingTests user interface with user
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 24
Test Cases for Performance Testing
♦ Push the (integrated) system to its limits.♦ Goal: Try to break the subsystem♦ Test how the system behaves when overloaded.
Can bottlenecks be identified? (First candidates for redesign in the next iteration
♦ Try unusual orders of execution Call a receive() before send()
♦ Check the system’s response to large volumes of dataIf the system is supposed to handle 1000 items, try it with 1001items.
♦ What is the amount of time spent in different use cases?Are typical cases executed in a timely fashion?
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 25
Acceptance Testing
♦ Goal: Demonstrate system is ready for operational use
Choice of tests is made by client/sponsorMany tests can be taken from integration testingAcceptance test is performed by the client, not by the developer.
♦ Majority of all bugs in software is typically found by the client after the system is in use, not by the developers or testers. Therefore two kinds of additional tests:
♦ Goal: Demonstrate system is ready for operational use
Choice of tests is made by client/sponsorMany tests can be taken from integration testingAcceptance test is performed by the client, not by the developer.
♦ Majority of all bugs in software is typically found by the client after the system is in use, not by the developers or testers. Therefore two kinds of additional tests:
♦ Alpha test:Sponsor uses the software at the developer’s site.Software used in a controlled setting, with the developer always ready to fix bugs.
♦ Beta test:Conducted at sponsor’s site (developer is not present)Software gets a realistic workout in target environ-mentPotential customer might get discouraged
♦ Alpha test:Sponsor uses the software at the developer’s site.Software used in a controlled setting, with the developer always ready to fix bugs.
♦ Beta test:Conducted at sponsor’s site (developer is not present)Software gets a realistic workout in target environ-mentPotential customer might get discouraged
Bernd Bruegge & Allen H. Dutoit Object-Oriented Software Engineering: Using UML, Patterns, and Java 26
Testing has its own Life Cycle
Establish the test objectives
Design the test cases
Write the test cases
Test the test cases
Execute the tests
Evaluate the test results
Change the system
Do regression testing