SENG 521 Software Reliability & Software...
Transcript of SENG 521 Software Reliability & Software...
SENG 521SENG 521Software Reliability & Software Reliability & Software Reliability & Software Reliability & Software QualitySoftware Qualityyy
Chapter 15: Advances TopicsChapter 15: Advances Topics
D t t f El t i l & C t E i i U i it f C lDepartment of Electrical & Computer Engineering, University of Calgary
B.H. Far ([email protected])http://www.enel.ucalgary.ca/People/far/Lectures/SENG521
Section 1Section 1
Software Reliability Engineering for Software Reliability Engineering for Agile Software DevelopmentAgile Software Development
SENG521 (Winter 2008) [email protected] 2
SRE: ProcessSRE: ProcessRequirement &Requirement &
ArchitectureArchitectureDesign &Design &
ImplementationImplementation TestTest
1 Define Necessary1 Define Necessary1. Define Necessary1. Define NecessaryReliabilityReliability
2. Develop Operational2. Develop OperationalProfileProfileSRE ProfileProfile
3. Prepare for Test3. Prepare for Test
5 A l5 A l
SREProc
4. Execute 4. Execute TestTest
5. Apply 5. Apply Failure Failure
DataData
People involved: Senior management; Test coordinator; Data coordinator; Customer or user
3
Customer or user
SRE: Critics …SRE: Critics …Practical implementation of an effective SRE Practical implementation of an effective SRE program is a non-trivial task.
Occurrence probabilities of operational profile are Occurrence probabilities of operational profile are usually developers’ best guess.
Mechanisms for collection and analysis of data on ysoftware product and process must be in place.
Fault identification and elimination techniques must b i lbe in place.
Other organizational abilities such as the use of reviews and inspections reliability based testingreviews and inspections, reliability based testing and software process improvement are also necessary for effective SRE.
4
y
Research QuestionsResearch Questions
Does SRE practice have meaningful interpretation in Agile software p gdevelopment?
How to apply common SRE practices in How to apply common SRE practices in Agile software development?
5
1. Necessary Reliability1. Necessary Reliabilityi i i i ?i i i i ? How much reliability is good enough?How much reliability is good enough?
1) Define failure with “failure severity classes (FSC)” for h dthe product.
2) Set a “failure intensity objective (FIO)” for each system t b t t dto be tested.
3) Find the developed software failure intensity objective.4) E i t t i t t th ft f il i t it4) Engineer strategies to meet the software failure intensity
objective. (Balance among fault prevention, fault-removal and fault tolerance strategies )removal, and fault tolerance strategies.)
Steps 1Steps 1--3 are valid for Agile process, too.3 are valid for Agile process, too.
6
Fault Prevention & RemovalFault Prevention & RemovalU i A il i th diff t f Using Agile process requires rather different way of thinking about fault prevention and fault removal.
T t D i D l t (TDD)T t D i D l t (TDD) i d Test Driven Development (TDD):Test Driven Development (TDD): unit tests and acceptance tests are written prior to functioning code.Continuous Integration:Continuous Integration: regression tests are performed Continuous Integration:Continuous Integration: regression tests are performed frequently.
Pair programming:Pair programming: shifts the activity of code reviews Pair programming:Pair programming: shifts the activity of code reviews into prevention.
Most of fault Most of fault removal activities (in traditional removal activities (in traditional SRE definition) are SRE definition) are moved into fault preventionmoved into fault prevention
7
2. Operational Profile2. Operational ProfileO ti l P fil (OP) i l t t f Operational Profile (OP) is a complete set of operations with their probabilities of occurrence
The purpose of OP is to provide the developers with the information about how users will use the product being built
Needed for better management of Needed for better management of d l dd l ddevelopment and test resources development and test resources
8
Operational Profile vs. User StoryOperational Profile vs. User Story
Si i i iSi i i i Similarities:Similarities: Both contain a complete piece of functionality Both are directly related to requirements Both share an emphasis on developing more critical
functionality first in order to achieve faster time to market
Differences:Differences: SRE process has an additional requirement on frequency
of use of each operation Collecting usage data by user story (operation in SRE
terms) is often missed by teams following agile practices
9
How to Find Operations?How to Find Operations?
Use-Case Realization(identified)
SupplementarySpecifications Glossary
(identified)
Use-Case ModelSoftware Architecture
Document
Operations
Use-Case Realization(developed)Design
p
BoundaryEntityControl
Analysis Classes
ElementsClassesSubsystemsInterfaces
10
InterfacesSignals & events
Case of Agile Development /1Case of Agile Development /1 Main source: User stories Main source: User stories Example: Example:
11
Case of Agile Development /2Case of Agile Development /2S t t i f h t Sort stories for each user type
Example:Example: BSC clerk operation list:pp p
Using blind TDD may lead to tests that do not reflect usage!
12
Using blind TDD may lead to tests that do not reflect usage!
3. Prepare & Execute Test3. Prepare & Execute Test
Prepare test cases and test procedures. User storiesUser stories already contain a list of User storiesUser stories already contain a list of
acceptance tests equivalent to feature (or it) t t t t f if iunit) tests or tests for verifying non-
functional requirements.
E t t t d ll t f il d t Execute tests and collect failure data.
13
Prepare & Execute TestPrepare & Execute Test
SRE: SRE: regression tests run after every build involving significant change. g g g
Agile process: Agile process: continuous integration. Checking code and tests into the commonChecking code and tests into the common repository several times per day. Each time a h k i i d ll icheck-in is made, all regression tests are run,
therefore regression tests are run very frequently.
14
Preparing TestPreparing Test
Differs significantly from common SRE practices: in Agile, tests are not allocated p gbased on usage and/or criticality
All testing should be done in conjunction All testing should be done in conjunction with development, with tests written prior to f i li i l ifunctionality implementation.
Test definitions are established based on ‘need’ rather than usage. whose ‘need’?
15
Number of Test CasesNumber of Test Cases
The number of test cases is another point of deviation from SRE which is calculated based on available budget and time.
In Agile development a common practice is In Agile development a common practice is that the development team will enforce a unit
i ( h 95% f lltest coverage metric (such as 95% for all developer written code).
Every functional user story will receive at least one test case
16
least one test case.
Test Case Selection & RunTest Case Selection & Runf i For performance testing, acceptance tests are
repeatedly run by selecting each based on its i d i d i h i lproportionate usage depicted in the operational
profile.
Occurrence probabilitiesOccurrence probabilities indicated in the operational profile can be used as a bias for testoperational profile can be used as a bias for test selection.
17
Test Preparation SummaryTest Preparation Summary
Deviates significantly from the standard SRE approach. pp
Rather than planning for a certain number of tests and allocating tests and time for teststests and allocating tests and time for tests among defined systems, test driven approach
ill l i h i fwill result in a comprehensive set of tests. Since functionality is not written without y
tests, every part of the system will be well tested
18
tested.
Reporting ToolsReporting Tools
NU i NUnit for running unit tests
CruiseControl.NET for continuous integrationg
CCTray for instant notification CCTray for instant notification
LoadRunner for performance testing LoadRunner for performance testing
CASRE for failure analysis19
CASRE for failure analysis
Tools: Tools: NUnitNUnit NUnit gives meaningful real-time feedback to developers NUnit gives meaningful, real time feedback to developers
for both new tests and for regression test. Users can choose to run only a single test, or the entire suiteUsers can choose to run only a single test, or the entire suite
of unit tests for a given project or a group of projects.
20
Tools: Tools: CruiseControl.NETCruiseControl.NET CruiseControl NET examines the common code CruiseControl.NET examines the common code
repository for changes. Finding a change, it compiles the entire solution and runs all testscompiles the entire solution and runs all tests.
21
Tools: Tools: CCTrayCCTrayCCT CCTray runs on clients and can instantly notify each developer about the current state y pof the build status (pass or fail), and upon failure the web interface can be used tofailure the web interface can be used to determine the exact cause of the failure.
22
Tools: Tools: LoadRunnerLoadRunnerL dR LoadRunner can be used for performance testing. g
It allows for simulation of multiple users, and can execute code or html web queriescan execute code or html web queries.
It also allows for the spawning of agents to multiple client machines, if necessary, to simulate a real web application.pp
23
Tools: Tools: LoadRunnerLoadRunner
24
Tools: Tools: CASRECASRETi B t F il ith F ilTime Between Failure vs. ith Failure
900
1000
700
800
400
500
600
Hou
rs
200
300
0
100
1 11 21 31 41 51 61 71 81 91
ith F il F il Ti
25
ith Failure Failure Time
5. Guided Decision5. Guided Decision
Assessing rather than Measuring reliability Reliability verification procedure is viaReliability verification procedure is via
Certification Test processI tifi ti t ti t l l t In certification testing we cannot calculate the exact amount of reliability of the developed product but we can be sure that it has exceeded a certain minimum level of reliability defined by the FIO set during the defining necessary reliability phase
26
defining necessary reliability phase.
Reliability Demo Chart /1Reliability Demo Chart /1 An efficient way of An efficient way of
checking whether the FIO (F) is met or not.
i b d It is based on collecting failure data at time points.p
Vertical axis: failure number (n)
Horizontal axis: Horizontal axis:normalized failure data (Tn), i.e., f il i failure time F orfailure time / MTTF
27
Lessons Learnt /1Lessons Learnt /1i i i f S i Reliability growth models of SRE may not suit a
test-driven process Test-driven development (TDD) approach and
Certification Testing together can be very useful Allocation of tests (number of test and time for
system testing) can be better managed by an y g) g yoperational profile
Certification Testing will assure minimum level of Certification Testing will assure minimum level of reliability has been achieved
28
Lessons Learnt /2Lessons Learnt /2
Regarding test allocation, there seems to be a more natural fit in defining tests on a user gscenario basis rather than allocating a number of tests per operation.of tests per operation.
However we lose the benefit of knowing that ll i i l i i d d d lall critical operations are indeed adequately
covered, and we must rely on developer skill rather than the established metrics and statistics of SRE. Operational profile will resolve this
29
stat st cs o S . p p
Lessons Learnt /3Lessons Learnt /3
More research: Establishing relationship between test coverage g p g
and mean time to failure or failure intensity. Incorporating performance testing into the Incorporating performance testing into the
continuous integration and automated build practices that agile promotes.practices that agile promotes.
Classifying failures and ensuring that potential failures are in fact testedfailures are in fact tested.
30
Enhancing AgileEnhancing Agile
Define failure severity classes and cover the potential failures in the acceptance tests p p
Defining operational profile and using it to define proper number of tests and system testdefine proper number of tests and system test duration
Add mechanisms for tracking performance Add certification testing to ensure minimum Add certification testing to ensure minimum
required quality is delivered
ConclusionsConclusionsA il f d l Agile software development treats quality as ‘cats for the bli d’!blind’!
While SRE related concepts and techniques may initially seem incompatible with the test-driven emphasis of Agile methods, we showed that SRE does in fact have value within the Agile process context.
32
Section 2Section 2
CleanroomCleanroom Software Software DevelopmentDevelopment
D t t f El t i l & C t E i i U i it f C lDepartment of Electrical & Computer Engineering, University of Calgary
B.H. Far ([email protected])http://www.enel.ucalgary.ca/People/far/Lectures/SENG521/12/
BackgroundBackgroundCh R [St di h 1995] Chaos Report [Standish 1995] Based on data representing 8,380 SE projects, only 16.2% of projects
met the delivery date, the budget and with all of the specified featuresmet the delivery date, the budget and with all of the specified features and functions. 31% of projects were cancelled before they were completed, 52.7% were delivered with over-budget, over-schedule or with fewer features and functions than specifiedwith fewer features and functions than specified.
Software Productivity Research [Chapman 2000] %60 of the United State’s software work force is dedicated to fixing
software errors that could have been avoided. In addition, there are only 47 days in a calendar year dedicated to doing development or y y y g penhancement of software applications. The rest is spent mainly on fixing the bugs.
The First Computer Bug!The First Computer Bug!O S t b 9th 1945 On September 9th 1945, Grace Murray Hopper was working on the Harvard gUniversity Mark II Aiken Relay Calculator when the machine was experiencingmachine was experiencing problems.
An investigation showed gthat there was a moth trapped between the points of Relay #70 in Panel Fof Relay #70, in Panel F.
Courtesy of the Naval Surface Warfare Center, Dahlgren, VA., 1988.
Courtesy of the Naval Surface Warfare Center, Dahlgren, VA., 1988.
http://www.history.navy.mil/photos/pers-us/uspers-h/g-hoppr.htm
Research QuestionResearch Question
Is it possible to build software without any bug in it?g
A M b B i l Answer: May be. By using cleanroomsoftware development
Causes for Bugs in ProgramsCauses for Bugs in ProgramsTh i f b i The main causes for bugs in programs: Design flaws Coding error Other (including human related error) ( g )
The first two can be eliminated by formal The first two can be eliminated by formal (e.g. box structure) design verification and automated code generators Certificationautomated code generators. Certification testing will take care of the last.
Cleanroom SECleanroom SECl ft i i (CSE) i i i Cleanroom software engineering (CSE) is an engineering process for the development of high quality software with certified reliability with the emphasis on design with no y p gdefects and test based on software reliability engineeringconcepts. CSE focuses on defect prevention instead of defect CSE focuses on defect prevention instead of defect correction, and certification of reliability for the intended environment of use.
CSE yields software that is correct by mathematically sound design, and software that is certified by statistically valid testing.g
CSE represents a paradigm shift from traditional, craft-based SE practices to rigorous, engineering-based practices.
CSE: CharacteristicsCSE: Characteristics
Objective:Objective: Achieve zero defects with certified reliabilityy
Focus:Focus: Defect prevention rather than defect correctioncorrection
Process:Process: Incremental (short) development cycles; long product life
CSE: HistoryCSE: History 1983: Original idea of Cleanroom came from one of Dr Harlan Mills’ 1983: Original idea of Cleanroom came from one of Dr. Harlan Mills
published papers 1987: Proposed by Dr. Mills as a SE methodology. The name
“Cleanroom” was borrowed from the electronics industryCleanroom was borrowed from the electronics industry 1988: Defense Advanced Research Projects Agency (DARPA) Software
Technology for Adaptable Reliable Systems (STARS) focus on Cleanroom
1991-1992: Prototyping of Cleanroom Process Guide 1992: A book of CSE published, foundation of CSE 1992-1993: Army and Air Force Demonstration of1992 1993: Army and Air Force Demonstration of
Cleanroom Technology 1993-1994: Prototyping of Cleanroom tools 1995: Commercialization of a Cleanroom Certification 1995: Commercialization of a Cleanroom Certification
Tool 1995: Cleanroom and CMM Consistency Review
…
ComparisonComparisonCraftCraft--Based SEBased SE
Sequential or chaos development
Cleanroom SEIncremental development
Informal designDisciplined engineering specification and design
Unknown reliability Measured reliability
Individual development
Individual unit testing
Peer reviewed engineering
Team correctness verificationIndividual unit testing
Informal load or coverage testing
Team correctness verification
Statistical usage testing
Cleanroom SE: TechnologiesCleanroom SE: TechnologiesD l i b d h i l f i Development practices are based on mathematical function theoryTest practices are based on applied statistics Test practices are based on applied statistics
Analysis and design models are based on incremental y gsoftware model and created using box structurerepresentation
A box encapsulates the system (or some aspect of the system) at a specific level of abstraction
Correctness verification is applied once the box structure design is complete
Cleanroom SE: TechnologiesCleanroom SE: Technologies Software is tested by defining a set of usage scenarios (i.e.,
operations or operational modes), determining the probability of use for each scenario (i.e., operational profile), and then defining random tests that conform to the probabilities.
Error records are checked. No corrective actions are taken. Only certification test is conducted to check whether errors (i.e., current failure intensity) meet the projected reliability ( y) p j y(i.e., failure intensity objective) for the software component.
CSE: Processes /1CSE: Processes /1ClCleanroom processes:1. Management process2. Specification process3 Development process3. Development process4. Certification process
CSE: Processes /2CSE: Processes /21 Cleanroom Management Process1. Cleanroom Management Process
Project planning Project management Project management Performance improvement Engineering change
2. Cleanroom Specification Process Requirements analysis Function specification Usage specification Architecture specification Architecture specification Increment planning
CSE: Processes /3CSE: Processes /3
3. Cleanroom Development Process Increment designg Correctness verification Software reengineering (reuse) Software reengineering (reuse)
4. Cleanroom Certification Process Usage modeling and test planning Statistical testing and certification Statistical testing and certification
CSE: Management ProcessCSE: Management ProcessP j t Pl i Project Planning Cleanroom engineering guide Software development plan (incremental)
Project Managementoject a age e t Project record
Performance Improvement Performance Improvement Performance improvement plan
Engineering Change Engineering change log
CSE: Specification Process /1CSE: Specification Process /1 Requirements AnalysisRequirements Analysis Requirements AnalysisRequirements Analysis
Elicitation and analyzes of requirements Define requirements for the software productq p Obtain agreement with the customer on the requirements Requirements are reconfirmed or clarified throughout the
i t l d l t d tifi tiincremental development and certification process. Functional Functional SpecificationSpecification
Based on the result of Requirements Analysis Based on the result of Requirements Analysis Specify the complete functional behavior of the software
in all possible modes of use Ob i i h h h ifi d Obtain agreement with the customer on the specified function as the basis for software development and certification
CSE: Specification Process /2CSE: Specification Process /2U S ifi tiU S ifi ti Usage SpecificationUsage Specification Identify and classify software users, usage scenarios, and
environments of use (operational modes)environments of use (operational modes) Establish and analyze the probability distribution for
software usage models Obtain agreement with the customer on the specified
usage as the basis for software certification A hit t S ifi tiA hit t S ifi ti Architecture SpecificationArchitecture Specification Define the conceptual model, the structural organization,
and the execution characteristics of the softwareand the execution characteristics of the software Architecture definition is a multi-level activity that spans
the life cycle
CSE: Specification Process /3CSE: Specification Process /3
Increment PlanningIncrement Planning Allocate customer requirements defined in the q
Function Specification to a series of softwareincrements that satisfy the Software Architecture, y ,
Define schedule and resource allocations for increment development and certificationincrement development and certification
Obtain agreement with the customer on the increment planincrement plan
CSE: Development ProcessCSE: Development ProcessIncrement 1Increment 1
RGRGBSSBSS FDFD CVCV CGCG CICI
SUTSUT CC
Increment 1Increment 1
TPTP
Increment 2Increment 2
SESE RGRGBSSBSS FDFD CVCV CGCG CICI
TPTPSUTSUT CC
TPTP
SE: System EngineeringRG R i t G th i
CG: Code GenerationCI: Code InspectionRG: Requirement Gathering
BSS: Box structure specificationFD: Formal DesignCV: Correctness Verification
CI: Code InspectionSUT: Statistical Use TestingC: Certification TestTP: Test Planning
Cleanroom Strategy /1Cleanroom Strategy /1i i ( G)i i ( G) Requirement gathering (RG)Requirement gathering (RG)
A detailed description of customer level requirements for h ieach increment.
Box structure specification (BSS)Box structure specification (BSS) Functional specification using box structure to separate
behavior, data and procedures.
Formal design (FD)Formal design (FD) Specifications (black boxes) are refined to become
analogous to architectural (state boxes) and procedural (clear boxes) design.
Cleanroom Strategy /2Cleanroom Strategy /2C ifi i (C )C ifi i (C ) Correctness verification (CV)Correctness verification (CV) A set of correctness verification activities on the design
d l d Fi l l ifi i i iand moves later to code. First level verification is via application of a set of “correctness questions”.
C d ti i ti & ifi ti (CGC d ti i ti & ifi ti (CG Code generation, inspection & verification (CG Code generation, inspection & verification (CG & CI)& CI) The box structure transformed to a programming
language. Walkthrough and code inspection techniques are used to ensure semantic conformance with the boxare used to ensure semantic conformance with the box structure.
Cleanroom Strategy /3Cleanroom Strategy /3S i i i ( )S i i i ( ) Statistical test planning (TP)Statistical test planning (TP) Planning the test based on operational modes, operational
fil d li biliprofiles and reliability.
Statistical use testing (SUT)Statistical use testing (SUT) Creating test case, execute them and collecting error data.
Certification (C)Certification (C)( )( ) Conducting certification test rather than reliability growth
to accept/reject developed software components (using reliability demonstration chart, etc).
Box Structure /1Box Structure /1
Box structures are used to move from an abstract Box structures are used to move from an abstract specification to a detailed design providing implementation details
Box Structure /2Box Structure /2Bl k bBl k b Black boxBlack box Specifies the behavior of a system or a part of a system.
The system responds to specific stimuli (events) byThe system responds to specific stimuli (events) by applying a set of transition rules that map the stimuli to response.
State boxState box Encapsulates state data and services (operations). Input to
the state box and outputs are represented. Clear boxClear box
Transition function that are implied by the state box. It contains the procedural design of the state box.
Box Structure /3Box Structure /3
f: S* → RS R State
T
Black box S Rg11
g12
ccg1
StateT
g13
g
clear box
f: S* → RS R
clear box
State box
Black boxes (specifications)
State boxes (architectural designs)
Cl b ( t d i )
State box Clear boxes (component designs)
Box Structure /4Box Structure /4CBCB1.1.11.1.1
CBCB1 1 21 1 2BBBB1.1.11.1.1 SBSB1.1.11.1.1
CBCB1.1.21.1.2
CBCB1.1.31.1.3
BBBB
BBBB1.11.1
BBBB1 21 2
BBBB1.1.21.1.2
BBBB11 BBBB1.21.2BBBB1.1.31.1.3
Clear box
BBBB1.n1.nState box
Black box
Correctness VerificationCorrectness Verification Mathematical based techniques are used to verify the Mathematical based techniques are used to verify the
correctness of a software increment Examples
If a function f is expanded into a sequence g and h, the correctness rule for all input to f is:Does g followed by h do f?
If a function f is expanded into a condition if-then-else, the correctness rule for all input to f is:Whenever condition <c> is true does g do f and whenever <c> is f l d h d f?false does h do f?
When function f is refined as a loop, the correctness rule for all input to f is:I i i d?Is termination guaranteed?Whenever <c> is true, does g followed by f do f? and whenever <c> is false, does skipping the loop still do f?
Black Box Structure /4Black Box Structure /4
If having more than one black box or nested black boxes verify the mappingy pp g
f ff f
g h
Sequential split
g
cSequential split h
P ll l lit
c
Parallel split
Black Box Structure /5Black Box Structure /5
If having more than one black box or nested black boxes verify the mappingy pp g
ff ff
h
g cg c
g
Loop split
c
Loop split
Advantages of VerificationAdvantages of Verification
Design verification has the following advantages:g Verification is reduced to a finite process Every step of design and every line of code can Every step of design and every line of code can
be verifiedN d f t l l i hi d Near zero defect level is achieved
Scalability is possible Better code (than unit testing) can be generated
CSE: Certification Process /1CSE: Certification Process /1U d li d t t l iU d li d t t l i Usage modeling and test planningUsage modeling and test planning A usage model represents a possible usage scenario of the software Usage model is based on usage specification and is used for testing Usage model is based on usage specification and is used for testing
Similar to the way we defined operations
CSE: Certification Process /2CSE: Certification Process /2S i i i C ifi iS i i i C ifi i Statistical Testing and CertificationStatistical Testing and Certification Testing is conducted in a formal statistical design under
i l lexperimental control. The software is demonstrated to perform correctly with
t t it ifi tirespect to its specification. Statistically valid estimates of the properties addressed by
the certification goals are derived for the softwarethe certification goals are derived for the software. Management decisions on continuation of testing and
certification of the software are based on statisticalcertification of the software are based on statistical estimates of software quality.
Cleanroom TestingCleanroom Testingi i i l f i Using statistical usage concept for testing.
Determine a usage probability distribution via the following steps:1) Analyze the specification to identify a set of stimuli
(direct and indirect input variables).2) Create usage scenarios (operational modes).3) Assign probability to use of each stimuli (operational
profile).4) Generate test cases for each stimuli according to the
usage probability distribution.
Certification TestCertification TestCl h O S O h i Cleanroom approach DOES NOT emphasize on Unit or integration testing. Bug fixing as a result of test and regression.
Certification procedure involves the followings: Create usage scenarios. Specify a usage profile.p y g p Generate test cases from the profile. Execute test cases and record failure data. Compute reliability and certify the component or system
using reliability demo chart, etc.
Reliability Demo ChartReliability Demo ChartA ffi i t f An efficient way of checking whether the failure intensity
bj ti ( ) i tobjective (F) is met or not based on collecting failure data at time
ipoints. Vertical axis: failure
numberu be Horizontal axis:
normalized failure data, i ei.e.,
failure time F
ExampleExample
Automated Teller Machine (ATM) Requirements:Requirements:Requirements:Requirements:
The customer has a PIN number and access-card to use the ATMto use the ATM
The customer can deposit, withdraw money from th tthe account
Transaction involves no bank employee
Example: Usage ModelExample: Usage Model
Customer<<extends>> <<extends>>
Withdraw Deposit
Usage percentage: 50% Usage percentage: 50%
Card ProcessorCard Processor
Cash DispenserCash Dispenser
Transaction ManagerTransaction Manager
Example: Black BoxesExample: Black Boxes Black boxesBlack boxes Black boxesBlack boxes
Card Processor In: ValidCard(cardNum) Out: showMessage(message)
Card ProcessorCard Processorg ( g )
Boolean Cash Dispenser
In: enoughCashInMachine(amount)CashCashdispenseCash(amount)
Out: showMessage(message)dispense(amount)Boolean
Cash DispenserCash Dispenser
Boolean Transaction Manager
In: ValidCustomer(cardNum, pin)AmountLimit(amount) Transaction
ManagerTransaction ManagerEnoughCashInAccount(amount)
Out: showMessage(message) Boolean
ManagerManager
State Box: Card ProcessorState Box: Card Processor
Menu
/insert card/idle
Menu
/get pin [false]/send message
Check/notify user
[true] /call notify
To cash dispenser
Notify/call notify
/send card verified
State Box: Cash Dispenser State Box: Cash Dispenser
Menu
/get card verified
Menu
/get amount [false][true]
CheckCheck Machine
cash
[false]
[false]
[true] /call check account
[true]
[ ]
Check Account /call check account
Example: Clear Box Spec /1Example: Clear Box Spec /1
M
/ insert card// Get customer PIN noValidCustomer(cardNum, pin)
Menu
/ get pin[false][true]
CheckCheckMachine
Cash
[false][false]
[true][true]
[false]
CheckAccount
[true] /get amount
Example: Clear Box Spec /2Example: Clear Box Spec /2
M
/ insert card// Bank returns false// Show messageshowMessage(mesg);
Menu
/ get pin[false][true]
CheckCheckMachine
Cash
[false][false]
[true][true]
[false]
CheckAccount
[true] /get amount
Example: Clear Box Spec /3Example: Clear Box Spec /3
M
/ insert card// Bank returns true// get amountgetAmount(amount);
Menu
/ get pin[false][true]
CheckCheckMachine
Cash
[false][false]
[true][true]
[false]
CheckAccount
[true] /get amount
Example: Clear Box Spec /4Example: Clear Box Spec /4
M
/ insert card// Bank returns false for daily limit// and/or balance// Show messageshowMessage(mesg);Menu
/ get pin
showMessage(mesg);
[false][true]
CheckCheckMachine
Cash
[false][false]
[true][true]
[false]
CheckAccount
[true] /get amount
Example: Clear Box Spec /5Example: Clear Box Spec /5
M
/ insert card// Bank returns true for daily limit// and balanceDispenser.enoughCashInAccount(amount)
Menu
/ get pin[false][true]
CheckCheckMachine
Cash
[false][false]
[true][true]
[false]
CheckAccount
[true] /get amount
Example: Clear Box Spec /6Example: Clear Box Spec /6
M
/ insert card// Dispenser returns false for // cash level// Show messageshowMessage(mesg);Menu
/ get pin
showMessage(mesg);
[false][true]
CheckCheckMachine
Cash
[false][false]
[true][true]
[false]
CheckAccount
[true] /get amount
Example: Clear Box Spec /7Example: Clear Box Spec /7
M
/ insert card// Dispenser returns true for // cash amountDispenser.dispense(amount);
Menu
/ get pin[false][true]
CheckCheckMachine
Cash
[false][false]
[true][true]
[false]
CheckAccount
[true] /get amount
Example: TestingExample: Testing
50% of tests go to test Withdraw; 50% to test Depositp
Test cases are created to test each execution path (or state transition path)path (or state transition path)
A subset of tests are selected for validation
SENG521 (Winter 2008) [email protected] 80
CSE: TeamCSE: TeamS ifi ti t Specification team: Responsible for developing and maintaining the system specification
Development team: Responsible for developing and verifying the software Responsible for developing and verifying the software The software is not executed during this process
Certification team: Responsible for developing a set of statistical tests to exercise the
ft ft d l tsoftware after development Use reliability growth models to assess reliability
CSE: EvaluationCSE: Evaluation
Basic features of Cleanroom development that distinguishes it from other SE gmethodologies are:1 Formal specification (Box structure) &1. Formal specification (Box structure) &
Correctness verification2 Statistical certification test2. Statistical certification test
Evaluation: Formal SpecEvaluation: Formal SpecAd tAd t Advantages:Advantages: Mathematical and logical foundation for defining
requirements accurately with precise notationrequirements accurately with precise notation. Proactive versus reactive approach with regards to
requirements validation. q Ambiguous, inconsistent and conflicting requirements are
caught before the system test. Box structure uses black, state, and clear box and it is a
stepwise approach to refine requirements. Usage models define how the software is to be used by Usage models define how the software is to be used by
the customer.
Evaluation: Formal SpecEvaluation: Formal Spec Disadvantages:Disadvantages:
Requires extra skills and knowledge (e.g. mathematics) Requires substantial effort to fully express the system in
formal specification On average Cleanroom projects require 60-80% of the time used
in analysis and design Ideal for safety or mission critical systems and not for ordinary Ideal for safety or mission critical systems and not for ordinary
commercial development
Lacks good enough CASE tools supporting Project specific
If time-to-market & conditions are issues, then might not be used
Evaluation: Incremental DevelEvaluation: Incremental DevelAd tAd t AdvantagesAdvantages Quick and clean development in Cleanroom Engineering Continuous validation Provides measurable progress Manage higher risk requirements (i.e. prototype).
T ki f i t Tracking of requirements Stepwise building functionalities that satisfies stakeholders’
requirements Allows for fast delivery on important parts Focus on planning and discipline at management level and technical
level Statistical testing make the project quality control in proper level Verifiable specifications
Evaluation: Incremental DevelEvaluation: Incremental Develii Disadvantages:Disadvantages: Incomplete or conflicting requirements cannot be
l d h b i i d i iresolved at the beginning to determine increments Risk analysis has not been incorporated explicitly Need more care about configuration management Requires extra planning at both the management and
t h i l l ltechnical levels Stable requirements for each increment is needed, i.e.,
cannot adapt quickly to “rapidly changing” requirementscannot adapt quickly to rapidly changing requirements
Evaluation: Certification TestEvaluation: Certification Test
AdvantagesAdvantages Determines a level of confidence that a software
system conforms to a specification Able to statistically evaluate and infer the quality Able to statistically evaluate and infer the quality
of the software system to meet all requirements Quantitative approach that is verifiable Quantitative approach that is verifiable Quantitative data could be recorded and used
l t f b h ki tlater for benchmarking, etc.
Evaluation: Certification TestEvaluation: Certification Testii DisadvantagesDisadvantages Testing is derived from a usage model that must be
h i i d l b f iexhaustive in order to select a subset for testing Statistical testing and verification will be more reliable if
it i b d th hi t d tit is based on the some history data It would be effective if it could be integrated with other
testing methodstesting methods Testing is not suitable for bug-hunting
Human residual coding errors may not be addressed Human residual coding errors may not be addressed
Cleanroom SE: Case StudyCleanroom SE: Case StudyCl ft d l t li Cleanroom software development relies on a mathematically sound model of design to
h d f i d d i hensure that no defects are introduced into the software.
Cleanroom Software Specification and Design begins with an external view (black g g (box), and is transformed into a state machine view (state box), and is fully developed into a ( ), y pprocedure (clear box).
Box StructureBox Structure Box structures map system inputs and theBox structures map system inputs and the
stimulus histories (previous inputs) into outputs.outputs.
Is Black-Box construct sufficient to represent this? e g Jackson modelthis? e.g. Jackson model
NoBox structure
inputs outputs
history
history
Cleanroom SE: Process /1Cleanroom SE: Process /11) D fi th t i t1) Define the system requirements2) Specify and validate the black box
Define the system boundary and specify all stimuli and responses
Specify the black box mapping rules Specify the black box mapping rules Validate the black box with owners and users
3) Specify and verify the state box3) Specify and verify the state box Specify the state data and initial state values Specify the state box transition function Specify the state box transition function Derive the black box behavior of the state box and
compare the derived black box for equivalence
Cleanroom SE: Process /2Cleanroom SE: Process /2
4) Design and verify the clear box Design the clear box control structures and operations Embed uses of new and reused black boxes as necessary Derive the state box behavior of the clear box and
compare the derived state box to the original state box for equivalence
5) Repeat the process for new black boxes6) Convert to code)7) Certification test the code
Requirements Analysis /1Requirements Analysis /1Th b i h f th f ll i t The process begins when one of the following entry criteria is satisfied.E t 1 Entry 1 The Statement of Work or other initial artifact, such as a
statement of allocated system requirements is availablestatement of allocated system requirements, is available. Entry 2
Changes including additions and corrections to the Changes, including additions and corrections, to the Software Requirements are proposed.
Entry 3 Entry 3 A completed increment is ready for customer execution
and evaluation.
Requirements Analysis /2Requirements Analysis /2
Task 1 Define the software requirements.q
Task 2 U l ti f h i t fi Upon completion of each increment, reconfirm or clarify requirements through customer evaluation
f th t bl tof the executable system.
Requirements Analysis /3Requirements Analysis /3
Verification 1 Review the evolving Software Requirements g q
work product. Verification 2 Verification 2
Validate the Software Requirements work d t ith th t dproduct with the customer and peer
organizations.
Requirements AnalysisRequirements Analysis
Example: Build a simple calculator p
D il d d fi i i f h l l f i Detailed definition of the calculator function and what it does must be given and verified with the customer Various formal methods can be used: graph Various formal methods can be used: graph
theory, automaton model, etc.
Formal SpecificationFormal Specification
Example: Use logic constructs such as “is-a”Use logic constructs such as is a
(inheritance hierarchy) , “has-a” (association) and “such as” (examples) to formulate theand such-as (examples) to formulate the specification.
Verify that the total set of sentences form a tree (directed graph with no loop) ( g p p)
Black Box Structure /1Black Box Structure /1
Entry #1: first operand (xxx digits) Entry #2: calculation symbol (add, subtract)Entry #2: calculation symbol (add, subtract) Entry #3: second operand Entry #4: equal symbol Exit #1: calculation result Exit #1: calculation result
Box structure
inputs outputshistory
history
Black Box Structure /2Black Box Structure /2
calculator
1st operand nullnull
calculator
Calc symbol null1st operand1 operand
queuePushPush--down automaton modeldown automaton model
queue
Black Box Structure /3Black Box Structure /3l l t
2nd d ll
calculator
2nd operand null1st operandCalc sym
calculator
Calc symqueue
Equal sym Calc resultsq y1st operandCalc sym2nd operand
2nd operandqueue
Black Box Structure /4Black Box Structure /4 If having more than one black box or nested If having more than one black box or nested
black boxes verify the mapping
f f
g h gg
Sequential split
g
h
c
h
Parallel split
Black Box Structure /5Black Box Structure /5 If having more than one black box or nested If having more than one black box or nested
black boxes verify the mapping
ff ff
h
g cg c
g
Loop split
c
Loop split
State Box Structure /1State Box Structure /1 State transition diagram happy path State transition diagram happy path
non-numeric key
Error
ypressed
State Box Structure /2State Box Structure /2S l t t b b t d Several state boxes can be generated depending on the combination of acceptable ( bl ) i d hi i(unacceptable) inputs and histories
Example: p If 1st operand is non-numeric and calc symbol are
typed the next state is error stateyp If 1st operand is numeric and any other key other
than calc symbol is typed the next state is error y ypstate
etc.
State Box Structure /3State Box Structure /3
State boxes should be generated for all possible combinations of input(s) and history p p ( ) ystates.
The set of state boxes can easily grow The set of state boxes can easily grow beyond control!
Clear Box Structure /1Clear Box Structure /1
Code GenerationCode Generation
Coding will be based on the clear boxes Use of automatic code generation tools isUse of automatic code generation tools is
encouraged to reduce the probability of human errorhuman error
Cleanroom Testing /1Cleanroom Testing /1
Cleanroom testing teams must determine a usage probability distribution for the softwareg p y
The operational profile can be used
Cleanroom Testing /2Cleanroom Testing /2S th t th i t t th l l t Suppose that the inputs to the calculator program are
Input percentage number
A1 1st operand (correct) % 22 0 – 21
A2 1st d (i t) %3 22 24A2 1st operand (incorrect) %3 22 – 24
B1 2nd operand (correct) % 22 25 – 46
B2 2nd operand (incorrect) %3 47 49B2 2nd operand (incorrect) %3 47 – 49
C1 Calculation symbol (correct) % 22 50 – 71
C2 Calculation symbol (incorrect) %3 72 – 74C2 Calculation symbol (incorrect) %3 72 74
D1 Equal symbol (correct) % 22 75 – 96D2 Equal symbol (incorrect) %3 97 – 99
Cleanroom Testing /3Cleanroom Testing /3W t t f t t th t We must generate a sequence of usage test cases that conform to the usage probability distribution.
A series of random numbers are generated between 0 and 99 A series of random numbers are generated between 0 and 99 that corresponds to the probability of stimuli occurrence.
For example, the following random number sequences are generated: 14 – 95 – 26 – 44 : A1; D1; B1; B1 81 19 31 69 Test case 81 – 19 – 31 – 69 38 – 21 – 52 – 84
The testing team executes the test cases noted above (and
Test case
others) and verifies software behavior against the specification for the system.
Cleanroom Testing /4Cleanroom Testing /4
For example for the test case T1:T1: A1; D1; B1; B1A1; D1; B1; B1T1:T1: A1; D1; B1; B1A1; D1; B1; B1
The input sequence is:p q1. 1st operand2. Equal symbolq y3. 2nd operand4. 2nd operand4. 2 operand
And the output should be: ErrorError
Cleanroom CertificationCleanroom Certification The verification and testing techniques lead to certification The verification and testing techniques lead to certification
of software components Certification implies that the reliability can be specified for
heach component. Each component would have a certified reliability under the
usage scenario and testing regime. This information isusage scenario and testing regime. This information is needed for future use of the components.
The certification approach involves five steps: 1 U i i t d 1. Usage scenario is created.
2. A usage profile is specified. 3. Test cases are generated from the profile. 4. Tests are executed and failure data are recorded and analyzed. 5. Reliability is computed and certified.
CSE: Overall AdvantagesCSE: Overall AdvantagesS it bl f it ti d i t l Suitable for iterative and incremental software development.
Uses formal specification that defines more accurate, less conflict and complete , prequirements.
Continuous verification of software quality is Continuous verification of software quality is possible.Software quality can be certified using Software quality can be certified using software reliability engineering method.
CSE: Overall DisadvantagesCSE: Overall DisadvantagesCl d t th f b d Cleanroom advocates the use of sequence-based specifications. These are better suited to problems with a high degree of logical interactions Notwith a high degree of logical interactions. Not suitable for black boxes used in numerical or highly computational applications.computational applications.
Non-functional requirements (real-time, security constraints) and a significant portion of algorithmicconstraints) and a significant portion of algorithmic requirements are hard to be represented by Box structure.
After requirements changes, rework of the box-structure is a time-consuming process.
[email protected] 115 Statistical test data may be hard to be collected.
Conclusions Conclusions Cl h i i h Cleanroom approach is a rigorous approach to software engineering that has emphasis on: Formal specification Mathematical verification of correctness of
design Certification of software reliability
Cleanroom approach is yet to become a common practice in software development industry because of emphasis on the above
three points
ReferencesReferencesLi R d T l C (1996) Cl S ft Linger, R. and Trammel, C. (1996). Cleanroom Software Engineering Reference Model Version 1.0. http://www sei cmu edu/pub/documents/96 reports/pdf/tr022 96 pdfhttp://www.sei.cmu.edu/pub/documents/96.reports/pdf/tr022.96.pdf
Wolack, C. (2001). Taking The Art Out of Software Development – An In-Depth Review of Cleanroom Software Engineering. http://www.scisstudyguides.addr.com/papers/cwdiss725paper1.htm
Pressmen and Associates (2000) Cleanroom Engineering Pressmen and Associates (2000). Cleanroom Engineering Resources. http://www.rspa.com/spi/cleanroom.html
One Last Advice One Last Advice
Want to impress your customers: use failure intensity + reliability growth methodology!y y g gy
Want to impress your boss (development):p y ( p )use failure density + zero time failure methodology!methodology!
Want to impress yourself: use target failure Want to impress yourself: use target failure intensity + reliability demonstration chart!
That is all folks!