Verification and Validation
Mark C. Paulk
July 16, 2003 SEEK 2003
2
V&V in IEEE 729-1983
Verification• The process of determining whether or not the
products of a given phase of the software development cycle fulfill the requirements established during the previous phase.
Validation• The process of evaluating software at the end
of the software development process to ensure compliance with software requirements.
3
The V-Model
Sami Zahran, Software Process Improvement: Practical Guidelines for Business Success, 1998, pages 377-386.
“Software Development Standard for the German Federal Armed Forces, General Directive 250 – Software Life Cycle Process Model,” 1992.
System RequirementsAnalysis and Design
DP RequirementsAnalysis and Design
Software RequirementsAnalysis
Preliminary Design
Detailed Design
Implementation
Software ComponentsIntegration
DP Integration
System Integration
4
V&V in IEEE 610.12-1990
Verification• The process of evaluating a system or
component to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase.
Validation• The process of evaluating a system or
component during or at the end of the development process to determine whether it satisfies specified requirements.
5
IV&V in IEEE 610.12-1990
Verification and Validation (V&V)• The process of determining whether (1) the
requirements for a system or component are complete and correct, (2) the products of each development phase fulfill the requirements or conditions imposed by the previous phase, and (3) the final system or component complies with the specified requirements.
Independent Verification and Validation (IV&V)• V&V performed by an organization that is
technically, managerially, and financially independent of the development organization.
6
Techniques
Name some verification techniques.
Name some validation techniques.
7
Concept V&V in IEEE 1012-1998
Concept documentation evaluation
Evaluation of operational procedures
Criticality analysis
Hardware / software / user requirements
Allocation analysis
Traceability analysis
Hazards analysis
Risk analysis
Software V&V Plan generation or updating
8
Requirements V&V in IEEE 1012-1998
Traceability analysis
Software requirements evaluation
Interface analysis
V&V test plan generation and verification• system• acceptance
Configuration management assessment
Criticality / hazard analysis update
Risk analysis
9
Design V&V in IEEE 1012-1998
Traceability analysis
Software design evaluation
Interface analysis
Component V&V test plan generation and verification
V&V test design generation and verification• component• integration• system• acceptance
Criticality / hazard / risk analysis updates
10
Implementation V&V in IEEE 1012-1998
Traceability / interface / criticality / hazard / risk analysis updates
Source code and source code documentation evaluation
Component V&V test plan generation and verification
V&V test case and test procedure generation and verification• component• integration• system
Acceptance test case generation and verification
Component V&V test execution and verification
11
Test V&V in IEEE 1012-1998
Traceability / hazard / risk analysis updates
Acceptance V&V test procedure generation and verification
V&V test execution and verification• acceptance• integration• system
12
Installation and Checkout V&V in IEEE 1012-1998
Hazard / risk analysis updates
Installation configuration audit
Installation checkout
V&V final report generation
13
Operation V&V in IEEE 1012-1998
Software V&V Plan revision
Proposed change assessment
Anomaly evaluation
Criticality / hazard / risk analysis updates
Migration assessment
Retirement assessment
14
CMMI Overview
Process is unpredictable,poorly controlled, and reactive
Process is characterized for projects and is oftenreactive
Process is characterizedfor the organization andis proactive
Process is measuredand controlled
Focus is on quantitativecontinuous processimprovement
Level Process Characteristics
Requirements ManagementProject Planning
Product and Process Quality Assurance
Configuration Management
Project Monitoring and ControlSupplier Agreement Management
Quantitative Project ManagementOrganizational Process Performance
Causal Analysis and ResolutionOrganizational Innovation and Deployment
Process Areas
Requirements Development Technical Solution
Product IntegrationValidation
Verification Organizational Process FocusIntegrated Project Management
Initial
Managed
Defined
Quantitatively Managed
Optimizing
Measurement and Analysis
Organization Process DefinitionOrganizational TrainingRisk ManagementDecision Analysis & Resolution
15
V&V in CMMI
Verification• Confirmation that work products properly
reflect the requirements specified for them. In other words, verification ensures that “you built it right.”
Validation• Confirmation that the product, as provided (or
as it will be provided), will fulfill its intended use. In other words, validation ensures that “you built the right thing.”
16
CMMI - Verification The purpose of Verification is to ensure that selected work products meet their specified requirements.
SG 1 Prepare for Verification
Preparation for verification is conducted.
SP 1.1-1 Select Work Products for Verification
Select the work products to be verified and the verification methods that will be used for each.
SP 1.2-2 Establish the Verification Environment
Establish and maintain the environment needed to support verification.
SP 1.3-3 Establish Verification Procedures and Criteria
Establish and maintain verification procedures and criteria for the selected work products.
SG 2 Perform Peer Reviews Peer reviews are performed on selected work products.
SP 2.1-1 Prepare for Peer Reviews Prepare for peer reviews of selected work products.
SP 2.2-1 Conduct Peer Reviews Conduct peer reviews on selected work products and identify issues resulting from the peer review.
SP 2.3-2 Analyze Peer Review Data
Analyze data about preparation, conduct, and results of the peer reviews.
SG 3 Verify Selected Work Products
Selected work products are verified against their specified requirements.
SP 3.1-1 Perform Verification Perform verification on the selected work products.
SP 3.2-2 Analyze Verification Results and Identify Corrective Action
Analyze the results of all verification activities and identify corrective action.
17
CMMI - Validation The purpose of Validation is to demonstrate that a product or product component fulfills its intended use when placed in its intended environment.
SG 1 Prepare for Validation
Preparation for validation is conducted.
SP 1.1-1 Select Products for Validation
Select products and product components to be validated and the validation methods that will be used for each.
SP 1.2-2 Establish the Validation Environment
Establish and maintain the environment needed to support validation.
SP 1.3-3 Establish Validation Procedures and Criteria
Establish and maintain procedures and criteria for validation.
SG 2 Validate Product or Product Components
The product or product components are validated to ensure that they are suitable for use in their intended operating environment.
SP 2.1-1 Perform Validation Perform validation on the selected products and product components.
SP 2.2-1 Analyze Validation Results
Analyze the results of the validation activities and identify issues.
18
SoftwareRD
SystemsRD
Customers&
Stakeholders
REQMHardware
RDHardware
TSHardware
VER
SoftwareTS
SoftwareVER
HardwarePI
SoftwarePI
SystemsPI
HardwareVAL
SoftwareVAL
System
sV
AL
SystemsTS
CMMI Engineering Hierarchy
19
Testing Principles
All tests should be traceable to customer requirements.
Test should be planned long before testing begins.
The Pareto principle applies to software testing.
Exhaustive testing is not possible
To be most effective, testing should be conducted by an independent third party.
Roger Pressman, Software Engineering: A Practitioner’s Approach, Fifth Edition, 2001.
20
Software Program Managers Network
SPMN established in 1992 to identify proven industry and government software best practices and convey these practices to managers of large-scale DoD system acquisition programs
16 Critical Software PracticesTM specifically address underlying cost and schedule drivers that have caused many software intensive systems to be delivered over budget, behind schedule and with significant performance shortfalls.
<URL: http://www.spmn.com/>
21
SPMN #8. Manage and Trace Requirements
Before any design is initiated, requirements for that segment of the software need to be agreed to.
Requirements tracing should be a continuous process providing the means to trace from the user requirement to the lowest level software component.
Tracing shall exist not only to user requirements but also between products and the test cases used to verify their successful implementation.
All products that are used as part of the trace need to be under configuration control.
Requirements tracing should use a tool and be kept current as products are approved and placed under CM.
Requirements tracing should address system, hardware, and software and the process should be defined in the system engineering management plan and the software development plan.
22
SPMN #14. Inspect Requirements and Design
All products that are placed under CM and are used as a basis for subsequent development need to be subjected to successful completion of a formal inspection prior to its release to CM.
The inspection needs to follow a rigorous process defined in the software development plan and should be based on agreed-to entry and exit criteria for that specific product.
At the inspection, specific metrics should be collected and tracked which will describe defects, defect removal efficiency, and efficiency of the inspection process.
All products to be placed under CM should be inspected as close to their production as feasible.
Inspections should be conducted beginning with concept definition and ending with completion of the engineering process.
The program needs to fund inspections and track rework savings.
23
SPMN #15. Manage Testing as a Continuous Process
All testing should follow a preplanned process, which is agreed to and funded.
Every product that is placed under CM should be tested by a corresponding testing activity.
All tests should consider not only a nominal system condition but also address anomalous and recovery aspects of the system.
Prior to delivery, the system needs to be tested in a stressed environment, nominally in excess of 150 percent of its rated capacities.
All test products (test cases, data, tools, configuration, and criteria) should be released through CM and be documented in a software version description document.
Every test should be described in traceable procedures and have pass-fail criteria included.
24
SPMN #16. Compile and Smoke Test Frequently
All tests should use systems that are built on a frequent and regular basis (nominally no less than twice a week).
All new releases should be regression tested by CM prior to release to the test organization.
Smoke testing should qualify new capability or components only after successful regression test completion.
All smoke tests should be based on a pre-approved and traceable procedure and run by an independent organization (not the engineers who produced it).
All defects identified should be documented and be subject to the program change control process.
Smoke test results should be visible and provided to all project personnel.
25
Generic V&V Techniques
• Technical reviews, management reviews, joint reviews• Symbolic execution, program proving, proof of
correctness, formal methods• Anomaly analysis, syntactical checks• Functional testing, black-box testing, equivalence
partitioning, boundary value analysis• Structural testing, white-box testing, basis path testing,
condition testing, data flow testing, loop testing• Unit testing• Regression testing, daily build and smoke test• Integration testing• Random testing, adaptive perturbation testing, mutation
testing, be-bugging• Operational profile testing, stress testing, performance
testing• System testing, acceptance testing• Peer reviews, structured walkthroughs, inspections,
active design reviews, pair programming, …
26
Effective V&V
What is the most effective verification technique (or techniques)??
What is the most effective validation technique (or techniques)?
27
Test procedure,test case inspections
Test procedure,test case inspections
Code inspections
Test procedure,test case inspections
Test procedure,test case inspections
Requirementsinspections
HLD inspections
DD inspections
The W-Model
Customer needs
Requirements
Architectural design
Detailed design
CodeUnit test
Integrationtest
Systemtest
Acceptancetest
28
Questions and Answers…
?
29
Some Useful Internet Links
Software Engineering Institute• www.sei.cmu.edu/• www.sei.cmu.edu/cmm/• www.sei.cmu.edu/cmm/cmm.articles.html• www.sei.cmu.edu/cmmi/• www.sei.cmu.edu/cmmi/models/• www.sei.cmu.edu/str/descriptions/inspections_body.html
Software Program Managers Network• www.spmn.com/
Formal Technical Review Archive• www2.ics.hawaii.edu/~johnson/FTR/
30
Copyrights and Trademarks
Capability Maturity Model, Capability Maturity Modeling, CMM Integration, and CMMI are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.
SM eSCM, IDEAL, SCAMPI, SCAMPI Lead Assessor, SCAMPI Lead Appraiser, Personal Software Process, PSP, Team Software Process, and TSP are service marks of Carnegie Mellon University.