DOC 8 - WordPress.com · 2015. 6. 17. · DOC 8 - WordPress.com ... doc 8
DOC
Transcript of DOC
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
6.5 Testing
VI.D.8.Testing
This subsection must include a narrative describing the QBP’s approach to testing and a sample Test Plan used by the QBP on another project as described in Section VI.D.8 – Testing.
6.5.A Testing
VI.D.8.Testing
Req.
Num.Requirement Response / Reference
M-43 Contractor shall describe the overall testing approach and methodology used for the SOMS Project
The Bidder must submit a narrative describing the testing approach and methodology with their proposal response as identified in Section VIII – Proposal and Bid Form
Section 6.4
Bidder Response Detail:
As a leader in information technology services, EDS has invested heavily in establishing application and solution testing as a core competency. We use disciplined testing practices to make sure functional quality, performance, and security for applications. Our services provide a control point for our clients to mitigate their risk as they seek to develop and modernize applications that utilize modern infrastructure and modern SOA architectures. Our recent acquisition of RelQ Software, a pioneer in the independent testing and V&V services market, positions the EDS Global Testing Practice as a leader in the global testing market. Our global knowledge can be applied locally, helping CDCR to improve the quality of the software delivered – measurably reducing defects and rework while improving application coverage and speed to delivery.
Approach to SOMS Testing
EDS uses a unique, comprehensive approach to risk-based testing – one driven by business requirements. Our Global Testing Practice identifies high-risk, high-value business requirements and concentrates testing activity on those requirements to minimize risk and deliver the most value to CDCR.
Team EDS’ approach to testing is based on our commitment to delivering a product of very high quality for acceptance testing. The objective of our rigorous testing approach is to validate that all requirements have been completely tested and all issues resolved before the start of user acceptance testing.
EDS’ philosophy is that testing is an integral part of virtually every phase of the system development life cycle. We begin our test planning at the earliest stages of the project, and it is tailored to meet the development methodology for the
EDS SOMS STATEMENT OF WORK EXHIBIT I -215
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
SOMS project. EDS’ approach to testing is governed by specific workflows, roles, and artifacts prescribed by the quality assurance (QA) process defined in our project management approach.
The primary goal of the testing effort is to increase the probability that when the system is launched, it behaves as expected under all circumstances and meets the defined requirements as set forth in the RFP. In support of this goal the following testing objectives are defined:
Verify that all end-user paths, inclusive of all interfaces to and through the system, perform correctly (end-to-end testing)
Verify that the user interface screens perform correctly
Verify that the system complies with defined response times
Verify that all devices, such as workstations, printers, and scanners, perform correctly
Verify that the environments are operationally ready and production worthy
Verify that all sites are operationally ready and production worthy
Verify that all system utility functions (backup, recovery, failsafe) perform correctly
Incorporate a test design that not only provides knowledge transfer to the CDCR, but also minimizes test rework associated with modifications to the system post-production deployment
Perform test activities that support both defect prevention and defect detection
Incorporate automated test design to create reusable and maintainable scripts, not only to expedite the testing process during initial development, but to provide these testing assets to the client at project end
Verify that requirements are testable
Verify traceability to the requirements to validate test coverage
The most important step in the successful testing of a component or a complete software system is capturing a clear and deep understanding of the functional requirements. Our approach to requirements gathering makes sure that they are testable. Testability in a requirement is determined by asking questions about the requirement to determine if it is complete, unambiguous, consistent with other system requirements, correct, and described at an elemental level for testing purposes (not a compound set of conditions).
Our approach is adaptable to any software development methodology – whether waterfall, iterative, spiral, or agile. We align with the selected methodology through our Testing V Model, depicted in Figure 6.5-1, EDS’ Testing V Model, verifying compliance with the development process and validating the quality of application requirements, design, and code. Our Enterprise Testing Method (ETM), of which Testing V Model is a part, is a comprehensive testing framework,
EXHIBIT I -216 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
provides the structure, processes, tools, and templates to make sure that CDCR has consistent, high-quality testing services.
The purpose of the ETM is to increase productivity and the quality of EDS' testing practices, resulting in reduced risk of failed or faulty implementations by improving the comprehensiveness and focus of our testing activities.
Appro
ach Ambiguity
AnalysisRisk
AnalysisSystematic Test Design
Requirements Traceability
Testing Metrics Collectionand Reporting
TestingClose-Down
Activities
Analysis Test DesignTest Preparation,
Execution, and MonitoringClose-Down
EDS Enterprise Testing MethodStrategy Design Techniques Management Measurement Production Support
Construction
Design Testing
Analysis Implementation
Risk-based, requirements-driven, end-to-end testing
ED
S
ETM
Testi
ng V
Model
Figure 6.5-1, EDS’ Testing V Model
Testing Methodology – Overview of Enterprise Testing Method (ETM)
As a comprehensive testing framework, the ETM provides test coverage for all hardware and software that make up a delivered system, providing direction and guidance in developing a testing strategy; planning, designing, and executing the required testing; managing and measuring all testing activities; and providing post-deployment testing support to an application or system in production.
EDS bases its philosophy, process, and methodology on the following seven principles.
Meeting Business Needs – The overriding objective of testing at all stages is to confirm that a system is fit for its business purpose. The correctness of test results is judged at all stages of development in terms of whether the system performs in a way that meets the business needs.
Defect-Centric Testing – The objective of designing and running a test is to find a defect. Testing can never prove that a computer system works: It can only build up a level of confidence. Confidence is derived from finding errors and fixing them.
EDS SOMS STATEMENT OF WORK EXHIBIT I -217
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Testing Throughout the Life Cycle – Testing must be performed on all products at all stages of the implementation process. Software products and associated design and user documents emerge throughout the development process. Testing must therefore be planned as an integral part of the iterative life cycle.
Comprehensive Requirements Traceability – EDS uses an effective combination of Borland StarTeam and HP Quality Center for comprehensive requirements traceability. We have used these tools successfully on many large projects.
Independent Testing – A deliverable should be tested by someone other than its creator. Independent testing is more effective than testing performed by the author of a deliverable. The active and constant involvement of users in the project makes certain that an independent perspective can be applied.
Repeatable Testing – Tests must be repeatable. To make a test repeatable, it must be documented.
Well-Managed Test Data – Given the complexity and interdependence among internal and external systems with regard to test data, a formal test data management strategy for each SOMS phase and area (such as eOMIS, interfaces, data conversion) will be developed once requirements and design documentation are sufficiently complete and available for review. At a high level, the types of data required for each level of testing includes:
– Application data, basic configuration
– Application data, transactional
– Input data from external source systems
– Output data for external destination systems
– List of databases and files
– Summary of record and transaction types and volumes
– Data converted from legacy systems
– Transient data requirements
– Live data from other SOMS projects
The ETM delivers value by enabling projects, programs, and organizations to define the most appropriate testing approach for their needs, to formalize testing requirements with clients, and to execute and manage testing across an effort of any size. The ETM defines testing activities across deliverable components throughout the entire deployment life cycle. It helps testing resources clearly define, execute, and manage testing activities – lowering risk and increasing solution quality by providing a testing framework that is comprehensive enough to guarantee consistency, yet flexible enough to allow tailoring that adapts the ETM to the specific testing needs of any client.
The ETM has associations with other EDS methods, processes, and strategies. The ETM links, by direct reference, to other methods for cross-discipline functions such as project management and requirements determination. Specifically, the ETM fully aligns with System Lifecycle 3 (SLC3), the software development life
EXHIBIT I -218 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
cycle that has been selected for the SOMS project. It also fully supports the unique needs of SOA testing. Please refer to proposal Section 6.04 - System Design and Development for a detailed description of SLC3.
The EDS ETM includes several levels of testing that are essential to the successful implementation of enterprise solution engagements. The following section provides a high-level overview of the focus of each of the levels with specific regard to enterprise implementations.
ETM testing focuses on the following:
Application components
– Unit testing
– Component integration testing
Services
– Service testing
Integrated systems
– System testing
– System integration testing
System Performance
– Performance testing
End-to-end intersystem testing
– Consolidated integration testing
Formal acceptance
– User acceptance testing
– Operational acceptance testing
EDS will develop a Master Test Plan that provides a framework for all testing to be performed for the SOMS project. The SOMS solution is thoroughly tested as specified in the Master Test Plan. We provide “out of the box” testing to verify that the COTS infrastructure products are functioning properly. Negative testing scenarios are included. The Master Test Plan includes testing for the configured and programmed items, the programs and reports, and a complete end-to-end test including testing of interfaces to external systems.
For each SOMS release, EDS does the following:
Develop a Release Test Plan and testing schedule
Conduct all test levels specified in the Release Test Plan
Develop a focused Test Plan for each test level
Facilitate and coordinate the staff testing effort
Provide support and software corrections during user acceptance testing
EDS SOMS STATEMENT OF WORK EXHIBIT I -219
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Submit a written document summarizing test results
There are many security concerns related to the use of real or live production data throughout the project life cycle. The SOMS project uses the services of many people for this project that are not employed by CDCR and may not have the proper security clearances to view and use personal information. Consequently, there are many security, privacy, and legal issues that must be considered when using production data.
Team EDS in collaboration with CDCR reviews these issues through CDCR’s security and legal departments, and determines the necessary steps to make sure that production data is used properly. Testing of the system follows recommendations and requirements determined by CDCR to make sure the proper handling of test data in testing environments.
Managing the Testing Effort
Team EDS has extensive experience in managing large testing efforts in public sector and commercial projects. We believe in leveraging our past experience and established methodologies to define a clear and effective test strategy. Testing is integrated into every facet of our development cycle.
The primary goal of the testing effort is to make sure that when the system is placed in the production environment, it behaves as expected under all normal circumstances and meets the requirements as defined in the RFP.
Test Planning
The SOMS Test Plans begin during the requirements validation stage and are presented as deliverables. Each section of the Test Plans also specifies the components of the SOMS Test Reports where appropriate.
The Test Plans include a description of the tests being conducted, any special requirements for data, and the goal for each test. The Test Report records the results of the tests, presents the capabilities and deficiencies for review, and provides a means of assessing progress toward the next stage of development or testing. When defects are found, the Test Report provides the error description and appropriate action taken to correct the affected component subsystem or system. It also provides appropriate statistics for each component.
The SOMS Test Plans are living documents and must be reviewed and updated as needed. The initial test descriptions are defined based on the baseline requirements and are elaborated on during the Design Phase. During the system test phases, software engineers, testers, and the Quality Assurance team may develop additional or more detailed descriptions and test cases. All test artifacts are maintained in the central Borland StarTeam repository.
Within each test phase, various test conditions are tested and validated, as shown in Table 6.5-1, Test Phases and Conditions Tested.
EXHIBIT I -220 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Table 6.5-1, Test Phases and Conditions TestedTest Condition System Performance
Fault Insertion ●
String Testing (end to end) ●
Error Handling ● ●
Stress ●
Performance ● ●
Portal Functionality ● ●
Functional ●
Regression ●
Interface Testing ● ●
Security ● ●
Operational Readiness ●
User Interface ●
Audit Trails ●
Backup/Recovery/Failsafe ●
Usability ●
Conversion ● ●
As a precursor to testing, Team EDS develops a comprehensive plan to guide the activities of the levels and stages of testing. The Test Plans detail the complete end-to-end testing of the system and all of its components.
To guarantee the completeness of the SOMS Test Plans, formal system requirements identified during the Requirements Validation Phase of the release are used to drive the design and execution of the Test Plans.
At the inception of the project, a Test team is formed that is responsible for formulating the Test Plans. The team defines the criteria for evaluating the testing effort, design test scenarios for both normal and exceptional situations, and evaluates the results of system-level tests.
As part of the creation of the Test Plans, Team EDS develops a comprehensive test database to serve as the baseline information required to test the system fully. As the various levels of testing are executed, the data used to verify the viability of each aspect of the system is used to augment the test database.
The system Test Plans are developed for SOMS during the application software system test phase. Part of the plan is to break the system test into cycles and test variants within cycles. Expected results are prepared for each test variant. Test variants are grouped into test scripts for execution.
EDS SOMS STATEMENT OF WORK EXHIBIT I -221
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Test Data Creation
Team EDS plans to use “real world” data for testing to be able to age the data in the same way a production case might change characteristics. To maintain confidentiality, we scramble critical identifiers such as social security numbers and dates of birth. We also enforce multiple layers of physical data safeguards, including confidentiality agreements, staff training, secure work sites, and use of shredders. This means that we must start data conversion early enough to have a usable test database when the developers are ready to begin unit testing.
The Data Conversion team feeds data to the test environment. As the team executes iterations of cleansing and sanitizing the converted legacy data and loading it into the electronic Offender Management Information System (eOMIS) data structure, the data become more reliable. The SMEs, who have been working on offender management for years, guide the creation of test data for the test scenarios established for SOMS. The test data created from legacy data conversion processes is used to develop and test all aspects of SOMS including interfaces and business processes.
For performance testing, we recognize that it is difficult to generate sufficient data artificially before the real data is created, and the real data only builds up gradually during back-scanning and data conversion. The test data for performance testing is generated from the storage area network (SAN) backup database in the production environment (Release 1 onwards) and the data conversion database in Release 1, repeating load tests as the back-scanning and converted legacy data builds up, for example, at 25 percent, 50 percent, and 100 percent load.
Defect Tracking and Resolution
It is not sufficient to detect issues and defects. It is critical that they be recorded, assigned for resolution, and retested after they are fixed. Borland StarTeam supports very efficient generation, assignment, and tracking of issues and defects. Issues and defects can also be linked using Realization connectors to model elements that are responsible for them. Issue tracking is important on complex projects, especially those with multiple state and federal requirements that must be met. This software test and defect tracking activity begins the moment a software discrepancy is detected and continues indefinitely across all iterations.
Test Environment
Unit and string testing is performed by the application developers in the development test environments. All system, acceptance testing, regression testing, and performance testing is performed in dedicated and controlled test environments. EDS’ Infrastructure team works in coordination with the EDS Test team to confirm that all components necessary for testing are part of the environment. The EDS Test Team does the following:
EXHIBIT I -222 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Work closely with the configuration manager to migrate software ready for testing to the test environment
Load all data needed to perform testing. This includes new case scenarios and other dependencies.
Schedule batch processes
Identify performance issues and defects in the test environment that may not be related to software
The migration of software to the test environment is managed through a very controlled configuration management process, using the configuration management tool CA Cohesion.
Communication Approach
Written and oral communication is critical to successful management and the Testing Phase is no exception. EDS uses a standard set of processes and tools to make sure that our Communication Plan is defined and followed throughout the Testing Phase. Tools include regular status meetings and reports. The status of system tests is part of the regular Weekly Project Status Report delivered to CDCR SOMS Project Management.
Reporting is not handled strictly through producing and delivering electronic and hard copy documents to the State. Testing Phase Plans and results are also shared by conducting walkthroughs and system demonstrations. These interactive sessions are conducted by the EDS Test team with key stakeholders and are supported by the documentation of changes before and after system outputs.
Test Team Reports, Deliverables, and Performance Metrics
The Test team performs an essential quality assurance (QA) role by subjecting planned system changes to a series of inspections and process executions to determine the behavior of the system. The testing activities are like scientific experiments that include detailed specifications of the planned change, criteria for the conditions of the tests, multiple executions of software code under different scenarios, and rigorous reporting and analysis of outcomes.
EDS SOMS STATEMENT OF WORK EXHIBIT I -223
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Figure 6.5-2, Sample SOMS System Test Status Report (Period to Period)
An excerpt from a sample system test status report for the SOMS application is presented graphically above in Figure 6.5-2, Sample SOMS System Test Status Report (Period to Period), and includes the status of test variant execution. This example shows that of the 2,443 total test variants for SOMS, 50 percent have been executed and 40 percent have passed. In this example, passed means executed successfully without generating any Level 1 or Level 2 incidents.
ReportsThe topics shown in Table 6.5-2, Communication Topics, are covered on a scheduled basis to communicate the progress and status of testing. They are included in the weekly project status report that provides details on the oversight of the project to the EDS and CDCR Project Management teams.
Table 6.5-2, Communication Topics
Topics To Be Covered Description
Management Summary Report on the activities of the Test team for the previous period and activities planned for the period ahead.
Migration Schedule Report of planned release content, such as work products to be included. It is used by Team EDS to schedule fixes, and by project management to manage releases.
Testing Completed Report of testing completed in the reporting period. Indicates issues and unresolved cases of testing not meeting expectations.
Incidents in Rework This report, a logical adjunct to the Testing Completed report, identifies all rework items, who is doing the rework, and the nature of the rework, such as recoding or revisiting the interpretation of requirements.
Application Defects Summary and detailed report of application defects found during testing.
EXHIBIT I -224 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Approach to Testability
The term “testability” is used to refer to two distinct design practices: Design for Test and Design for Diagnosis. It is important that a distinction be made between these two practices:
Design for Test refers to the use of good design practices that facilitate testing. In this sense of the word, design engineers typically verify that good testability practices are followed during the traditional design process.
Design for Diagnosis refers to the optimization of a design and accompanying test procedures to facilitate good diagnostics.
In Design for Diagnosis, the word “testability” involves the assessment of the fault detection and fault isolation capability of a system or device, as well as the optimization of test point placement, functional partitioning, and diagnostic strategies that are needed to meet a system’s testability requirements.
At EDS we use both of these methods to enhance the testability of our systems, and we apply these concepts to make sure that SOMS is built to exceed the State’s expectations. EDS approaches testability as a design characteristic that allows the status (operable, inoperable, or degraded) of an item to be determined and the isolation of faults within it to be performed in a timely manner. We have incorporated several best practices into our standard development methodology to enhance testability through the Design, Integration, and Testing phases.
The implementation of eOMIS accelerates the testability of SOMS and reduces total time for testing. The eOMIS version proposed for SOMS is a 91 percent fit to the SOMS requirements. The modules of the proposed eOMIS are proven, tested, and established at other client locations, thus reducing risk and the time needed to establish SOMS.
System Testability Best Practices
Build a cleaner system in the first place, so that less test-debug-fix cycles of rework are needed. As part of this best practice, the following quality practices are incorporated into the SOMS project:
Use requirements validation
Design to facilitate testing
Build from precise specifications (designs)
Perform consistent code inspections
Evaluate and validate unit testing coverage
Automate execution of unit tests to enable frequent testing
Test early; test small pieces first; developers perform unit testing
Use generational capabilities of tools to reduce human error
EDS SOMS STATEMENT OF WORK EXHIBIT I -225
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Use source code control
Establish test entry criteria. We obtain consensus with the project manager and developers on the criteria for determining whether the system is ready to test, and whether the test environment is ready.
Increase the early involvement of testers in the project. The SOMS project Test team is engaged early in the project and has enough time to master the system, determine how to test it, prepare the test cases, and fully validate the test environment. The testers actively participate in developing the overall project Work Plan. In fact, we plan to move the Functional Manager and part of the Functional team that validates and defines the requirements immediately to the system test planning task.
Verify that the testers have a thorough understanding. Team EDS system testers understand the following:
The project goals and success factors
The project context
The system’s functionality
The system’s risks and vulnerabilities
The test equipment, procedures, and tools
The testers use Design for Test reviews to instrument and place probes into the system being tested. Design for Test is intended to give black-box testers access to the hidden internal behavior of the system.
Encourage a common system architecture and component reuse. Although these areas are not generally the primary concern of testers and QA analysts, a common architecture across a family of systems and planned component reuse can drastically shorten the test time and the overall development cycle time.
Stabilize the system being tested as early as possible. The SOMS applications are placed under change control and version control, and a cutoff date is established beyond which no changes are allowed except emergency show-stopper fixes (the code freeze date) before moving into the System Test Phase. In some cases, EDS sets the cutoff early, even if this means reducing functionality, to allow sufficient time after the fix for final testing and fixing of the stabilized version (at least 2 weeks for small systems and at least 1 month for large complex ones).
Stabilize and control the test environment. An argument can be made that more stop-and-go interruptions of test execution are caused by gremlins in the test environment than by any other cause. Team EDS establishes and tests the testing environment well in advance of the scheduled start of testing. The test environment includes tools to manage the test configuration, diagnose problems in the environment, and easily reconfigure the environment from test to test.
EXHIBIT I -226 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Benefits of EDS’ Testing Approach
EDS believes our proposed testing approach is consistent with the CDCR requirements and provides enhanced benefits. Table 6.5-3, Features and Benefits of Our Testing Approach, highlights the features and benefits of our approach.
Table 6.5-3, Features and Benefits of Our Testing Approach
Feature Benefit
Early, up-front test planning, where test conditions and cycles are developed as part of the specification development
Verifies that SOMS directly supports the CDCR business model and meets the needs of each stakeholder
Provide CDCR SMEs and stakeholders the opportunity to review and approve the Test Model and PlanBuild milestones into the Test Plan that the CDCR can use when assessing the quality and thoroughness of the test process
Verifies that the SOMS application directly supports the CDCR business model and meets the needs of each stakeholder
Execute integrated test packages based on realistic business cases, with substantial input from the CDCR SMEs
Verifies that the application supports the business functions being tested
Develop well-documented, repeatable test models to facilitate analysis and regression testingAutomated testing tools are used, where possible, to help automate the regression testing process
Reduces risk, cost, and the testing time frame
Follow the EDS Enterprise Testing Method, to make sure that the Test Plan is complete and aligned with the design process
Verifies stage containment and minimizes the duration and cost of testing
Use a tightly controlled test environment that is separate from the development environment, so that fixes can be made to the application without affecting user testing
Verifies that application components are tested in a production-like environment and that code versions are managed accurately
Provide comprehensive written and oral communication of testing efforts at all levels, at scheduled periods
Enhances communication with CDCR, verifies that Team EDS meets business needs and stakeholder expectations, and avoids rework
Use third-party tools to manage test execution, including for: Defect tracking Source code management Automated test scripts Database management Status reporting
Provides the following benefits to CDCR: Verification of completed requirements Version control Reduces risk, cost, and time frame for
testing Multiple simultaneous environments Close interaction with CDCR to make
sure that Team EDS delivers – at a minimum – what is requested and expected
EDS SOMS STATEMENT OF WORK EXHIBIT I -227
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
6.5.B Test Plan Requirements
VI.D.8.a. Test Plan Requirements
Req.
Num.Requirement Response / Reference
M-44 The Contractor shall incorporate the testing approach into a comprehensive Test Plan complying with CDCR testing practices and IEEE Std. 829-1998 Standard for Software Test Documentation. The Test Plan shall include the procedures for documenting the completion of each test phase, test scripts, test conditions, test cases, and test reports. Detailed Test Plans shall be created for the following: Unit Testing Functional Testing Integration Testing System Testing Security Testing Regression Testing Stress/Load Testing Performance Testing
Acceptance/Usability Testing
The Bidder must submit a sample Test Plan with their proposal response as identified in Section VIII – Proposal and Bid Format.
Vol 1 – Appendices
Appendix J
Bidder Response Detail:
One of the objectives of the Team EDS approach to application development is reuse. Defining and applying a consistent approach to testing is a form of reuse that can reduce duplication of effort, improve productivity, and increase the speed with which SOMS implementation is completed and delivered.
Team EDS complies with this requirement and confirms that we will incorporate the testing approach into a comprehensive Master Test Plan that complies with CDCR testing practices and exceeds IEEE Std. 829-1998, Standard for Software Test Documentation. We also confirm that the Test Plan will include the procedures for documenting the completion of each test phase and of test scripts, test conditions, test cases, and test reports. We also confirm that Team EDS will provide detailed Test Plans for:
Unit testing
Functional testing
Integration testing
EXHIBIT I -228 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
System testing
Security testing
Regression testing
Stress/load testing
Performance testing
Acceptance/usability testing
As part of this proposal, we have also included a Sample Test Plan in Appendix J of our response.
The purpose of the Master Test Plan is to define the comprehensive testing strategy that is applied to the SOMS program as a whole and to each release undertaken by EDS. The primary objective of a Master Test Plan is to provide guidance and establish a program-level framework within which all testing activities for a specific project or release can be defined, planned, and executed. This “umbrella” strategy focuses on defining and facilitating an efficient and cost-effective approach to all testing activities that supports:
Achieving goals for the SOMS implementation
Defining how an iterative-incremental approach to application development applies to testing activities
Meeting agreed requirements
Managing risks
Informed decision-making
Before the Testing teams begin the SOMS testing effort, the Test team members are walked through the Test Plan, test strategy, and testing goals, so that they work in unison and create a better SOMS application. We establish guidelines and processes for the Test team for creating the test data, positive and negative tests, and guidelines for logging and closing a bug.
These guidelines document and support the SOMS implementation project life cycle with information applicable specifically to the testing life cycle, and include the following:
Testing roles and responsibilities
The testing methodology
The test levels that are performed
The test coverage that each test level provides
The deliverables associated with each of the test levels
The testing activities developed and implemented for each test level are described in detail, including relevant assumptions and dependencies.
EDS SOMS STATEMENT OF WORK EXHIBIT I -229
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
The test planning and deliverables (plans, cases, and data) for each SOMS release derive their direction from the Master Test Plan. Implementing a common testing framework verifies that the appropriate levels of testing are performed, that testing takes place in the appropriate environments, that proper test data is generated, and that appropriate test cases are developed and executed. This approach results in a better product for SOMS implementation, as defects are identified early in the development life cycle. The cost of development and delivery are also reduced, as rework during the later stages of the product development cycle is minimized.
In addition to the Enterprise Testing Method (ETM) described under M-43, EDS has developed a related SOA testing strategy template that can be tailored to meet the unique needs of an SOA development for SOMS. This template is used as the basis for developing the SOMS Master Test Plan.
It supports the application of efficient and effective testing practices across all SOA projects and provides comprehensive guidance on all aspects of the testing process by defining the following:
SOA testing organization along with generic and specific roles and responsibilities for the appropriate project stakeholders
Generic, standard testing methodology to be adapted for use on each SOA project
Activities associated with test planning, preparation, and execution
Activities associated with managing all aspects of testing
Requirements for testing environments and tools required to support the testing effort
This SOA testing strategy template also includes a comprehensive glossary of testing terms.
For each SOMS release, EDS develops a Release Test Plan that addresses the scope, testing requirements, test coverage, test levels to be performed, resourcing, and schedule. Detailed Test Plans are developed for each test level to be performed for the release. Table 6.5-4, Test Deliverables, summarizes the primary testing deliverables that EDS will produce for each SOMS release.
EXHIBIT I -230 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Table 6.5-4, Test Deliverables
Work Product Purpose Testing Life-Cycle Phase
Test Plan The primary means by which each Testing team communicates what it plans to test, its environmental and resource needs, and the schedule for each testing level to be performed.
Planning
Inputs can include:
Documented and agreed business requirements
Agreed change requests E2E testing strategy
Activities include defining:
Scope (in and out) Objectives Project- and release-
specific entry and exit criteria
Tasks and deliverables Assumptions, constraints,
and risks Resource requirements
(source information, equipment, software, data, personnel, tools)
Team roles and responsibilities
Training Plan for testers (if required)
High-level test scenarios
Test Scenarios
Test Cases
Grouping of selected test cases that follow a logical sequence or common grouped business processes.
Define the conditions to be tested and their assigned priorities. Prerequisites, detailed test procedures, and expected test results must also be included.
Design
Inputs can include:
Approved functional specifications and design documents
Approved technical specifications
Test Plan for the testing level
Activities include:
Reviewing and refining test scenarios
Creating test cases Defining test data Defining expected results
Test Scripts Automated instances of selected test cases. Test Execution
Inputs can include:
Test cases Test data Test results
Activities include:
Executing tests (cases and
Testing Progress Reports (ongoing)
Regular reports on testing progress for each test level based on the following metrics:
EDS SOMS STATEMENT OF WORK EXHIBIT I -231
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Work Product Purpose Testing Life-Cycle Phase
Number of test cases planned Number and percentage of test cases
executed Number and percentage of test cases
closed Number of defects by status (severity and
priority)
scripts) Verifying test results Documenting actual
results Identifying defects Re-executing tests
Test Summary Report (final)
Summary of the final outcome of test execution for each test level:
Number of test cases planned Number and priority of open test cases Number, severity, and priority of
unresolved defects A recommendation on the readiness of the
system for the next testing level or project phase
Closedown
Inputs consist of recorded test results
Activities include:
Assessing the outstanding status of testing (i.e., number of test cases planned, number and percentage of test cases executed, number and percentage of test cases closed, number of defects by status)
Preparing and delivering a recommendation
6.5.C General Testing Requirements
VI.D.8.b.General Testing Requirements
Req.
Num.Requirement Response / Reference
M-45 Testing and Development shall have their own environments, separate from Production. Testing or development shall not be performed in the production environment.
Section 5.0
Bidder Response Detail:
Team EDS complies with this requirement and confirms that testing and development have their own environments, separate from production. Testing and development are not performed in the production environment. In proposal Section 5.0, Technical Requirements, we have described the proposed configurations for the various SOMS environments. That section describes in detail the hardware and software components proposed for each environment.
EDS recognizes that the environment has a large influence on the quality, lead-time, and cost of the testing process. Elements of an effective test environment include the following:
EXHIBIT I -232 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Environment Operational Processes – Documented policies and procedures for the setup and ongoing operations of each environment. The environment is managed with respect to factors like setup, availability, maintenance, version control, error handling, and authorizations. The saving and restoring of certain configurations and conditions can be arranged quickly and easily, for example, different copies of the database are available for the execution of different test cases and scenarios.
Environment Scalability – The ability to modify the environment based on the need to reflect the current or future state of production.
Metrics Collection – To provide continuous improvement throughout the environments, collection of specific metrics is required. These include the number of testing events in each environment, and the projected and actual time spent for each event.
Environment Personnel – The environment consists at the very least of an environment manager and the appropriate resources to support the environment.
Cost/Budget – A method is in place to determine the costs involved in the setup and ongoing operations of the required environment.
Release Management – EDS recommends the Release Management Methodology (RMM) approach for successfully introducing new or altered components into existing IT environments, in order to develop these environments and enable clients to achieve their strategic business goals.
Each testing environment is configured like the final production environment, although the testing environments for system and system integration testing and for user acceptance testing can have smaller databases and use emulators instead of actual external interfaces. The SOMS Master Test Plan specifies the testing environments that are established for the SOMS project. The Test Plan developed for each SOMS release identifies the detailed requirements for each environment for each specified test level. Furthermore, the Test Plan for each test level identifies:
The differences between its testing environment and the live environment
Any risks stemming from these differences
The approach to mitigating these risks
When each environment is needed and for how long
Once a testing environment is established, the Release Testing Manager must approve any change to the testing environment before that change is made. The agreement of all project stakeholders to the documented testing environment management process is needed.
The performance testing environment mirrors production and has all the components necessary to fully evaluate system performance accurately so that the necessary tuning can be undertaken. The following factors are considered when creating the performance testing environment:
EDS SOMS STATEMENT OF WORK EXHIBIT I -233
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Specific project components required for performance testing
Server capacity and configuration
Network environment
Tier load-balancing scheme
Back-end database
M-46 The Contractor shall use an automated testing tool for, at a minimum, stress/load and regression testing. The tool shall create the test scripts, perform the test execution, and generate the test reports.
N/A
Bidder Response Detail:
Team EDS complies with this requirement and confirms that Team EDS will use an automated testing tool for, at a minimum, stress/load and regression testing. We confirm that we will use the tool to create the test scripts, perform the test execution, and generate the test reports. EDS test automation specialists will use HP QuickTest Professional and HP LoadRunner during performance testing.
Functional Test Automation
Test automation enables the creation, modification, and execution of functional, distributed functional, regression and sanity tests for graphical user interface (GUI) applications. In addition, results of the tests, including defects generated, are recorded and available for analysis and reporting purposes.
Test automation is both a desirable and necessary component of an efficient and effective testing process. It is also software development; consequently, it requires careful planning and design, and the resulting automated test must itself be verified. Like all other forms of software development, it is very time consuming; in fact, it can take from 3 to 10 times as long to automate a test as it takes to execute the same test manually. Finally, the time saved through test automation can easily be offset and even exceeded through test script maintenance.
In fact, the introduction of test automation requires a significant investment, including the cost of:
Acquiring and maintaining licenses
Creating a set of evaluation criteria for functions to be considered when using the automated test tool
Examining the existing set of test cases and test scripts to see which ones are most applicable for test automation
Examining the current testing process and determining where it needs to be adjusted for using automated test tools
EXHIBIT I -234 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
This does not mean that test automation does not add value to the testing process. However, it does confirm the importance of selecting tests for automation that provide the best return on investment for the SOMS project.
There are four attributes of a good test case, regardless of whether the testing is manual or automated:
Effectiveness – Its potential for finding defects
Efficiency – How much test coverage it provides, that is, how much it reduces the total number of test cases required
Practicality – How easily and quickly it can be performed, analyzed, and debugged
Maintainability – How much maintenance effort is required to modify the test case each time the system that it tests is changed
Test developers must strike a balance among these attributes to make sure that each test, whether manual or automated, uncovers a high proportion of defects and still avoids excessive cost.
When to automate a functional test
EDS considers tests that fall into the following categories to be ideal candidates for automation:
Any test that is repeated often enough to offset the cost of developing and maintaining the resulting automated test script (for example, tests that have been identified for regression testing or smoke tests, and tests that are executed frequently as preconditions for other tests)
Any test that measures the ability of a system to handle stress, load, or volume and continue to perform reliably over time
Structural – especially application programming interface (API)-based – unit and integration tests
When to test manually
If the automation of a test results in excessive up-front or maintenance costs that cannot be offset, EDS recognizes that it should be done manually. Likewise, any test that requires human judgment to assess the validity of the result or extensive, ongoing human intervention to keep the test running should be done manually. Typically, no return on investment results from automating tests that have these attributes. Manual testing is the appropriate choice for the following tests:
Installation and setup
Configuration and compatibility
Error handling and recovery
Usability
EDS SOMS STATEMENT OF WORK EXHIBIT I -235
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
It is estimated that 60 to 80 percent of all test cases meet the criteria for automation. EDS automates all tests identified for smoke tests. For each SOMS release, EDS test automation specialists conduct a test automation feasibility analysis to determine which manual test cases meet the criteria for test automation to support regression testing.
EDS introduces functional test automation with Release 1B, selecting and automating functional tests from Release 1A that support regression testing of Release 1A functionality. The same approach is applied to subsequent releases, concluding with the development of automated test scripts for the final Release. This managed approach to functional test automation results in a comprehensive regression test suite for delivery to CDCR.
When test automation is deemed appropriate and necessary for system and system integration testing, user acceptance testing, or consolidated integration testing, EDS test automation specialists will use HP QuickTest Professional (QuickTest Pro).
Performance Testing
For every SOMS release, EDS will use HP LoadRunner during performance testing to:
Facilitate the automation of performance testing (for example, load, volume, stress, endurance)
Predict and measure system behavior and performance
LoadRunner can exercise the entire enterprise infrastructure by emulating thousands of users and employs performance monitors to identify and isolate problems. By using LoadRunner for performance testing, testers are able to minimize testing cycles, optimize performance, and accelerate deployment.
LoadRunner uses a suite of integrated performance monitors to quickly isolate system bottlenecks with minimal impact to the system. The suite consists of monitors for the network, application servers, Web servers and database servers. These monitors are designed to accurately measure the performance of every single tier, server, and component of the system during the load test. By correlating this performance data with end-user loads and response times, it is possible to determine the source of bottlenecks. In addition, all system performance data can be collected and managed from the LoadRunner Controller.
M-47 The Contractor shall repeat the test lifecycle when a failure occurs at any stage of testing (e.g., a failure in Acceptance Testing that necessitates a code change will require the component to go back through Unit Testing, Integration Testing, and so forth).
N/A
Bidder Response Detail:
EXHIBIT I -236 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Team EDS complies with this requirement and confirms that Team EDS will repeat the test life cycle when a failure occurs at any stage of testing.
The standard EDS testing life cycle includes the requirement that each defect be analyzed for point of origin in the development life cycle. For example, if a defect that is discovered in acceptance testing originated in design, the defect is resolved, the design is re-verified, the code is updated, and all subsequent, related tests are repeated before the system is returned for verification by acceptance testing.
EDS will use the requirements traceability matrix developed for each SOMS release to verify that all required retests are completed for each defect.
M-48 The Contractor shall perform Full Lifecycle Testing throughout the duration of the project. This includes Unit, Integration/String, System, Operational (Stress/Load, Performance), and Regression Testing.
N/A
Bidder Response Detail:
Team EDS complies with this requirement and confirms that Team EDS will perform full life-cycle testing throughout the duration of the project, which includes unit, integration/string, system, operational (stress/load, performance), and regression testing.
Unlike more traditional testing practices, which tend to engage the software development life cycle only when detailed design is complete and disengage after an application has been deployed into production, our testing process for each project release begins immediately after release initiation and also supports post-production maintenance of the application. As a result, the Testing teams can plan and prepare for their testing efforts well before the software is delivered to them. Specifically, the teams participate in the joint application development (JAD) workshops and verify that JAD outputs map to documented and agreed requirements.
The testing methodology is consistent with, and aligns with, the testing practices specified by the following:
EDS Global Applications Delivery Quality Management System (GAD QMS)
Enterprise Testing Method (ETM), which is the testing component of GAD QMS
EDS applies its ETM to verify the delivery of efficient and effective testing for the SOMS project. ETM defines testing activities across deliverable components throughout the entire development life cycle and includes all test levels to be performed for each release.
The Testing V Model, typically associated with waterfall development, has been tailored to support a manageable balance between consistency and flexibility –
EDS SOMS STATEMENT OF WORK EXHIBIT I -237
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
both are required by an incremental approach to software development. The Testing V Model offers another important advantage by supporting early testing involvement in the project and release life cycle.
Figure 6.5-3, EDS’ Testing V Model
The left-hand side of Figure 6.5-3, EDS’ Testing V Model, represents the capturing of the client’s needs and expectations and the definition of the requirements of the system. The high-level technical and architectural design process identifies the hardware and software components and their integration required to deliver the solution that meets the requirements. The detailed design phase identifies the lowest level solution components and subcomponents, defines their internal structure, and specifies them to a level that supports their construction.
The right-hand side of the figure represents the test execution of the individual subcomponents and the progressive integration of the components into a delivered solution, which is subject to acceptance by the client.
An important feature of the Testing V Model is the review activity shown on the left-hand side, which focuses on building quality into a deliverable from the development process by examining it for compliance with standards and requirements, and ensuring that it provides a sound basis for the development of dependent, downstream deliverables. This “total quality” approach is the key to EDS’ philosophy and methodology.
EXHIBIT I -238 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
The testing life cycle for each test level of a SOMS release will consist of the following sequence of major activities:
Developing a Test Plan
Determining the required test scenarios and test cases for each test level
Developing test cases for each test level
Developing test data
Executing the tests and reporting on progress
Completing closedown activities
Types of Testing
EDS uses a full-scale quality control strategy to verify a high-quality implementation of the system. EDS’ testing strategy includes the following comprehensive testing classifications:
Technical testing
SOA service testing
System testing
Business testing
Installation testing
Technical Testing
Technical testing includes:
Unit testing
Code reviews
Peer review
Component integration testing (CIT)
Smoke tests
Unit Testing
Unit testing is the testing of individual components (units) of the software. The objective of unit testing is to test the functionality of the code, which implements the functional requirements identified in the business scenario of the high-level design (HLD) documents.
EDS follows a test-driven development methodology – tests are written before development starts.
Using the JUnit tool, unit tests are written by each developer for each documented RFP requirement before starting development. These unit test cases are reviewed and approved by the Development Team Lead.
EDS SOMS STATEMENT OF WORK EXHIBIT I -239
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Unit tests are conducted along with development and are performed by each developer on his or her code.
Unit testing follows a self-correcting mechanism in that, if there is an error, it has to be corrected by the developer in order to commit the code. Unless all bugs are fixed, the code cannot be completed and stored in a developed status.
All components are subject to unit testing. Unit testing is an internal EDS activity. Unit test results are stored in Borland StarTeam.
Code Review
Code review is the peer review activity of software development. Code is reviewed for adherence to coding standards, consistency, and accuracy by either fellow team members or the team lead. It is also checked for adherence to the development specifications. During development, code is analyzed and re-factored through several processes. This allows the code to be reformatted and streamlined and creates the following benefits in the development life cycle:
Reduces defects within code
Increases reusability
Increases application performance
Shortens debugging time
Eases task distribution
Enforces best practice
Peer Review
Peer reviews are conducted by the developers on the screens developed by their fellow team members, and provide a general check of the workability of the screen:
Peer reviews are not accompanied by any formal test plans or scripts.
Peer review comments are formally added in Borland StarTeam for the function or screen being tested.
This is a formal step in the development quality control process and development for a function is not complete until this stage is successfully passed and recorded in Borland StarTeam.
Component Integration Testing
Component integration testing (CIT) is performed on each individual function and interface. This test is used to verify that the technical objects that are servicing a screen are working together properly to make the screen functional:
CIT is performed by an independent team.
EXHIBIT I -240 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
The objective is to test the function for error-free operation; it does not test functionality or business process flow.
It is conducted using test scripts and is executed using Test Director.
All bugs that are found during CIT are recorded manually in Borland StarTeam by the CIT team along with a status change and assignment to the system analysts for review and correction.
Smoke Test
The smoke test is an application readiness check that is performed after the completion of a binary build and release for service testing.
Smoke testing is conducted by the CIT team.
Key application activities are quickly checked to see if the release is healthy.
If the application fails the smoke test, it is rejected and the environment is rolled back to the previous release.
Only when the release passes the smoke test can it be used for service testing and any subsequent test levels.
SOA Service Testing
SOA testing requires a fundamental change in testing strategy. With SOA, testing is required in isolation at the service level. Reliance on testing through the application’s user interface may lead to erroneous conclusions. SOA testing begins earlier in the development life cycle as the cost of repairing defects rises rapidly as time elapses.
Functional testing is not usually sufficient; testing must occur along several dimensions such as security, interoperability, and performance. Testing tasks are assigned differently in an SOA environment:
Testing of services in isolation is best handled by Service Delivery teams who have the expertise to effectively perform this type of testing.
The introduction of a round of isolation testing by an independent group (internal or outsourced) provides an impartial second look that identifies problems that may have gone undetected.
Existing Testing teams should continue to perform application-level testing.
SOA testing requires enhanced technical skills and deeper business acumen.
Exclusive use of GUI testing tools is not possible, as testers must be involved in the execution of test harnesses in order to test services in isolation.
A deeper understanding of the business allows the tester to determine how well the services embody the business process.
EDS SOMS STATEMENT OF WORK EXHIBIT I -241
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Service testing is a distinct test level that follows component integration testing and precedes system testing (where the service is tested in conjunction with other services and applications, to verify that they jointly deliver the required functionality).
Service testing must involve collaboration among business analysts, developers, and testers.
The objective of service testing is to validate that the built service delivers the required functionality and exhibits the expected nonfunctional performance and secure code characteristics.
When the components have been developed to encompass a self-contained business function or service, and the components have completed component integration testing, the application can be released for service testing. This verifies that all business functions within a specific service deliver the required service.
The service testing approach includes the following testing activities:
Verify That Integration of Components Is Complete – It is expected that developers will complete component integration testing prior to hand-over to (independent) system testing. However there may be occasions where this is not possible. For example, the component might include deliverables from a third party, or code that requires an operating environment which cannot be replicated in the development arena. In these circumstances, and by previous agreement with the Service Testing team, this final stage of component integration testing may be performed by the testing team that perform service testing as an initial activity, on receipt of the hand-over of the component.
Verify Service Integration – Testing of combinations of components at each layer (Presentation, Process and Service, Integration, and Data/Applications). This testing uses a white box approach, but is driven by a view of the business functionality to be delivered. This activity requires a collaborative effort involving independent testers, application and middleware developers, and database administrators.
Verify Functional Correctness – Verification against the service requirement and specifications. Verification against business specifications. This activity is as much about testing that additional functionality has not been delivered, as it is about testing of delivered functionality. To be certified for reuse, services must not include business processing or logic specific to only one system. To exercise all possible tests, it is necessary to construct stubs and drivers to inject tests into the service and to check outputs for correctness.
Verify Policy, Standards, and Registry – SOA registries and repositories help manage metadata related to SOA artifacts (for example, services, policies, processes, and profiles) and include the creation and documentation of the relationships (that is, configurations and dependencies) between various metadata and artifacts. SOA policy management provides the
EXHIBIT I -242 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
technology to create, discover, reference, and sometimes enforce policies related to SOA artifacts, such as access control, and performance and service levels. As these form part of the service deliverable, they require testing. Any additional SOA rules or conventions that EDS adopts for its offerings (A3 architecture direction/constraints) over and above the industry standard services (often XML) and protocols that allow the service to be provided, also require testing.
Verify External Interfaces – Testing of the interfaces between the service and the service consumers to make sure that the external interfaces satisfy their defined requirements, incoming interface data is correctly and thoroughly vetted by the service, only data that satisfies the rules of the interface is input to the service, and data output from the service is in accord with requirements.
Verify Security – Testing to verify that functionality and data are only delivered appropriate to the type of access and usage.
Verify Quality of Service – Performance, load, stress, scalability, reliability (or long-running – how long can the service run without failing), and service failure resilience testing.
Verify Service Level Agreement – Testing against service level specifications in the business model.
Frequency
At least one cycle of service testing is performed for each new release of the software. Separate builds of the same release result in the remaining non-closed test cases being executed.
System and System Integration Testing
System and system integration testing is the most important testing phase; it makes sure that SOMS functionality is signed off by CDCR in the HLD documents. In system and system integration testing, all requirements are tested in the context of overall system requirements. The HLD documents specify the flow of all business scenarios. The HLD is used to create requirements in Borland StarTeam, and then test cases are written to test each business scenario. All requirements in Borland StarTeam are mapped to test cases in HP Quality Center, and this mapping can be shared with the CDCR to verify that all requirements are covered by test cases.
Specific types of testing conducted during system and system integration testing include the following:
Smoke test
Functional testing
Interface testing
Regression testing
EDS SOMS STATEMENT OF WORK EXHIBIT I -243
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
The requirements for SOMS system and system integration testing include the following:
The in-scope functions list is based on HLD documents (prepared on the basis of RFP requirements) and the change requests approved by the CDCR team.
HLD documents prepared by the project teams are considered the basis for testing activities. Any changes to the detailed functional specifications, once baselined, go through a change management process to assess the impact on the testing effort and schedule.
All interface connections are tested using live connections in the user acceptance testing environment. Stubs are used when the interface connections are unavailable. EDS will work with the CDCR SOMS team to get appropriate access to the interfaces. All third-party interfaces will be available for the testing through a testing environment that can be connected from the DTS data center.
The infrastructure, hosted in the DTS environment for system testing (FTP-based interfaces), is ready and configured correctly.
All HLD documents and business rules documents are signed off before testing begins. EDS will also get concurrence on all the HLD changes before completing system and system integration testing (as some of the changes may come during system and system integration testing).
System Testing
System testing is performed when the software is functioning as a whole, or when well-defined subsets of its behavior are implemented and the software under test has successfully progressed through the formal unit and component integration test levels. System testing validates the functional and structural stability of the application or system, as well as nonfunctional requirements such as reliability, security, and interoperability. In this test level, testing is concerned with the behavior of the whole system, not with the workings of individual components. Tests may be based on risks, requirements specifications, business processes, use cases or other high-level descriptions of system behavior, and interactions with the operating system or system resources. Testing investigates both functional and nonfunctional requirements.
Functional test types addressed by system testing include:
System transactions
System processes
System functionality
Business function
Integrated functionality
Application security
Accessibility
EXHIBIT I -244 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Nonfunctional test types addressed by system testing include:
Conversion
Data integrity
Installability
Legal and regulatory
Compatibility
Portability
Privacy
Application security, ensuring that users are restricted to specific functions or are limited to the data made available to them
System security, ensuring that only those users granted access to the system are capable of accessing the applications through the appropriate gateways
Usability
Infrastructure security
Each SOMS release undergoes rigorous system testing. System tests are prepared and performed based on the approved requirements documents, the application architecture and the application design as specified by the Development team. Tests are created to exercise the specific business functionality selected and approved by CDCR. This verifies that the functions perform as specified in the requirements documents, and verifies that the system as designed supports the client’s business.
System test cases are based on two types of scenarios: functions and roles. Functional testing validates all the functionality associated with functional requirements and data flows through the use of the application. Role-based testing verifies that access to the application, and to menus, submenus, buttons, and icons, is complete and correct for each defined role, and that the process flow is complete and correct for each role. Authorization checks can be performed by entering invalid logon ID or unauthorized access codes to determine whether screens or windows are displayed when they should not be. Test cases verify that specific access codes take the user to where (and only to where) access rules permit.
Once sufficient components have been developed to constitute a self-contained subsystem, and the subsystem has completed component integration testing, the application can be released to system testing for black box testing that verifies that all of its business functions interact correctly with all other business functions. During this type of testing, the database is loaded with sample production data. This provides a foundation for static information as well as initial data for batch and month-end processes.
EDS SOMS STATEMENT OF WORK EXHIBIT I -245
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
System Integration Testing
The purpose of system integration testing is to confirm that the necessary setup and communication exists with respect to interfaces and reports so that functional testing can be performed. System integration testing addresses the need for the product under test to interface with other applications or systems without interfering with their operations.
System integration testing is performed once the interface code is available from the Development team. Early execution of these tests is recommended because this may be some of the most complex testing – it normally involves the communication and security layers and requires coordination with outside organizations. It can be performed at the same time as system testing or late in component integration testing.
Initially this testing involves the use of drivers, since all testing components, particularly data, may not be available in the current environment. This approach also provides the tester with direct control over the transactions being sent to the external entity – the potential for functionality errors in client or GUI code modules is eliminated. Ideally, the tests are performed without the use of stubs and drivers. Once the front-end code has been tested, these tests are re-executed.
When testing with an external organization, we provide a copy of all test cases for review and to permit coordination of test execution and synchronization of data. We identify each interface and provide details of the type of tests that are performed and any tools that may be required. The following is an example of the types of information that could be provided for an interface to be tested:
Named applications or systems with which to interface
Messaging format, medium, and content
Compatibility with products from different vendors
Conversion requirements for hardware, applications, or data
Messaging and transmission protocols
Process timing and sequencing
Regression Testing
During the development of system and system integration test cases, candidate test cases for use in regression testing are also identified. Typically, those test cases that test critical components or functionality of the system are selected as regression test cases, and are the primary candidates for automation. The resulting automated test scripts can then be used in the testing of future releases of the system or re-executed as part of the testing of the current project or release as changes to the system are made.
Test Objective: To validate that identified business scenarios are working as desired.
EXHIBIT I -246 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Technique: Execute pre-selected HLD, HLD flow, or function, using valid data, to verify that the expected results are obtained when valid data is used.
Completion Criteria: All planned tests have been executed, and all identified defects have been addressed.
Special Considerations: None.
Performance Testing
Performance testing is designed to verify that the developed programs and system work as required at expected usage levels. The objective is to verify that the product is structurally sound and will function correctly at peak operation. Performance testing determines that the technology has been used properly and that, when all the component parts are assembled, they function as a cohesive unit that meets response time requirements. The techniques are not designed to validate that the application system is functionally correct, but rather, that it is structurally sound and reliable.
Performance testing is conducted to evaluate the compliance of a program component or system with specified performance requirements or service levels. These may include subsets related to stress, volume, reliability, and load. Performance testing requires use of sophisticated tools that generate high levels of use, monitor system throughput, and provide reports on the results. The resulting consistent and repeatable tests identify any bottlenecks in resource utilization, transaction response times, and overall system performance.
The following are high-level goals for performance testing:
Validate that documented SLAs and requirements regarding the performance of the application and infrastructure have been satisfied
Verify that the system performs as required under expected usage levels
Identify points of system degradation or bottlenecks
Identify system capacity and limitations on specific components
Identify causes of poor performance of the business functions
Reduce implementation risk
EDS application developers and architects incorporate performance testing into their existing development activities, regardless of language type, to develop code that meets performance requirements. Developers conduct performance testing at the unit and component integration test levels to make sure that the application performance requirements have been met. At or near the completion of system testing, performance testing of the application and infrastructure is required. It tests the performance of the application and infrastructure in a production-like environment to verify that the system meets or exceeds SLA requirements for performance response time.
Performance testing uses the following high-level approach:
EDS SOMS STATEMENT OF WORK EXHIBIT I -247
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Identify business and user processes to be used in the development of testing scenarios
– The workload to be generated for the performance tests consists of three components: the number of users, critical business functions that must be executed, and creation of a workload profile. The latter is calculated as the estimated number of users times the estimated number of transactions (that is, the sum of the information defined in the user activity profiles, which are created on the basis of the workload anticipated in the current production system).
– Develop and automate testing scenarios that align with and simulate user workload
– Execute tests and analyze the results to identify and resolve bottlenecks by completing:
o Consistent load test runs simulating the average workload
o Consistent load test runs simulating the peak workload
o Stress test runs at double the expected peak workload
o An endurance test run simulating the average workload
o A formal report of test results and appropriate recommendations
The number of executions of the performance test is:
Two consistent load test runs simulating the average workload
Two consistent load test runs simulating the peak workload
One stress test run at double the expected peak workload
One twelve-hour endurance test run simulating the average workload
The following performance metrics are captured and analyzed:
Throughput and concurrent user goals
Response time acceptability
Server utilization (CPU, memory, disk, I/O number of packets)
Future performance growth
Server errors acceptability
Memory leaks or other stability issues
Processing delays
Application failover
Application performance is monitored and analyzed to determine if any immediate actions must be taken to verify that performance does not become an issue during later stages of testing. Defined performance testing processes, practices, and procedures are followed as documented in the Enterprise Testing
EXHIBIT I -248 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Method (see ETM Plan and Prepare for Performance Testing, http://semethod.iweb.eds.com/online/etm/6418-Plan-Instance.htm)
The database is loaded with production-like volumes. When the workload profiles are developed, the number of data rows that the virtual users require during the testing time determines the amount of specific data needed for each script.
Performance testing may use two sources of test data: application data and data generated by the performance test itself.
Consolidated Integration Testing
The purpose of consolidated integration testing is to establish confidence that, when introduced into the production environment, the new or revised systems and services:
Will operate correctly and deliver the required functionality
Will not cause undesired functional effects in systems with which it interacts or coexists
Will not cause undesired nonfunctional effects in systems with which it interacts or coexists
Will not cause undesired effects to data shared with other systems with which it interacts or coexists
Consolidated integration testing includes the end-to-end testing of software including interfaces with external entities and batch and online software. All parts of the system should be available. Entering data through normal interfaces will test interaction within the application as well as between the various interrelated systems.
There are three main themes in consolidated integration testing:
Testing of the new software in a production-like environment in company with other software with which it is required to interact
Testing of the new software in a production-like environment in company with other software with which it is required to coexist but not interact
Testing to establish that the new software has only the intended effects on databases and the data in them, whether or not the data is directly manipulated by the new software
Consolidated integration testing concentrates on business process threads that originate in SOMS and exercises those threads that represent the most heavily used and most critical functions that cross one or more interfaces with external systems.
The consolidated integration tests are executed in a separate environment set up to mirror the planned production environment. Production-like test scenarios are prepared by the Testing team (with assistance from business analysts) for each
EDS SOMS STATEMENT OF WORK EXHIBIT I -249
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
process and each requirements statement. These scenarios correspond to a logical business process thread, usually comprising several business tasks. The Testing team may supplement these scenarios with test cases provided by the System Testing team. When consolidated integration testing has achieved its exit criteria, the new or revised systems and services are considered ready for promotion to the production deployment phase.
Consolidated integration testing is performed for each SOMS release.
Acceptance Testing
Acceptance testing combines both functional and structural system testing to verify that the developed system works as a whole and meets business and operational requirements. The objective is to verify that the product is structurally sound and functions correctly. It determines that the technology has been used properly and that, when the component parts are assembled, they function as a cohesive unit. Acceptance testing consists of the following test levels:
User acceptance testing
Operational acceptance testing
User Acceptance Testing
The objective of user acceptance testing (UAT) is to validate that an application or system meets its business and user requirements, and to provide confidence that the application or system is usable before it is formally delivered for use by end users for business purposes. UAT is conducted by testing analysts and testers provided by the business area that is responsible for the application or system.
UAT employs a black box approach, with a focus on validating the high-level business requirements of the application or system. In practice, the UAT analysts and testers develop and execute test cases based on the tasks that they need to perform during normal, day-to-day use of the application or system.
UAT also tests user support materials (such as user guides, online help, training materials) to validate their accuracy and completeness.
Upon completion of system and system integration testing, the full application is given to CDCR for user acceptance testing. UAT uses the following high-level approach:
Participate in static testing of the business and user requirements for the application or system, including:
– Ambiguity analysis
– Verification that requirements are correct, complete, consistent, and traceable
EXHIBIT I -250 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Analyze the business and user requirements for the application or system and identify:
– The high-level business and user processes and workflows
– The functional and nonfunctional requirements associated with business and user processes and workflows
Examine the user documentation, training materials, and online help as sources of testing requirements
Create production-like test scenarios for each process and each requirements statement. These scenarios correspond to one or more business tasks that form a logical business function.
Use the above information to generate a set of test cases to validate that the application or system supports the completion of user tasks specified in the user requirements to achieve the business objectives specified in the business requirements, describing in detail the steps to be followed during testing
Formally report test results and provide appropriate recommendations
The UAT team may supplement these scenarios with test cases provided by the System Testing team. EDS provides system and System Integration Testing team members to support development of the UAT Test Plan and test cases, and execution of the tests by the UAT team.
Operational Acceptance Testing
The objective of operational acceptance testing is to validate the operational and administrative aspects of the application or system, such as:
Confirming installation by ensuring that the software can be installed on the necessary configurations, such as required by a new installation, an upgrade, and a complete installation or custom installation, and under normal and abnormal conditions
Confirming registration of new users and assignment of their privileges
Confirming the backing up, archiving, and restoring the application/system and its data
Confirming the portability of an application or system that must operate consistently on more than one platform or with more than one operating system
Confirming the recoverability of the application or system in the event of service interruption
While subjecting the system to an average load, creating a failure condition on one hardware component to validate that the remaining hardware picks up the load, to determine whether the system continues to operate as required when hardware capacity is reduced
EDS SOMS STATEMENT OF WORK EXHIBIT I -251
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Confirming failover and recovery by ensuring that, for those systems that must be kept running when a failover condition occurs, the alternate or backup systems take over from the failed system without loss of data or transactions
Operational acceptance testing is conducted by test analysts and testers provided by the operations area that are responsible for supporting and administering the application or system after it is deployed into production.
Operational acceptance testing uses black box testing techniques to validate that the application or system meets its operational requirements. It should also address the testing of system documentation, that is, operating and administration guides, to validate their accuracy and completeness.
Operational acceptance testing uses the following high-level approach:
Participate in static testing of the requirements for the application or system, including:
– Ambiguity analysis
– Verification that requirements are correct, complete, consistent, and traceable
Analyze the user and system requirements for the application or system and identify the high-level operational requirements
Identify any requirements for the application or system to communicate with other systems, along with the means of communication
Review the environment in which the live system runs in order to identify any interoperability or compatibility issues with other systems
Examine the requirement for testing operating procedures (such as procedures for installation, failover, backup, and recovery) and system documentation (such as administration guides)
To test the application or system, use the above information to generate a set of test cases that describe in detail the steps to be followed during testing
Formally report test results and provide appropriate recommendations
M-49 The Contractor shall be responsible for building test plans, executing test plans, and creating reports. CDCR will evaluate the Contractor test plans, and Contractor test results, as well as validate the testing done by augmenting it with their own testing.
N/A
Bidder Response Detail:
Team EDS understands and complies with this requirement and confirms that Team EDS will be responsible for building Test Plans, executing Test Plans, and creating reports. We understand that CDCR will evaluate the Team EDS Test
EXHIBIT I -252 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Plans and test results, and will validate this testing by augmenting it with CDCR’s own testing.
Table 6.5-5, Deliverables EDS Will Produce for Each Test Level, summarizes the deliverables that will be produced for each test level to be performed for a SOMS release.
Table 6.5-5, Deliverables EDS Will Produce for Each Test Level
Work Product Purpose Testing Life-Cycle Phase
Test Plan The primary means by which each testing team communicates what it plans to test, its environmental and resource needs, and the schedule for each testing level that it will perform
Planning
Inputs can include:
Documented and agreed business requirements
Agreed change requests E2E testing strategy
Activities include defining:
Scope (in and out) Objectives Project/release-specific
entry and exit criteria Tasks and deliverables Assumptions, constraints,
and risks Resource requirements
(source information, equipment, software, data, personnel, tools)
Team roles and responsibilities
Training Plan for testers (if required)
High-level test scenarios
Test Scenarios
Test Cases
Grouping of selected test cases that follow a logical sequence or common grouped business process
Define the conditions to be tested and their assigned priorities. Prerequisites, detailed test procedures, and expected test results must also be included.
Design
Inputs can include:
Approved functional specifications and design documents
Approved technical specifications
Test Plan for the testing level
Activities include:
Reviewing and refining test scenarios
Creating test cases Defining test data Defining expected results
Test Scripts Automated instances of selected test cases Test Execution
Inputs can include:
EDS SOMS STATEMENT OF WORK EXHIBIT I -253
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Work Product Purpose Testing Life-Cycle Phase
Test cases Test data Test results
Activities include:
Executing tests (cases and scripts)
Verifying test results Documenting actual
results Identifying defects Re-executing tests
Testing Progress Reports (Ongoing)
Regular reports on testing progress for each test level based on the following metrics:
Number of test cases planned
Number and percentage of test cases executed
Number and percentage of test cases closed
Number of defects by status (severity and priority)
Test Summary Report (Final)
Summary of the final outcome of test execution for each test level:
Number of test cases planned Number and priority of open test cases Number, severity, and priority of
unresolved defects A recommendation on the readiness of the
system for the next testing level or project phase
Closedown
Inputs consist of recorded test results
Activities include:
Assessing the status of testing (number of test cases planned, number and percentage of test cases executed, number and percentage of test cases closed, number of defects by status)
Preparing and delivering a recommendation
The EDS testing consultants will work with CDCR to verify that the SOMS Master Test Plan provides a mutually agreed definition of user acceptance testing, including roles and responsibilities, deliverables, and entry and exit criteria. The Release Testing Manager will also work with CDCR to verify that the Release Test Plan for each SOMS release includes the appropriate level of effort and time for completion of user acceptance testing.
M-50 The Contractor shall provide staff to CDCR to answer questions and address any problems that may arise during testing conducted by CDCR.
N/A
Bidder Response Detail:
Team EDS understands and complies with this requirement and confirms that Team EDS will provide staff to CDCR to answer questions and address any problems that may arise during testing conducted by CDCR.
EXHIBIT I -254 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
The Release Testing Manager and the System and System Integration Testing Lead will support the development of the User Acceptance Test Plan for each release. EDS test analysts and testers from the System and System Integration Testing team are assigned to support the development and execution of user acceptance testing. In addition, EDS will offer CDCR user acceptance testing resources the opportunity to act as test witnesses during system and system integration test execution and will provide access to applicable system and system integration test cases.
M-51 The Contractor shall refine the test documents, procedures, and scripts throughout development and through full system acceptance to reflect the as-built design and current requirements.
N/A
Bidder Response Detail:
Team EDS understands and complies with this requirement and confirms that Team EDS will refine the test documents, procedures, and scripts throughout development and through full system acceptance to reflect the as-built design and current requirements. Table 6.5-6, Sources of Information for Test Planning and Preparation Activities, provides a view for each test level, level of the dependencies between test planning and preparation activities and sources of information. It also indicates the project phase and release phase in which test planning and preparation can take place for each test level.
For each SOMS release, we will complete test planning activities and produce the outputs for each test level as specified in the Release Test Plan. Initial requirements and agreed change requests drive initial development and changes to downstream deliverables (including testing deliverables), starting with the Release Test Plan.
EDS SOMS STATEMENT OF WORK EXHIBIT I -255
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Table 6.5-6, Sources of Information for Test Planning and Preparation ActivitiesPer Release Primary Information Sources Output
All Test Levels Project Charter Project Scope Document Business Requirements User Requirements Agreed Change Requests
Release Test Plan
Per Test Level Primary Information Sources Output
Unit Testing Detailed Design Unit testing input to the Release Test Plan
Component Integration Testing
Application Architecture Detailed Design
Component integration testing input to the Release Test Plan
System Testing System Requirements/ Use Cases
Application Architecture
System testing input to the Release Test Plan (including system integration testing, if required)
User Acceptance Testing
Business Requirements User Requirements
User acceptance testing input to the Release Test Plan
Performance Testing Business Requirements User Requirements (End User) System Requirements/ Use
Cases
Performance testing input to the Release Test Plan
Operational Acceptance Testing
User Requirements (Operational User)
System Requirements/ Use Cases
Operational acceptance testing input to the Release Test Plan
Field Acceptance Testing
User Requirements (End User) System Requirements/ Use
Cases
Operational acceptance testing input to the Release Test Plan
M-52 The Contractor shall allow DTS/CDCR to run validation and testing software against externally facing Internet applications to help identify potential security issues, and must agree to repair any deficiencies found during this testing. DTS currently uses Web Inspect to test Internet applications for security and vulnerability issues.
Section 6.7
Bidder Response Detail:
Team EDS understands and complies with this requirement and confirms that Team EDS will allow DTS/CDCR to run validation and testing software against externally facing Internet applications to help identify potential security issues, and we agree to repair any deficiencies found during this testing. Team EDS understands and acknowledges that DTS currently uses Web Inspect to test Internet applications for security and vulnerability issues.
EXHIBIT I -256 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
Team EDS is extremely proactive in building a secure application. As described in proposal Section 6.7, Security, we conduct a threat modeling exercise in the Requirements Phase and identify the aspects of the solution in which potential threats would have impact. Nonfunctional use cases, and test cases, are constructed on the basis of these threat and vulnerability resolution measures. In the Design Phase, these threat and vulnerability resolution measures and controls are embedded in the application and deployment architectures and in the procedures established for the system configuration and deployment.
The nonfunctional security test cases will be executed and issues rectified as per the test cycles established for the SOMS project. Team EDS will be extremely open and collaborative in working with DTS/CDCR to run validation and testing software against externally facing Internet applications to help identify potential security issues.
M-53 As Offender Case Management programs contain date and time-sensitive elements, the testing infrastructure must provide a method of altering and synchronizing the system date throughout each test phase. This requires the ability to change the system date and time in some scenarios. The system date may actually reside in multiple systems in a distributed environment and may be difficult to change if other applications also reside on any of those servers.
N/A
Bidder Response Detail:
Team EDS understands and complies with this requirement and confirms that the testing infrastructure provides a method of altering and synchronizing the system date throughout each test phase. Manipulation of dates provides a vehicle to test temporally sensitive aspects of the system such as alerts and sentence computation. For example, by creating test cases in this environment and accelerating the date forward, aspects of sentence computation such as “good time” and “dead time” can be incorporated properly into the release date.
This requirement is fulfilled by two components of the testing environment. First, the version of Oracle offered as part of the SOMS solution allows the database date to be manipulated independently of the server system date. Second, the accelerated time testing environment is housed on its own server partition, allowing system date change without affecting other environments. A network time server is deployed for the accelerated time environment to simply avoid synchronizing the system date on multiple servers.
6.5.D Problem Resolution Requirements
VI.D.8.c. Problem Resolution Requirements
Req.
Num.Requirement Response / Reference
EDS SOMS STATEMENT OF WORK EXHIBIT I -257
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
M-54 The Contractor must develop a comprehensive Problem Resolution Management Plan that describes the approach to be taken in managing all problems discovered during any testing phase and in production.
N/A
Bidder Response Detail:
Team EDS understands and complies with this requirement and confirms that Team EDS will develop a comprehensive Problem Resolution Management Plan that describes the approach to be taken in managing all problems discovered during any testing phase and in production. The SOMS Master Test Plan documents the problem resolution management process, based on agreed process flow, roles and responsibilities, criteria for assigning severity and priority, and defect resolution turnaround time based on defect level. This process uses the defect levels specified by CDCR in requirement M-55 and summarized as follows:
Critical – Results in a complete system outage or is detrimental to the majority of the development or testing efforts. There is no workaround.
Serious – System functionality is degraded with severe adverse impact to the user and there is not an effective workaround.
Moderate – System functionality is degraded with a moderate adverse impact to the user but there is an effective workaround.
Minor – There is no immediate adverse impact to the user.
Figure 6.5-4, Problem Management Action/Status Workflow, shows a typical problem management action/status workflow. The following description identifies the status that the defect report would have and the person that would be notified of the defect.
Tester, CM, Developer Development (Unit and Component I ntegration Testing) CM System Test Acceptance Test
Enter Open Assign Assigned Repair Repaired Build
Re-Open
Released Validate
Re-Open
Tested Close
Re-Open
Closed
Return
Return
Returned
Suspend
Close
Reject
CloseRe-Open Suspended
Dev MgrDeveloper Dev Mgr Sys Tst TL Accpt Tst TL
Originator
Figure 6.5-4, Problem Management Action/Status Workflow
EXHIBIT I -258 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
The typical process for rectifying a defect found during test execution is the following:
1. Tester or user finds fault; tester validates the fault, documents it, and reviews it with the team leaders (optional); opens new defect. (Status = Open; notifies Development Manager)
2. SOMS Project Management designates the level of severity of the problem.
3. If in Production Phase: CDCR Project Management team determines whether the error should be corrected and retested prior to the system entering production.
4. If the error is to be fixed, Development Manager assigns defect to developer. (Status = Assigned); notifies developer)
5. Developer repairs the defect, checks in the code, performs component integration test, and updates the defect report. (Status = Repaired); notifies Development Manager
6. Development Manager decides to promote the fixed code; completes promotion form. (Notifies Configuration Manager)
7. Configuration Manager stores the code safely, placing it in accessible directories, and updates the defect report. (Status = Released; notifies System Test Lead)
8. System tester retests the fixed fault and updates the defect report. (Status = Tested; notifies Acceptance Test Team Lead)
9. Acceptance Tester retests the fixed fault and updates the defect report. (Status = Closed)
These alternative flows are possible:
1. Return Defect
Developer returns the defect. (Status = Returned; notifies originator) Originating tester accepts the reason for returning the defect; updates defect report. (Status = Closed)
Developer returns the defect. (Status = Returned; notifies originator) Originating tester does not agree with developer and re-opens defect. (Status = Open; notifies Development Team Lead)
Defect is not a priority or developer cannot fix it in current release; developer returns the defect. (Status = Returned; notifies originator) Defect Review Group (DRG) agrees to suspend the defect to a future release. (Status = Suspended)
2. Re-open Defect
EDS SOMS STATEMENT OF WORK EXHIBIT I -259
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
During retesting of defect, the tester finds that the defect still exists and re-opens defect. (Status = Open; notifies Development Manager)
Configuration Manager finds that the fixed software will not build and re-opens defect. (Status = Open; notifies Development Manager)
3. Early Close
Depending on the test phase in which the defect was found, once the defect has been fixed and tested in that phase, the defect is closed by a tester in that phase.
To verify the timely resolution of defects and to monitor the assignment of defect levels, the Release Testing Manager is responsible for managing all open defects for a release and establishing a Defect Review Group (DRG) with representation from both EDS and CDCR. The DRG is established as soon as system and system integration testing begins. The members of the group determine the schedule for their meetings; it is anticipated that the meetings occur daily. The DRG is responsible for assigning priority to defects.
DRG members meet regularly to make decisions on defect resolution that are designed to move the project forward toward the published implementation dates. The DRG’s primary responsibilities are as follows:
Track defects
Review defects and assess the appropriateness of the severity code
Reassign defects for retesting upon resolution
Receive results of testing and take the appropriate action, for example:
– Accept defects
– Reclassify or reject defects
– Close defects (because of redundancy, for example)
– Escalate the defect and invoke the Rapid Decision-Making Group, if warranted
In making the above decisions, the DRG considers the following items:
Availability of environments and remaining testing time frames per testing type
Functionality and criticality of test case
Subsequent phases and types of testing
Availability of resources
Validity of defects raised (for example, an item raised as a defect that is actually deemed a change request)
Impact on test schedule
EXHIBIT I -260 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
For each test level, the tester allocates a defect level to each defect based on the criteria specified in the Problem Resolution Management Plan and provides sufficient data to support reproduction of the defect.
With the exception of defects raised during unit testing and component integration testing, defects encountered during execution of each test level are formally entered into a defect report in HP Quality Center™ and tracked to closure. For component-level testing, only those defects that remain unresolved when a build is submitted for acceptance for system and system integration testing are entered into Quality Center™, along with a documented workaround and a scheduled Corrective Action Plan. The Release Testing Manager must agree that these unresolved defects do not pose an unacceptable risk and that system testing can proceed.
EDS SOMS STATEMENT OF WORK EXHIBIT I -261
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
M-55 The Contractor shall install and test a single Problem Resolution Tracking System that the Contractor and CDCR shall use collaboratively for the tracking of system defects, security, and system issues. The Problem Resolution Tracking System must, at a minimum, include:
All defects in the solution identified during any testing phase or in production must be recorded, prioritized, tracked, and resolved in a timely manner. Each must be assigned a “Defect Level” based on the following definitions:
¾ Critical - Results in a complete system outage and/or is detrimental to the majority of the development and/or testing efforts. There is no workaround.
¾ Serious - System functionality is degraded with severe adverse impact to the user and there is not an effective workaround.
¾ Moderate -System functionality is degraded with a moderate adverse impact to the user but there is an effective workaround.
¾ Minor - No immediate adverse impact to the user.
The Contractor shall allow CDCR full access to the Problem Resolution Tracking System.
The Problem Resolution Tracking System shall be designed in a manner to allow for the transfer of ownership to the State following contract completion.
The processes and management of the Problem Resolution Tracking system shall be addressed as part of the Contractor Quality Management Plan.
The Contractor shall comply with the “Defect Level” approach as described above, including the requirement that SOMS Project Management shall designate the level of severity to all defects. Critical and serious defects (incidents) shall require remediation and retesting before the system enters production. Moderate and Minor defects shall be fixed and tested to CDCR’s satisfaction prior to system acceptance.
The Problem Resolution Tracking System shall provide a classification and tracking method for system or application errors that describes the severity of the defect. CDCR will determine whether that error shall be corrected and re-tested prior to the system entering production.
N/A
Bidder Response Detail:
Team EDS complies with this requirement and confirms that Team EDS will install and test a single Problem Resolution Tracking System that the EDS SOMS team and CDCR shall use collaboratively for the tracking of system defects and security and system issues. We also comply with and confirm that:
EXHIBIT I -262 EDS SOMS STATEMENT OF WORK
EDS AGREEMENT NUMBER [XXXX]CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABIL ITATION (CDCR) EXHIBIT IMANAGEMENT REQUIREMENTS RESPONSE
All defects in the solution identified during any testing phase or in production are recorded, prioritized, tracked, and resolved in a timely manner. Each is assigned a defect level based on the definitions provided by CDCR in requirement M-55.
Team EDS will allow CDCR full access to the Problem Resolution Tracking System.
The proposed Problem Resolution Tracking System will be designed in a manner to allow for the transfer of ownership to the State following contract completion.
The processes and management of the Problem Resolution Tracking System are addressed as part of the Quality Management Plan.
Team EDS will comply with the defect level approach described by CDCR in requirement M-55, including the requirement that SOMS Project Management designate the level of severity for all defects. Critical and serious defects (incidents) will be remedied and retesting before the system enters production. Moderate and minor defects will be fixed and tested to CDCR’s satisfaction prior to system acceptance.
The proposed Problem Resolution Tracking System will provide a classification and tracking method for system or application errors that describes the severity of the defect. We understand that CDCR will determine whether that error is corrected and retested prior to the system entering production.
Team EDS will use HP Quality Center™ to automate testing management and control for high-level testing, including service testing and system and system integration testing performed by EDS, and user acceptance testing performed by CDCR. Quality Center combines test planning, test execution, and defect tracking with an open architecture and object-based repository. This tool supports documentation of testing work products, management control, and mapping to requirements. Various facilities for reporting on test coverage and progress are available to assist in the management and control of testing activity.
Management of testing activities covers activities associated with the management of test planning, design, development, execution, and analysis. Quality Center provides a single point from which it is possible to determine and track the exact state of testing activities for all test levels.
Quality Center facilitates an efficient, cost-effective testing process by:
Incorporating all aspects of the testing process – requirements management, planning, scheduling, running tests, defect management, and testing status analysis – into a single application
Linking requirements to test cases, and test cases to defects, to verify traceability throughout the testing cycle
Providing customizable, integrated graphs and reports that support analysis of requirements coverage, planning progress, testing progress, and defect status as a basis for informed decision making
EDS SOMS STATEMENT OF WORK EXHIBIT I -263
IN IT IAL F INAL PROPOSAL (CDCR-5225-113)
Notifying developers of open problem reports that require their attention
Notifying testers of changes in requirements or fixes to defects
Supporting scheduling and running unattended automated tests, allowing for overnight test runs
Quality Center will also be set up to issue e-mail messages to the Development and Testing teams to notify them of new defects or changes in the status of existing defects.
EXHIBIT I -264 EDS SOMS STATEMENT OF WORK