Sample Master Test Plan

29
MASTER TEST PLAN Version: 1.0 Prepared for [Enter Company Name Here] [Name] Improvement Project

Transcript of Sample Master Test Plan

Page 1: Sample Master Test Plan

MASTER TEST PLANVersion: 1.0

Prepared for

[Enter Company Name Here]

[Name] Improvement Project

Revision History

Page 2: Sample Master Test Plan

Date Version Description Author

Page 3: Sample Master Test Plan

TABLE OF CONTENTS

1 INTRODUCTION..................................................................................5

1.1 PURPOSE................................................................................................................51.2 SCOPE...................................................................................................................5

1.2.1 The Current Version...............................................................................................61.2.2 Updating the Product.............................................................................................71.2.3 High-Level Objective..............................................................................................7

1.3 TEST PLAN OBJECTIVES.............................................................................................8

2 TEST ITEMS (FUNCTIONS)..................................................................8

2.1 CLIENT APPLICATION.................................................................................................82.2 QUICK HELP TESTING................................................................................................92.3 LICENSE KEY...........................................................................................................92.4 SECURITY.............................................................................................................10

3 SOFTWARE RISK ISSUES...................................................................10

3.1 SCHEDULE............................................................................................................103.2 TECHNICAL...........................................................................................................103.3 MANAGEMENT.......................................................................................................103.4 PERSONNEL..........................................................................................................103.5 REQUIREMENTS......................................................................................................10

4 FUNCTIONS TO BE TESTED...............................................................11

5 FUNCTIONS TO NOT BE TESTED........................................................11

6 TEST APPROACH..............................................................................12

6.1 SPECIAL TESTING TOOLS.........................................................................................126.2 SPECIAL TRAINING ON TESTING TOOLS.......................................................................126.3 TEST METRICS.......................................................................................................126.4 CONFIGURATION MANAGEMENT.................................................................................136.5 REGRESSION TESTING.............................................................................................146.6 REQUIREMENTS MANAGEMENT..................................................................................14

7 TEST STRATEGY...............................................................................14

7.1 SYSTEM TEST........................................................................................................147.2 PERFORMANCE TEST...............................................................................................147.3 SECURITY TEST......................................................................................................147.4 AUTOMATED TEST..................................................................................................157.5 STRESS AND VOLUME TEST......................................................................................157.6 RECOVERY TEST....................................................................................................157.7 DOCUMENTATION TEST...........................................................................................157.8 BETA TEST...........................................................................................................157.9 USER ACCEPTANCE TEST.........................................................................................15

8 ENTRY AND EXIT CRITERIA...............................................................16

8.1 TEST PLAN............................................................................................................16

Page 4: Sample Master Test Plan

8.1.1 Test Plan Entry Criteria........................................................................................168.1.2 Test Plan Exit Criteria...........................................................................................168.1.3 Suspension and Resumption Criteria...................................................................16

8.2 TEST CYCLES........................................................................................................168.2.1 Test Cycle Entry Criteria......................................................................................168.2.2 Test Cycle Exit Criteria.........................................................................................16

9 DELIVERABLES.................................................................................17

9.1 TEST EVALUATION SUMMARIES.................................................................................179.2 INCIDENT LOGS AND CHANGE REQUESTS....................................................................17

10 ENVIRONMENT REQUIREMENTS........................................................18

10.1 BASE SYSTEM HARDWARE.......................................................................................1810.2 BASE SOFTWARE ELEMENTS IN THE TEST ENVIRONMENT................................................1810.3 PRODUCTIVITY AND SUPPORT TOOLS..........................................................................18

11 RESPONSIBILITIES, STAFFING AND TRAINING NEEDS.........................19

11.1 PEOPLE AND ROLES................................................................................................1911.2 STAFFING AND TRAINING NEEDS...............................................................................21

12 TEST SCHEDULE...............................................................................21

13 POTENTIAL RISKS AND CONTINGENCIES............................................22

14 CONTROL PROCEDURES...................................................................22

14.1 REVIEWS..............................................................................................................2214.2 BUG REVIEW MEETINGS...........................................................................................2214.3 CHANGE REQUEST..................................................................................................2314.4 DEFECT REPORTING................................................................................................23

15 DOCUMENTATION............................................................................23

16 ITERATION MILESTONES...................................................................24

17 MANAGEMENT PROCESS AND PROCEDURES......................................25

17.1 PROBLEM REPORTING, ESCALATION, AND ISSUE RESOLUTION..........................................2517.2 Approval and Signoff.........................................................................................25

1 Introduction

1.1 Purpose

Page 5: Sample Master Test Plan

The purpose of this Software Quality Assurance (SQA) Plan is to establish the goals, processes, and responsibilities required to implement effective quality assurance functions for the [Project Name] Improvement project.

This [Project Name] Improvement Software Quality Assurance Plan provides the framework necessary to ensure a consistent approach to software quality assurance throughout the project life cycle. It defines the approach that will be used by the Software Quality (SQ) personnel to monitor and assess software development processes and products to provide objective insight into the maturity and quality of the software. The systematic monitoring of the [Project Name] products, processes, and services will be evaluated to ensure they meet requirements and comply with [COMPANY NAME] and [Project Name] policies, standards, and procedures, as well as applicable Institute of Electrical and Electronic Engineers (IEEE) standards.

The overall purpose of this Master Test Plan is to gather all of the information necessary to plan and control the test effort for testing the [Project Name] application. It describes the approach to testing the software, and will be the top-level plan used by testers to direct the test effort.

This plan is designed to create clear and precise documentation of the test methods and processes that [COMPANY NAME] will use throughout the course of the [Project Name] system verification testing.

This plan covers SQA activities throughout the formulation and implementation phases of the [Project Name] mission. SQA activities will continue through operations and maintenance of the system.

This Documenting of the test methods and processes will serve as the basis for ensuring that all major milestones and activities required for effective verification testing can efficiently and successfully be accomplished. This Master Test Plan will be modified and enhanced as required throughout the verification testing engagement.

1.2 Scope

The scope of this quality assurance effort is to validate the full range of activities related to the functionality of the flagship product of the [COMPANY NAME]: the [Project Name] Course; as it undergoes re-designing and re-building.

This test plan describes the unit, subsystem integration, and system level tests that will be performed on components of the [Name] application. It is assumed that prior to testing each subsystem to be tested will have undergone an informal peer review and only code that has successfully passed a peer review will be tested.

Unit tests will be done through test driver program that perform boundary checking and basic black box testing.

The scope of this test effort outlines the quality assurance methodology, process and procedures used to validate system functionality and the user’s ability to navigate through the 4 major components of the [Project Name] Course solution:

The Prep Course

Page 6: Sample Master Test Plan

Practice Exams

The [PRODUCT NAME]Exam

A Progress Tracking

The functional testing will include a Prep Course creation and maintenance tool (Admin Tool) for use by internal [COMPANY NAME] staff to facilitate creating new Prep Course, Practice Exam, and [PRODUCT NAME]Exam question and answer databases, as well as testing the ability to maintain those question and answer databases by adding new questions, correcting errors in existing questions, modifying responses, etc. This internal tool will support the maintenance of multiple versions of the software, and should be capable of generating a database suitable for distribution with the software itself. Finally, the test effort will include an internal tool which allows [COMPANY NAME] staff to review results that are uploaded by users for the Prep Course, Practice Exams, and [PRODUCT NAME]Exam.

1.2.1 Product Description

The [Project Name] Course is a software product sold by the [COMPANY NAME] to its members to help them study and prepare for the [PRODUCT NAME]Exam, which is required for an individual to earn their Certified Fraud Examiner certification. The [Project Name] Course software actually encompasses two different applications: The [Project Name] Course itself, which offers over a thousand sample questions for the member to study, and the actual [PRODUCT NAME]Exam, which administers randomly-selected questions from a question database in a timed session.

The [Project Name] Course and [PRODUCT NAME]Exam are offered in US, Canadian, UK, and International editions, and a new revision of each is released annually.

1.2.2 The Current Product Version

Currently, this product exists in multiple buyable formats, including the stand-alone Prep Course and Exam on CD, a downloadable version of the Prep + Exam, the [PRODUCT NAME]Exam-only CD, a downloadable version of the [PRODUCT NAME]Exam, and the [Project Name] Toolkit, which includes hard-copy study material. In each case, both the [Project Name] Course and the [PRODUCT NAME]Exam software programs are Microsoft Windows-based desktop applications that utilize an encrypted Microsoft Access database for data storage. Users of the software are initially provided with a license key to unlock the Prep Course.

Upon successful completion of the Prep Course, they are then provided a second key to unlock the [PRODUCT NAME]Exam. The user may submit the results of the Exam by exporting an encrypted text file from the software and then emailing that file to the [COMPANY NAME] for grading.

1.2.3 Updating the Product

The software has proven to be a very stable and reliable product despite the fact that it was developed in 1999; however, the product is showing its age and needs to be re-written to bring it up to current standards. Since the product has proven to be very successful and has received a great deal of positive feedback from users, it is

Page 7: Sample Master Test Plan

the intent of this project to keep most of the features of the existing version intact. Additionally, several new features and enhancements will be added to improve the customer experience with the product, help users work through the Prep Course more efficiently, as well as to enhance the installation, maintenance, and administration of the software. For this initial release, the product will continue to be offered as an installable product on Windows-based PCs and will not require Internet access, with the exception of a few specific tasks (such as license key verification, transmitting exam results to the [COMPANY NAME], and receiving updates and patches).

1.2.4 High-Level Product Development Objective

The high-level objectives of this project are as follows:

Update the software to a current technology platform while maintaining the core functionality of the original product

Re-design the user interface to take advantage of current technology standards, and to deliver a better user experience

Improve the process for creating new versions of the [Name] by providing tools that allow the addition, modification, and removal of questions from the questions database, maintenance of sections and sub-sections, and error-checking routines that ensure the question and answers database is assembled correctly

Provide a method for delivering updates to the product electronically and seamlessly to the user

Provide a feature set that helps the user proceed through the certification process faster and more efficiently, from the time the product is purchased to the time the exam results are submitted to the [COMPANY NAME] for evaluation

Allow internal non-technical personnel to manage the creation and maintenance of new [Name] versions

1.3 Test Plan Objectives

This Test Plan for the new [Project Name] Course solutions supports the following objectives:

This Test Plan supports the following objectives:

Outlines and defines the overall test approach that will be used. Identifies hardware, software, and tools to be used to support the testing efforts. Defines the types of tests to be performed. Defines the types of exam data required for effective testing. Defines the types of security threats and vulnerabilities against which each exam

system will be tested.

Page 8: Sample Master Test Plan

Identifies and establishes traceability from the Requirements Matrix to test cases and from test cases to the Requirements Matrix.

Serves as a foundation for the development of Test Plans and Test Cases. Defines the process for recording and reporting test results. Defines the process for regression testing and closure of discrepancies. Identifies the items that should be targeted by the tests. Identifies the motivation for and ideas behind the test areas to be covered. Identifies the required resources and provides an estimate of the test efforts. List the deliverable elements of the test activities. Define the activities required to prepare for and conduct System, Beta and User

Acceptance testing Communicate to all responsible parties the System Test strategy Define deliverables and responsible parties Communicate to all responsible parties the various Dependencies and Risks Scope

2 Test Items (Functions)The testing effort will be concentrated on the following functions of the application:

2.1 Client Application

Installation Uninstall Activation Prep Course Home Page Pre-Assessment Review Sessions Practice Exams Exam Application Admin Tool Editions Revisions Sections Subsections Topics Demo Questions (create, view, edit, and approve) Test Search Capability License Keys Exam Keys Review Exam Key Requests Review Exam Submissions Issue Reports

2.2 Quick Help Testing

The system will provide the [Name] users the ability to access a help menu; consisting of links to various options. These following utility options will be tested:

[COMPANY NAME] Reference Manual

Page 9: Sample Master Test Plan

Launch [Project Name] Course Your Contact Information [COMPANY NAME].com FAQs Online Discussion Forums [PRODUCT NAME]Certification Qualification Checklist Request Exam Key Enter Exam Key Submit [PRODUCT NAME]Exam to [COMPANY NAME] Utilities Check for Updates Activate Product Report Issue

2.3 License Key

Current the License Key is requested by submitting an email to the [COMPANY NAME] Certification Team. The [COMPANY NAME] Certification Team will electronically generate and email the license key back to the requestor.

New functionality will be implemented to enhance this process. The users will be required to complete and electronically submit a License Key request form to the [COMPANY NAME] Certification Team. The [COMPANY NAME] Certification Team will electronically generate and email the license key back to the requestor.

This new functionality will be included in the scope of the testing effort.

2.4 Security

Each user will need a UserId and password to login to the system. The UserId and password will be created by the user during the registration process. The system should validate that the UserId and password both meet the correct format standard. Once the registration form has been electronically submitted, the system will notify the user that the requested identification information has been accepted and the request has been granted. The system will require the users to change the password every 30 days.

3 Software Risk Issues

3.1 Schedule

The schedule for each phase is very aggressive and could affect testing. A slip in the schedule in one of the other phases could result in a subsequent slip in the test phase. Close project management is crucial to meeting the forecasted completion date.

Page 10: Sample Master Test Plan

3.2 Technical

Since this is a new [Name] system, in the event of a failure the old system can be used. We will run our test in parallel with the production system so that there is no downtime of the current system.

3.3 Management

Management support is required so when the project falls behind, the test schedule does not get squeezed to make up for the delay. Management can reduce the risk of delays by supporting the test team throughout the testing phase and assigning people to this project with the required skills set.

3.4 Personnel

Due to the aggressive schedule, it is very important to have experienced testers on this project. Unexpected turnovers can impact the schedule. If attrition does happen, all efforts must be made to replace the experienced individual.

3.5 Requirements

The test plan and test schedule are based on the current Requirements Document. Any changes to the requirements could affect the test schedule and will need to be approved by the CCB.

4 Functions to Be Tested

The following is a list of functions that will be tested:

Add/update user information

Search / Lookup employee information

Escape to return to Main Menu

Security features

Error messages

Prep Course Functionality

New License Key Request Functionality

Practice Exam Functionality

[PRODUCT NAME]Exam Functionality

Progress Tracking Functionality

Screen mappings (GUI flow). Includes default settings

A Requirements Validation Matrix will “map” the test cases back to the requirements. See Deliverables.

Page 11: Sample Master Test Plan

5 Functions to Not Be Tested

This is to be determined.

6 Test ApproachFunctional testing will be conducted during the entire Application Development Life Cycle by the Quality Assurance Engineer and the System Business Analyst. At the conclusion of each iteration, formal testing will be conducted by the business unit subject matter experts in two cycles; while overall testing of the entire system will be conducted by the Quality Assurance Engineer and the System Business Analyst in one final cycle.

The overall testing approach of the project will address and encompass the following rules and processes:

6.1 Special Testing Tools

Microsoft Visual Studio 2010 Ultimate - The comprehensive suite of application lifecycle management tools used by the [COMPANY NAME] Software Development Team to ensure quality results, from design to deployment. It includes: Integrated Development Environment, Development Platform Support, Team Foundation Server, MSDN Subscription, Testing Tools, Database Development, Debugging and Diagnostics, Application Lifecycle Management, Architecture and Modeling, and Lab Management.

Visual Studio Test Professional 2010 - An integrated testing toolset that delivers a complete plan-test-track workflow for in-context collaboration between testers and developers.

Page 12: Sample Master Test Plan

Microsoft Visual Studio Team Foundation Server 2010 (TFS) - The collaboration platform at the core of our application lifecycle management (ALM) process. Team Foundation Server 2010 automates the software delivery process and enables everyone on our team to collaborate more effectively, be more agile, and deliver better quality software while building and sharing institutional knowledge. Project artifacts like requirements, tasks, bugs, source code, and build and test results are stored in a data warehouse. The tool also contains the reporting, historical trending, full traceability, and real-time visibility into quality and progress.

6.2 Special Training on Testing Tools

Since Microsoft Visual Studio 2010 Ultimate is a relatively new tool on the market, some special training will be required. Due to the current time constraints of the project, this training will be scheduled and completed at a later date.

However, the Development Team; including the SQA Engineer, has proactively begun “self-learning” and utilizing the tool and has integrated the use of the Microsoft tools into its overall testing strategy.

6.3 Test Metrics

Testing Information will be collected and oriented toward the level of testing. For higher levels, application and functional data will be collected and documented. For lower levels, program, unit, module and build data will be collected and documented.

All test results will be documented in the Master Test Case spreadsheet, reviewed by the business unit team leader, and forwarded to the Quality Assurance Engineer and the System Business Analyst.

The test results will be evaluated and entered into the Team Foundation Server through the use of Visual Studio Test Manager; where each “automatable” test case will be recorded and run using the “Coded UI Test” feature in Visual Studio 2010 Ultimate.

Reports will be generated highlighting:

The name of the assigned tester The number of test cases completed The number of passed and failed test cases The number of open issued (by priority)

Eventually, performance testing will be conducted to measure the system response time, baseline load, and system stress capacity.

Page 13: Sample Master Test Plan

6.4 Configuration Management

Test configuration management will be managed by the Lab Manager tool; included with the Microsoft Visual Studio 2010 Ultimate package.

Lab Manager can fully provision and ready multiple environments for testing so that build scripts can explicitly target a particular lab configuration at build time. Lab Management stores the environments as virtual machine images in a library of pre-built images using System Center Virtual Machine Manager (SCVMM) to ensure teams always begin their testing from a known configuration.

The following operating systems and browser will be used in multiple combinations for testing the [Project Name] application:

Operating Systems

Windows XP Windows Vista Windows 7

Browsers

Firefox 3.0 Internet Explorer 7.0 Internet Explorer 8.0

6.5 Regression Testing

Regression testing will be conducted at the conclusion of each testing iteration by the SQA Department and any assigned business unit subject matter experts. In most cases, the testing will be based on severity of defects detected.

6.6 Requirements Management

The business requirements will be elicited and managed by the Systems Business Analyst. Any elements in the requirements and design that do not make sense or are not testable will be immediately documented and reported to the SBA; who will in turn address these issues with the shareholders for further clarification.

7 Test Strategy

The test strategy consists of a series of different tests that will fully exercise the [Name] system. The primary purpose of these tests is to uncover the systems limitations and measure its full capabilities. A list of the various planned tests and a brief explanation follows below.

Page 14: Sample Master Test Plan

7.1 System Test

The System tests will focus on the behavior of the [Name] system. User scenarios will be executed against the system as well as screen mapping and error message testing. Overall, the system tests will test the integrated system and verify that it meets the requirements defined in the requirements document.

7.2 Performance Test

Performance test will be conducted to ensure that the [Name] system’s response times meet the user expectations and do not exceed the specified performance criteria. During these tests, response times will be measured under heavy stress and/or volume.

7.3 Security Test

Security tests will determine how secure the new [Name] system is. The tests will verify that unauthorized user access to confidential data is prevented.

7.4 Automated Test

A suite of automated tests will be developed to test the basic functionality of the [Name] system and perform regression testing on areas of the systems that previously had critical/major defects. The tool will also assist us by executing user scenarios thereby emulating several users.

7.5 Stress and Volume Test

We will subject the [Name] system to high input conditions and a high volume of data during the peak times. The System will be stress tested using twice (20 users) the number of expected users.

7.6 Recovery Test

Recovery tests will force the system to fail in a various ways and verify the recovery is properly performed. It is vitally important that all [Name] data is recovered after a system failure & no corruption of the data occurred.

7.7 Documentation Test

Tests will be conducted to check the accuracy of the user documentation. These tests will ensure that no features are missing, and the contents can be easily understood.

7.8 Beta Test

The [Name] department will beta tests the new [Name] system and will report any defects they find. This will subject the system to tests that could not be performed in our test environment.

7.9 User Acceptance Test

Page 15: Sample Master Test Plan

Once the [Name] system is ready for implementation, the [Name] department will perform User Acceptance Testing. The purpose of these tests is to confirm that the system is developed according to the specified user requirements and is ready for operational use.

8 Entry and Exit Criteria

8.1 Test Plan

8.1.1 Test Plan Entry Criteria

Package coding is complete and the code has been informally reviewed by the team.

8.1.2 Test Plan Exit Criteria

Either the second semester ends or all of the use case requirements have been verified.

8.1.3 Suspension and Resumption Criteria

Testing will be suspended on critical design flaws that will require a package, package interface redesigned. Testing will resume when the coding is complete and code is reviewed successfully.

If testing is suspended, resumption will only occur when the problem(s) that caused the suspension has been resolved. When a critical defect is the cause of the suspension, the “FIX” must be verified by the test department before testing is resumed.

8.2 Test Cycles

8.2.1 Test Cycle Entry Criteria

TBD

8.2.2 Test Cycle Exit Criteria

All tests specified at the start of the iteration have completed successfully, or the end of the second semester occurs.

If any defects are found which seriously impact the test progress, the QA manager may choose to suspend testing. Criteria that will justify test suspension are:

Hardware/software is not available at the times indicated in the project schedule.

Source code contains one or more critical defects, which seriously prevents or limits testing progress.

Assigned test resources are not available when needed by the test team.

Page 16: Sample Master Test Plan

9 Deliverables

Deliverable Responsibility Delivery Date

Develop Test cases Testers 10/13/2010Test Case Review Test Lead, Dev. Lead, Testers 10/15/2010Develop Automated test suites

Testers TBD

Requirements Validation Matrix

Test Lead TBD

Execute manual and automated tests

Testers & Test Lead 9/20 – 10/15/2010

Complete Defect Reports Everyone testing the product On-goingDocument and communicate test status/coverage

Test Lead Weekly

Execute Beta tests [Name] Department Users 10/18 – 10/29/2010Document and communicate Beta test status/coverage

[Name] Department Manager TBD

Execute User Acceptance tests

[Name] Department Users 10/18 – 10/29/2010

Document and communicate Acceptance test status/coverage

[Name] Department Manager TBD

Final Test Summary Report Test Lead 11/1/2010

The following artifacts will be testing deliverables, available to the stakeholders:

9.1 Test Evaluation Summaries

These summaries will outline the tests conducted and their results.

Page 17: Sample Master Test Plan

9.2 Incident Logs and Change Requests

Incident log entries will be made for all bugs found during testing. The log will be used to track the status of the bugs.

Any modifications to the requirements must be done through change requests, which will ensure the proposed change is fully reviewed before being incorporated into the [Name] application.

10 Environment Requirements

This section presents the non-human resources required for the Test Plan.

10.1 Base System Hardware

The following table sets forth the system resources for the test effort presented in this Test Plan.

System ResourcesResource Quantity Name and TypeApplication Server 1—CPUs—Memory—Hard Disk 1—Hard Disk 2—Server Name—IP AddressTest Development PCs TBD

10.2 Base Software Elements in the Test Environment

The following base software elements are required in the test environment for this Test Plan.

Software Element Name Version Type and Other NotesWindows ServerSQL Server

10.3 Productivity and Support Tools

The following tools will be employed to support the test process for this Test Plan.Tool Category or Type Tool Brand Name Vendor or In-house VersionTest Management Microsoft 2010Defect Tracking

Page 18: Sample Master Test Plan

11 Responsibilities, Staffing and Training Needs

This section outlines the personnel necessary to successfully test the [Name] application. Since staff size is fixed these number may change.

11.1 People and Roles

This table shows the staffing assumptions for the test effort.

Human Resources

Role Minimum Resources Recommended

(number of full-time roles allocated)

Specific Responsibilities or Comments

Test Analyst 1 Identifies and defines the specific tests to be conducted.

Responsibilities include:

identify test ideas

define test details

determine test results

document change requests

evaluate product quality

Test Designer 1 Defines the technical approach to the implementation of the test effort.

Responsibilities include:

define test approach

define test automation architecture

verify test techniques

define testability elements

structure test implementation

Page 19: Sample Master Test Plan

Human Resources

Role Minimum Resources Recommended

(number of full-time roles allocated)

Specific Responsibilities or Comments

Tester 2 Implements and executes the tests.

Responsibilities include:

implement tests and test suites

execute test suites

log results

analyze and recover from test failures

document incidents

Database Administrator, Database Manager

1 Ensures test data (database) environment and assets are managed and maintained.

Responsibilities include:

Support the administration of test data and test beds (database).

Designer 1 Identifies and defines the operations, attributes, and associations of the test classes.

Responsibilities include:

defines the test classes required to support testability requirements as defined by the test team

Page 20: Sample Master Test Plan

Human Resources

Role Minimum Resources Recommended

(number of full-time roles allocated)

Specific Responsibilities or Comments

Implementer 2 Implements and unit tests the test classes and test packages.

Responsibilities include:

creates the test components required to support testability requirements as defined by the designer

11.2 Staffing and Training Needs

This section outlines how to approach staffing and training the test roles for the project.

Staffing is fixed for the duration of this project. It is likely most of the staff will assume some testing role.

12 Test Schedule

Ramp up / System familiarization 10/01/10 - 10/15/10 System Test 10/16/10 - 12/26/10 Beta Test 12/28/10 - 01/18/11 User Acceptance Test 01/19/11 - 02/01/11

13 Potential Risks and Contingencies

Page 21: Sample Master Test Plan

The overall risks to the project with an emphasis on the testing process are:

Lack of personnel resources when testing is to begin.

Lack of availability of required hardware, software, data or tools.

Late delivery of the software, hardware or tools.

Delays in training on the application and/or tools.

Changes to the original requirements or designs.

Requirements definition will be complete by September 1, 2010, and, if the requirements change after that date, the following actions will be taken:

The test schedule and development schedule could move out an appropriate number of days.

The number of test performed could be reduced.

The number of acceptable defects could be increased.

Resources could be added to the test team.

The test team could be asked to work overtime.

The scope of the plan may be changed.

There may be some optimization of resources.

14 Control Procedures

14.1 Reviews

The project team will perform reviews for each Phase. (i.e. Requirements Review, Design Review, Code Review, Test Plan Review, Test Case Review and Final Test Summary Review). A meeting notice, with related documents, will be emailed to each participant.

14.2 Bug Review meetings

Regular weekly meeting will be held to discuss reported defects. The development department will provide status/updates on all defects reported and the test department will provide addition defect information if needed. All member of the project team will participate.14.3 Change Request

Once testing begins, changes to the [Name] system are discouraged. If functional changes are required, these proposed changes will be discussed with the Change Control Board (CCB). The CCB will determine the impact of the change and if/when it should be implemented.

Page 22: Sample Master Test Plan

14.4 Defect Reporting

When defects are found, the testers will complete a defect report on the defect tracking system and submit the report to the Eureka software engineers for analysis. The defect tracking system is accessible by testers, developers & all members of the project team. When a defect has been fixed or more information is needed, the Eureka developer will change the status of the defect to indicate the current state. Once a defect is verified as “fixed” by the testers; the testers will close the defect report.

15 DocumentationThe following documentation will be available at the end of the test phase:

Test Plan

Test Cases

Test Case review

Requirements Validation Matrix

Defect reports

Final Test Summary Report

16 Iteration Milestones

Milestone Planned Start Date

Actual Start Date

Planned End Date

Actual End Date

Iteration Plan agreed

Page 23: Sample Master Test Plan

Milestone Planned Start Date

Actual Start Date

Planned End Date

Actual End Date

Architecture base lined

User Interface base lined

First Build delivered to test

First Build accepted into test

First Build test cycle finishes

[Build Two will not be tested]

Third Build delivered to test

Third Build accepted into test

Third Build test cycle finishes

Fourth Build delivered to test

Fourth Build accepted into test

Iteration Assessment review

Iteration ends

17 Management Process and Procedures

17.1 Problem Reporting, Escalation, and Issue Resolution

Problems will be reported to the team by e-mail. Bugs will be listed on the project page with the developer responsible for fixing each bug. Each bug will be given a priority, which will determine when it is addressed in the current iteration.

Page 24: Sample Master Test Plan

17.2 Approval and Signoff

The team manager must approve the initial test plan. As tests are successfully completed the testers will sign off use case requirements.