Test plan

23
Test Plan Document IIT Official Website

Transcript of Test plan

Test Plan Document IIT Official Website

1 | P a g e

TEST PLAN OF IIT OFFICIAL

WEBSITE

Version 1.00

Submitted To:

Mohammad Ashik Elahi

Course Teacher

SE 605

Submitted By:

Md. Arif Ibne Ali – BIT0308

Nadia Nahar – BIT0327

Submission Date:

5th

December, 2013

2 | P a g e

December 05, 2013

Mohammad Ashik Elahi

Course Teacher, SE 605

Institute of Information Technology

University of Dhaka

Subject: Letter of Submission

Dear Sir,

We are pleased to submit the Test Plan Document on IIT Web Site that you had asked, we tried

to find the scope of this Project Plan and its prospect from a pragmatic point of view. We have

faced many obstacles in preparing it. But finally, we have successfully accomplished preparing

the document.

Therefore, we request you to accept this document. We believe that you’ll find it in order. We

are eagerly expecting your feedback on the overall document.

Yours sincerely,

Md. Arif Ibne Ali - BIT0308

Nadia Nahar - BIT0327

3rd Batch, IITDU

3 | P a g e

Revision History

Name Date Changes Version Signature

Test Plan of IIT Website 5.12.13 - 1.0

4 | P a g e

Table of Contents

1. Test Plan Identifier .................................................................................................................. 6

2. References ............................................................................................................................... 6

3. Introduction ............................................................................................................................. 6

4. Test Items ................................................................................................................................ 7

5. Software Risk Issues ................................................................................................................ 8

6. Features to be Tested ............................................................................................................... 9

7. Features not to be Tested ....................................................................................................... 11

8. Approach ............................................................................................................................... 12

8.1 Testing Levels ................................................................................................................ 12

8.2 Configuration Management/Change Control ................................................................. 12

8.3 Test Tools ....................................................................................................................... 13

8.4 Meetings ......................................................................................................................... 13

8.5 Measures and Metrics..................................................................................................... 13

9. Item Pass/ Fail Criteria .......................................................................................................... 14

10. Suspension Criteria and Resumption Requirements ............................................................. 14

10.1 Suspension Criteria ........................................................................................................ 14

10.2 Resumption Requirements ............................................................................................. 14

11. Test Deliverables ................................................................................................................... 15

12. Remaining Test Tasks ........................................................................................................... 15

13. Environment .......................................................................................................................... 16

13.1 Environmental Needs ..................................................................................................... 16

13.2 Description of Actual Testing Environment .................................................................. 16

14. Staffing and Testing Needs ................................................................................................... 17

15. Responsibilities ..................................................................................................................... 17

5 | P a g e

16. Schedule ................................................................................................................................ 18

17. Planning Risks and Contingencies ........................................................................................ 18

18. Approvals .............................................................................................................................. 19

19. Glossary ................................................................................................................................. 19

References ..................................................................................................................................... 22

LIST OF TABLES

Table 1: Features to be tested.......................................................................................................... 9

Table 2: Features not to be tested ................................................................................................. 11

Table 3: Remaining test tasks ....................................................................................................... 15

Table 4: Responsibilities ............................................................................................................... 17

Table 5: Approvals ........................................................................................................................ 19

Table 6: Glossary .......................................................................................................................... 19

6 | P a g e

1. Test Plan Identifier

IW-MTP01.00

2. References

List of documents that support this test plan –

1. Software Requirement Specification Report on IIT Web Site – Version 1.0

2. IEEE 829 standards and guidelines

3. Introduction

“IIT Website” will be the representer of IIT. We have been designing our website with well-

equipped technology. IIT website contains all the information of IIT students, teachers,

academia, admission etc. This project is now at development phase, so readers can read the

Software Requirement Specification document for details.

This document presents the Master Test Plan of IIT Website. As we know, master test plan is a

living and breathing document that summarizes the overall effort required to test a software

product. Master test plan will actually contain the details of individual tests to be run during the

testing cycle like unit test, system test, beta test etc. However, our document will categorize and

describe each test case. It will also outline pass-fail criteria and indicate the planned run day or

week. This is a quick-reference tracking document for what has to be tested, the priority of test

items, what is left to test etc.

We followed IEEE-829 format to develop our test plan. We strictly follow the instructions

provided by our respective course teacher. This is our first test plan documentation, so we also

read some sample test plan (example: Reassigned sales re-write project) to gather knowledge

about test plan documentation.

The estimated time line for this project is a semester (maximum 6 month). The testing activities

are to be done in parallel with the development process.

7 | P a g e

4. Test Items

The test items can be recognized from higher level and also from lower level.

Higher level Test Items

The following is a list of the items to be tested:

o IIT Official Website released version 1.0 and supporting infrastructure

o Website running on different client platforms

The following is a list of the items not to be tested:

o SRS of IIT Official Website version 1.0

o Previous IIT Official Website

o Other applications that the IIT Official Website application uses

o The manual processes related to the application

Lower level Test Items

For lower levels, test items may be program, unit, module or build. We have considered modules

as lower level items here. So, the modules of IIT Official Website are –

The following is a list of the items to be tested:

o Achievements

o Gallery

o News & Events

o Program Management

o Batch Management

o Course Management

o Faculty Management

o Exam Management

o Result Management

o Group

8 | P a g e

o Profile

o Documents

o Attendance

o Project & thesis

o Calendar

o Alumni

o Information

The following is a list of the items not to be tested:

o User – a default module of codeIgniter, test not needed

o Admission Management – not released in version1.0

o Archive – merged with the project-thesis module

o Faculty Member – merged with faculty management

o Programs – merged with program management

Some more modules were included and changed during development phase, which are not fully

recognized or documented yet. So, test items of those modules will be included in the next

version of the test plan.

5. Software Risk Issues

There is several risk issues recognized which can have direct impact on the website application

and need to be handled carefully.

1. Delivery of the website and hosting

2. Reliability of the web hosting service (measures availability of website)

3. Poorly documented modules and changes

4. Backup and recovery of files, database

5. Database security and safety

6. Failure of services detection and handling

9 | P a g e

6. Features to be Tested

The features and attributes to be tested are –

IIT Official Website Likelihood Impact Priority

Features Attributes

Add Album

(Upload Picture) Medium Medium 4

Modify Album Medium Medium 4

View Album Medium Low 3

Add Achievement Medium Medium 4

Modify Achievement Medium Medium 4

Add News/Events Medium Medium 4

Modify News/Events Medium Medium 4

View News/Events Low Medium 3

Create Program Medium High 5

Modify Program Medium High 5

Create Semester Medium High 5

Modify Semester Medium High 5

Create Course Medium High 5

Modify Course Medium High 5

Add Faculty Member Medium High 5

Modify Faculty

Member Medium Medium 4

Assign Course High High 6

Create Batch High High 6

Modify Batch High High 6

View Batch Medium Low 3

Create Student Medium High 5

Modify Student Medium High 5

10 | P a g e

Send Program

Notification Medium Medium 4

Send Batch Notification Medium Medium 4

Create Club Low Medium 3

Modify Club Medium Low 3

Edit Profile High High 6

Document Upload High Medium 5

Document Download Medium Medium 4

Add Attendance Medium High 5

Modify Attendance Medium Low 3

Add Project/Thesis Medium High 5

Modify Project/Thesis Medium Low 3

Group Post High Medium 5

Add Examination Medium High 5

Modify Examination Medium High 5

View Examination Low High 4

View Activity Log Low Medium 3

Add Admin Medium High 5

Modify Admin Medium Medium 4

View Admin Low Medium 3

Promote Batch High High 6

View Calendar Medium Low 3

Capability Medium Medium 4

Reliability Medium High 5

Usability Medium High 5

Security Medium High 5

Compatibility Medium Medium 4

Table 1: Features to be tested

11 | P a g e

7. Features not to be Tested

Features and attributes with low priority need not be tested. So the list of those features and

attributes is –

IIT Official Website Likelihood Impact Priority

Features Attributes

View Achievements Low Low 2

View Programs Low Low 2

View Semesters Low Low 2

View Courses Low Low 2

View Student Low Low 2

Document Delete Low Low 2

View Attendance Low Low 2

View Project/Thesis Low Low 2

View Dashboard Low Low 2

View About IIT Low Low 2

View Academic Info Low Medium 3

View Admission Info Low Medium 3

View Alumni Low Low 2

Scalability Low Medium 3

Performance Low Medium 3

Table 2: Features not to be tested

12 | P a g e

8. Approach

8.1 Testing Levels

The testing approach for the IIT official website project is Master Testing Plan (MTP). MTP

consists of unit testing, system/integration(combined) and acceptance test levels. In our project,

most testing part is being accomplished by our development and testing team.

All kinds of developers will work in the unit testing part of this project. Proof of unit testing

must be provided by the developers to the team leader. All unit test information also need to be

provided to the testing team.

System/Itegration testing will be done by the testing team and development team. They will enter

the programs into System/Itegration test after all critical defects have been corrected.

Acceptance testing will be done by the end user- the users of this website with the assistance of

the test team and developer team. Program will enter into acceptance test after all critical and

major defects have been corrected. Prior to final completation of acceptance testing all open

critical and major defects MUST be corrected and verfied by the customer.

8.2 Configuration Management/Change Control

The programs under development and those in full test will have the same version controls and

tracking of changes. The migration of the website from the development and test phase to the

production phase will be done once all testing has been completed according to published plans

and guidelines.

All changes, enhancements and other modification requests to the system will be handled

through the published change control procedures. Any modifications to the standard procedures

will be according to the SRS change control section (page – 194).

13 | P a g e

8.3 Test Tools

o Selenium – Web Browser Automation

o Microsoft Visual Studio 2012 – Load Testing

o CIUnit – Unit testing for CodeIgniter

o Firebug – Web development tool that facilitates debugging

o FreeMind - free mind mapping software

o JSUnit – Unit testing for Javascript

o Multi-Mechanize – Performance and Load Testing

o Capybara – Acceptance test framework

8.4 Meetings

The test team will meet once in every weeks to evaluate the progress and identify all problems

and conduct a solution. Test team will also meet with development team to merge their ideas

about testing and quality of our website. Addtional mettings can be called as required for

emergency situation.

8.5 Measures and Metrics

The following information will be collected by the Development team during the Unit testing

process. This information will be provided to the test team at program turnover as well as be

provided to the project team on a biweekly basis.

o Defects by module and severity

o Defect origin

o Time spent on defect resolution

14 | P a g e

The following information will be collected by the test team during all testing phases. This

information will be provided on a biweekly basis to the test manager and to the project team.

o Defects by module and severity

o Defect origin

o Time spent on defect resolution

o Number of times a program submitted to test team as ready for test

9. Item Pass/ Fail Criteria

The test process will be completed when the project leader will be satisfied with the result of the

test. For this, at least 90% of test cases must pass; all functionalities must be covered in those test

cases and most of all, high and medium severity defects must be detected and fixed. Minor

defects can be ignored, but with the assurance that it does not lead to severe defect.

The project leader will decide whether the detected defects and criticality will cause the release

of IIT Website of version 1.0 to delay.

10. Suspension Criteria and Resumption Requirements

10.1 Suspension Criteria

In general, testing will only stop if somehow the website becomes unavailable. But certain

portion of tests may be suspended or skipped if prerequisite tests have previously failed.

10.2 Resumption Requirements

In the case of website unavailability, testing will be resumed after access to the Website is

reestablished. And about the skipped test cases, they can be tested after the related failed cases

are fixed.

15 | P a g e

11. Test Deliverables

o Master Test Plan (IW-MTP01.00 – this document)

o Unit Test Plan

o System/Integration Test Plan

o Acceptance Test Plan

o Screen Prototypes

o Defect reports and summaries

o Test logs

o Automated test scripts and supporting test data

12. Remaining Test Tasks

Task Assigned to Status

Define Unit Test rules and procedures Test Manager, Project Manager, Developer

Create System/Integration Test Plan Test Manager, Project Manager, Developer

Create Acceptance Test Plan Test Manager, Project Manager, Client

Verify Prototype of Screens Test Manager, Developer, Client

Verify Prototype of Reports Test Manager, Developer, Client

Automate Test Scripts Test Manager, Developer

Verify Test Data Test Manager, Project Manager

Table 3: Remaining test tasks

16 | P a g e

13. Environment

Our project is the official website of Institute of Information Technology, University of Dhaka.

There are essentially two part of our system with many of its functionalities. One part of the

application is going to be accessed over the internet by students, teachers, office staffs and

outsiders (general visitors). And other is IIT maintainance side (for admin/ maintenance officer).

Following elements support the overal test strategy and effort to improve the quality of our

website.

13.1 Environmental Needs

The following represent the essential hardware and software needs.

I. Any kind of Operating system supports this website

II. Minimum Hardware (comparing with this era) configaration of pc’s and servers.

III. Relaible communication link with our website supported software

13.2 Description of Actual Testing Environment

I. Available Client side envirnoment

- no need to buy a new hardware because it runs with minimum hardware configartion

- Normal browsers are able load this side.

II. Available admin (server) side environment

- Testing and development team do their job related to testing and QA on the total

development period.

- Unit, system, acceptance, load, performence testing task

III. Avaiable testing tools

- Use 3rd party tool for further testing.

- Selenium, FreeMind, VS-2012, Firewall etc

17 | P a g e

14. Staffing and Testing Needs

In our project there is a testing team consisting of 04-06 members. It’s an academic project (in

students’ perspective) as well as a business project (IIT perspective). It seems like a bridge

between academia and industry. The tester(s) and development engineers will ensure that

teachers, other students, staffs assign to this project are experienced with:

Development period:

1. General development and testing techniques and QA process.

2. Simple knowledge about website development lifecycles, DB management

3. Development tools, testing tools that they may be required to use

Production period:

1. Relevant people (internal user) should be trained by developer and tester.

2. Train at least two persons who will maintain and solve general problems of IIT website.

15. Responsibilities

Overall Operations Test

Manager

Project

Manager

Development

Team

Testing

Team

Client

Unit test documentation and execution X X X

System/integration test documentation

and execution X X X

Acceptance test documentation and

execution X X X

System design review X X X

Detail design review X X

Test procedures and rules X X X

Regression testing X X X

Table 4: Responsibilities

18 | P a g e

16. Schedule

Scheduling is an important part in project management. In a software project there are many

different steps like requirements gathering, designing, development, QA & Testing. Every step

has fixed timestamps. To develop a test plan we need to consider these following parts:

I. Review of requirements documents

II. Create test Design, observe test execution and produce the test incident/summary report

III. Development of master test plan (MTP)

IV. Develop the system/integration and acceptance test plans of this project

V. Review of the system design document

VI. Unit test time within the development process

VII. Allocate time for system/integration, acceptance

All steps must be accomplished within the fixed budget and time.

17. Planning Risks and Contingencies

Following are the likely project risks and possible contingencies of them –

o Unavailability of Website: Testing will be delayed until the website is reestablished.

Possible contingency can be to increase testers or reduce number of test cases.

o Unavailability of Testing Software: This can be caused because of the disability of the tools

to handle cookie and it can lead to delay of automated testing and increase manual testing.

Possible contingency can be to increase testers or reduce number of test cases.

o Time problem: There may not be enough time to complete all test cases. In that case we can

skip the cases with lower priorities.

o Lack of Tester: If testers are unavailable, test cases can be reduced by eliminating cases

with low priority.

o Large Number of Defects: A large number of defects make it functionally impossible to run

all of the test cases. In that case release of the version need to be delayed.

19 | P a g e

18. Approvals

Approvals need to be taken from the following persons –

Post Signature

Project Sponsor – Institute of Information Technology,

University of Dhaka

Project Manager – Dr. Md. Mahbubul Alam Joarder

Project Supervisor – Asif Imtiaz

Development Team Leader – Upal Roy, Nadia Nahar

Testing Team Leader – Ahmad Tahmid

Management Team Leader – Sujit Ghosh

Table 5: Approvals

19. Glossary

Term Definition

Acceptance

Testing

Testing conducted to enable a user/customer to determine whether to accept a

software product. Normally performed to validate the software meets a set of agreed

acceptance criteria.

Automated

Testing Testing employing software tools which execute tests without manual intervention.

Beta Testing Testing of a rerelease of a software product conducted by customers.

Bug A fault in a program which causes the program to perform in an unintended or

unanticipated manner.

Coding The generation of source code.

Configuration

Management

A discipline applying technical and administrative direction and surveillance to:

identify and document the functional and physical characteristics of a configuration

item, control changes to those characteristics, record and report change processing

and implementation status, and verify compliance with specified requirements.

[IEEE 610]

20 | P a g e

Debugging The process of finding and removing the causes of software failures.

Defect Nonconformance to requirements or functional / program specification

deliverable Any (work) product that must be delivered to someone other that the (work)

product’s author.

Functional

Testing

Testing the features and operational behavior of a product to ensure they correspond

to its specifications.

Impact

Analysis

The assessment of change to the layers of development documentation, test

documentation and components, in order to implement a given change to specified

requirements.

Load Test

A test type concerned with measuring the behavior of a component or system with

increasing load, e.g. number of parallel users and/or numbers of transactions to

determine what load can be handled by the component or system.

pass/fail

criteria

Decision rules used to determine whether a test item (function) or feature has passed

or failed a test. [IEEE 829]

Performance

Testing The process of testing to determine the performance of a software product.

Risk Analysis The process of assessing identified risks to estimate their impact and probability of

occurrence (likelihood).

Security

Testing

Testing which confirms that the program can restrict access to authorized personnel

and that the authorized personnel can access the functions available to their security

level.

severity The degree of impact that a defect has on the development or operation of a

component or system. [After IEEE 610]

Software

Requirements

Specification

A deliverable that describes all data, functional and behavioral requirements, all

constraints, and all validation requirements for software

Software

Testing A set of activities conducted with the intent of finding errors in software.

Testing The process of exercising software to verify that it satisfies specified requirements

and to detect errors.

21 | P a g e

Test

Approach

The implementation of the test strategy for a specific project. It typically includes the

decisions made that follow based on the (test) project’s goal and the risk assessment

carried out, starting points regarding the test process, the test design techniques to be

applied, exit criteria and test types to be performed.

Test Case

A set of inputs, execution preconditions, and expected outcomes developed for a

particular objective, such as to exercise a particular program path or to verify

compliance with a specific requirement.

Test

Environment

The hardware and software environment in which tests will be run, and any other

software with which the software under test interacts when under test including stubs

and test drivers.

Test Item The individual element to be tested. There usually is one test object and many test

items. See also test object.

Test Plan

A document describing the scope, approach, resources, and schedule of intended

testing activities. It identifies test items, the features to be tested, the testing tasks,

who will do each task, and any risks requiring contingency planning. Ref IEEE Std

829.

Test Tools Computer programs used in the testing of a system, a component of the system, or its

documentation.

Tester A technically skilled professional who is involved in the testing of a component or

system.

Use Case

The specification of tests that are conducted from the end-user perspective. Use cases

tend to focus on operating software as an end-user would conduct their day-to-day

activities.

Unit Testing Testing of individual software components.

Table 6: Glossary

22 | P a g e

References

1. http://www.aptest.com/glossary.html, accessed on 3rd

December, 2013

2. http://www.softwaretestinghelp.com/software-testing-terms-complete-glossary/, accessed on

3rd

December, 2013

3. http://medical.nema.org/dicom/Geninfo/GUIDELIN/TPMV1L3.HTM, accessed on 2nd

December, 2013

4. BDonline Release 1.0 MASTER TEST PLAN, accessed on 1st December,2013

5. TEST PLAN OUTLINE (IEEE 829 FORMAT), accessed on 21st October, 2013

6. MASTER TEST PLAN on Reassigned sales re-write project, accessed on 1st

December,2013

7. http://en.wikipedia.org