Department of Computer Science and Engineering The...

52
Last Updated: April 09, 2015 Department of Computer Science and Engineering The University of Texas at Arlington Aegle Outreach Storage System Team Members: Joseph Finnegan Karla Hernandez Joe Martinez Carlos Torres

Transcript of Department of Computer Science and Engineering The...

Page 1: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

Last Updated: April 09, 2015

Department of Computer Science and Engineering

The University of Texas at Arlington

Aegle Outreach Storage System

Team Members:

Joseph Finnegan

Karla Hernandez

Joe Martinez

Carlos Torres

Page 2: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 2 Aegle

Document Revision History

Revision

Number Revision Date Description Rationale

0.1 03/27/2015 STP First Draft Initial draft of System Test Plan

0.2 04/01/2015 STP Modifications Modifications to System’s Architecture

1.0 04/09/2015 STP Baseline Implemented peer review feedback.

Page 3: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 3 Aegle

Table of Contents

1. Introduction .............................................................................................................................. 8

1.1. Document Overview ........................................................................................................ 8

1.2. System Test Plan Scope ................................................................................................... 8

1.3. Product Concept ............................................................................................................... 8

1.4. Testing Scope ................................................................................................................... 9

2. Test References ...................................................................................................................... 11

2.1. System Requirements Specification (SRS) .................................................................... 11

2.2. Architectural Design Specification (ADS) ..................................................................... 17

2.3. Detailed Design Specification ........................................................................................ 22

3. Test Items ............................................................................................................................... 28

3.1. Hardware Tests ............................................................................................................... 29

3.2. Unit Tests ....................................................................................................................... 29

3.3. Component Tests ............................................................................................................ 31

3.4. Integration Tests ............................................................................................................. 33

3.5. System Validation .......................................................................................................... 33

4. Risk ........................................................................................................................................ 35

4.1. Risk Overview ................................................................................................................ 35

4.2. Risk Table ...................................................................................................................... 35

5. Features Not To Be Tested .................................................................................................... 36

5.1. Customer Requirements ................................................................................................. 36

5.2. Packaging Requirements ................................................................................................ 36

5.3. Performance Requirements ............................................................................................ 36

5.4. Safety Requirements ...................................................................................................... 36

5.4.1. Signal Interference ...................................................................................................... 36

5.5. Maintenance and Support Requirements........................................................................ 36

5.5.1. Troubleshooting Guide ............................................................................................... 36

5.5.2. Database Interchangeability ....................................................................................... 37

5.6. Other Requirements........................................................................................................ 37

Page 4: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 4 Aegle

6. Features to Be Tested ............................................................................................................. 38

6.1. Customer Requirements ................................................................................................. 38

6.1.1. Keep Track of Items and Crates by the System .......................................................... 38

6.1.2. System Description of Items ....................................................................................... 38

6.1.3. System Description of Crates ..................................................................................... 38

6.1.4. Search Functionality for Items, Crates, and Projects.................................................. 38

6.1.5. Locating Item Inside a Crate ...................................................................................... 39

6.1.6. Item Management by the Administrators ................................................................... 39

6.1.7. Crate Management by the Administrators .................................................................. 39

6.1.8. Project Management by Administrators ..................................................................... 39

6.1.9. System Interaction by Administrators ........................................................................ 39

6.1.10. System Interaction by Registered Users ................................................................. 40

6.1.11. Registration/Login System...................................................................................... 40

6.1.12. Web-Based Accessible Application ........................................................................ 40

6.1.13. Locating Crate Inside the Storage Room ................................................................ 40

6.2. Packaging Requirements ................................................................................................ 40

6.2.1. Included Hardware Components ................................................................................ 40

6.2.2. Installation Manual ..................................................................................................... 41

6.2.3. Range of the RFID Reader Integrated Antenna .......................................................... 41

6.2.4. Software Components................................................................................................. 41

6.3. Performance Requirements ............................................................................................ 41

6.3.1. Check-in/Check-out Latency ...................................................................................... 41

6.3.2. Web Interface Response Latency ............................................................................... 42

6.4. Safety Requirements ...................................................................................................... 42

6.4.1. Electrical Hazard ........................................................................................................ 42

6.5. Maintenance and Support Requirements........................................................................ 42

6.5.1. User Manual ............................................................................................................... 42

6.5.2. Source Code Availability & Documentation .............................................................. 42

6.6. Other Requirements........................................................................................................ 42

6.6.1. Information Security ................................................................................................... 42

7. Test Strategy .......................................................................................................................... 44

Page 5: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 5 Aegle

7.1. Test Phases ..................................................................................................................... 44

7.2. Test Metrics .................................................................................................................... 45

7.3. Test Tools ....................................................................................................................... 45

8. Item Pass/Fail Criteria............................................................................................................ 46

8.1. Hardware Tests ............................................................................................................... 46

8.2. Unit Tests ....................................................................................................................... 46

8.3. Component Tests ............................................................................................................ 48

8.4. Integration Tests ............................................................................................................. 49

8.5. System Validation Tests ................................................................................................. 49

9. Test Deliverables ................................................................................................................... 50

9.1. System Test Plan ............................................................................................................ 50

9.2. Test Cases ....................................................................................................................... 50

9.3. Test Results .................................................................................................................... 50

9.4. Bug Reports .................................................................................................................... 51

10. Test Schedule ......................................................................................................................... 52

Page 6: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 6 Aegle

List of Figures FIGURE PAGE

Figure 1-1 Conceptual Design Diagram ....................................................................................... 10

Figure 2-1 High Level Layer Diagram ......................................................................................... 17

Figure 2-2 Architectural Design Diagram .................................................................................... 19

Figure 2-3 Detail Design Diagram ................................................................................................ 22

Figure 3-1 Testing Relational Diagram ........................................................................................ 28

Page 7: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 7 Aegle

List of Tables TABLE PAGE

Table 2-1 Requirements Mapping Table ...................................................................................... 20

Table 2-2 Presentation Layer Requirements Mapping ................................................................. 23

Table 2-3 Hardware Layer Requirements Mapping ..................................................................... 24

Table 2-4 Processing Layer Requirements Mapping .................................................................... 25

Table 2-5 Processing Layer Requirements Mapping (side by side continuation) ........................ 26

Table 2-6 Data Storage Layer Requirements Mapping ................................................................ 27

Table 3-1 Hardware Test Items ..................................................................................................... 29

Table 3-3 Presentation Layer Component Test Items ................................................................... 31

Table 3-4 Processing Layer Component Test Items ..................................................................... 32

Table 3-5 Data Storage Layer Component Test Items .................................................................. 32

Table 3-6 Integration Tests Items ................................................................................................. 33

Table 3-7 System Validation Test Items ....................................................................................... 34

Table 4-1 Risk Table ..................................................................................................................... 35

Table 7-1 Test Metrics Table ........................................................................................................ 45

Table 8-1 Hardware Pass/Fail Criteria .......................................................................................... 46

Table 8-2 Unit Test Pass/Fail Criteria .......................................................................................... 47

Table 10-1 Test Schedule Table ................................................................................................... 52

Page 8: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 8 Aegle

1. Introduction

1.1. Document Overview

The following document describes the System Test Plan document in detail. This document shall

specify the testing procedures that Team Aegle will follow to ensure delivery of a quality

product that fulfills all acceptance criteria specified in previous documentation. The document

will go into detail concerning the hardware, unit, component, integration, and system validation

testing for Outreach Storage System.

1.2. System Test Plan Scope

OSS will be developed as a prototype, being composed of an RFID reader that will be configured

in such a manner that it will comply with the requirements stated in the SRS document. The

RFID reader’s range will be adapted to read correctly in the testing environment as well as the

mounting process. The system is also composed of a software application, which will be verified

in testing computers that fulfill the minimum hardware specifications required to run the

software application.

1.3. Product Concept

Managing product inventory is a problem that has been with human societies since the merchants

of Sumeria. In the past this required meticulous record keeping and regular auditing to ensure

accurate inventory records. OSS seeks to automate these processes and add new tasks specific to

our customer’s needs.

This system will use RFID technology to automate previously manual tasks. Inventory records

will be updated as items are added or removed from storage crates, and their location within the

storeroom will be recorded. OSS will also organize inventory into user defined projects, such as

constructing a self-supporting tower made of dry spaghetti noodles, or categories, such as tools

or crafts.

The goal of OSS will be to save the user the time and effort needed to find an item within a

storeroom and to more effectively manage resources. By keeping accurate inventory counts,

existing product can be used more efficiently and waste can be reduced.

Page 9: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 9 Aegle

1.4. Testing Scope

Testing scope for OSS will be limited to ensuring that the functionality of the system performs as

stated in the System Requirements Specification (SRS) described later in this document. Testing

will be performed in the Senior Design lab space allocated to Team Aegle for the development of

OSS

Hardware tests will be done to verify the performance of system hardware such as the RFID

Reader and the system server. Afterwards the software portions of OSS will be tested in

successive phases to confirm proper functionality.

Unit tests will be performed on the identified modules which represent the smallest logical

portions of OSS. Next, component tests will be performed on the identified subsystems which

comprise each major architectural layer and represent the next logical layer of system

functionality. Finally, integration tests will be performed that will test the ability of the system to

function entirely as a whole by ensuring that inter layer interfaces perform as expected.

At the successful completion of all testing phases OSS shall be able to perform all customer

requirements as specified in the SRS.

Page 10: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 10 Aegle

Figure 1-1 Conceptual Design Diagram

Page 11: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 11 Aegle

2. Test References

The System Test Plan (STP) will describe how team Aegle will test the system, what the testing

is based on, and the criteria that will determine the outcome of the test. The STP will be based on

the System Requirements Specification (SRS), Architectural Design Specification (ADS), and

the Detailed Design Specification (DDS).

2.1. System Requirements Specification (SRS)

The SRS is the first document created by team Aegle with the help of the sponsor, Dr. Tiernan.

The SRS is the document that specifies what the system is, what the system shall be able to do,

the purpose of the system, and the requirements of the system. The test plan shall check if the

system meets the requirements specified in the SRS.

2.1.1. Customer Requirements

2.1.1.1. Keep track Items and Crates by the System

Description: The system shall be able to keep track of the items’ status (in stock

or out of stock) and keep track of crates’ status.

Priority: 1 – Critical.

2.1.1.2. System Description of Items

Description: The system shall be able to provide a description of the item to the

administrators and registered users. The description shall provide the item data

fields stated previously on definitions section.

Priority: 1 – Critical.

2.1.1.3. System Description of Crates

Description: The system shall be able to provide a description of the crate to the

administrators and registered users. The description will provide crate data fields

stated previously on definitions section.

Priority: 1 – Critical.

2.1.1.4. Search Function for Items, Crates and Projects.

Description: The system shall be able to search the database for items, crates and

projects by the multiple data fields of the items and crates describe in the

definitions section and by general word search.

Page 12: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 12 Aegle

Priority: 1 – Critical.

2.1.1.5. Locating Item inside a Crate

Description: The system shall be able to locate in which crate an item is located.

Priority: 1 – Critical.

2.1.1.6. Item Management by the Administrators

Description: The administrators shall be able to add, remove and delete an item

in the System.

Priority: 1 – Critical.

2.1.1.7. Crate Management by the Administrators

Description: The administrators shall be able to register a new crate into the

system, as well as delete and edit crates already existing in the system.

Priority: 2 – High.

2.1.1.8. Project Management by Administrators

Description: The administrators shall be able to create, edit and delete projects in

the system.

Priority: 3 – Moderate.

2.1.1.9. System Interaction by Administrators

Description: The administrator shall be able to look at the items and crates in the

inventory, look at the different projects, and shall be able to perform any

functionality specified in other requirements. The administrator shall be able to

approve or deny a registered user’s request to check out an item(s) or crate(s)

from the inventory. Also, the administrator shall be able to check out an item(s) or

crate(s) from inventory and approve or deny a user registration request.

Priority: 1 - Critical.

2.1.1.10. System Interaction by Registered Users

Description: Registered users shall be able to look at the items in inventory, and

shall be able to perform any functionality specified in other requirements.

Registered users shall be able to look at the different projects, and submit a

Page 13: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 13 Aegle

request for permission to take an item(s) or crate(s) from inventory, from the

administrator.

Priority: 2 – High.

2.1.1.11. Registration/Login System

Description: The system shall have a registration and login system, which will be

the only way to access the database and system functionalities. The registration

requirements will be an email account, first name, and last name, date of birth,

phone number, organization, and password. The users shall be able to login into

the system with their email account and password, after the administrator has

approved their account.

Priority: 1 – Critical.

2.1.1.12. Web-Based Accessible Application

Description: The system shall be implemented as a web-based application.

Priority: 1 – Critical.

2.1.1.13. Locating Crate inside the Storage Room

Description: The system shall be able to provide a relative location for a crate

inside the storage room.

Priority: 4 – Low.

2.1.2. Packaging Requirements

2.1.2.1. Included Hardware Components

Description: The final product shall include the following components: Passive

RFID tags for the items and an RFID reader.

Priority: 1 – Critical.

2.1.2.2. Installation Manual

Description: The final product shall provide an installation manual that includes

detailed instructions on how to install, set up, and use the system.

Priority: 3 – Moderate.

Page 14: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 14 Aegle

2.1.2.3. Range of the RFID Reader Integrated Antenna

Description: The RFID reader antenna range shall be able to cover the door

height.

Priority: 2 – High.

2.1.2.4. Software Components

Description: The final product shall include the following software components:

source code. Shall be deliver on a USB flash drive.

Priority: 1 – Critical.

2.1.2.5. RFID Reader Installation

Description: The RFID reader shall be mounted above the door enclosed in a

frame. A technician shall not be required to install the reader.

Priority: 2 – High.

2.1.2.6. Software Installation

Description: The software shall be delivered in an installation script guiding the

user through the necessary steps to complete the installation.

Priority: 4 – Low.

2.1.3. Performance Requirements

2.1.3.1. Check-in/Check-out Latency

Description: The amount of time that it takes for the system to recognize that an

item has entered or left the storage room should not exceed 1 minute.

Priority: 3 – Moderate.

2.1.3.2. Web Interface Response Latency

Description: The amount of time that it takes for the web application to return

meaningful information to the user shall not exceed 10 seconds.

Priority: 3 – Moderate.

Page 15: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 15 Aegle

2.1.4. Safety Requirements

2.1.4.1. Electrical Hazard

Description: The system and its components present within the storage room

shall not pose an electrical hazard to its users or the building it resides in.

Priority: 1 – Critical.

2.1.4.2. Signal Interference

Description: The RFID system shall not interfere with any critical radio

frequencies transmission.

Priority: 3 – Moderate.

2.1.5. Maintenance and Support Requirements

2.1.5.1. User Manual

Description: The final product shall come with a user manual describing in detail

how to set up the system and use its various features.

Priority: 3 – Moderate.

2.1.5.2. Troubleshooting Guide

Description: The final product shall include a troubleshooting guide to help solve

general problems the user may have, and to assist them in determining whether

the problem needs to be solved by a third party.

Priority: 4 – Low.

2.1.5.3. Source Code Availability & Documentation

Description: The final product shall include all the source code and

documentation used to design and implement the system. The source code will be

well structured and commented so as to allow for future modularity and support.

Priority: 3 – Moderate.

Page 16: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 16 Aegle

2.1.6. Other Requirements

2.1.6.1. Information Security

Description: The system shall ensure the privacy and security of personal

information being stored and transmitted by users of the system through the use of

secure connectivity and secure programming techniques.

Priority: 3 – Moderate.

2.1.6.2. Connectivity Tolerance

Description: The system shall be tolerant of a loss of communication with the

web server. During loss of connectivity to the server, transactions shall be stored

in the buffer of a user’s device. Transactions shall be processed when connectivity

to the server is reestablished.

Priority: 2 – High.

Page 17: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 17 Aegle

2.2. Architectural Design Specification (ADS)

The ADS provides a high level description of the system, dividing the system into layers that

interact with each other, which are composed of subsystems that interact with each other at an

inter-layer level and at an intra-layer level. The test plan aims to test the overall structure of the

system, the system’s layers and subsystems to show that the architecture is valid and proper.

2.2.1. Layer Overview

Figure 2-1 High Level Layer Diagram

Page 18: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 18 Aegle

2.2.1.1. Presentation Layer

The purpose of the Presentation Layer is to accept user and administrator input and display the

appropriated output. This layer is the main point of interaction between the system and its users

and will present the majority of the system’s functionality. User requests are accepted into this

layer, the data is then converted into the proper format before being sent to the Processing Layer.

System output is likewise formatted in this layer into a form that is presentable to a user before

being displayed.

2.2.1.2. Hardware Layer

The purpose of the Hardware Layer is to continuously read RFID tags, and to forward the tag’s

information into the Processing Layer. This layer consists of an RFID integrated reader and

antenna that will be set up to constantly search for RFID tags in the vicinity of the door frame.

2.2.1.3. Processing Layer

The purpose of the Processing Layer is to analyze and process data that it receives from the

Presentation Layer, Hardware Layer, and Data Storage Layer. The Processing Layer processes

incoming data or user queries and sends system output data to the Presentation Layer. This layer

also interacts with the Data Storage Layer for the purpose of both storing and retrieving data as

necessary.

2.2.1.4. Data Storage Layer

The purpose of the Data Storage Layer is to read and write the data received from the Processing

Layer. This layer contains the Database Controller Subsystem that is responsible for generating

the database queries necessary for storing and retrieving data from the database. This subsystem

also formats both incoming and outgoing data to the appropriate format needed by the Processing

Layer and database. The SQL Database Management Subsystem (DBMS) interfaces with the

database to save or retrieve data through SQL queries.

Page 19: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 19 Aegle

2.2.2. Subsystems Overview

Figure 2-2 Architectural Design Diagram

Page 20: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 20 Aegle

2.2.3. Requirements Mapping

This section highlights the core system requirements as laid out by the customer, and which

subsystems fulfill the functionality of each respective requirement. This should give one an idea

as to how the various layers of the system fulfill the desired functionality. Any column/layer with

an “X” in it can be traced back along the row the “X” is placed on to find which requirement a

layer is tied to.

Requirement

Number Requirement Name Presentation Hardware Processing

Data

Storage

3.1

Keep Track of Items

and Crates by the

System X X X

3.2

System Description of

Items X X X

3.3

System Description of

Crates X X X

3.4

Search Function for

Items, Crates, and

Projects X X X

3.5

Locating Item Inside a

Crate X X X

3.6

Item Management by

the Administrators X X X

3.7

Crate Management by

the Administrators X X X

3.8

Project Management

by Administrators X X X

3.9

System Interaction by

Administrators X X X

3.10

System Interaction by

Registered Users X X

3.11

Registration/Login

System X X X

3.12

Web-Based

Accessible

Application X X

3.13

Locating Crate Inside

the Storage Room X X X

Table 2-1 Requirements Mapping Table

Page 21: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 21 Aegle

Page 22: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 22 Aegle

2.3. Detailed Design Specification

2.3.1. Modules Overview

Figure 2-3 Detail Design Diagram

Page 23: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 23 Aegle

2.3.2. Requirements Mapping

The requirements mapping section identifies what modules of the detailed design diagram fulfill

which requirements.

2.3.2.1. Presentation Layer

Requirement

Number

Requirement

Name

UI

Display

Output

Data

Formatter

I/O

Module

Event

Handler

3.1 Keep track Items

and Crates by the

System

x x x x

3.2 System Description

of Items

x x x x

3.3 System Description

of Crates

x x x x

3.4 Search Function for

Items, Crates and

Projects

x x x x

3.5 Locating Item

Inside a Crate

x x x x

3.6 Item Management

by the

Administrators

x x x x

3.7 Crate Management

by the

Administrators

x x x x

3.8 Project Management

by Administrators

x x x x

3.9 System Interaction

by Administrators

x x x x

3.10 System Interaction

by Registered Users

x x x x

3.11 Registration/Login

System

x x x x

3.12 Web-Based

Accessible

Application

x x x x

3.13 Locating Crate

Inside the Storage

Room

x x x x

Table 2-2 Presentation Layer Requirements Mapping

Page 24: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 24 Aegle

2.3.2.2. Hardware Layer

Requirement

Number

Requirement Name Reader

Module

RFID

Reader

3.1 Keep track Items and

Crates by the System

x x

3.2 System Description of

Items

3.3 System Description of

Crates

3.4 Search Function for

Items, Crates and

Projects

3.5 Locating Item Inside a

Crate

3.6 Item Management by

the Administrators

3.7 Crate Management by

the Administrators

3.8 Project Management

by Administrators

3.9 System Interaction by

Administrators

3.10 System Interaction by

Registered Users

3.11 Registration/Login

System

3.12 Web-Based Accessible

Application

3.13 Locating Crate Inside

the Storage Room

Table 2-3 Hardware Layer Requirements Mapping

Page 25: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 25 Aegle

2.3.2.3. Processing Layer

Requirement

Name

Request

Module

I/O

Controller

User

Mgmt

Keep track Items

and Crates by the

System

x

System

Description of

Items

x

System

Description of

Crates

x

Search Function

for Items, Crates

and Projects

x

Locating Item

Inside a Crate

x

Item Management

by the

Administrators

x x

Crate

Management by

the Administrators

x x

Project

Management by

Administrators

x x

System Interaction

by Administrators

x x

System Interaction

by Registered

Users

x x

Registration/Login

System

x x

Web-Based

Accessible

Application

Locating Crate

Inside the Storage

Room

x

Table 2-4 Processing Layer Requirements Mapping

Page 26: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 26 Aegle

Requirement

Number

Requirement

Name

Input

Handler

EPC

Handler

Inventory

Mgmt

Output

Handler

DB

Request

Handler

3.1 Keep track Items

and Crates by the

System

x x x

3.2 System Description

of Items

x x x x x

3.3 System Description

of Crates

x x x x x

3.4 Search Function for

Items, Crates and

Projects

x x x x x

3.5 Locating Item

Inside a Crate

x x x x x

3.6 Item Management

by the

Administrators

x x x

3.7 Crate Management

by the

Administrators

x x x

3.8 Project

Management by

Administrators

x x x

3.9 System Interaction

by Administrators

x x x

3.10 System Interaction

by Registered

Users

x x x

3.11 Registration/Login

System

x x x

3.12 Web-Based

Accessible

Application

x x

3.13 Locating Crate

Inside the Storage

Room

x x x x x

Table 2-5 Processing Layer Requirements Mapping (side by side continuation)

Page 27: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 27 Aegle

2.3.2.4. Data Storage Layer

Requirement

Number

Requirement Name I/O

Formatter

DB

Adapter

SQL

Query

Generator

SQL

DBMS

Module

3.1 Keep track Items

and Crates by the

System

x x x x

3.2 System Description

of Items

x x x x

3.3 System Description

of Crates

x x x x

3.4 Search Function for

Items, Crates and

Projects

x x x x

3.5 Locating Item Inside

a Crate

x x x x

3.6 Item Management

by the

Administrators

3.7 Crate Management

by the

Administrators

3.8 Project Management

by Administrators

3.9 System Interaction

by Administrators

3.10 System Interaction

by Registered Users

3.11 Registration/Login

System

x x x x

3.12 Web-Based

Accessible

Application

3.13 Locating Crate

Inside the Storage

Room

x x x x

Table 2-6 Data Storage Layer Requirements Mapping

Page 28: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 28 Aegle

3. Test Items

The following section describes the tests items that will be tested in OSS. In order to fully test the system,

the testing process will include different testing phases: hardware tests, unit tests, integration tests,

component tests and system validation tests. A testing relational diagram depicts the items being tested in

each phase, and the tables describe the inputs and outputs as well as the test and the priority of each item

that will be tested as part of the system.

Figure 3-1 Testing Relational Diagram

Page 29: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 29 Aegle

3.1. Hardware Tests

Test

ID

Test

Component

Input Expected

Output

Test Priority

HW1 RFID Reader

with Integrated

Antenna

Radio Frequency

data transmitted

by an RFID

passive tag

EPC Raw

Data

This item will be tested by

verifying that the reader is

able to send data through a

TCP port connection.

Critical

HW2 RFID passive

tag

None. Radio

Frequency

Data

This item will be tested by

verifying that the RFID

passive tag is transmitting

data to the reader through

radio frequency waves.

Critical

Table 3-1 Hardware Test Items

3.2. Unit Tests

Test

ID

Test

Component

Input Expected

Output

Test Priority

UT1 UI Display User actions,

when

interacting

with the web

application.

JavaScript

object.

This module will be validated

by interacting with the web

application and verifying that

a JavaScript object is created

with the information entered.

Critical

UT2 Output Data

Formatter

JSON object Formatted

strings.

The Output data formatter

should format the JSON

object as expected.

High

UT3 I/O Module HTTP

Request

HTTP

Response

An HTTP request will be

created and the format of it

will be verified.

Critical

UT4 Event

Handler

JavaScript

Objects

Event Object A JavaScript Object will be

created and the format of the

Event Object produced will

be verified.

Moderate

UT5 Reader

Module

Raw Tag

Data

Formatted Tag

Data

When reading an RFID

passive tag, the reader should

output the formatted tag data;

which will be validated.

Critical

UT7 Request

Module

HTTP

Request

C# object The request module will

receive an HTTP request and

the type of C# object created

Critical

Page 30: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 30 Aegle

will be verified.

UT9 User

Management

User Object Database

Request Object

Two User Objects will be

provided as input to User

Management, one for

registration and one for login,

and the Database Request

Object produced will be

verified.

Critical

UT10 Inventory

Management

Inventory

Object List

Database

Request Object

An Inventory Object List will

be provided as input to the

Inventory Management

module and the format of the

Database Request Object

produced will be verified.

High

UT11 Input

Handler

EPC data

and a

Request

Object

User

Management,

Inventory

Management,

EPC Object

This module will be provided

with two different inputs:

EPC data and Request Object,

the type of object produce

depending on the input will

be verified.

High

UT12 Output

Handler

Notifications,

Search

Results, and

lists of EPC

objects

Response

Object

This module will be provided

with 3 inputs: notifications,

search results, and lists of

EPC objects, and the

Response Object produce will

be verified.

Critical

UT13 DB Request

Handler

User

Management,

Inventory

Management,

EPC Object

Database

Request Object

This module will be provided

with three different inputs:

User Management, Inventory

Management and EPC

Object. The Database

Request object produced will

be verified.

Critical

UT14 EPC

Handler

EPC Object Database

Request Object

An EPC object will be

received and the format of the

Database Request Object

produced will be validated.

High

UT15 I/O

Formatter

Database

Request

Object

Database Query

Object, User

Object,

Inventory

Object

A Database Request Object

will be provided to the I/O

Formatter module and the

Database Query Object

Produced will be verified. A

second test will be performed

to ensure the correct response

Critical

Page 31: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 31 Aegle

object is being returned to the

Processing Layer.

UT16 SQL

Manager

Database

Query Object

SQL Query

Commands

A Database Query Object

will be provided to the SQL

Manager module, the SQL

query produced will be

verified, along with the query

results.

Critical

UT17 SQL DBMS

Module

SQL Query String Array The SQL DBMS Module can

only be tested through

integration testing.

Critical

Table 8- Unit Tests Items

3.3. Component Tests

3.3.1. Hardware Layer

The RFID Reader can only be tested through integration testing.

3.3.2. Presentation Layer

Test

ID

Test

Component

Input Expected Output Test Priority

CPG1 GUI User actions and

input data (strings)

JavaScript Object The GUI will be tested by verifying

that the correct JavaScript object was

created on user interaction.

High

CPIO1 I/O Controller JavaScript Object HTTP Request,

Strings

The I/O Controller will be tested by

verifying that the correct HTTP

request has been created. A second

test will be performed to ensure that

the correct strings are sent back to the

GUI subsystem.

Critical

Table 3-2 Presentation Layer Component Test Items

Page 32: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 32 Aegle

3.3.3. Processing Layer

Test

ID

Test

Component

Input Expected Output Test Priority

CPRH1 Request

Handler

HTTP Request C# object The Request Handler will be tested by

verifying that the correct C# object

was created on user interaction.

High

CPHC1 Hardware

Controller

JavaScript Object HTTP Request,

Strings

The I/O Controller will be tested by

verifying that the correct HTTP

request has been created. A second

test will be performed to ensure that

the correct strings are sent back to the

GUI subsystem.

Critical

CPOS1 OSS

Application C# object Response Object

Database Request

Object

The OSS Application will be tested

by providing a c# object and verifying

the contents of the Response Object

produced. A second test will be

created to ensure that an action

requiring database access successfully

creates a Database Request Object.

Critical

Table 3-3 Processing Layer Component Test Items

3.3.4. Data Storage Layer

Test

ID

Test

Component

Input Expected Output Test Priority

CPRH1 DB Controller HTTP Request C# object The Request Handler will be tested by

verifying that the correct C# object

was created on user interaction.

High

CPHC1 SQL DBMS SQL Query String Array

The SQL DBMS subsystem can

only be tested through integration

testing.

Critical

Table 3-4 Data Storage Layer Component Test Items

Page 33: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 33 Aegle

3.4. Integration Tests

Test

ID Test

Component Input Expected Output Test Priority

IP1 Presentation

Layer

User actions and

input data (strings) HTTP Request,

Strings The Presentation Layer will be tested

by verifying that the correct HTTP

Requests are created and sent to the

Presentation Layer on user

interaction. A second test will be

performed to ensure that the

Presentation Layer displays HTTP

Response information coming from

the Processing Layer.

Critical

IH1 Hardware

Layer Raw Tag Data EPC and

Timestamp The Hardware Layer will be tested by

scanning an RFID passive tag and

verifying that the data is sent to the

Processing Layer.

Critical

IPS1 Processing

Layer HTTP Request, EPC

and Timestamp,

User Object,

Inventory Object

HTTP Response,

Database Request

Object

The Processing Layer will be tested

by sending HTTP Requests and

verifying that the correct objects

(User Object, Inventory Object) were

created. A second test will be created

to ensure that incoming EPC Data

creates the valid Request Object.

Critical

IDS1 Data Storage

Layer Database Request

Object User Object/

Inventory Object The Data Storage Layer will be tested

by providing a Database Request

Object and verifying that it returns the

correct output data.

Critical

Table 3-5 Integration Tests Items

3.5. System Validation

Test

ID Test

Component Input Expected Output Test Priority

V1 User login User logs in to the

system User access to

system The test will verify that an already

approved user will be able to login to

the system.

Moderate

V2 User

Registration

User registers in the

system Admin receives a

list of approved

users

The test will verify that on user

registration, the Admin user will be

able to see a list of approved users.

Moderate

V3 Search Item User performs

search n an item User sees item’s

information The test will verify that when a user

searches an item, the details of the

item will be displayed.

High

Page 34: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 34 Aegle

V4 Item/crate

check out User removes an

item/crate from the

storage room.

Item/crate state is

“checked out”

The test will verify that item’s status

must be marked as Checked Out in

the database.

Critical

V5 Item/crate

checked in User checks in an

item/crate in the

storage room.

Item/crate state is

“in stock” The test will verify that item’s status

must be marked as Checked In in the

database.

Critical

V6 Admin

Interaction Administrator

interacts with the

system.

The system will

allow the admin to

perform the

authorized tasks

The test will verify that the Admin is

allowed to interact with the system

through a series of different tasks.

High

V7 User

Interaction User interacts with

the system. The system will

allow the user to

perform the

authorized tasks.

The test will verify that the user is

allowed to interact with the system

through a series of different tasks.

Moderate

V8 Item/Crate

Registration EPC Code Inventory Item is

added to the

database

An RFID tag previously unknown to

the system will be passed through the

RFID reader’s reading field

Critical

Table 3-6 System Validation Test Items

Page 35: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 35 Aegle

4. Risk

4.1. Risk Overview

This section will go into detail concerning the risks that the team may encounter during the

testing of OSS. The following table highlights the risks identified, their potential impact,

severity, and mitigation plans.

4.2. Risk Table

Risk

ID

Description Impact Severity Mitigation Strategy

R1 RFID Reader failure Hardware Layer tests

will fail. Several

module and subsystem

tests in the Processing

Layer will fail.

High Ensure that RFID

Reader hardware

works upon arrival.

Careful storage of all

hardware when not in

use.

R2 Tag reading failure Data Layer tests will

fail or give erroneous

results.

High Proper positioning of

the Reader as well as

removal of any

interference material.

R3 Software bugs Tests will fail or give

erroneous results.

Low Good coding practices.

Regression testing for

every new code

change.

R4 Server connection

failure

Presentation Layer

tests will fail.

Medium If an internal domain

cannot be secured then

a custom router based

system will be used.

Table 4-1 Risk Table

Page 36: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 36 Aegle

5. Features Not To Be Tested

The following section describes the features that will not be tested along with a description

explaining the reason why the development team decided not to test the feature.

5.1. Customer Requirements

All of these requirements will be tested.

5.2. Packaging Requirements

All of these requirements will be tested.

5.3. Performance Requirements

All of these requirements will be tested.

5.4. Safety Requirements

5.4.1. Signal Interference

Description: The RFID system shall not interfere with any critical radio frequency

transmission.

Reasoning: Signal interference will not be tested because we do not have access to

the tools or expertise necessary to check whether interference is occurring. We are

currently operating under the assumption that the manufacturer of the RFID reader

has accounted for signal interference being a problem.

5.5. Maintenance and Support Requirements

5.5.1. Troubleshooting Guide

Description: The final product shall include a troubleshooting guide to help solve

general problems the user may have, and to assist them in determining whether the

problem needs to be solved by a third party.

Reasoning: A short development schedule will inhibit us from properly being able to

test our troubleshooting guide. However, we will do everything we can to make sure

appropriate error messages are given to the user when a problem does arise. These

error messages should suffice in helping to identify a problem, and what to do about it

without the help of the troubleshooting guide.

Page 37: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 37 Aegle

5.5.2. Database Interchangeability

Description: The final product shall be adaptable to different database frameworks

such as SQL, Oracle, etc.

Reasoning: Since we do not have the development schedule to allow for the setup of

a secondary database, we will not be testing this feature.

5.6. Other Requirements

All of these Requirements will be tested

Page 38: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 38 Aegle

6. Features to Be Tested The following section describes all the features that are listed in the system requirements

specification document, which the development team will test.

6.1. Customer Requirements

6.1.1. Keep Track of Items and Crates by the System

Description: The system shall be able to keep track of the items’ status (in stock or

out of stock) and keep track of crates’ status.

Approach: A small set of items will be passed by the RFID reader at random and the

web interface will be logged into the web interface to check and see if appropriate

inventory levels are displayed afterwards.

6.1.2. System Description of Items

Description: The system shall be able to provide a description of the item to the

administrators and registered users. The description shall provide the item data fields

stated previously on definitions section.

Approach: Users of all account types will be logged into the web interface to verify

whether the appropriate item descriptions are available as expected.

6.1.3. System Description of Crates

Description: The system shall be able to provide a description of the crate to the

administrators and registered users. The description will provide crate data fields

stated previously on definitions section.

Approach: Users of all account types will be logged into the web interface to verify

whether the appropriate item descriptions are available as expected.

6.1.4. Search Functionality for Items, Crates, and Projects

Description: The system shall be able to search the database for items, crates and

projects by the multiple data fields of the items and crates describe in the definitions

section and by general word search.

Approach: A user account will be used to log into the web interface of the system to

check whether items can be searched for, and if the search functionality displays

items as expected, and with the appropriate attributes.

Page 39: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 39 Aegle

6.1.5. Locating Item Inside a Crate

Description: The system shall be able to locate in which crate an item is located.

Approach: A user account will be logged into the web interface to check whether an

item is listed as contained within a crate

6.1.6. Item Management by the Administrators

Description: The administrators shall be able to add, remove and delete all items in

the System.

Approach: An administrator will be logged into the web interface to verify whether

items of all types can be added, removed, and deleted using the administrator

privileges.

6.1.7. Crate Management by the Administrators

Description: The administrators shall be able to register a new crate into the system,

as well as delete and edit crates already existing in the system.

Approach: An administrator will be logged into the web interface to check whether

crate data can be added, edited, and deleted using administrator privileges.

6.1.8. Project Management by Administrators

Description: The administrators shall be able to create, edit and delete projects in the

system.

Approach: An administrator will be logged into the web interface to check whether

project data can be added, edited, and deleted using administrator privileges.

6.1.9. System Interaction by Administrators

Description: The administrator shall be able to look at the items and crates in the

inventory, look at the different projects, and shall be able to perform any functionality

specified in other requirements. The administrator shall be able to approve or deny a

registered user’s request to check out an item(s) or crate(s) from the inventory. Also,

the administrator shall be able to check out an item(s) or crate(s) from inventory and

approve or deny a user registration request.

Approach: An administrator will be logged into the web interface to check whether

the account is able to view projects, perform all the various requirement specified

actions, approve or deny a checkout request, check out an item or crate, and moderate

registration requests.

Page 40: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 40 Aegle

6.1.10. System Interaction by Registered Users

Description: Registered users shall be able to look at the items in inventory, and shall

be able to perform any functionality specified in other requirements. Registered users

shall be able to look at the different projects, and submit a request for permission to

take an item(s) or crate(s) from inventory, from the administrator.

Approach: A registered user will be logged into the web interface to check whether it

can perform all functionality as specified in other requirements, view different

projects, and submit checkout requests.

6.1.11. Registration/Login System

Description: The system shall have a registration and login system, which will be the

only way to access the database and system functionalities. The registration

requirements will be an email account, first name, and last name, date of birth, phone

number, organization, and password. The users shall be able to login into the system

with their email account and password, after the administrator has approved their

account.

Approach: User accounts of all types will be created and logged into on the web

interface to verify if they have the appropriate functionality and permissions.

6.1.12. Web-Based Accessible Application

Description: The system shall be implemented as a web-based application.

Approach: A remote computer will be used to attempt access to the web interface

using a specified IP address and port to access the web server from a supported web

browser

6.1.13. Locating Crate Inside the Storage Room

Description: The system shall be able to provide a relative location for a crate inside

the storage room.

Approach: A generic user account will be logged into the web interface to verify

whether crate location attributes are available for viewing on in-stock items.

6.2. Packaging Requirements

6.2.1. Included Hardware Components

Description: The final product shall include the following components: Passive

RFID tagsfor the items and an RFID reader.

Page 41: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 41 Aegle

Approach: The available components will be tallied and inspected to ensure

everything is packaged and ready.

6.2.2. Installation Manual

Description: The final product shall provide an installation manual that includes

detailed instructions on how to install, set up, and use the system.

Approach: People outside the development team will be asked to follow the

installation manual’s instructions and give feedback on whether they think the

installation manual’s installation instructions are explicit and useful in system

installation.

6.2.3. Range of the RFID Reader Integrated Antenna

Description: The RFID reader antenna range shall be able to cover the door length.

Approach: The RFID reader will be mounted on a door frame and items will be

passed through the door frame by different people as anticipated in typical system

use. The Universal Reader assistant software will be used to verify whether these

items are being read when they pass through the reading field.

6.2.4. Software Components

Description: The final product shall include the following software components:

source code. It shall be delivered on a USB flash drive.

Approach: A clean installation will be performed using the USB flash drive

deliverable. A complete and successful installation will indicate fulfillment of the

requirement and a positive test

6.3. Performance Requirements

6.3.1. Check-in/Check-out Latency

Description: The amount of time that it takes for the system to recognize that an item

has entered or left the storage room should not exceed 1 minute.

Approach: A timer will be used to check that this time constraint is met throughout

the testing process of all inventory manipulation features.

Page 42: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 42 Aegle

6.3.2. Web Interface Response Latency

Description: The amount of time that it takes for the web application to return

meaningful information to the user shall not exceed 10 seconds.

Approach: A timer will be used to check whether this time constraint on web page

load time is met while testing all other features involving the web interface.

6.4. Safety Requirements

6.4.1. Electrical Hazard

Description: The system and its components present within the storage room shall

not pose an electrical hazard to its users or the building it resides in.

Approach: The hardware components of the system will be inspected for potential

electrical hazards.

6.5. Maintenance and Support Requirements

6.5.1. User Manual

Description: The final product shall come with a user manual describing in detail

how to set up the system and use its various features.

Approach: People outside of the development team will be asked to use the system

and to reference the user manual for any questions they may have. We will then ask

for feedback on how well their questions were answered by the user manual.

6.5.2. Source Code Availability & Documentation

Description: The final product shall include all the source code and documentation

used to design and implement the system. The source code will be well structured and

commented so as to allow for future modularity and support.

Approach: The source code’s documentation and public availability will be verified

by testing whether everything is downloadable from the project’s github.com page

6.6. Other Requirements

6.6.1. Information Security

Description: The system shall ensure the privacy and security of personal

information being stored and transmitted by users of the system through the use of

secure connectivity and secure programming techniques.

Page 43: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 43 Aegle

Approach: We will verify that all users’ private information is protected during use

of the system.

Page 44: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 44 Aegle

7. Test Strategy The following section describes the strategy that the team will use in order to ensure that OSS

and its functionality are properly tested according to the requirements gathered in the System

Requirements Specification document.

7.1. Test Phases

7.1.1. Hardware Test

The hardware will be tested through Integration testing, since output data needs to be sent

to the Processing Layer to ensure the correctness and consistency of the RFID reader.

7.1.2. Unit Test

Each of the modules in the OSS architecture will be tested by writing unit tests and

verifying that the correct behavior has been implemented. The team member in charge of

developing each module will be in charge of writing the corresponding unit tests.

7.1.3. Component Tests

Each subsystem in the OSS architecture will be tested by writing component tests and

verifying that the correct behavior has been implemented. The team member in charge of

developing each component will be in charge of writing the corresponding component

tests. The following components will be tested:

GUI

I/O Controller

RFID Reader

Hardware Controller

Request Handler

OSS Application

DB Controller

SQL DBMS

7.1.4. Integration Tests

Each subsystem in the OSS architecture will be tested by performing integration tests to

verify that the correct behavior occurs on execution. The team member in charge of

developing each component will be in charge of setting up the corresponding integration

tests. Integration test will ensure that the data flows from each subsystem and layer as it

was designed in the OSS architectural diagram, reflecting the integrity of the system.

Page 45: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 45 Aegle

7.1.5. System Validation Test

After each module, subsystem and layer has been properly tested, System Validation

Tests will be performed to ensure that the system performs accurately as a whole,

meeting the requirements described in the System Requirements Specification document.

7.2. Test Metrics

Priority Description Success

Criteria

Failure

Criteria

Critical Features that are required for OSS to function according

to the core requirements.

100% < 100%

High Features that are important to OSS to function but they are

not critical.

>= 90% < 90%

Moderate Features that enhance OSS, but are not required for OSS

to properly function.

>= 75% < 75%

Low Features that not impact OSS and are considered for

future releases.

0% N/A

Table 7-1 Test Metrics Table

7.3. Test Tools

In order to properly test OSS, team Aegle will make use of the following testing tools:

Visual Studio 2013 built in testing framework

Jasmine

Fiddler

Page 46: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 46 Aegle

8. Item Pass/Fail Criteria

The following section describes each of the tests, describing the criteria in which the test is

considered to pass or fail for every phase in the testing cycle.

8.1. Hardware Tests

Test

ID Pass Criteria Fail Criteria

HW1 The application that is supposed to receive

the data receives the data, and the data

received is correct

The application that is supposed to receive

the data does not, or the data it receives is

incorrect

HW2 The RFID reader indicates that it has read a

tag when a tag is brought into its reading field The RFID reader does not indicate that it has

read a tag when a tag is brought into its

reading field

Table 8-1 Hardware Pass/Fail Criteria

8.2. Unit Tests

Test

ID Pass Criteria Fail Criteria

UT1 The web interface responds to all user actions

as expected and displays data correctly The web interface does not respond to all user

actions as expected and displays data

incorrectly

UT2 The output formatter receives the expected

JSON object The output formatter does not receive the

expected JSON object

UT3 An http request is created and sent An http request is not created

UT4 The format of the event object is correct The format of the event object is incorrect

UT5 Correct data is sent by the RFID reader Incorrect data is sent by the RFID reader, or

is not sent at all

UT6 Correct data is sent by the RFID reader Incorrect data is sent by the RFID reader, or

is not sent at all

UT7 The C# object created is correct The C# object created is incorrect

UT8 N/A N/A

UT9 The objects produced by the process are

correct The objects produced by the process are

incorrect, or are not created

UT10 The database request object created is correct The database request object created is

incorrect

UT11 The object produced is correct the object produced is incorrect, or is not

created

Page 47: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 47 Aegle

UT12 The response object produced is correct The response object produced

is incorrect, or is not created

UT13 The database request object produced is correct The database request object

created is incorrect

UT14 The database request object produced is correct The database request object

produced is incorrect, or is not

created

UT15 The response object received by the processing layer

is correct

The response object received

is incorrect, or the object is

not received

UT16 The SQL query produced is correctly formatted with

the correct data

The SQL query produced is

not correctly formatted, or is

not created

UT17 The SQL query produced is correctly formatted with

the correct data

The SQL query produced is

not correctly formatted, or is

not created

Table 8-2 Unit Test Pass/Fail Criteria

Page 48: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 48 Aegle

8.3. Component Tests

8.3.1. Hardware Layer Tests

The hardware layer subsystems can only be tested through integration testing.

8.3.2. Presentation Layer Tests

Test ID Pass Criteria Fail Criteria

CPG1 The correct JavaScript object is created when

the user interacts with its respective interface

element

An incorrect JavaScript object is created

when the user interacts with the user

interface, or the interface is unresponsive

CPIO1 An http request is sent and the correct strings

are sent back to the GUI subsystem upon

reception of the http response

An http response is not created, the message

received by the GUI is incorrect, or no

messages are sent or received

8.3.3. Processing Layer Tests

Test ID Pass Criteria Fail Criteria

CPRH1 The correct C# object is created An incorrect C# object is created, or no object

is created

CPHC1 An http request is sent and the correct strings

are sent back to the GUI subsystem upon

reception of the http response

An http response is not created, the message

received by the GUI is incorrect, or no

messages are sent or received

CPOS1 The response object that is created is correct,

and the database request object created is

correct

The response object that is created is

incorrect, the database request object created

is incorrect, or an object fails to be created

8.3.4. Data Storage Layer Tests

Test ID Pass Criteria Fail Criteria

CPRH1 The correct C# object is created An incorrect C# object is created, or no object

is created

CPHC1 An http request is sent and the correct strings

are sent back to the GUI subsystem upon

reception of the http response

An http response is not created, the message

received by the GUI is incorrect, or no

messages are sent or received

Page 49: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 49 Aegle

8.4. Integration Tests

Test ID Pass Criteria Fail Criteria

IP1 HTTP information sent and received is

correct HTTP information sent and received is

incorrect or nonexistent

IH1 The data received by the processing layer is

correct The data received by the processing layer is

incorrect or nonexistent

IPS1 The correct objects are created and valid

request objects are created when an EPC is

received

The correct objects are not created or

received

IDS1 Output data received from the data storage

layer is correct Output data received from the data storage

layer is incorrect or not received

8.5. System Validation Tests

Test

ID

Pass Criteria Fail Criteria

V1 The preapproved user is able to log into

the system

The preapproved user is unable to log into

the system

V2 The administrator is able to see a list of

approved users

The administrator is unable to see a list of

approved users

V3 Item details are present and properly

displayed

Item details are left out or not properly

displayed

V4 The item/crate status is changed to

"checked in"

The item/crate status is not changed

V5 The item/crate status is changed to

"checked out"

The item/crate status is not changed

V6 The administrator is able to perform the

expected tasks

The administrator is unable to perform

one or more of the expected tasks

V7 The user is able to perform the expected

tasks

The user is unable to perform the

expected tasks

V8 The unknown EPC is added to the list of

registerable items

The unknown EPC is not added to the list

of registerable items

Page 50: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 50 Aegle

9. Test Deliverables This section describes what will be delivered by the test plan, which will be the system test plan,

test cases, test results, and bug reports. The section also describes what each deliverable will

contain.

9.1. System Test Plan

The System Test Plan will describe in general what will be tested, how it is going to be tested,

the expected input and output of the test, and the criteria to determine if a test was a success or a

failure.

9.2. Test Cases

Each test case will be recorded in a spreadsheet and delivered with the final documentation. A

test case will consist of the following information:

Test ID: Unique ID to identify the test case

Test Item: The subsystem or module that will be tested

Purpose: Objective of the test

Test Overview: A brief description of what’s happening during the test

Input: The inputs that will be given to the test item

Expected Output: The result that should occur during testing

Priority: Level of importance

9.3. Test Results

Test results will be documented in a spreadsheet and delivered with the final documentation. A

test report will consist of the following information:

Test ID: Unique ID to identify the test case

Test Date: The date that the test was performed

Tester: The person responsible for carrying out the test

Actual Output: The result given by the test

Success/Fail Result: The result of the testing, either success, or fail.

Comments: Additional remarks about the test

Page 51: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 51 Aegle

9.4. Bug Reports

Bug reports will be documented in a spreadsheet and delivered with the final documentation.

Only bugs with critical or high priority will be fixed according to the team’s schedule. A bug

report will consist of the following information:

Bug ID: Unique ID to identify the bug report

Test ID: Unique ID to identify the test case

Test Date: The date that the test was performed

Tester: The person responsible for carrying out the test

Debugger: The person responsible for fixing the bug

Debug Date: The date that the bug was fixed

Bug Description: A brief description of the issue

Priority: Level of importance

Comments: Additional remarks about the bug

Page 52: Department of Computer Science and Engineering The ...ranger.uta.edu/~odell/Senior_Design_Document_Library/Fall2014/STP... · Department of Computer Science and Engineering The University

System Test Plan OSS

April 09, 2015 52 Aegle

10. Test Schedule

WBS Task Name Planned Start Planned Finish

3.5.1.1 Unit Testing 3/22/2015 3/28/2015

3.5.1.1.1 UI Display 3/22/2015 3/28/2015

3.5.1.1.2 Event Handler 3/22/2015 3/28/2015

3.5.1.1.3 I/O Module 3/22/2015 3/28/2015

3.5.1.1.4 Output Data

Formatter

3/22/2015 3/28/2015

3.5.1.1.5 Request Module 3/22/2015 3/28/2015

3.5.1.1.6 Input Handler 3/22/2015 3/28/2015

3.5.1.1.7 User Management 3/22/2015 3/28/2015

3.5.1.1.8 Inventory

Management

3/22/2015 3/28/2015

3.5.1.1.9 DB Request Handler 3/22/2015 3/28/2015

3.5.1.1.10 EPC Handler 3/22/2015 3/28/2015

3.5.1.1.11 Output Handler 3/22/2015 3/28/2015

3.5.1.1.12 Input Controller 3/22/2015 3/28/2015

3.5.1.1.13 I/O Formatter 3/22/2015 3/28/2015

3.5.1.1.14 Database Adapter 3/22/2015 3/28/2015

3.5.1.1.15 SQL Query

Generator

3/22/2015 3/28/2015

3.5.1.1.16 Query Executor 3/22/2015 3/28/2015

3.5.1.1.17 SQL DBMS Module 3/22/2015 3/28/2015

3.5.1.2 Component Testing 3/29/2015 4/11/2015

3.5.1.2.1 GUI 3/29/2015 4/11/2015

3.5.1.2.2 I/O Controller 3/29/2015 4/11/2015

3.5.1.2.3 RFID Reader 3/29/2015 4/11/2015

3.5.1.2.4 Request Handler 3/29/2015 4/11/2015

3.5.1.2.5 Hardware Controller 3/29/2015 4/11/2015

3.5.1.2.6 OSS Application 3/29/2015 4/11/2015

3.5.1.2.7 Database Controller 3/29/2015 4/11/2015

3.5.1.2.8 SQL DBMS 3/29/2015 4/11/2015

3.5.1.3 Integration Testing 4/12/2015 4/25/2015

3.5.1.3.1 Hardware Layer 4/12/2015 4/25/2015

3.5.1.3.2 Presentation Layer 4/12/2015 4/25/2015

3.5.1.3.3 Processing Layer 4/12/2015 4/25/2015

3.5.1.3.4 Data Storage Layer 4/12/2015 4/25/2015

Table 10-1 Test Schedule Table