Test Planning Test Design Test Analysis Test Design Techniques Static Techniques Dynamic Techniques...
-
Upload
estefania-simons -
Category
Documents
-
view
247 -
download
1
Transcript of Test Planning Test Design Test Analysis Test Design Techniques Static Techniques Dynamic Techniques...
Test Design Overview
Test Planning
Test Design
Test Analysis
Test Design Techniques
Static Techniques
Dynamic Techniques
Choosing A Test Design Technique
Test Design Specification Structure
Test Design Specification Examples
Homework
Test Planning Test Planning – the process of defining and documenting the strategy that will be
used to verify and ensure that a product or system meets its design specifications and other requirements.
Test Plan document should be created by QC management (QC Analyst/QC Lead/QC Manager) and answer on the following questions:
◦ How the testing will be done?
◦ Who will do it?
◦ What will be tested?
◦ How long it will take?
◦ What the test coverage will be, i.e. what quality level is required?
Test Plan document formats can be as varied as the products and organizations to which they apply, but there are three major elements that should be described in each Test Plan: ◦ Test Coverage◦ Test Methods ◦ Test Responsibilities
IEEE 829 – Standard for Software Test Documentation According to IEEE 829 Test Plan consists of:
◦ Test plan identifier
◦ Introduction
◦ Test items
◦ Features to be tested
◦ Features not to be tested
◦ Approach
◦ Item pass/fail criteria
◦ Suspension criteria and resumption requirements
◦ Test deliverables
◦ Testing tasks
◦ Environmental needs
◦ Responsibilities
◦ Staffing and training needs
◦ Schedule
◦ Risks and contingencies
◦ Approvals
Test Plan according to IEEE 829 standard
Example of Test Plan
Test Design
Test Design Phase – In software engineering, test design phase is a process of reviewing and analyzing test basis, selecting test design techniques and creating designed test cases, checklists and scenarios for testing software.
Test Design Specification
◦ It is a document that describes features to be tested and specifies list of all test scenarios or test cases, which should be designed for providing the testing of software.
◦ The test design does not record the values to be entered for a test, but describes the requirements for defining those values.
Test design could require all or one of:
◦ Knowledge of the software, and the business area it operates on
◦ Knowledge of the functionality being tested
◦ Knowledge of testing techniques and heuristics
◦ Planning skills to schedule in which order the test cases should be designed, given the effort, time and cost needed or the consequences for the most important and/or risky features
Review and Analyze Test Basis
Select Test Design Techniques
Create Test Design Specification
Create Test Cases Specification
Test Plan
SRS
Mock-ups
Test Design Specification
…
Test Case Specification
Trainings’ Content
Test Analysis
Test Analysis is the process of looking at something that can be used to derive test information. This basis for the tests is called the 'test basis’.
Test analysis has the following major tasks, in approximately the following order:
◦ Review Test Basis
◦ Define Test Conditions
◦ Evaluate testability of the requirements and system
◦ Define test environment
Test Basis – all documents from which the requirements of a component or system can be inferred (the documentation on which the test cases are based).
Test Condition – an item or event of a component or system that could be verified by one or more test cases, e.g. a function, transaction, feature, quality attribute or structural element.
Traceability – the ability to identify related items in documentation and software, such as requirements with associated tests. There are:
◦ Horizontal traceability
◦ Vertical traceability
Review and Analyze Test Basis
Select Test Design Techniques
Create Test Design Specification
Create Test Case Specification
Test Plan
SRS
Mock-ups
Test Design Specification
…
Test Case Specification
Trainings’ Content
Test Design Techniques
Test Design Techniques are used to derive and/or select test cases
Why they are important?
Two main categories of Test Design Techniques
1
Test Design Techniques
Static: The fundamental objective of static testing is to improve the quality of software work products by assisting engineers to recognize and fix their own defects early in the software development.
Dynamic: Testing that involves the execution of the software of a component or system.
Static Techniques
.
Static Techniques
Static Analysis
Control Flow Data Flow
Informal Reviews
Walkthroughs
Technical Reviews
Inspections
Statique Techniques
Informal Review – a review not based on a formal (documented) procedure.
Walkthrough – a step-by-step presentation by the author of a document in order to gather information and to establish a common understanding of its content.
Technical Review – a peer group discussion activity that focuses on achieving consensus on the technical approach to be taken.
Inspection – a type of peer review that relies on visual examination of documents to detect defects. The most formal review technique and therefore always based on a documented procedure.
Control flow analysis – a form of static analysis based on a representation of unique paths (sequences of events) in the execution through a component or system. Control flow analysis evaluates the integrity of control flow structures, looking for possible control flow anomalies such as closed loops or logically unreachable process steps.
Data Flow Analysis – a form of static analysis based on the definition and usage of variables.
Dynamic Techniques
Dynamic Techniques
.
Structure – Based
Experience – Based
Specification-Based
Equivalence Partitioning
State Transition
Decision Tables
Use Case Testing
Boundary Values
Analysis
Error Guessing
Exploratory Testing
Statement
Decision
Condition
Multiple Condition
Testing, either functional or non-functional, without reference to the internal structure of the component or system.
Equivalence Partitioning
Equivalence partitioning (EP) – A black box test design technique in which test cases are designed to execute representatives from equivalence partitions. In principle test cases are designed to cover each partition at least once.
Idea: Divide (i.e. to partition) a set of test conditions into groups or sets that can be considered the same (i.e. the system should handle them equivalently), hence equivalence partitioning.
Example: Bank represents new deposit program for corporate clients. According to the program client has ability to get different %, based on amount of deposited money. Minimum which can be deposited in $1, maximum is – $999. If client deposits less than $500 it will have 5% of interests. In case the amount of deposited money is $500 and higher, then client gets on 10% of interests more.
Invalid Valid for 5% discount Valid for 15% discount Invalid
$0 $1 $499 $500 $999 $1000
Boundary Values Analysis
Boundary value: An input value or output value which is on the edge of an equivalence partition or at the smallest incremental distance on either side of an edge, for example the minimum or maximum value of a range.
Boundary value analysis (BVA): A black box test design technique in which test cases are designed based on boundary values.
Idea: Divide test conditions into sets and test the boundaries between these sets.
Example: Bank represents new deposit program for corporate clients. According to the program client has ability to get different %, based on amount of deposited money. Minimum which can be deposited in $1, maximum is – $999. If client deposits less than $500 it will have 5% of interests. In case the amount of deposited money is $500 and higher, then client gets on 10% of interests more.
Invalid Valid for 5% discount Valid for 15% discount Invalid
$0 $1 $499 $500 $999 $1000
Decision tables
Decision table – A table showing combinations of inputs and/or stimuli (causes) with their associated outputs and/or actions (effects), which can be used to design test cases.
Example: If you hold an 'over 60s' rail card, you get a 34% discount on whatever ticket you buy. If you hold family rail card and you are traveling with a child (under 16), you can get a 50% discount on any ticket. If you are traveling with a child (under 16), but do not have family rail card, you can get a 10% discount. You can use only one type of rail card.
State transition
State transition – A transition between two states of a component or system
State transition testing – A black box test design technique in which test cases are designed to execute valid and invalid state transitions
Example: The diagram below shows an example of entering a Personal Identity Number (PIN) to a bank account. The states are shown as circles, the transitions as lines with arrows and the events as the text near the transitions.
Wait for Pin
3rd try
Start
1st try 2nd try
Access to
account
Eat card
Card inserted
Enter
Pin Ok
Pin NOT Ok
Pin Ok Pin Ok
Pin NOT Ok
Use Case testing
Use Case testing - is a technique that helps us identify test cases that exercise the whole system on a transaction by transaction basis from start to finish.
◦ Use cases describe the process flows through a system based on its most likely use
◦ This makes the test cases derived from use cases particularly good for finding defects in the real-world use of the system
◦ Each use case usually has a mainstream (or most likely) scenario and sometimes additional alternative branches (covering, for example, special cases or exceptional conditions).
◦ Each use case must specify any preconditions that need to be met for the use case to work.
◦ Use cases must also specify post conditions that are observable results and a description of the final state of the system after the use case has been executed successfully.
Structure-Based Techniques
Dynamic Techniques
.
Structure – Based
Experience – Based
Specification-Based
Equivalence Partitioning
State Transition
Decision Tables
Use Case Testing
Boundary Values
Analysis
Error Guessing
Exploratory Testing
Statement
Decision
Condition
Multiple Condition
Procedure to derive and/or select test cases based on an analysis of the internal structure of a component or system.
Structure based Techniques
Types of Structure based technique:
◦ Statement
A testing aimed at exercising programming statements. If we aim to test every executable statement we call this full or 100% statement coverage.
◦ Decision
A white box test design technique in which test cases are designed to execute decision outcomes.
◦ Condition
A white box test design technique in which test cases are designed to execute condition outcomes – the evaluation of a condition to True or False
◦ Multiply Condition
A white box test design technique in which test cases are designed to execute combinations of single condition outcomes (within one statement.
Statement Testing
Statement – an entity in a programming language, which is typically the smallest indivisible unit of execution.
Example:
Decision Testing
Decision is an IF statement, a loop control statement (e.g. DO-WHILE or REPEAT-UNTIL), or a CASE statement, where there are two or more possible exits or outcomes from the statement.
Example:
Experience-Based Techniques
Dynamic Techniques
.
Structure – Based
Experience – Based
Specification-Based
Equivalence Partitioning
State Transition
Decision Tables
Use Case Testing
Boundary Values
Analysis
Error Guessing
Exploratory Testing
Statement
Decision
Condition
Multiple Condition
Procedure to derive and/or select test cases basedon the tester’s experience, knowledge and intuition.
Experience based Techniques
Experience-based techniques:
◦ Error guessing is a technique that should always be used as a complement to
other more formal techniques. The success of error guessing is very much
dependent on the skill of the tester, as good testers know where the defects are
most likely to lurk.
◦ Exploratory testing is a hands-on approach in which testers are involved in
minimum planning and maximum test execution.
Choosing A Test Design Technique
Which technique is best? This is the wrong question!
Each technique is good for certain things, and not as good for other things. Some techniques are more applicable to certain situations and test levels, others are applicable to all test levels.
The internal factors that influence the decision about which technique to use are: ◦ Tester knowledge and experience ◦ Expected defects ◦ Test objectives ◦ Documentation ◦ Life cycle model
The external factors that influence the decision about which technique to use are: ◦ Risks ◦ Customer and contractual requirements ◦ System type◦ Regulatory requirements ◦ Time and budget
Coffee Break
Review and Analyze Test Basis
Select Test Design Techniques
Create Test Design Specification
Create Test Case Specification
Test Plan
SRS
Mock-ups
Test Design Specification
…
Test Case Specification
Trainings’ Content
Test Design Specification Structure
According to IEEE-829 standard template structure looks in the following way:
1. Test Design Specification Identifier1.1 Purpose1.2 References1.3 Definitions, acronyms and abbreviations
2. Features to be Tested3. Approach Refinements4. Test Identification
4.1 <Test Item 1>4.2 <Test Item …>4.3 <Test Item N>
5. Feature Pass/Fail Criteria
Test Design Specification Structure
Test Design Specification Identifier section covers:◦ Purpose of the document◦ Scope of the document◦ List of references which should include references on test plan, functional
specification, test case specification, etc. ◦ Definitions, acronyms and abbreviations used in Test Design Specification
Features to be Tested identifies test items and describes features and combinations of features that are the object of this design specification. Reference on Functional Specification for each feature or combination of features should be included.
Approach Refinements section describes the following:◦ Specific test techniques to be used for testing features or combinations of
features◦ Types of testing which will be provided◦ Methods of analyzing test results◦ Test results reporting◦ Whether automation of test cases will be provided or not◦ Any other information which describes approach to testing
Test Design Specification Structure
Feature Pass/Fail Criteria specifies the criteria to be used to determine whether the feature or feature combination has passed or failed
The following items can be considered as “pass / fail criteria”:
◦ Feature works according to stated requirements
◦ Feature works correctly on the test platforms
◦ Feature works correctly with other modules of application
◦ All issues with High and Medium Priority will be verified and closed
Test Design Specification Structure
Test Identification section is separated to sub-section according to the amount of
test items identifying future documentation which will be created for testing
features or combinations of features that are the object of this design specification
Features can be covered by test objectives in different ways depending on projects
needs, approaches for testing etc.
Let’s consider three examples of such coverage:
Feature covered Feature covered Feature coveredby test cases by test scenarios by check list
Example of functional checklist
Example of coverage by scenarios
Example of coverage by test cases
Real Example: User Registration page
Business Value: I, as an Administrator user, should be able to create a simple user account to log in application.
Functional Requirements: ‘User Registration’ page should contain three fields ‘User Name’, ‘Password’, ‘Confirm Password’ and two buttons – ‘Save’ and ‘Cancel’.
Mock up:
‘User Name’ field is limited by 10 symbols and should
contain letters of Latin alphabet only. ‘User Name’ field is
empty by default. User Name should be unique in the
system.
‘Password’ field should be no less than 4 symbols long and
should include only numbers and letters of Latin alphabet
only. ‘Password’ field is empty by default.
‘Confirm Password’ field should be equal to ‘Password’.
‘Confirm Password’ field is empty by default.
‘Cancel’ button cancels account creation and closes ‘User
Registration’ page.
‘Save’ button validates data entered into fields on ‘User
Registration’ page and creates user account if entered
data are correct; or shows error dialogs if validation fails.
Validation should be provided in following order: User
Name, Password, and Confirm Password.
Real Example: Error messages
Real Example: Test Item “User Registration”
Requirement Test Name Description‘Save’ button functionality
Creating new user account and save This test verifies that user account could be created if all fields on ‘User Registration’ page are filled with correct data; and ‘User Registration’ page is closed on save action
‘Cancel’ button functionality
Creating new user account and cancel This test verifies that user account is not created after filling in fields on ‘User Registration’ page and canceling; and ‘User Registration’ page is closed on cancel action
Default values Default values on the ‘User Registration’ page This test verifies that all fields on ‘User Registration’ page are blank by default
‘User Name’ field validation
Error dialog on saving user account with too long user name
This test verifies that error dialog appears while save action if user name length is too long:1)boundary length – 11 characters2)restricted length – more than 11 characters
Error dialog on saving user account with blank ‘User Name’ field
This test verifies that error dialog appears while save action if ‘User Name’ field is blank
Verify boundary length for user name This test verifies that user account having user name with boundary length 1 or 10 could be created
Error dialog on saving user account with wrong user name
This test verifies that error dialog appears while save action if ‘User Name’ field include: 1)special symbols; 2)numbers; 3)both
Error dialog on saving already existing user account This test verifies that error dialog appears while save action if user already exists in the system
‘Password’ field validation Error dialog on saving user account with too short password
This test verifies that error dialog appears while save action if password length is too short: 1)boundary length – 3 characters2)restricted length – less than 3 characters
Error dialog on saving user account with blank ‘Password’ field
This test verifies that error dialog appears while save action if password is blank
Verify boundary length for password This test verifies that user account having password with boundary length 4 could be created
Error dialog on saving user account with incorrect password
This test verifies that error dialog appears while save action if ‘Password’ field includes special symbols
‘Confirm Password’ field validation
Error dialog on saving user account with unequal password and confirm password
This test verifies that error dialog appears while save action if:1)’Confirm Password’ field is blank2)password and confirm password do not match
Test Design Specification Examples
Example of Test Design with Test Cases
Example of Test Design with Scenarios
Test Design and Techniques Homework
Create Test Design Specification based on Software Requirements Specification
Practice in using Test Design Techniques and design test objectives using Dynamic Test Design Techniques
SRS for Homework Test Design Homework variant 1
Test Design Homework variant 2
Test Design Techniques Homework variant 1
Test Design Techniques Homework variant 2
Test Design Techniques Homework variant 3
Test Design Techniques Homework variant 4