COTS Testing

77
COTS Testing

Transcript of COTS Testing

Page 1: COTS Testing

COTS Testing

Page 2: COTS Testing

Diff. With in-house components

Interface (pre and post conditions) are not clearly specified.

No Arch. and code. Black boxes to component user.

Why use COTS

Page 3: COTS Testing

Why COTS Testing

Failure of Ariane5.

• explosion resulted from insufficiently tested software reused from the Ariane 4 launcher.

Page 4: COTS Testing

COTS Evaluation and selection

Page 5: COTS Testing

Why rigorous evaluation of COTS?

Large number of alternative products. Multiple stakeholders. Large number of Quality criteria. Compatibility with other products.

Page 6: COTS Testing

Why evaluation difficult

Large number of evaluation criteria. Different opinions are usually encountered among

different stakeholders. Evaluation criteria are not easily measurable at

evaluation time. Gathering relevant info. is prohibitively expensive. COTS market is changing fast, evaluation must be

performed several times during lifecycle. Evaluation deals with uncertainty info.

Page 7: COTS Testing

AHP Technique

Originally designed for economic and political science domains.

Requires a pair wise comparison of alternatives and pair wise weighting of selection criteria.

Enables consistency analysis of comparisons and weights, making possible to assess quality of gathered info.

Page 8: COTS Testing

AHP Technique (contd.)

Allows alternatives to be measured on a ratio scale,we can determine how much better an alternative compared to other.

Practically usable if number of alternatives and criteria are sufficiently low, because comparisons are made by experts.

Page 9: COTS Testing

Selection in practice

Follows three stages Informal screening for a set of requirements

using selection thresholds. More systematic evaluation using AHP

process. Detailed Information gathering involves

testing, prototyping and reading technical documents.

Page 10: COTS Testing

State of the art in COTS testing

Page 11: COTS Testing

How to provide information to user

Component meta-data approach. Retro-components approach. Component test bench approach. Built-in test approach. Component+ approach. STECC strategy.

Page 12: COTS Testing

Component meta-data approach

Binary code

Metadata

Call graphs,Testing info.

done by provider

Component

Page 13: COTS Testing

Component metadata (contd.)

server MetaDB

Component

functionality

Metadata req

Metadata

Page 14: COTS Testing

Retro-components approach

server MetaDB

Component

functionalityMetadata req and test data

Metadata

Page 15: COTS Testing

Component test bench approach

A set of test cases called test operation is associated with each interface of a component.

A test operation defines the necessary steps for testing a specific method.

The concrete test inputs and expected test output packaged in a test operation.

Page 16: COTS Testing

Built-in test approach.

Functionality

Test casegenerator

Tester

Component

Page 17: COTS Testing

Built-in test approach(contd.)

Functionality

Normal mode.

Functionality

Test casegenerator

Tester

Maintenance mode.

Page 18: COTS Testing

Built-in test approach(contd.)

Inheritance

Base Component

Derived Component

Page 19: COTS Testing

Component+ approach

Functionality

Test executor

Built-in testing enabled component

Tester

Test casegenerator

FailureRecovery

mech.s

Handler

Interface

Page 20: COTS Testing

Disadv. of BIT and component+

Static nature. Generally do not ensure that tests are

conducted as required by the component user

The component provider makes some assumptions concerning the requirements of the component user, which again might be wrong or inaccurate.

Page 21: COTS Testing

STECC strategy

functionality

TesterServer

MetaDB

query

Test generator

Metadata Req.

Metadata

Page 22: COTS Testing

Levels of Testing

Unit Testing. Integration Testing. System Testing

Page 23: COTS Testing

Types of testing

Functionality Testing . Reliability Testing. Robustness Testing. Performance Testing. Load Testing. Stress Testing. Stability Testing. Security Testing.

Page 24: COTS Testing

Certifying COTS

When considering a candidate component, developersneed to ask three key questions: Does component C fill the developer’s needs? Is the quality of component C high enough? What impact will component C have on system S?

Page 25: COTS Testing

Certifying COTS(contd.)

Page 26: COTS Testing

CERTIFICATION TECHNIQUES

Black-box component testing. System-level fault injection. Operational system testing. Software Wrapping. Interface propagation Analysis.

Page 27: COTS Testing

Black box Testing

To understand the behavior of a component, various inputs are executed and outputs are analyzed.

To catch all types of errors all possible combinations of input values should be executed.

To make testing feasible, test cases are selected randomly from test case space.

Page 28: COTS Testing

Black box test reduction using Input-output Analysis

Random Testing is not complete. To perform complete functional testing,

number of test cases can be reduced by Input-output Analysis.

Page 29: COTS Testing
Page 30: COTS Testing
Page 31: COTS Testing

How to find I/O relationships By static analysis or execution analysis of

program.

Page 32: COTS Testing

Fault Injection

ComponentFault

simulation tool

Faultsimulation

tool

Erroneous or malicious input

Exceptions, No response

request

Page 33: COTS Testing

Operational System Testing

complements system-level fault injection. System is operated with random inputs (valid

and invalid inputs) Provides more accurate assessment of COTS

quality. To ensure that a component is a good match

for the system.

Page 34: COTS Testing

Software Wrapping

ComponentInput output

Input wrapper Output wrapper

Page 35: COTS Testing
Page 36: COTS Testing

Instrumentation configuration file

Page 37: COTS Testing

Interface propagation Analysis

COTS Component 1

COTSComponent 2Fault Injector

Modify input, call correct method.

Call correct method, modify output.

Call perturbed function.

Page 38: COTS Testing

Fault Injection used for

Robustness Testing. Error propagation Analysis. Reliability Testing. Security Testing.

Page 39: COTS Testing

Robustness Testing

Page 40: COTS Testing

COTS testing for OS failures

COTS component

OperatingSystemWrapper

Page 41: COTS Testing

Ballista approach

Based on fault injection technique. Test cases are generated using parameter

types of an interface. Independent of internal functionality. Testing is not complete.

Page 42: COTS Testing

Test value Data Base

Page 43: COTS Testing

Test value Data Base(contd.)

Integer data type: 0, 1, -1, MAXINT, -MAXINT, selected powers of two, powers of two minus one, and powers of two plus one.

Float data type: 0, 1, -1, +/-DBL_MIN, +/-DBL_MAX, pi, and e.

Pointer data type: NULL, -1 (cast to a pointer), pointer to free’d memory, and pointers to malloc’ed buffers of various powers of two in size.

Page 44: COTS Testing

Test value Data Base(contd.)

String data type (based on the pointer base type): includes NULL, -1 (cast to a pointer), pointer to an empty string, a string as large as a virtual memory page, a string 64K bytes in length.

File descriptor (based on integer base type): includes -1;MAXINT; and various descriptors: to a file open for reading, to a file open for writing, to a file whose offset is set to end of file, to an empty file, and to a file deleted after the file descriptor was assigned.

Page 45: COTS Testing

Test case generation

All combinations of values for the parameter types are generated.

Number of test cases generated are product of number of parameters and test base for that type.

Page 46: COTS Testing

Error propagation analysis

Interface Propagation Analysis is used by injecting faults at one component.

This is done at component integration level. A known faulty input is injected using fault injector

into the system. Components effected by this input are observed

(how they handle the faulty input).

Page 47: COTS Testing

Performance Testing

Page 48: COTS Testing

Middleware

Application’s execution and Middleware cannot be divorced in any meaningful way.

In order to predict the performance of application component, performance of its middleware should be analyzed.

Page 49: COTS Testing

Performance prediction Methodology

Application’s performance prediction isthree step process.

Obtaining Technology performance. Analyzing Architecture specific

behavioral characteristics. Analyzing Application specific

behavioral characteristics.

Page 50: COTS Testing

Technology performance profile

Page 51: COTS Testing

Technology performance profile (contd.)

Page 52: COTS Testing

Technology performance profile (contd.)

Page 53: COTS Testing

Architecture behavior

Identity Application

Page 54: COTS Testing

Effect of database access thru Middleware

The performance of the entity bean architecture is less than 50% of the performance of the session bean only Architecture.

Sessionbean

Entitybean

Container

DB

Page 55: COTS Testing

Effect of Server Thread

The performance increases from 2 threads to 32 threads, stabilizes around 32 to 64 threads, and gradually decreases as more threads are added due to contention.

Page 56: COTS Testing

The Effect of Client Request Load.

Client response time increases with concurrent client request rate due to contention for server threads.

Page 57: COTS Testing

Effect of Database Contention

Effect of database contention leads to performance that is between 20% and 49%.

Page 58: COTS Testing

Optimal Number of threads

Page 59: COTS Testing

Load Testing

Page 60: COTS Testing

Load Testing

It is just Performance testing under various loads.

Performance is measured as Connections per second (CPS), throughput in bytes per second, and round trip time (RTT) .

Page 61: COTS Testing

Load Test Application

Load testApp

Ethernet

Webserver

Appserver

DBserver

System Under Test

Page 62: COTS Testing

Testing strategy

Load tests will be conducted in three phases.1. Consumption of server resources as a function

of the volume of incoming requests will be measured.

2. Response time for sequential requests will be measured.

3. Response time for concurrent client request load will be measured.

Page 63: COTS Testing

Security Testing

Page 64: COTS Testing

Security Risks with COTS

Component design. Component procurement. Component integration. System maintenance.

Page 65: COTS Testing

Component Design

Inadvertently flawed component design. Intentionally flawed component design. Excessive component functionality. Open or widely spread component design. Insufficient or incorrect documentation.

Page 66: COTS Testing

System maintenance Insecure updating. Unexpected side effects. Maintenance backdoors.

Component integration

Mismatch between product security levels.Ex. UNIX and CORBA security integration.

Page 67: COTS Testing

Privacy Data Base Risks

Page 68: COTS Testing

Risks revealed

Trojan horse in client. Information leaking to swap file. DBMS log files. DBMS ordering of records.

Page 69: COTS Testing

Piracy avoidance techniques

Hardware and software tokens. Dynamic Decryption of Code. Watermarking. Code Partitioning.

Page 70: COTS Testing

Regression testing for COTS

Page 71: COTS Testing

I-BACCI process

1. Decomposing the binary file of the component; and filtering trivial information.

2. Comparison the code sections between the two versions.

3. Identification of glue code functions.4. Identification of change propagation in other

components/system. 5. Selection of test cases to cover only the affected

glue code functions (functions in firewall).

Page 72: COTS Testing

Black box understanding of COTS

Page 73: COTS Testing

Methods for understanding

Binary reverse Engg. Interface probing. Partial automation of interface probing.

Page 74: COTS Testing

Binary reverse Engg.

Derives the design structure (call graph, control graph) from binary code.

Source code can also be partially extracted using decompilation.

Decompiled source code will have no comments and variable names will not be meaningful.

Licenses forbid decompilation back to source code.

Page 75: COTS Testing

Interface probing

System Developer designs a set of test cases, executes, and analyzes outputs.

Done in an iterative manner.

Page 76: COTS Testing

Disadvantages

A large number of test cases have to be generated and analyzed.

Some properties may require significant probing which may be tedious,labor intensive, expensive.

Developers miss certain limitations and make incorrect assumptions.

Page 77: COTS Testing

Partial Automation of interface probing

Based on interface probing. Test cases are generated based on scenarios. Testing is done in three phases

1. Scenario description phase.2. Search space specification phase.3. Test case generation phase.