Matada Prashanth

22
Testing: A Roadmap Mary Jean Harrold College Of Computing Georgia Institute Of Technology Presented By Prashanth L Anmol N M
  • date post

    12-Sep-2014
  • Category

    Documents

  • view

    578
  • download

    0

description

 

Transcript of Matada Prashanth

Page 1: Matada Prashanth

Testing: A Roadmap

Mary Jean HarroldCollege Of Computing

Georgia Institute Of Technology

Presented ByPrashanth LAnmol N M

Page 2: Matada Prashanth

Introduction

•Definition

•Purposes for which testing is performed

•Key Concepts

•Advantages

•Disadvantages

Page 3: Matada Prashanth

Introduction

•Definition

–Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results.

–Constitutes more than 50% of cost of software development

Page 4: Matada Prashanth

Introduction

Purposes

•To improve quality •For Verification & Validation (V&V)

–Functionality (Exterior Quality)-Correctness ,Reliability Usability Integrity

–Engineering (interior quality)-Efficiency ,Testability Documentation Structure

– Adaptability (future quality )-Flexibility Reusability Maintainability

•For reliability estimation

Page 5: Matada Prashanth

Introduction

Key Concepts

•Taxonomy–correctness testing–performance testing–reliability testing–security testing

•Testing Automation

•When to Stop Testing?

Page 6: Matada Prashanth

Introduction

•Correctness Testing- –Minimum Requirements of Software.–can be either black box or white box

Black Box - treats s/w as blackbox.only the inputs/outputs visible basic functionality testing. White Box-structure & flow of software under test visible.

•Performance Testing-Design problems in software that cause the system performance to degrade

•Reliability Testing-robustness of a software component is the degree to which it can function correctly in the presence of exceptional inputs or stressful environmental conditions

Security Testing-include identifying and removing software flaws that may potentially lead to security violations, and validating the effectiveness of security measures. Simulated security attacks can be performed to find vulnerabilities

Page 7: Matada Prashanth

Introduction

•Testing Automation– Automation is a good way to cut down time and cost. – Software testing tools and techniques usually suffer from a lack of generic applicability and scalability.

•When to Stop Testing?–Testing is a trade-off between budget, time and quality. –It is driven by profit models.–pessimistic time, budget, or test cases -- are exhausted.–optimistic reliability meets the requirement, or the benefit from

continuing testing cannot justify the testing cost.

•Advantages–Easy generation of test cases & Instrumentation of software.–Process automation & execution in expected environment.

Page 8: Matada Prashanth

Roadmap

Fundamental Research

– Testing Component Based Systems – Testing Based on Precode Artifacts– Testing Evolving Software– Demonstrating Effectiveness of Testing Techniques– Using Testing Artifacts– Other Testing Techniques– Methods & Tools– Empirical Studies– Testing Resources

Page 9: Matada Prashanth

Testing Component Based Systems

•Issues – Component Provider (Developer of the Components)

Views Components independently of the context.

– Component User (Application Developer) Views Components relevant to the application.•Limiting Factor

–Availability of Source Code

•Roadmap Suggested–Types of Testing Information needed.–Techniques for representing & computing the types of testing info the user needs.–Techniques to use the information provided with the component for testing the application.

Page 10: Matada Prashanth

Testing Based On Precode Artifacts

•Issues – Design– Requirements– Architectural Specifications

•Issue Under Spotlight– Architecture

•Roadmap Suggested– Use of formal notations used for s/w architecture – Develop techniques to be used with architectural specification for test-case development.– Develop techniques to evaluate s/w architectures for testability.

Page 11: Matada Prashanth

Testing Evolving Software• Regression testing:

Validate modified software to ensure no new errors introduced. ONE OF THE MOST EXPENSIVE PART !!!

• Some useful techniques* select subset of test suite of previous from previous testing. * techniques to help manage growth in size of test suite.* assess regression testability.

• Testing techniques needed ….* not only for software but also for architecture and requirements.* manage test suites themselves.* identify parts of modified software for which new test cases are required.* identify test cases no longer needed.* prioritize test cases.* asses the test suite themselves.

Page 12: Matada Prashanth

Demonstrating effectiveness of Testing Techniques

• How ????

* Increase developers confidence.* software behavior* identify classes of faults for a given test criteria.* provide visual interface.* determine interaction among test selection criteria and ways to combine.

• Research been done In ….

* evaluation criteria to determine adequacy of test suites and test cases that inspire confidence.* test cases based on either software’s intended behavior or purely on code.* test cases based on data-flow in a program.* use existing testing techniques to test visual programming languages.* test complex Boolean expressions.* mutation analysis and approximation ( ex: can avoid testing pointer variable in data flow analysis.

Page 13: Matada Prashanth

Establishing Effective Process for Testing

• Need to develop process for planning and implementation of Testing.

• Some of the currently used or proposed techniques ….* Develop a test plan during requirements gathering phase and implementation of the test plan after s/w implementation phase. Useful ?

* what does Microsoft do ?- frequently synchronize what people are doing.- periodically stabilize the product in increments as project proceeds.- build and test a version every night !!! ( only Microsoft can do…!)

* perpetual testing - Build foundation for treating analysis and test ongoing activities for improved quality.

* selective regression testing where we test one version and gather testing artifacts such as I/o pairs and coverage information.

* explicit progress of regression testing that integrates many key testing techniques into development and maintenance of evolving software.

Page 14: Matada Prashanth

Establishing Effective Process for Testing. Cont’d

• Some Open questions …

– Does Microsoft nightly rebuild, minimize testing later ?

– Does testing show all the software qualities ?

– Can results obtained from testing be generalized ?

• Some useful suggestions …

– Integrate various quality techniques and tools

– Combine static analysis with testing

Page 15: Matada Prashanth

Using Testing Artifacts

• Artifacts include :– Execution traces of software’s execution with test cases.– Results of execution like pass/fail of software for given test cases.

• Useful : Store results and use for retesting modified software.

• A whole bunch of research is done on this, some of the proposed ….* use dynamic program slices derived from execution traces along with pass/fail results for execution traces to identify potential faulty code.

apply heuristics to find out subset of test suite that passed and failed.

* identify program invariants and develop test cases generation.

* use coverage information to predict the magnitude of regression testing.

* use coverage information also to select test cases from test suite for use in regression testing.

Page 16: Matada Prashanth

Using Testing Artifacts. Cont’d…

• Proposed research cont’d….

* Use artifacts to for test suite reduction and prioritization.

* Perform concept analysis on coverage info and compute relationships among executed entities. Helps in uncovering properties of test suites.

* Path spectra identifies paths where control diverges in program execution which is helpful in debugging, testing and maintenance. Expensive !

* Use branch spectra which is less expensive profiling.

* Using visual tools to analyze test info.

• Additional research Needed ….* use of testing artifacts for software engineering tasks.* identify types of info software engineers and managers need (data mining)

Page 17: Matada Prashanth

Other Testing Techniques

• Some other techniques helpful in reaching end goal (quality software)

* Need for a scalable automatic test data generation

* Static analysis required but expensive. Need for a scalable analysis technique that can be used to compute required information.

* data-flow analysis expensive and hence need for efficient instrumentation and recording techniques.

Page 18: Matada Prashanth

Method and Tools

• Goal– Develop efficient methods and tools that can be used by practitioners to test their software.

• Complaint– Software Engineering technology requires on average 18 years to be transferred in to

practice !!!!!!! We need to reduce this time for technology transfer. • Reasons

– techniques developed demonstrated on contrived or toy system. Not Scalable !

• What we need

– Development of tools and methods for industrial setting to demonstrate the usefulness of techniques proposed

– Techniques proposed should be scalable.

– Develop robust prototypes and identify the context in which they can function and use them to perform the experiments to demonstrate the techniques.

Page 19: Matada Prashanth

Method and Tools. Cont’d ….

• What we need. Cont’d …

– Tools to consider computation trade offs like Precision Vs Efficiency.

– Automatic development of method and tools on the lines of compilers.

– Develop tools that are attractive to practitioners.

– Finally the developed tools require minimal involvement of Software Engineers.

Page 20: Matada Prashanth

Empirical Studies

• What does it mean ?– Studies which will help demonstrate the scalability and usefulness of the

techniques in practice in other words feedback for future research.

• Difficulties in doing Empirical Studies

– Difficulty in acquiring sufficient robust implementation of these techniques

– Difficulty in obtaining sufficient experimental subjects (software and test suites)

• Solutions

– Collect sets of experimental subjects and make them available for researchers.

– Create sanitized information that would reveal no proprietary information and still useful for experimentation.

Page 21: Matada Prashanth

Testing Resources

• Workshops, Conferences, Journals and Useful Links

– Workshop on Strategic Directions in Software Quality 1996 . (ACM)

– National Science Foundation & Computational Research Association

– Workshop on Role of Software Architectures in Testing and Analysis. (INRC)

– International Conference on Software Engineering Workshop on Testing Distributed Component-based Systems.

– Middle Tennessee State’s STORM Software Testinghttp://www.mtsu.edu/~storm/

– Reliable Software Technology’s Software Assurance Hotlisthttp://www.rstcorp.com/hotlist

– Research Institute’s Software Quality Hotlisthttp://www.soft.com/Institute/HotList/index.html

– Newsgroup: comp.software.testing

Page 22: Matada Prashanth

Conclusions

• Relevance To Embedded Systems

Emphasizes the basic stages-Sets up the next paper.

Talks about evolving systems-ES evolve by the second.

Talks about component based testing-COTS.

Emphasizes the need for Testing based on Precode Artifacts- Software Architecture.

Examining Current Techniques to demonstrate scalability & usefulness of techniques in practice-Empirical.

• Weakness – Nil