16103293 Software Testing Practial

download 16103293 Software Testing Practial

of 118

Transcript of 16103293 Software Testing Practial

  • 8/14/2019 16103293 Software Testing Practial

    1/118

    SOFTWARE

    TESTINGBy

    VIKRAM

  • 8/14/2019 16103293 Software Testing Practial

    2/118

    Contents

    Software QualityFish ModelV-Model

    Reviews in Analysis

    Reviews in Design Unit Testing

    Integration Testing

    System Testing

    1. Usability Testing2. Functional Testing3. Non Functional Testing

    User Acceptance Testing

    Release Testing

    Testing During Maintenance

    Risks and Ad-Hoc Testing

    System Testing Process

    Test Initiation

    Test Planning

    Test Design

    Test Execution

    1. Formal Meeting2. Build Version Control3. Levels of Test Execution4. Levels of Test Execution VS Test Cases5. Level-0(Sanity Testing)6. Level-1(Comprehensive Testing)7. Level-2(Regression Testing)

    Test Reporting

    Test Closure

    User Acceptance Testing

    Sign Off

    Case StudyManual Testing VS Automation TestingWinRunnerAutomation Test creation in WinRunner

    Recording Modes1. Context Sensitive mode2. Analog mode

    Check Points

    1. GUI check pointfor single pointfor object or windowfor multiple objects

    2. Bitmap check pointfor object or windowfor screen area

    3. Database check point

  • 8/14/2019 16103293 Software Testing Practial

    3/118

    default checkcustom checkruntime record check

    4. Text check pointfrom object or windowfrom screen area

    Data Driven Testing

    1. From Key Board2. From Flat Files3. From Front end Objects4. From XL Sheets

    Silent mode

    Synchronization point

    1. wait2. for object/window property3. for object/window bitmap4. for screen area bitmap5. Change Runtime settings

    Function Generator

    Administration of WinRunner

    1. WinRunner Frame WorkGlobal GUI Map filePer Test mode

    2. Changes in references3. GUI Map configuration4. Virtual object wizard5. Description Programming6. Start script

    7. Selected applications User defined functions

    Compiled Module

    Exception Handling or Recovery Manager

    1. TSL exceptions2. Object exceptions3. Popup exceptions

    Web Test Option

    1. Links coverage2. Content coverage3. Web functions

    Batch Testing Parameter Passing

    Data Driven Batch Testing

    Transaction Point

    Debugging

    Short/Soft Key Configuration

    Rapid Test Script Wizard(RTSW)

    GUI Spy

    Quick Test Professionals

    Recording modes

    1. General recording

  • 8/14/2019 16103293 Software Testing Practial

    4/118

    2. Analog recording3. Low level recording

    Check Points

    1. Standard check point2. Bitmap check point3. Text check point4. Textarea check point

    5. Database check point6. Accessibility check point7. XML check point8. XML check point(File)

    VBScript

    Step Generator

    Data Driven Testing

    DDT through Key Board

    DDT through Front end objects

    DDT through XL sheet

    DDT through flat fileMultiple Actions

    Reusable actions

    Parameters

    Synchronization Point

    QTP FrameWork

    Per Test mode

    Export references

    Regular Expressions

    Object Identification

    Smart Identification Virtual Object Wizard

    Web Test Option

    Links Coverage

    Content coverage

    Recovery Scenario ManagerBatch testingParameter PassingWith statementActive ScreenOutput valueCall to WinRunnerAdvanced Testing Process

  • 8/14/2019 16103293 Software Testing Practial

    5/118

    MANUAL TESTING

    Software Quality

    *Meet customer requirements(ATM)*Meet customer expectations(Speed)*Cost to purchase*Time to release

    Quality Assurance and Quality Control

    "The monitoring and measuring the strength of development process is called asSoftware QualityAssurance".It is a process based concept.

    "The testing on a deliverable after completion of a process is called asSoftware Quality Control".The QA is specified as Verification and the QC is specified as Validation.

    Fish Model(Development VS Testing)

    *In the above process model, the development stages are indicatingSDLC (Software Development LifeCycle) and the lower angle is indicatingSTLC (Software Testing Life Cycle).

    *During SDLC, the organizations are following standards, internal auditing and strategies, called SQA.*After completion of every development stage, the organizations are conducting testing, called QC.*QA is indicating defect prevention.The QC is indicating defect detection and correctness.*Finally the STLC is indicating Quality Control.

    BRS: The Business Requirements Specification defines the requirements of the customer to be developed assoftware.

    SRS: The Software Requirements Specification defines the functional requirements to be developed and thesystem requirements to be used.

    Walk Through: It is astatic testing technique.During this, the responsible people are studying the document to

  • 8/14/2019 16103293 Software Testing Practial

    6/118

    estimate the completeness and correctness.

    Inspection: It is also a static testing technique to check a specific factor in corresponding document.

    Peer Review: The comparison of similar documents is called as Point to Point review.

    HLD: High level design document is representing the overall view of an s/w from root functionality to leaffunctionality.HLD are also known asExternal Design orArchitectural design.

    LLD: Low level Design document is representing the internal logic of every functionality.This design is alsoknown asInternal or Detailed design.

    Program: It is a set of executable statements.A s/w means that a set of modules/functionality/features.Onemodule means that a set of dependent programs.

    White Box Testing: It is a program based testing technique used to estimate the completeness and correctness

    the internal programs structure.

    Black Box Testing: It is an s/w level testing technique used to estimate the completeness and correctness of thexternal functionality.

    NOTE: White Box Testing is also known as Clear Box Testing or Open box Testing."The combination of WBTand BBT is called as Grey Box Testing".

  • 8/14/2019 16103293 Software Testing Practial

    7/118

    V-Model(Verification and Validation Model)This model defines conceptual mapping in b/w development stages and testing stages (verifying the proce

    and validating it).

    The above V-Model is defining multiple stages of development with multiple stages of testing.Themaintence of separate teams for all stages is expensive to small and medium scale companies.

    *Due to this reason, the small and medium scale organizations are maintaining the separate testing teamonly forSystemTestingbecause this is Bottle-Neck stage in s/w process.

    A) Reviews in Analysis

    After completion of requirements garnering, theBusiness Analystcategory people are developing the SRSwith required functional requirements and system requirements.The same category people are conducting reviewson those documents to estimate the completeness and correctness.In this review meeting, they are following WalkThroughs, Inspections and Peer Reviews to estimate below factors.

    *are they complete requirements?*are they correct requirements?*are they achievable requirements?(Practically)*are they reasonable requirements?(Budget and Time)*are they testable requirements?

    B) Reviews in DesignAfter completion of Analysis and their reviews,the designer category people are preparing HLD's and

    LLD's.After completion of designing,the same category people are conducting a review meeting to estimate thecompleteness and correctness through below factors.

    *are they understandable?*are they complete?*are they correct?*are they followable?*are they handling errors?

    C) Unit TestingAfter completion of analysis and design,our organization programming category people are starting coding

  • 8/14/2019 16103293 Software Testing Practial

    8/118

    "The Analysis and Design level reviews are also known as Verification Testing".After completion of Verificationtesting,the programmers are starting coding and verify every program internal structure using WBT techniques asfollows.

    1. Basis Path Testing:(whether it is executing or not)In this,the programmers are checking all executableareas in that program to estimate "whether that program is running or not?".To conduct this testing the programs afollowing below approach.

    *write program w.r.t design logic (HLD's and LLD's).

    *prepare flow graph.*calculate individual paths in that flow graph called Cyclomatic Complexity.*run that program more than one time to cover all individual paths.

    After completion of Basis Path testing, the programmers are concentrating on correctness of inputs and

    outputs using Control structure Testing

    2. Control structure Testing: In this, the programmers are verifying every statement, condition and loop iterms of completeness and correctness of I/O (example: Debugging)

    3. Program Technique Testing: During this, the programmer is calculating the execution time of thatprogram.If the execution time is not reasonable then the programmer is performing changes in structure of thatprogram sans disturbing the functionality.

    *4.Mutation Testing: Mutation means a change in a program.Programmers are performing changes intested program to estimate completeness and correctness of that program testing.

    D) Integration TestingAfter completion of dependent programs development and Unit Testing, the programmers are inter

    connecting them to form a complete system.After completion of inter connection, the programmers are checkingthe completeness and correctness of that inter connection.This Integration testing is also known asInterfaceTesting. There are 4 approaches to inter connect programs and testing on that inter connections

    1.Top Down approach: In this approach, programmers are inter connecting the Main module andcompleted sub modules sans using under constructive sub modules.In the place of under constructive submodules,the programmers are using temporary or alternative programs calledStubs.These stubs are also known as

    Called Programsby Main module.

  • 8/14/2019 16103293 Software Testing Practial

    9/118

    2. Bottom to Up Approach: In this approach, programmers are connecting completed Sub modules sansinter connection of Main module which is under construction.The programmers are using a temporary programinstead of main module calledDriver or Callin Program.

    3. Hybrid Approach: It is a combination of Top-Down and Bottom-Up approaches.This approach is alsocalled asSandwich Approach.

    4. System Approach: After completion of all approaches,the programmers are integrating modules,theirmodules and Unit testing.This approach is known asBig Bank approach.

    E) System TestingAfter completion all required modules integration,the development team is releasing a s/w build to a

    separate testing team in our organization.The s/w build is also known as Application Under Testing(UAT).ThisSystem testing is classified into 3 levels as Usability Testing,Functional Testing(Black Box Testing Techniques)and Non Functional Testing (this is an Expensive Testing).

    1. Usability Testing(Appearance of bike)In general the separate testing team is starting test execution with usability

    testing to estimate user friendliness of s/w build.During this, test engineers are applying below sub testsa)User Interface Testing: Whether every screen in the application build is

    *Ease of use (understandable).*Look and feel (attractiveness).

    *Speed in interface (short navigations).

  • 8/14/2019 16103293 Software Testing Practial

    10/118

    b) Manual support testing:During the s/w release,our organization is releasing user manuals also.Befores/w release,the separate team is validating those user manuals also in terms of completeness and correctnes

    Case study

    2. Functional testing

    After completion of user interface testing on responsible screens in our application build, theseparate testing team is concentrating on requirements correctness and completeness in that build.In thistesting; the separate testing team is using a set ofBlack Box Testing Techniques, likeBoundary Value Analysis, Equivalence Class Partitions, Error Guessing, etc.

    This testing is classified into 2 sub tests as follows

    a) Functionality Testing: During this test, test engineers are validating the completeness and correctness oevery functionality.This testing is also known asRequirements Testing.In this test,the separate testing teamis validating the correctness of every functionality through below coverages.

    *GUI coverage or Behavioral coverage (valid changes in properties of objects and windows in ourapplication build).

    *Error handling coverage (the prevention of wrong operations with meaningful error messages likedisplaying a message before closing a file without saving it).*Input Domain coverage (the validity of i/p values in terms of size and type like while giving

    alphabets to age field).*Manipulations coverage (the correctness of o/p or outcomes).*Order of functionalities (the existence of functionality w.r.t customer requirements).*Back end coverage (the impact of front ends screen operation on back ends table content in

    corresponding functionality).

    NOTE: Above coverages are applicable on every functionality in our application build with the help ofBlack BoTesting Techniques.

    b)Sanitation testing: It is also known as Garbage Testing.During this test,the separate testing team isdetecting extra functionalities in s/w build w.r.t customer requirements(like sign in link sign in page).

    NOTE:*Defects in s/w are 3 typesMistakes, Missing and Extra.

    3) Non-Functional TestingIt is also a Manditory testing level in System Testing phase.But it is expensive and complex to

    conduct.During this test, the testing team is concentrating on characteristics of an s/w.

    *a) Recovery/Reliability testing: During this, the test engineers are validating that whether our s/

    build is changing from abnormal state to normal state or not?

  • 8/14/2019 16103293 Software Testing Practial

    11/118

    b)Compatibility Testing:(friends game cd not working in our system)It is also known asPortability testing.(Adjust anywhere).During this test,the test engineers are validating that whether our s/build is able to run on customers expected platform or not?Platform means that the OS, Compilers,Browsers and other system s/w.

    c)Configuration Testing: It is also known as H/W compatibility testing.During this,the testing teamis validating that whether our s/w build is supporting different technology devices or not?(Example differe

    types of printers, different types of n/w etc)

    d)Inter system testing:It is also known as End-To-End Testing.During this the testing team isvalidating that whether the s/w build is co-existing with other s/w or not?(To share common resources)

    EX: E-Server

    e)Installation Testing: The order is important INITIAION,DURING & AFTER

    f)Data Volume testing:It is also known as Storage testing or Memory Testing.During this the testinteam is calculating the peak limit of data handled by the s/w build.(EX:Hospital software.It is also known aMass testing)EX:MS Access technology oriented s/w builds are supporting 2GB data as maximum.

    g)Load Testing: It is also known as Performance or Scalability testing.Load or Scale means that thnumber of concurrent users(at the same time) who are operating a s/w.The execution of our s/w build undecustomer expected configuration and customer expected load to estimate the performance is LOADTESTING.(Inputs are customer expected configuration and output is performance).Performance means thaspeed of the processing.

  • 8/14/2019 16103293 Software Testing Practial

    12/118

    h)Stress Testing: The execution of our s/w build under customer expected configuration andvarious load levels to estimate Stability or continuity is called Stress Testing or Endurous testing.

    i)Security Testing: It is also known as Penetration testing.During this the testing team is validatingAuthorization,Access control and Encryption or Decryption.Authorization is indicating the validity of userto s/w like Student entering a class.Access control is indicating authorities of valid users to use features orfunctionalities or modules in that s/w like after entering Student has limited resources to use.

    Encryption or Decryption procedures are preventing 3rd party accessing.

    NOTE: In general the separate Testing team is covering Authorization and Access control checking, the samedevelopment people are covering Encryption or Decryption checking.

    F) User Acceptance TestingAfter completion of all reasonable tests,the project management is concentrating on UAT to garner feedbac

    from real customers or model customers.In this testing the developers and testers are also involving.There are 2ways to conduct UAT such as(both testing purpose is to garner feed back from customer)

    *Alpha Testing*Beta testing

    Alpha Testing Beta Testing

    In development site (like Taylor) In model customer site like MS OS s/w

    By real Customers By model customers

    Suitable for applications Suitable for productsInvolvement of Developers and testers Involvement of Developers and testers

    G) Release TestingAfter completion of UAT and their modifications, the project manager is defining Release or Delivery team

    with few developers, few testers and few h/w engineers.This release team is coming to responsible customer siteand conducts Release Testing or Port Testing or Green Box Testing.In this testing the release team is observingbelow factors in that customer site.

    *Compact Installation.(fully installed or not)*Overall Functionality.

    *Input devices handling.(keyboard,mouse,etc)

  • 8/14/2019 16103293 Software Testing Practial

    13/118

    *Output Devices handling. (Monitor printer, etc)*Secondary Storage devices handling. (Cd drive, hard disk, floppy etc)*OS error handling. (Reliability)*Co-Existence with other s/w application.After completion of Port testing the responsible release team is conducting TRAINING SESSIONS to endusers or customer site people.

    H) Testing During MaintenanceAfter completion Release testing, the customer site people are utilizing that s/w for required purposes.

    During this utilization, the customer site people are sending Change Requests to our company.The responsible teato handle that change is called CCB (Change Control Board).This team consists of project manager, fewdevelopers, few testers and few h/w engineers.This team will receive 2 types of change requests such asEnhancement and Missed/Latent defects.

    Case study:

    Testing phase/level/stage Responsible Testing techniqueIn analysis Business analyst Walk through,Inspections and Peer reviews

    In design Designer Walk through,Inspections and Peer reviews

    Unit testing Programmer White box testing

    Integration/Interface testing Programmer Top down,Bottom up and Hybrid system approach

    Systemtesting(usability,functionalityand non-functionality)

    Testing team Black box testing

    UAT Real/Model customers Alpha and Beta testing

    Release testing Release team Port testing factors

    Testing during maintenance CCB Test s/w changes(Regression testing)

    Risks and Ad-Hoc testingSometimes, organizations are not able to conduct planned testing.Due to some risks, the testing teams are

    conducting Ad-Hoc testing instead of planned testing.There are different styles of Ad-Hoc testing.*Monkey or Chimpanzee testing: Due to lack of time, the testing team is covering Main activities of s/w

    functionalities.*Buddy testing: Due to lack of time, the developers and testers are grouping as Buddies.Every buddy

    consists of developer and tester to continue process parallel.*Exploratory testing: In general, the testing team is conducting testing w.r.t available documents.Due to lac

    of documentation, the testing team is depending on past experience, discussions with others, similar projectsbrowsing and internet surfing.This style of testing is exploratory testing.

  • 8/14/2019 16103293 Software Testing Practial

    14/118

    *Pair testing: Due to lack of skills, the junior test engineers are grouping with senior test engineers toshare their knowledge.

    *Debugging testing: To estimate the efficiency of testing people the development team is releasing build totesting team with known defects.

    The above Ad-Hoc testing styles are also known as INFORMAL TESTING TECHNIQUES.

    System testing processIn general,the small and medium scale organizations are maintaining the separate testing team only forSystem testing stage.This stage is bottle neck stage in s/w development.

    Development process VS System Testing process

    Test InitiationIn general, the system testing process starts with Test Initiation or Test Commencement.In this stage,the

    Project Manager or Test Manager selects a reasonable approach or reasonable methodology to be followed by theseparate testing team.This approach or methodology is called Test Strategy.

    The Test Strategy document consists of below components.1. Scope and Objective: The importance of testing in this project.2. Business Issues: The Cost and Time allocation for testing (100%cost=Deve&maintenance+36%Testing)3. Test Approach: Selected list of reasonable testing factors or issues w.r.t the requirements in project, scop

    of requirements and risks involved in testing.4. Roles and Responsibilities: The names of jobs in testing team and their responsibilities.

    5. Communication & Status reporting: Required negotiations in b/w every 2 consecutive jobs in testing tea

  • 8/14/2019 16103293 Software Testing Practial

    15/118

    6. Test Automation and Tools: The importance of Test Automation in this project testing and the names ofavailable testing tools in our organization.

    7. Defect Reporting and Tracking: The required negotiation in b/w developers and testers to report and toresolve defects.

    8. Testing Measurements and Metrics: Selected lists of measures and metrics to estimate testing process.9. Risks and Assumptions: List of expected risks will come in future and solution to over come.

    10. Change and Configuration management: The management of deliverables related to s/w development an

    testing.11. Training Plan: The required number of training sessions before starting current project testing process by

    testing team.

    Testing Process or Issues:

    To define a quality s/w, there are 15 factors or issues.1. Authorization: The validity of users to connect to that s/w. (Security testing)EX: login with password, digital signatures etc.2. Access Control: The permissions of users to use functionalities in a s/w.(Security testing)EX: Admin user performs all functionalities and general users perform some of functionalities.3. Audit Trail: The correctness of data about data (Metadata).This is functionality testing

    4. Data Integrity: Correctness of taking inputs (Functionality testing).EX: Testing AGE object inputs5. Correctness: The correctness of Output or Outcome.(Functionality testing)EX: Successful Mailbox opens after Login6. Ease of use: User friendliness (Usability testing and manual support testing)EX: Color, font, alignment, etc.7. Ease of Operate: Easy to maintain in our environment (Installation testing)EX: Installation, Uninstallation, Downloading, etc8. Portable: Run on different platforms (Compatibility testing and configuration testing)EX: Java products are running on Windows and UNIX.9. Performance: Speed of processing (Load, Volume and Stress testing)

    EX: 3 seconds is a performance of a website for link operation.10. Reliability: Recover from Abnormal situation (Recovery and Stress testing)EX: Backup of database in a s/w11. Coupling: Co-Existence with other s/w application to share common resources (Inter System testing)EX: The Bank account s/w database is shareable to loan system s/w in a bank.12. Maintainable: Whether our s/w is longtime serviceable to customer site people/not.(Compliance TestinEX: Complan food13. Methodology: Whether the project team is following specified standards or not(Compliance testing)14. Service Levels: Order of functionalities or features or services (Functionality and stress testing)EX: After login the s/w is providing mailing facility to users15. Continuity of processing: Means Inter process communication (Integration testing by developers)

    Case Study:15 Test factors for a quality s/w-4 (requirements)

    _____11+2 (scope of requirements)

    ______13-4 (risks)

    ______9 (Finalized factors to be applied)

    ______

  • 8/14/2019 16103293 Software Testing Practial

    16/118

    In above example,9 test factors or issues finalized by PM to be applied by testing team in current project systemtesting.

    Test PlanningAfter preparation of Test Strategy documents with required details,the test lead category people are definin

    test plan in terms of What to test,How to test,When to test and Who to test.

    In this stage,the test lead is preparing the system test plan and then divides that plan into module test plans(Master test plan into detailed test plans).In this test planning the test lead is following below approach to preparetest plans.

    a)Testing team formation: In general, the test planning process is starting with testing team formation bytest lead.In this team formation the test lead is depending on below factors.

    *Project size(EX: number of functionality points).*Availability of test engineers.*Available test duration*Availability of test environment resources(EX: Testing tools)

    b)Identify Tactical risks:After completion of testing team formation,the test lead is analyzing possible risw.r.t team.Example risks are

    *Lack of knowledge on project requirement domain*Lack of time*Lack of resources*Delays in Delivery*Lack of Documentation*Lack of development process seriousness*Lack of communication

    c)Prepare Test plans:After completion of testing team formation and risks analysis,the test lead is

    concentrating on Master Test plan and detailed test plans development.Every test plan document follows a fixedformatIEEE 829(Institute of Electrical and Electronics Engineering).These IEEE 829 standards are speciallydesigned for test documentation.The Format is

    1. Test Plan Id: Unique number or name for future reference.2. Introduction: About project.3. Test Items: Names of all modules or features4. Features to be tested: The names of modules or features to test.5. Features not to be tested: The names of modules or features which are already tested. 3, 4 and 5 indicates what to test.6. Tests to be applied: The selected list of testing techniques to be applied (From Test Strategy ofProject manager)

    7. Test Environment: Required h/w and s/w including testing tools.

  • 8/14/2019 16103293 Software Testing Practial

    17/118

    8. Entry Criteria: When the test engineers are able to start test execution for defect in s/w build.*prepared all valid test cases*Establishment of test environment*received stable build from developers

    9. Suspension Criteria: When the test engineers are interrupting test execution*Test environment is not working*High severe bug or show stopper problem detected

    *Pending defects are not serious but more (called quality gap)10. Exit Criteria: When the test engineers are stopping test execution

    *All major bugs are resolved*All modules or features tested*crossed scheduled time

    11. Test Deliverables: The names of testing documents to be prepared by test engineers*test scenarios*Test case documents*test logs*Defect logs*Summary reports

    6 to 9 indicatesHow to test12. Staff & Training needs: The selected names of test engineers and required no of training session13. Responsibilities: The work allocation in terms of test engineers VS requirements or test

    engineersVS testing techniques.

    12 and 13 indicates Who to Test14. Schedule: Dates and Time. It indicates When to Test15. Risks and Assumptions: Previously analyzed list of risks and their assumptions16. Approvals: The signatures of test lead and project manager or test manager.

    d) Review Test plan:After completion of master and detailed test plans preparation,the test lead is

    reviewing the documentation for completeness and correctness.In that review meeting,the test lead is depending onthe following factors

    *Requirements oriented plans review.*Testing techniques oriented plans review.*Risks oriented plans review.

    After completion of this review the project management is conducting training sessions to selected testengineers.In this training period the project management is inviting subject experts or domain experts to share theknowledge with engineers.

    Test Design

    After completion of required training, the responsible test engineers are concentrating on test casespreparation.Every Test case defines a unique test condition to be applied on our s/w build.There are 3 types ofmethods to prepare test case

    *Functional and System specification based test case design*Use cases based test case design*User Interface or Application based test case design

    1.Functional and System specification based:In general,the maximum test engineers are preparing testcase depending on functional and system specifications in SRS.

  • 8/14/2019 16103293 Software Testing Practial

    18/118

  • 8/14/2019 16103293 Software Testing Practial

    19/118

    Test case title3: check login operation

    Specification2:In an Insurance application,users can apply for different types of Insurance policies.When a user apply for

    Type A insurance,system asks age of that user.The age value should be greater than 16yrs and should be less than70yrs.Prepare test case titles or scenarios.Test case title1: check Type a selection as insurance type.Test case title2: check focus to age after selection of type A.Test case title3: check age value.

    Specification3:In a shopping application, the users can apply for different types of items purchase orders.Every purchase

    order is allowing user to select item number and entry of quantity upto 10.Every purchase order returns 1 item pricand total amount.Prepare test case titles or scenarios.

    Test case title1: check item number selection.Test case title2: check quantity value.

    Test case title3: check return values using Total amount = Price * Quantity.

    Specificatio4:A door opened when a person comes to infront of that door and the door closed when that person comes in

    inside.Prepare test case titles or scenarios.Test case title1: check door open.

  • 8/14/2019 16103293 Software Testing Practial

    20/118

    Test case title2: check door closed.

    Test case title3: check door operation when a person is standing at the middle of the door.

    *Specification5:In an e-banking application users are connecting to bank server using internet connection.In this applicatio

    user are filling below fields to login to bank server.Password - 6 digits number.Area code - 3 digits number and optional.Prefix - 3 digits number and does not start with 0 and 1.Suffix - 6 digits alphanumeric.Commands - cheque deposit, money transfer, mini statement and bills pay.Prepare test case titles or scenarios.

    Test case title1: check password.

    Test case title2: check area code.

    Test case title3: check prefix.

    Test case title4: check suffix.

  • 8/14/2019 16103293 Software Testing Practial

    21/118

    Test case title5: check command selection.Test case title6: check login operation to connect to bank server.

    Specification6:For a computer shutdown operation prepare test case titles or scenarios.

    Test case title1: check Shutdown option selection using Start menu.Test case title2: check Shutdown option selection using Alt+F4.Test case title3: check Shutdown operation using Command prompt.Test case title4: check Shutdown operation using Shutdown option in start menu.Test case title5: check Shutdown operation when a process is in running.

    Test case title6: check Shutdown operation using Power off button.

    Specification7:For washing machine operations prepare test case titles.

    Test case title1: check power supply to washing machine.Test case title2: check door open.Test case title3: check water filling.Test case title4: check cloths filling.Test case title5: check door closed.Test case title6: check door closed when clothes over flow.Test case title7: check selection of washing setting.

    Test case title8: check washing operation.Test case title9: check washing operation with improper power supply.Test case title10: check its operation when door opened in middle of the process (Security testing).Test case title11: check its operation when water is leaked from door (Security testing).Test case title12: check its operation with cloths over load (Stress testing).Test case title13: check with improper settings.Test case title14: check with any machinery problem.

    Specification8:Money withdrawl from ATM with all rules and regulations.

    Test case title1: check ATM card insertion.

    Test case title2: check operation with card insertion in wrong way.Test case title3: check operation with invalid card insertion (like other bank card, time expired, scratches etc).Test case title4: check entry pin number.Test case title5: check operation when you entered wrong pin number 3 times consequently.Test case title6: check language selection.Test case title7: check account type selection.Test case title8: check operation when you have select wrong account w.r.t that inserted card.Test case title9: check withdrawl option selection.Test case title10: check amount entry.Test case title11: check operation when you entered amount with wrong denominations (EX: withdrawl of Rs999)Test case title12: check withdrawl operation success (received correct amount, getting right receipt and able to tak

    card back).

  • 8/14/2019 16103293 Software Testing Practial

    22/118

    Test case title13: check withdrawl operation when the given amount is greater than possible balance.Test case title14: check withdrawl operation when your ATM machine is having lack of amount.Test case title15: check withdrawl operation when your ATM has machinery or network problem.Test case title16: check withdrawl operation when our given card amount is greater than day limit.Test case title17: check withdrawl operation when our current transaction number is greater than number oftransactions per day.Test case title18: check withdrawl operation when you click cancel after insertion of card.

    Test case title19: check withdrawl operation when you click cancel after entering pin number.Test case title20: check withdrawl operation when you click cancel after language selection.Test case title21: check withdrawl operation when you click cancel after type selection.Test case title22: check withdrawl operation when you click cancel after withdrawl selection.Test case title23: check withdrawl operation when you click cancel after entry of amount.

    NOTE: After completion of required test case titles or scenarios, the test engineers are preparing test casedocuments with all required details.

    Test case document format IEEE829:

    1) Test case id: unique number or name for future reference.

    2) Test case title or name: the previously selection test case title or scenario.3) Feature to be tested: the name of module or function or service.4) Test suit id: the name of test batch, which consists of a set of dependent test cases including current case.5) Priority: the importance of test case.

    *p0 for Functional test cases.*p1 for Non-Functional test cases.*p2 for Usability test cases.

    6) Test environment: the required h/w and s/w to run this test case on build.7) Test effort: the expected time to run this test case on build.(person or hour, 20min approximately).8) Test duration: the expected date and time schedule for this test case.9) Test setup or pre condition: the necessary tasks to do before start this case execution on our application build

    (EX: first register to login).10) Test procedure or data matrix: test procedure is

    Data matrix format is

    11) Test case passes or fails criteria: the final result of test case after execution on build or AUT.

    NOTE: In above test case format, the test engineers are preparing test procedure when that test case is covering anoperation.And they are preparing data matrix when that test case is covering an object (taking inputs).

    NOTE: In general the test engineers are not filling above like lengthy format of test cases.To save their job time,test engineers are filling some of the fields and remember remaining fields value manually.

    NOTE: In general, the test engineers are preparing test cases documents in MS Excel or available test managementool (like test director).

  • 8/14/2019 16103293 Software Testing Practial

    23/118

    Specification9:A login process is authorizing users using userid and password.The userid object is allowing alpha numeri

    in lower case from 4 to 16 characters long.The password object is allowing alphabets in lower case from 4 to 8characters long.Prepare test case documents.Document1:

    *test case id: TC_Login_Sri_14_11_06_1.*test case name: check userid.

    *test suit id: TS_Login.*priority: p0.*test setup: userid object is taking inputs.*data matrix:

    Document2:*test case id: TC_Login_Sri_14_11_06_2.*test case name: check password.*test suit id: TS_Login.*priority: p0.*test setup: password object is taking inputs.*data matrix:

    Document3:*test case id: TC_Login_Sri_15_11_06_3.*test case name: check login operation.*test suit id: TS_Login.*priority: p0.*test setup: valid and invalid userid and password object values given.*test procedure:

    Specification10:In a bank application,the bank employees are creating fixed deposit forms with the help of customers given

    data.This fixed deposit form is taking below values from bank employees.Depositor name: alphabets in lower case with initial as capital.Amount: 1500-100000.Tenure: upto 12 months.

  • 8/14/2019 16103293 Software Testing Practial

    24/118

    Interest: numeric with one decimal.In this fixed deposit operation if the tenure>10 months then the interest is also greater than 10%.Prepare te

    case documents.Document1:

    *test case id: TC_FD_Sri_15_11_06_1.*test case name: check deposit name.*test suit id: TS_FD.

    *priority: p0.*test setup: deposit name is taking inputs.*data matrix:

    Document2:

    *test case id: TC_FD_Sri_15_11_06_2.*test case name: check amount.*test suit id: TS_FD.*priority: p0.*test setup: amount is taking inputs.*data matrix:

    Document3:*test case id: TC_FD_Sri_15_11_06_3.*test case name: check tenure.*test suit id: TS_FD.*priority: p0.*test setup: tenure is taking inputs.*data matrix:

    Document4:*test case id: TC_FD_Sri_15_11_06_4.*test case name: check interest.*test suit id: TS_FD.*priority: p0.*test setup: interest is taking inputs.*data matrix:

  • 8/14/2019 16103293 Software Testing Practial

    25/118

    Document5:*test case id: TC_FD_Sri_15_11_06_5.*test case name: check fixed deposit operation.*test suit id: TS_FD.*priority: p0.*test setup: valid and invalid values are available in hand.*test procedure:

    Document6:*test case id:TC_FD_Sri_15_11_06_6.*test case name:check tenure and interest rule.*test suit id:TS_FD.*priority:p0.*test setup:valid and invalid values are available.*test procedure:

    Specification11:Readers Paradise is a library management system.This s/w is allowing new users through registration.In th

    new registration,the s/w is taking details from users and then return(o/p) person identity number likeRP_Date_XXXX(EX:RP_15_11_06_1111).Fields in Registration form.

    User name:alphabets in capital.Address:street name(alphabets),city name(alphabets), and pin code(numerics).DOB:day month year as valid(in date / is taken automatically in this project).e-mail id:valid ids and optional([email protected])userid:1-256 characters and 0-9.sitename:1-256 characters and numbers 0-9.sitetype:1-3 characters.

    mailto:[email protected]:[email protected]
  • 8/14/2019 16103293 Software Testing Practial

    26/118

    Prepare test case documents.Document1:

    *test case id:TC_RP_Sri_15_11_06_1.*test case name:check user name.*test suit id:TS_RP.*priority:p0.*test setup:user name object is taking inputs.

    *data matrix:

    Document2:*test case id:TC_RP_Sri_15_11_06_2.*test case name:check street name.*test suit id:TS_RP.

    *priority:p0.*test setup:street object is taking inputs.*data matrix:

    Document3:*test case id:TC_RP_Sri_15_11_06_3.*test case name:check city name.

    *test suit id:TS_RP.*priority:p0.*test setup:city name object is taking inputs.*data matrix:

    Document4:*test case id:TC_RP_Sri_15_11_06_4.*test case name:check pincode.*test suit id:TS_RP.*priority:p0.*test setup:pincode object is taking inputs.*data matrix:

  • 8/14/2019 16103293 Software Testing Practial

    27/118

    Document5:*test case id:TC_RP_Sri_15_11_06_5.*test case name:check date.*test suit id:TS_RP.*priority:p0.*test setup:date object is taking inputs.*data matrix:

    Decision table:

    Day Month Year

    01-3101-30

    01-2801-29

    01,03,05,07,08,10,1204,06,09,11

    0202

    00-9900-99

    00-99Leap year in b/w 00-99

    Document6:*test case id:TC_RP_Sri_16_11_06_6.*test case name:check e-mail id.*test suit id:TS_RP.*priority:p0.*test setup:e-mail object is taking inputs.*data matrix:

    Document7:check registration.*test case id:TC_RP_Sri_16_11_06_7.*test case name:check registration.*test suit id:TS_RP.*priority:p0.*test setup:all valid and invalid values available in hand.*test procedure:

  • 8/14/2019 16103293 Software Testing Practial

    28/118

    2.Usecases based Test case design:Usecases are more elaborative than functional and system specificatioin SRS.In this usecase oriented test case design,test engineers are not taking their own assumptions.

    From the above model,the usecase defines How to use a functionality.Every test case defines How to test a

    functionality.Every test case is derived from usecase.Depending on agreement,the responsible development teammanagement people or responsible testing team management people are developing usecases depending onfunctional and system specifications in SRS.

    Usecase format:

    1)usecase id:unique number or name.2)usecase description:the summary of requirement.3)actors:the type of users,which are accessing this requirements in our application build.

    4)preconditions:necessary tasks to do before start this requirement functionality.5)event list:a step by step procedure with required input and expected output.6)post conditions:necessary tasks to do after completion of this requirement functionality.7)flow diagram:pictorial presentation of requirement functionality.8)prototype:a sample screen to indicate requirement functionality.9)business rules:a list of rules and regulations if possible.10)alternative flows:a list of alternative events to do same requirement functionality,if possible.11)dependent usecases:a list of usecases related to this usecase.

    Depending on above like formatted usecases,test engineers are preparing test cases sans any their ownassumptions because the usecases are providing all details about corresponding requirement functionality.

    Usecase1:*usecase id:UC_Login.*usecase desc:a login process allows user id and password to authorize users.*actors:registered users(they have valid id and password).*pre conditions:every user registered before going to login.*event list:activate login window.

    enter user id as alpha numerics in lower case from 4 to 16 char long.enter password as alphabets in lower case 4 to 8 characters.click SUBMIT button.

    *post conditions:mail box opened after succesfull login,error message for unsucessfull login.*flow diagram:

  • 8/14/2019 16103293 Software Testing Practial

    29/118

    *prototype:

    *business rules: none.*alternative flows: none.*dependent usecase: new user registration and mailbox open.

    Prepare test case documents.Document1:

    *test case id: TC_Login_Sri_16_11_06_1.*test case name: check userid.*test suit id: TS_Login.*priority: p0.*test setup: userid object is taking inputs.*data matrix:

    Document2:*test case id: TC_Login_Sri_16_11_06_2.*test case name: check password.*test suit id: TS_Login.*priority: p0.*test setup: password object is taking inputs.*data matrix:

    Document3:*test case id: TC_Login_Sri_16_11_06_3.*test case name: check login.*test suit id: TS_Login.*priority: p0.*test setup: login form is verified.

  • 8/14/2019 16103293 Software Testing Practial

    30/118

    *test procedure:

    Step no Task/Event Required input Expected output

    123

    Activate login windowEnter userid and pwdClick SUBMIT

    NoneUserid and pwdvalid & validvalid & invalid

    invalid & validvalid & blankblank & value

    Userid and pwd are empty by defaultSUBMIT button enabledMail box openedError message

    Error messageError messageError message

    Usecase2:*use case id: UC_Book_Issue.*usecase desc: administrator opens the book issue form and enters book id to know the availability and ifAvailable, issues book to the valid user.

    *actors: administrators and valid users.*pre conditions: the administrator and user should be registered.*event list: check for book by entering the bookid in bookid field and click GO (EX: RP_XXXX).

    Check the availability from the message window that is displayed on GO click.For a given user id verifies whether the user is valid or not (EX: RP_Date_XXXX).If the user is valid, then the message window is displayed on GO click.Issue that book through Click issue Dutton.if the book is not available or the user is not valid then click Cancel.

    *post condition: issue the book.*flow diagram:

    *prototype:

    *business rules: none.*alternative flows: none.*dependent usecase: user registration.

  • 8/14/2019 16103293 Software Testing Practial

    31/118

    Prepare test case documents.Document1:

    *test case id: TC_BookIssue _Sri_16_11_06_1.*test case name: check book id format.*test suit id: TS_BookIssue.

    *priority: p0.*test setup: book id object is taking inputs.*data matrix:

    Document2:*test case id: TC_BookIssue _Sri_17_11_06_2.*test case name: check GO for availability verification.*test suit id: TS_BookIssue.*priority: p0.*test setup: valid and invalid book ids are available in hand.*test procedure:

    Step no Task/Event Required input Expected

    12

    activate BookIssue windowenter bookid and click GO

    noneavailable bookidunavailable

    bookid object focusedmessage as Availablemessage as Unavailable

    Document3:*test case id: TC_BookIssue _Sri_17_11_06_3.*test case name: check user id value.*test suit id: TS_BookIssue.*priority: p0.*test setup: user id object takes some value.*test matrix:

    Document4:*test case id: TC_BookIssue _Sri_17_11_06_4.*test case name: check user id validation by GO.*test suit id: TS_BookIssue.*priority: p0.*test setup: valid and invalid user ids are available in hand.*test procedure:

    Step no Task/Event Required input Expected

    123

    activate BookIssue windowenter bookid and click GOenter user id and click GO

    noneavailable bookidvalid id

    invalid id

    bookid object focusedmessage as Available and focus to user message as Issue book permitted

    not permitted.Cancel message

  • 8/14/2019 16103293 Software Testing Practial

    32/118

    Document5:*test case id: TC_BookIssue _Sri_17_11_06_5.*test case name: check BookIssue operation.*test suit id: TS_BookIssue.*priority: p0.*test setup: valid bookid and valid user id available in hand.*test procedure:

    Step no Task/Event Required input Expected123

    4

    activate BookIssue windowenter bookid and click GOenter user id and click GO

    click Issue

    noneavailable bookidvalid id

    none

    bookid object focusedmessage as Available and focus to user message as Valid user and Issue buttonenabledAcknowledgement.

    Document6:*test case id: TC_BookIssue _Sri_17_11_06_6.*test case name: check Cancel operation.*test suit id: TS_BookIssue.*priority: p0.

    *test setup: invalid bookid and invalid user id available in hand.*test procedure:

    Step no Task/Event Required input Expected

    123

    activate Book Issue windowenter bookid and click GOenter user id and click GO

    noneunavailable bookidun valid id

    bookid object focusedfocus to Cancel buttonfocus to Cancel button

    NOTE: In general, the maximum testing team are following functional and system specification based test casedesign depending on SRS.In this method,the test engineers are exploring their knowledge depending onSRS,previous experience,discussions with others,similar s/w browsing,internet surfing,etc.

    3.User Interface test case design:In general,the test engineers are preparingtest cases for functional andnon functional tests depending on any one of previous 2 methods.To prepare test cases for Usability testing,testengineers are depending on user interface based test case design.

    In this method, test engineers are identifying the interest of customer site people and user interface rules inmarket.Example test cases:

    Test case title1: check spelling.Test case title2: check font uniqueness in every screen.Test case title3: check style uniqueness in every screen.Test case title4: check labels initial letters as capitals.Test case title5: check alignment of object in every screen.

    Test case title6: check color contrast in every screen.Test case title7: check name spacing uniqueness in every screen.Test case title8: check spacing uniqueness in b/w label and object.Test case title9: check spacing in b/w objects.Test case title10: check dependent objects grouping.Test case title11: check borders of object groups.Test case title12: check tool tips of icons in all screens.Test case title13: check abbreviations or full forms.Test case title14: check multiple data objects positions in every screen (Ex: Dropdown list box, Menus

    (always at top), Tables and data windows).Test case title15: check scroll bars in every screen.

    Test case title16: check short cut keys in keyboards to operate our build.

  • 8/14/2019 16103293 Software Testing Practial

    33/118

    Test case title17: check visibility of all icons in every screen.Test case title18: check help documents (Manual support testing).Test case title19: check identity controls (EX: title of s/w, version of s/w, logo of company, copy wright of

    win).

    NOTE: Above usability test cases are applicable on any GUI application for Usability testing.For these test cases,p2 is given as priority by testers.

    NOTE: The maximum above usability test cases are STATIC because the test cases are applicable on build sansoperating.

    Review Test casesAfter completion of all reasonable test cases testing team is conducting a review meeting for completeness

    and correctness.In this review meeting the test lead is depending on below factors.*requirements oriented coverage.*testing techniques oriented coverage.

    Case StudyProject Flight Reservation.Feature to be tested login.Tests to be conducted usability, functional and non functional (compatibility and performance) testing.

    Test case titles:1. Functional testing:

    *check agent name.*check password.*check login operation.*check Cancel operation.*check help button.

    2. Non functional testing:*compatibility testing: check login in windows 2000, xp, win NT. (these are customer expected

    platforms)*load testing: check login performance under customer expected load.*stress testing: check login reliability under various load levels.

    3. Usability testing:*refer user interface test cases examples given.

    Test Execution

    After completion of test cases design and review, the testing people are concentrating on test execution.Inthis stage the testing people are communicating with development team for features negotiations.

    a)Formal Meeting: The test execution process is starting with a small formal meeting.In this meeting,thePM,project leads,developers,test leads and test engineers are involved.In this meeting the members are confirmingthe architecture of required environment.

  • 8/14/2019 16103293 Software Testing Practial

    34/118

    SCM: Software Configuration or Change Management.TCDB: Test Cases Database

    DR: Data Repository.SCM repository consists of:*Development documents (project plan, BRS, SRS, HLD and LLD).*Environment files (required s/w used in this project).*Unit and integration test cases.*S/w coding (build or AUT).TCDB consists of:*Test case titles or scenarios.*Test case and defects reference.*Test log (test results).*Test case documents with reference.

    DR consists of:*Defect details (report them to developers).*Defect fix details (they are accepted or rejected by developers).*Defect communication details.*Defect test details (when accepted some side effects may come).

    b)Build version control: After confirming required environment the formal review meeting members areconcentrating on build version control.From this concept the development people are assigning unique versionnumber to every modified build after solving defects.This version numbering system is understandable to testingteam.

    c)Levels of test execution:After completion of formal review meeting, the testing people are concentratin

    on the finalization of test execution levels.*Level -0 testing on Initial build.*Level-1 testing on Stable build or Working build.*Level-2 testing on Modified build.*Level-3 testing on Master build.*UAT on release build.

  • 8/14/2019 16103293 Software Testing Practial

    35/118

    *Finally Golden build is released to customer site.

    d)Levels of Test execution VS Test cases:Level-0: Selected P0 test cases (mainly functionality area)Level-1: All P0, P1, P2 test cases (entire build)

    Level-2: Selected P0, P1 and P2 test cases (modified functionalities in build)Level-3: Selected P0, P1 and P2 test cases (high defect density areas in build)

    e)Level-0 (Sanity testing):Practically the test execution process is starting with Sanity test execution toestimate stability of that build.In this Sanity testing,the test engineers are concentrating on below factors throughthe coverage of basic functionality in that build.

    *Understandable.(On seeing the project)*Operatable.(No hanging during operation)*Observable (know its flow)*Controllable (do the operation and undo the operation)*Consistency (in functionality)

    *Simplicity (means less navigation required)*Maintainable (in testers system)*Automatable (whether some tools are applicable or not)The above like level-0 Sanity testing is estimating Testing ability of build.This testing is also known as

    Sanity testing or Testability testing or Build Acceptance testing or Build verification testing or Tester acceptancetesting or Octangle testing(Above 8 testing factors).

    f)Level-1(Comprehensive testing):After completion of Sanity testing test engineers are conducting level1real testing to detect defects in build.In this level,the test engineers are executing all test cases either in manual or iautomation as test batches.Every test batch consists of a set of defined test cases.This test batch is also known asTest Suit or Test set or Test Chain or Test build.

    In these test cases execution as batches on the build the test engineers are preparing test log documents wit3 types of entries.

    *Passed: All our test case expected values are equal to build actual values.*Failed: Any one expected value is not equal to build actual value.*Blocked: Our test case execution postponed due to incorrect parent functionality.In this Level-1 test execution as test batches, the test engineers are following below approach.

  • 8/14/2019 16103293 Software Testing Practial

    36/118

    From the above approach,the test engineers are skipping some test cases due to lack of time for testexecution.The final status of every test case is CLOSED or SKIP.

    g)Level-2(Regression testing):During level1 Comprehensive testing the test engineers are reportingmismatches in b/w our test case expected values and build actual values as defect reports.After receiving defectreports from testers,the developers are conducting a review meeting to fix defects.If our defect accepted by thedevelopers then they are performing changes in coding and then they will releaseModified Build with Releasenote.The release note of a modified build is describing the changes in that modified build to resolve reporteddefects.

    Test engineers are going to plan Regression testing to conduct on that modified build w.r.t release note.Approach to Regression testing:*receive modified build along with release note from developers.*apply Sanity or Smoke test on that modified build.*Select test cases to be executed on that modified build w.r.t modifications specified in release note.*run that selected test cases on that modified build to ensure correctness of modifications sans having side

    effects in that build.In above regular Regression testing approach, the selection of test cases w.r.t modifications is critical task.

    Due to this reason, the test engineers are following some standardized process models for regression testing.

    Case1: If the development team resolved defect severity is high, then the test engineers are re-executing alfunctional, all non-functional and maximum usability test cases on that modified build to ensure the correctness of

    modifications sans side effects.Case2: If the development team resolved defect severity is medium, then the test engineers are re-executin

    all functional, maximum non-functional and some usability test cases on that modified build to ensure thecorrectness of modifications sans side effects.

    Case3:If the development team resolved defect severity is low,then the test engineers are re-executing somfunctional,some non-functional and some usability test cases on that modified build to ensure the correctness ofmodifications sans side effects.

    Case4: If the development team released modified build due to sudden changes in customer requirements,then the test engineers are performing changes in corresponding test cases and then re-executing that test case onthat modified build to ensure the correctness of modifications w.r.t changes in requirements.

    After completion of the required level of regression testing, the test engineers are continuing remaining

    level1 test execution.

  • 8/14/2019 16103293 Software Testing Practial

    37/118

    Test ReportingDuring level1 and level2 test execution test engineers are reporting mismatches to development team as

    defects.The defect is also known as Error or Issue or Bug.A programmer detected a problem in program is called ERROR.The tester detected a problem in build is called DEFECT or ISSUE.The reported defect or issue accepted to resolve is called BUG.In this defect reporting to developers the test engineers are following a standard defect report format

    (IEEE829).

    a)Defect Report:*defect id: the unique name or number.*description: the summary of the defect.*build version id: the version number of build, in this build the test engineer detect defect.*feature: the name of module or function, in that module the test engineers find this defect.*test case title: title of failed test case.*detected by: name of the test engineer.*detected on: date of the defect detection and submition.*status: New (reporting first time), Re-Open (re reporting).

    *severity:the seriousness of defect in terms of functionality.If it is High or Show stopper,then not able tocontinue testing sans resolving that defect.If it is Medium or Major,then able to continue testing but Manditory toresolve.If it is Low or Minor,then able to continue testing and may or may not to resolve.

    *priority: the importance of defect to resolve in terms of customer (high, medium, low ex name).*reproduceable: Yes or No.Yes means defect appears every time in test execution (then attach test

    procedure).No means defect rarely appears in test execution (then attach snapshot and test procedure.Snapshot istaken by Print screen button when defect is occurred).

    *assigned to: the name of responsible person to receive this defect at development site.*suggested fix: the suggestion to accept or reject the defect.It is optional.

    NOTE: The defect priority is modifiable by PM and project lead also.

    NOTE: In general the test engineers are reporting defect to development team after getting permission from testlead.

    NOTE: In application oriented s/w development test engineers are reporting defects to customer site also.

    b)Defect submission process:

  • 8/14/2019 16103293 Software Testing Practial

    38/118

  • 8/14/2019 16103293 Software Testing Practial

    39/118

    Invalid label of object w.r.t functionality (medium priority)Improper right alignment (low priority)

    *error handling defects (medium severity)Error message not coming for wrong operation (high priority)Wrong error message is coming for wrong operation (medium)Correct error message but incomplete (low)

    *input domain defects (medium severity)

    Does not taking valid input (high)Taking valid and invalid also (medium)Taking valid type and valid size values but the range is exceeded (low)

    *manipulations defects (high severity)Wrong output (high)Valid output with out having decimal points (medium)Valid output with rounded decimal points (low)EX: actual answer is 10.96High (13), medium (10) and low (10.9)

    *race conditions defects (high)Hang or dead lock (show stopper and high priority)

    Invalid order of functionalities (medium)Application build is running on some of platforms only (low)

    *h/w related defects (high)Device is not connecting (high)Device is connecting but returning wrong output (medium)Device is connecting and returning correct output but incomplete (low)

    *load condition defects (high)Does not allow customer expected load (high)Allow customer expected load on some of the functionalities (medium)Allowing customer expected load on all functionalities w.r.t benchmarks (low)

    *source defects (medium)

    Wrong help document (high)Incomplete help document (medium)Correct and complete help but complex to understand (low)

    *version control defects (medium)Unwanted differences in b/w old build and modified build

    *id control defects (medium)Logo missing, wrong logo, version number missing, copy right window missing, team members

    names missing.

    Test Closure(UAT)After completion of all reasonable test cycles completion the test lead is conducting a review meeting to

    estimate the completeness and correctness of the test execution.If the test execution status is equal to EXITCRITERIA then testing team is going to stop testing.Otherwise the team will continue remaining test executionw.r.t available time.In this test closure review meeting the test lead is depending on below factors.

    a)Coverage analysis:

    *requirements oriented coverages*testing techniques oriented coverages

    b)Defect density:

    *modules or functionalizes

  • 8/14/2019 16103293 Software Testing Practial

    40/118

    C need for regression testingc)Analysis of deferred defects:

    *whether the all deferred defects are postpone able or not

    NOTE: In general the project management is deferring low severity and low priority defects only.After completion of above test closure review meeting the testing team is concentrating on level-3 test

    execution.This level of testing is also known as Postmatern testing orFinal regression testing orPre-acceptanctesting.In this test execution the test engineers are following below approach.

    In above like Final regression testing the test engineers are concentrating on high defect density modules ofunctionalities only.If they got any defect in this level, they are called as Golden defect or Lucky defect.Afterresolving the all golden defects, the testing team is concentrating on UAT along with developers.

    UAT(User Acceptance Testing)

    In this level, the PM is concentrating on the feedback of the real customer site people or model customer speople.There is 2 ways in UAT as follows

    *Alpha testing.*Beta testing.

    Sign OffAfter completion of UAT and their modifications,the test lead is conducting sign off review.In this review

    the test lead is garnering all testing documents from test engineers as Test Strategy,System test plan and detailed teplans,Test scenarios or titles,Test case documents,Test log,Defect report and

    Final defects summary reports (defect id, description, severity, detected by and status (closed or deferred))Reuqirements Traceability matrix (RTM) (reqid, test case, defected, status).

    *RTM is mapping in b/w requirements and defects via test cases.

    Case Study1. Test Initiation

    Done by Project manager.Deliver test strategy document.Test Responsibility Matrix (RTM) defines reasonable tests to be applied (part in test strategy).

    2. Test PlanningDone by Test lead or senior test engineer.Deliver System test plan and detailed test plans.

    Follows IEEE 829 document standards.

  • 8/14/2019 16103293 Software Testing Practial

    41/118

    3. Test DesignDone by test engineer.Deliver test scenarios or titles and test documents.

    4. Test ExecutionDone by test engineer.Prepare automation programs (if possible).Deliver test logs or test results.

    5. Test ReportingDone test engineer and test lead.Send defect reports.Receive modified build with release note.

    6. Test ClosureDone by test lead and test engineer.Plan post marten testing.Initiate UAT (User Acceptance testing).

    7. Sign OffDone by test lead

    Garner all test documentsFinalize RTM (Requirements traceability Matrix).

    Case Study (3 to 5 months test plan)

    Test Deliverable Responsibility Completion Time

    Test strategy Project or test manager 5-10 days

    Test plan Test lead 5-10 days

    Training sessions to test engineers Business analyst or subject ordomain experts

    10-15 days

    Test scenarios or title selection Test engineer 5-10 days

    Test cases documents Test engineer 5-10 days

    Review test cases Test lead and Test engineer 1-2 days

    Receive initial build and Sanitytesting

    Test engineer 1-2 days

    Test automation (if possible) Test engineer 10-15 days

    Test execution (level1 and level2) Test engineer 30-40 days

    Test reporting Test engineer and test lead On going

    Status reporting Test lead Weekly twice

    Test closure and post marten Test lead and Test engineer 5-7 days

    UAT (User Acceptance Testing) Customer/ model customers withinvolvement of developers & testers

    5-7 days

    Sign off Test lead 1-2 days

  • 8/14/2019 16103293 Software Testing Practial

    42/118

    Manual Testing VS Automation TestingIn general, the test engineers are executing Test cases manually. To save test execution time and to decreas

    complexity in manual testing, the engineers are using Test automation. The test automation is possible for twomanual tests.

    * Functional testing.

    * Performance testing (of non functional testing).WinRunner, QTP (Quick Time Professional), Rational Robot and Silk Test are Functional testing Tools.LoadRunner, Rational Load Test, Silk performer and Jmeter are Performance Testing Tools to automate load andstress testing.The organizations are using tools for test management also.Ex: Test Editor, Quality Center and Rational Manager.

  • 8/14/2019 16103293 Software Testing Practial

    43/118

    WinRunner8.0

    * Released in January 2005.* Developed by Mercury Interactive (taken over by HP).

    * Functional Testing Tool.* Support VB,Java,VB.NET,HTML,Power Builder(PB),Delphi,D2K,VC++ and Siebel technology buildsfor Functional testing.

    * To support above technologies including XML, SAP, People Soft, ORACLE Apps and Multi Media wecan use QTP.

    * WinRunner runs on Windows platform.* XRunner runs on Linux and UNIX.* WinRunner converts our manual test cases into TSL programs (Test Scripting Languages) as automation

    like C.

    Objective

    * Study Functional and system specification or Use Cases.* Prepare Functional test cases in English.* Convert to TSL programs (automation).

    WinRunner Test Approach

    * Select manual Functional test cases to be automated.* Receive Stable build from developers (after Sanity Testing).* Create TSL program for that selected test cases on that Stable build.* Make those programs as Test batches.* Run Test batches on that build to detect mismatching.* Analyze results and report mismatch (if required).

    Add-In ManagerThis Window list out all WinRunner supported technologies w.r.t license. The test engineers are selecting thecurrent application Build technology.

    Icons in WinRunner Screen

  • 8/14/2019 16103293 Software Testing Practial

    44/118

    Automation Test creation in WinRunnerWinRunner is a functionality testing tool. Test engineers are automating corresponding manual functional

    test cases into automation programs in two steps.1) Recording or description of build actions (means operating) 2) Inserting Check Points

    1) Recording Modes: To generate automation program, test engineers are recording build actions or operation

    in 2 types of modes* Content Sensitive mode* Analog mode

    Content Sensitive mode: In this mode, the WinRunner is recording all mouse and key board operations w.r.tObjects and Windows in our application build. To select this mode we can use below options

    * Click start record icon* Text menu -> Context Sensitive option

    Analog mode: In this mode, the WinRunner is recording all mouse pointer movements w.r.t desktop coordinates.To select analog mode we can use below options

    * Click Start Record icon twice

    * Text menu -> Analog option (Example: Recording digital signatures, graphs drawing and imagemovements)

    NOTE: To change from one mode to another mode, the test engineers are using F2 as a short key.

    NOTE: In Analog mode, the WinRunner is recording mouse pointer movements w.r.t desktop coordinates insteadof windows and objects. Due to this reason, the test engineers are maintaining corresponding window position onthe desktop and monitor resolution as constant.

    2) Check Points: After completion of required action or operations recording, the test engineers are insertingrequired check points into that recorded script. The WinRunner8.0 is supporting 4 types of check points.

    *GUI check point.*Bitmap check point.*Database check point.*Text check point.Every check point is comparing test engineer given expected value and build actual value. The above 4

    check points are automating all functional test coverages on our application build.*GUI or Behavioral coverages.*Error handling coverages.*Input domain coverages.*Manipulation coverages.*Backend coverages.

    *Order of functionality coverages.

    GUI check point: To check behavior of objects in our application windows, we can use this check point. Thischeck point consists of three sub options.

    *For single property.*For object or window.*For multiple objects.

  • 8/14/2019 16103293 Software Testing Practial

    45/118

    a) For single property: to verify one property of one object we can use this option (like starting a Mile with onestep)EX1: Manual test case

    Test case id: TC_EX_SRI_24NOV_1Test case name: check Delete Order buttonTest suit id: TS_EXPriority: P0Test set up: already one record is inserted to deleteTest procedure:

    Step no Event Input required Expected output

    1 Focus to Flight Reservation window None Delete Order button disabled

    2 Open an order Valid order number Delete order button enabled

    Build -> Flight Reservation windowAutomation program:

    set_window (Flight Reservation, 1);button_check_info (Delete Order, enabled, 0);#check point on Delete order buttonmenu_select_item (File; Open order .);set_window (Open Order, 1);button_set (Order no, ON);edit_set (Edit,1);button_press (OK);set_window (Flight Reservation, 1);button_check_info (Delete Order, enabled, 1);#check point on Delete order button

    EX2: Manual test caseTest case id: TC_EX_SRI_24NOV_2Test case name: check Update Order buttonTest suit id: TS_EXPriority: P0Test set up: already one valid record is inserted to updateTest procedure:

    Step no Event Input required Expected output

    1 Focus to Flight Reservation window None Update Order button disabled

    2 Open an order Valid order number Update Order button disabled

    3 Perform a change Valid change is required Update Order enabled

    Build -> Flight Reservation window

  • 8/14/2019 16103293 Software Testing Practial

    46/118

    Automation program:set_window (Flight Reservation, 2);button_check_info (Update Order, enabled, 0);#check point on Update order buttonmenu_select_item (File; Open order .);set_window (Open Order, 4);button_set (Order no, ON);

    edit_set (Edit,1);button_press (OK);set_window (Flight Reservation, 1);button_check_info (Update Order, enabled, 0);#check point on Update order buttonset_window (Flight Reservation, 2);button_set (First, ON);button_check_info (Update Order, enabled, 1);#check point on Update order button

    Context Sensitive statements in TSL*Focus to a window: set_window (Window name, time to focus);*Click BUTTON: button_press (Button name);*Select a Radio button: button_set (Radio button name, ON);*Select a Check box: button_set (Check box name, ON);*Fill a Text box: edit_set (Text box name, input);*Fill a Password text box: edit_set (Password text box name, given text Encrypted value);*Select an item in a List box: list_select_item (list box name, selected item name);*Select an Option Menu: menu_select_item (Menu name; Option name);*texit():We can use this statement to quit or Terminate from test execution like as pause.

    EX3: Manual test case

    Test case id: TC_EX_SRI_28NOV_3Test case name: check OK buttonTest suit id: TS_EXPriority: P0Test set up: all input objects are taking valuesTest procedure:

    Step no Event Input required Expected output

    1 Focus to Sample window None OK button disabled

    2 Enter Name Valid OK button enabled

    Build -> Sample

    Automation program:set_window (Sample, 5);button_check_info (OK, enabled, 0);edit_set (Name, Sri);button_check_info (OK, enabled, 1);

    EX4: Manual test case

    Test case id: TC_EX_SRI_28NOV_4

  • 8/14/2019 16103293 Software Testing Practial

    47/118

    Test case name: check SUBMIT buttonTest suit id: TS_EXPriority: P0Test set up: all input objects are taking valuesTest procedure:

    Step no Event Input required Expected output

    1 Focus to Registration window None SUBMIT button disabled

    2 Enter Name Valid SUBMIT button disabled3 Select Gender as M or F None SUBMIT button disabled

    4 Say Y/F for Passport availability None SUBMIT button disabled

    5 Select Country None SUBMIT button enabled

    Build->Registration form

    Automation program:set_window (Registration, 5);button_check_info (SUBMIT, enabled, 0);edit_set (Name, Sri);button_check_info (SUBMIT, enabled, 0);button_set (Male, ON);button_check_info (SUBMIT, enabled, 0);button_set (YES, ON);button_check_info (SUBMIT, enabled, 0);list_select_item (COUNTRY,INDIA);button_check_info (SUBMIT, enabled, 1);

    Case StudyObject Type Testable Properties

    Push button Enabled (0 or 1), Focus

    Radio button Enabled (0 or 1), Status (ON or OFF)

    Check box Enabled (0 or 1), Status (ON or OFF)

    List or Combo box Enabled (0 or 1), Count, Value (of selected item)

    Menu Enabled (0 or 1), Count

    Edit or Text box Enabled (0 or 1), Focused, Value, Range, Regularexpression (text or pwd), Data format, Timeformat..

    Table guard Rows count, Columns count, Cell count

    EX5: Manual test caseTest case id: TC_EX_SRI_28NOV_5Test case name: check Flight to countTest suit id: TS_EXPriority: P0Test set up: Fly From and Fly To consists of valid city nameTest procedure:

  • 8/14/2019 16103293 Software Testing Practial

    48/118

    Step no Event Input required Expected output

    1 Focus to Journey window and select onecity name in Fly From

    None Fly To count decreased by one

    Build -> Journey

    Automation program:set_window (Journey, 5);list_get_info (Fly To, count, x);list_select_item (Fly From, VIZ);list_check_info (Fly To, count, x-1);

    EX6: Manual test caseTest case id: TC_EX_SRI_28NOV_6Test case name: check Message value

    Test suit id: TS_EXPriority: P0Test set up: all valid names are available for MessagesTest procedure:

    Step no Event Input required Expected output

    1 Focus to Display window None OK button disabled

    2 Select a Name None OK button enabled

    3 Click OK None Coming message is equal to selected message

    Build->Display form

    Automation program:

    set_window (Registration, 5);button_check_info (OK, enabled, 0);list_select_item (Name, Sri);list_get_info (Name, value, x);button_check_info (OK, enabled, 1);button_press (OK);edit_check_info (Message, value, x);

    EX7: Manual test caseTest case id: TC_EX_SRI_28NOV_7Test case name: check SUM buttonTest suit id: TS_EXPriority: P0Test set up: input objects consists of numeric valuesTest procedure:

    Step no Event Input required Expected output

    1 Focus to Addition window None OK button disabled

  • 8/14/2019 16103293 Software Testing Practial

    49/118

    2 Select input one None OK button disabled

    3 Select input two None OK button enabled

    4 OK click none Coming output is equal to addition of 2 inputs

    Build->Addition form

    Automation program:set_window (Addition, 5);button_check_info (OK, enabled, 0);list_select_item (INPUT1, 20);list_get_info (INPUT1, value, x);button_check_info (OK, enabled, 0);list_select_item (INPUT2, 4);list_get_info (INPUT2, value, y);button_check_info (OK, enabled, 1);button_press (OK);edit_check_info (SUM, value, x+y);

    EX8: Manual test caseTest case id: TC_EX_SRI_29NOV_8Test case name: check Age, Gender and Qualification objectsTest suit id: TS_EXPriority: P0Test set up: all insurance policy types are availableTest procedure:

    Step no Event Input required Expected output

    1 Focus to Insurance window andselect type of insurance policy

    None If type is A then Age is focused.If type is B then Gender is focused.If other then Qualification is focused

    Build->Insurance form

    Automation program:set_window (Insurance, 5);list_select_item (Type, xx);

    list_get_info (Type, value, x);

  • 8/14/2019 16103293 Software Testing Practial

    50/118

    if (x= = A)edit_check_info (Age, focused, 1);else if (x= = B)list_check_info (Gender, focused, 1);Elselist_check_info (Qualification, focused, 1);

    EX9: Manual test case

    Test case id: TC_EX_SRI_29NOV_9Test case name: check Student gradeTest suit id: TS_EXPriority: P0Test set up: all valid students marks already feededTest procedure:

    Step no Event Input required Expected output

    1 Focus to Student window None OK button disabled

    2 Select a Student roll number None OK button enabled

    3 OK click none Returns total marks and gradeIf total >= 800 then grade is A.

    If total >= 700 and = 600 and = 800)edit_check_info (Grade, value, A);Else if (x=700)

    edit_check_info (Grade, value, B);Else if (x=600)edit_check_info (Grade, value, C);Elseedit_check_info (Grade, value, D);

    EX10: Manual test caseTest case id: TC_EX_SRI_29NOV_10Test case name: check Gross salary of an EmployeeTest suit id: TS_EXPriority: P0Test set up: all valid employees Basic salaries are feeded.

    Test procedure:

  • 8/14/2019 16103293 Software Testing Practial

    51/118

    Step no Event Input required Expected output

    1 Focus to Employee window None OK button disabled

    2 Select a Employee number None OK button enabled

    3 OK click None Returns Basic and Gross salariesIf Basic >= 15000 then Gross=Basic+10% of BasicIf Basic=8000 Gross=Basic+5%of Basic

    If Basic < 8000 then Gross=Basic+200Build->Employee Form

    Automation program:set_window (Employee, 5);

    button_check_info (OK, enabled, 0);list_select_item (Empno, xxx);button_check_info (OK, enabled, 1);button_press (OK);edit_get_info (Basic, value, x);If (x> = 15000)edit_check_info (Gross, value, x+ (10/100)*x);Else if (x=8000)edit_check_info (Gross, value, x+ (5/100)*x);Elseedit_check_info (Gross, value, x+200);

    b) For Object or Window: To verify more than oneproperties of one object, we can use this option.EXAMPLE:

    *Update order button is disabled after focus to window*Update order button disabled after open a record*Update order button enabled and focused after perform a change. (Here one object with TWO properties)

    Build -> Flight Reservation windowAutomation program:

    set_window (Flight Reservation, 5);button_check_info (Update Order, enabled, 0);#check point on Update order buttonmenu_select_item (File; Open order .);

    set_window (Open Order, 4);button_set (Order no, ON);edit_set (Edit,1);button_press (OK);set_window (Flight Reservation, 1);button_check_info (Update Order, enabled, 0);#check point on Update order buttonset_window (Flight Reservation, 2);button_set (First, ON);obj_check_gui (Update Order, list1.ckl, gui1, 1);#check point for MULTIPLE properties

    SYNTAX for Multiple properties:

  • 8/14/2019 16103293 Software Testing Practial

    52/118

    obj_check_gui (Object name, CheckListfile.ckl, Expected values file (GUI), Time);In above syntax

    CHECKLIST FILE specifies the selected list of propertiesEXPECTED VALUES FILE specifies the selected values for that properties

    c) For Multiple Objects: To check more than one property of more Objects, we can use this option. (Objects musbe in same WINDOW).EXAMPLE

    *Insert, Delete and Update order buttons are disabled after focus to window.*Insert and Update order buttons are disabled, Delete order button is enabled after open a record.*Insert order button is disabled; Update order button enabled and focused and Delete button is enabled afte

    perform a change.Automation program:

    set_window (Flight Reservation, 5);win_check_gui (Flight Reservation, list1.ckl, gui1, 1);

    #check pointmenu_select_item (File; Open order .);set_window (Open Order, 4);button_set (Order no, ON);edit_set (Edit,1);button_press (OK);win_check_gui (Flight Reservation, list2.ckl, gui2, 1);#check pointset_window (Flight Reservation, 1);button_set (First, ON);win_check_gui (Flight Reservation, list3.ckl, gui3, 1);

    #check pointSyntax for Multiple Objects:

    win_check_gui (Window name, CheckListfile.ckl, Expected values file (GUI), Time);NOTE: This check point is applicable on more than one object in a same windowNavigation to insert Check Point:

    *Select a position in Script*Choose Insert Menu option*In it, choose GUI Check point*Then select sub option as For Multiple objects*Click Add button and select Testable objects*Now Right click to relive from selection

    *Select required properties with expected values*Click OK

    EX11: Manual test caseTest case id: TC_EX_SRI_30NOV_11Test case name: check value of ticketsTest suit id: TS_EXPriority: P0Test set up: all valid records feeded.Test procedure:

    Step no Event Input required Expected output

    1 Focus to Flight Reservation

    window and Open an order

    Valid Order no No of Tickets value is numeric up to 10

  • 8/14/2019 16103293 Software Testing Practial

    53/118

    NOTE: Testing the TYPE OF VALUE of an Object is called as REGULAR EXPRESSION.Build->Employee FormAutomation program:

    set_window (Flight Reservation, 2);menu_select_item (File; Open order .);set_window (Open Order, 4);button_set (Order no, ON);

    edit_set (Edit,1);button_press (OK);set_window (Flight Reservation, 1);obj_check_gui (Tickets, list1.chl, gui, 1);#Check point for Range and Regular Expression with 0 to 10 and [0-9]* (* is for multiple positions)

    EX12: Prepare Regular expression for Alpha numeric[a-zA-Z0-9]*

    EX13: Prepare Regular expression for Alpha numeric in lower case with initial as capital.[A-Z][a-z0-9]*

    EX14: Prepare Regular expression for Alpha numeric in lower case but start with capital and end with lower case.[A-Z][a-z0-9]*[a-z]

    EX15: Prepare Regular expression for Alpha numeric in lower case with underscore, which does not start with _[a-z0-9][a-z0-9_]*[a-z0-9]

    EX16: Prepare Regular expression for Yahoo mail user idChanges in Check points: most Irritating part in s/w testing

    Due to sudden changes in customer requirements or mistakes in test creation, the test engineers areperforming changes in existing check points.

    a) Changes in expected values:*Run our test*Open result*Perform changes in expected values*Click OK

    *Close results*Re-execute that test

    b) Add new properties:*Insert Menu*edit GUI checklist*select Checklist file name*click OK*select new properties for testing*click OK*click OK to over write checklist file*click OK after reading suggestion to update

    *change run mode to Update mode and run from top (the modified checklist is taking default valueas expected)

    *run our test script in verify mode to get results*analyze that results manually

    If the defaults expected are not correct, then test engineers are changing that expected values and re run thetest.

    Bitmap check point: (binary presentation of an image). It is an optional check point in functional testing. Testengineers are using this option to compare images. This check point is supporting static images only. This checkpoint consists of 2 sub options

    1) For object or window bitmap: To compare our expected image with our application build actual image, wcan use this option.

    EX: logo testing

  • 8/14/2019 16103293 Software Testing Practial

    54/118

  • 8/14/2019 16103293 Software Testing Practial

    55/118

    NOTE: The GUI check point is Manditory, but the Bitmap check point is optional because all windows are notconsisting of imagesDatabase Check point: The GUI and Bitmap check points are applicable on our application build front end screeonly. This Database check point is applicable on our application build back end tables to estimate the impact offront end screen operation on back end tables content. This checking is called DATABASE OR BACK ENDTESTING.

    To automate Database or Back end testing, the database check point in WinRunner is following below approach

    * Database check point wizard is connecting to our application build Database.* execute a Select statement on that connected database* Retrieve selected data from database into an XL sheet.* The test engineer is analyzing that selected data to estimate the completeness and correctness of the front

    end operation impact on that database.To follow above approach using WinRunner for database testing, test engineers are collecting some

    information from development team.* The name of connectivity in b/w application build front end windows and back end database* The names of Tables including columns* Front end screens VS Back end tablesThe above information is available in Database Design Documents (DD