The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

82
MB Full-Day Tutorial 9/30/2013 8:30:00 AM "The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More" Presented by: Hans Buwalda LogiGear Corporation Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ [email protected] www.sqe.com

description

Large-scale testing projects can severely stress many of the testing practices we have gotten used to over the year. This can result in less than optimal outcomes. A number of innovative ideas and concepts have emerged to support industrial-strength testing of large and complex projects. Hans Buwalda shares his experiences and the strategies he's developed and used for large testing on large projects. Learn how to design tests specifically for automation and how to successfully incorporate keyword testing. The automation discussion will include virtualization and cloud options, how to deal with numerous versions and configurations common to large projects, and how to handle the complexity added by mobile devices. Hans also outlines the possibilities and pitfalls of outsourcing test automation. The information presented is based on his nineteen years of worldwide experience with testing and test automation involving large projects with test cases executing continuously for many weeks on multiple machines.

Transcript of The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

Page 1: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

MB Full-Day Tutorial

9/30/2013 8:30:00 AM

"The Challenges of BIG Testing:

Automation, Virtualization,

Outsourcing, and More"

Presented by:

Hans Buwalda

LogiGear Corporation

Brought to you by:

340 Corporate Way, Suite 300, Orange Park, FL 32073

888-268-8770 ∙ 904-278-0524 ∙ [email protected] ∙ www.sqe.com

Page 2: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

Hans Buwalda

LogiGear

An internationally recognized expert in testing, Hans Buwalda is a pioneer of keyword-driven

test automation, an approach now widely adopted throughout the testing industry. Originally

from the Netherlands, Hans is the CTO of LogiGear, directing the development of the successful

Action Based Testing™ methodology for keyword-driven test automation and its supporting

TestArchitect™ toolset. Prior to joining LogiGear, Hans served as project director at CMG (now

CFI).

Page 3: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

1

© 2013 LogiGear Corporation. All Rights Reserved

Hans Buwalda

LogiGear

Automation,

Virtualization,

Outsourcing, and More

STAREAST 2013, Tutorial MB

Orlando, Monday April 29

The Challenges of

BIG Testing

© 2013 LogiGear Corporation. All Rights Reserved

Introduction

industries

roles in testing

Page 4: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

2

© 2013 LogiGear Corporation. All Rights Reserved

About LogiGear

Software testing company, around since 1994

Testing and test automation expertise, services and

tooling consultancy, training

test development and automation services

"test integrated" development services

Aims to be thought leader, in particular for large and

complex test projects

Products: TestArchitect™, TestArchitect for Visual Studio™

integrating test development with test management and automation

based on modularized keyword-driven testing

www.logigear.com

www.testarchitect.com

© 2013 LogiGear Corporation. All Rights Reserved

About Hans

Dutch guy, living and working in California since 2001, as

CTO of LogiGear

Background in math, computer science, management

Original career in management consultancy, since 1994

focusing on testing and test automation keywords, agile testing, big testing, . . .

www.happytester.com

hans @ logigear.com

Page 5: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

3

© 2013 LogiGear Corporation. All Rights Reserved

Topics for today

Automation

Designing and organizing tests

Executing tests

Team, organization and process

Off-shoring, globalization

© 2013 LogiGear Corporation. All Rights Reserved

What is "BIG"

Big efforts in development, automation, execution and/or follow up

It takes a long time and/or large capacity to run tests (lot of tests, lot

of versions, lot of configurations, ...)

Scalability, short term and long term

Complexity, functional, technical

Number and diversity of players and stakeholders pigs, chicken, elephants, ankle biters, ...

Various definitions of "big" possible... and relevant... "10 machines" or "10 acres"

"1000 tests" or "1000 weeks of testing"

Big today means: big for you "non trivial", you need to think about it

"Windows 8 has undergone more than

1,240,000,000 hours of testing" Steven Sinofsky, Microsoft, 2012

Page 6: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

4

© 2013 LogiGear Corporation. All Rights Reserved

Existential Questions

Why test?

Why not test?

Why automate tests?

Why not automate tests?

© 2013 LogiGear Corporation. All Rights Reserved

Why test?

People expect us to do

Somebody wants us to

Increases certainty and control Showing absence of problems

Finds faults, saving time, money, damage Showing presence of problems

Page 7: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

5

© 2013 LogiGear Corporation. All Rights Reserved

Why not test?

It costs time and money

You might find problems . . .

We forgot to plan for it

We need the resources for development

It is difficult

It's hard to manage

© 2013 LogiGear Corporation. All Rights Reserved

Why Automate Tests?

It is more fun

Can save time and money potentially improving time-to-market

Can capture key application knowledge in a re-

usable way

Consolidates a structured way of working when established as integral part of system

development process

Can speeds up development life cycles

Execution typically is more reliable a robot is not subjective

Page 8: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

6

© 2013 LogiGear Corporation. All Rights Reserved

The Power of Robot Perception

FINISHED FILES ARE THE RE

SULT OF YEARS OF SCIENTI

FIC STUDY COMBINED WITH

THE EXPERIENCE OF YEARS...

© 2013 LogiGear Corporation. All Rights Reserved

Why not Automate?

Can rule out the human elements promotes "mechanical" testing

might not find "unexpected" problems

More sensitive to good practices pitfalls are plentiful

Creates more software to manage

Needs/uses technical expertise in the test team

Tends to dominate the testing process at the cost of good test development

maintenance can crush automation...

Page 9: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

7

© 2013 LogiGear Corporation. All Rights Reserved

Olny srmat poelpe can raed tihs.

I cdnuolt blveiee taht I cluod aulaclty uesdnatnrd

waht I was rdanieg. The phaonmneal pweor of the

hmuan mnid, aoccdrnig to a rscheearch at

Cmabrigde Uinervtisy, it deosn't mttaer in waht

oredr the ltteers in a wrod are, the olny iprmoatnt

tihng is taht the frist and lsat ltteer be in the rghit

pclae. The rset can be a taotl mses and you can

sitll raed it wouthit a porbelm. Tihs is bcuseae the

huamn mnid deos not raed ervey lteter by istlef,

but the wrod as a wlohe.

The Power of Human Perception

© 2013 LogiGear Corporation. All Rights Reserved

About tests in big projects

Regular tests may be activities, complex tests are products. In fact any test that you want to run more than once is a product

Every test that is written down with sufficient detail should be automated

Automation No longer an option in most situations Also a key prerequisite of most agile approaches

How tests are written and automated can make or break large scale testing

Page 10: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

8

© 2013 LogiGear Corporation. All Rights Reserved

Keywords, essential for scalability

Distinguish tasks for test development and for automation

The test developer creates tests using "actions" (my term)

Each action consists of a keyword and arguments

The automation task focuses on automating the actions

Each action is automated only once

number name quantity

new product P-9009 Sledge Hammer 5

number quantity

add quantity P-9009 20

add quantity P-9009 3

add quantity P-9009 6

number quantity

check quantity P-9009 34

actions, each with a

keyword and

arguments

"34" is the

expected value

here

read from top

to bottom

fragment from a test with actions

© 2013 LogiGear Corporation. All Rights Reserved

Potential benefits of keywords

More tests, better tests more breadth

more depth

Fast, results can be quickly available the design directly drives the automation

Separates the tests from the technical scripting language easier to involve business subject matter experts

the action format allows for easy readability

Less efforts for automation "script free" in most cases

Automation more stable and maintainable limited and manageable impact of changes in the system under test

Develop tests more early in the life cycle deal with execution details later

. . .

Page 11: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

9

© 2013 LogiGear Corporation. All Rights Reserved

Risks of keywords

Often seen as silver bullet, complications are

underestimated often treated as a technical "trick"

testers can get squeezed and marginalized • developers and users dictating tests

• automation engineers dictating actions

or testers get the automation responsibility, thus becoming pseudo

programmers

The method needs understanding and experience to be

successful pitfalls are many, and can have a negative effect on the outcome

Lack of method and structure can risk manageability maintainability not as good as hoped

results can be disappointing, approach will be blamed

© 2013 LogiGear Corporation. All Rights Reserved

Case: International Financial Project

One of the largest projects to date with action words

Over 10 000 windows, meant for use in 85 countries

Long development cycle (400 pp, 4 years and counting)

Maintenance very hard

Testing major bottleneck

Much investment in automation techniques were needed

to become successful

Also a lot of attention for team and work environment

helped the success

Team of 35 test developers, 2 automation engineers

Page 12: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

10

© 2013 LogiGear Corporation. All Rights Reserved

Keywords need a method

By themselves keywords don't provide much scalability they can even backfire and make automation more cumbersome

a method can help tell you which keywords to use when, and how to

organize the process

Today we'll look at Action Based Testing (ABT) addresses test management, test development and automation

large focus on test design as the main driver for automation success

Central deliveries in ABT are the "Test Modules" developed in spreadsheets

each test module contains "test objectives" and "test cases"

each test module is a separate (mini) project, each test module can

involve different stake holders

© 2013 LogiGear Corporation. All Rights Reserved

Example of an ABT test module

Consists of an (1) initial part, (2) test cases and (3) a final part

Focus is on readability, and a clear scope

Navigation details are avoided, unless they're meant to be tested

TEST MODULE Car Rental Payments

user

start system john

TEST CASE TC 01 Rent some cars

first name last name car

rent car John Doe Ford Escape

rent car John Doe Chevvy Volt

last name amount

check payment Doe 140.4

FINAL

close application

Page 13: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

11

© 2013 LogiGear Corporation. All Rights Reserved

Example of a "low level" test module

In "low level" tests interaction details are not hidden, since they are

the target of the test

The right level of abstraction depends on the scope of the test, and is

an outcome of your test design process

TEST MODULE Screen Flow

user

start system john

TEST CASE TC 01 "New Order" button

first name control

click main new order

window

check window exists new order

FINAL

close application

© 2013 LogiGear Corporation. All Rights Reserved

Re-use actions to make new actions

In the below example we use another sheet, but if you code actions,

you could do something similar

Often low level tests are re-used into these action definitions

ACTION DEFINITION check payment

user default value

argument last name Jones

argument amount

window control value

enter main last name # last name

window control

click main view balance

window control expected

check main balance # amount

Page 14: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

12

© 2013 LogiGear Corporation. All Rights Reserved

Action Based Testing

Test Development Plan

Test Cases

Test

Objectives

Test Module 1

Test Cases

Test

Objectives

Test Module 2

Test Cases

Test

Objectives

Test Module N

Actions

. . .

ACTION AUTOMATION

Break down

Automate

© 2013 LogiGear Corporation. All Rights Reserved

Case: Stock Exchange

Transition from floor-based to screen-based trade

Created on basis of an existing standard package result: very little specifications

Consisting of four major, different, systems that need to work in real-

time

Failures and bugs are not an option: core of the financial system of the country, 100K revenue per second

traders not necessarily following rules

In-depth knowledge limited to four people nicknamed "The Four Daltons", after characters in a French comic book series about the

wild west

none of the four Daltons was involved in testing, testing was in a vacuum

Three months to go... test development (and scripted automation) had failed

test department not cooperating well with developers and domain experts

internal and external auditors had raised the alarm

and... the Dutch Crown Prince was scheduled to put the system into use!!

The Four Daltons (French comic book characters)

Page 15: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

13

© 2013 LogiGear Corporation. All Rights Reserved

Case: Stock Exchange

Test set: make it comprehensive

make it in-depth and aggressive

make it easy to assess and approve

Organization: get the right people involved (testing, automation, etc)

use scarce resources efficiently (in particular the four Daltons)

work with stake holders to let the process be transparent

Technical: use of the keyword method ("action words")

use "test objectives" so auditors can see quickly what you're testing

use great test design, don't mix apples and oranges

"Sign off lubrication": auditors signed off on the tests, not the test results

"the test is complete", not "the system works well"

Results: deadline was met one day before final date

the automated tests were the only ones used for acceptance

no functional errors found afterwards

© 2013 LogiGear Corporation. All Rights Reserved

Question

What is wrong with the

following pictures?

Page 16: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

14

© 2013 LogiGear Corporation. All Rights Reserved

No Y2K Problems in Auckland Airport??

© 2013 LogiGear Corporation. All Rights Reserved

Anything wrong with this instruction ?

You should change your battery or switch to outlet

power immediately to keep from losing your work.

Page 17: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

15

© 2013 LogiGear Corporation. All Rights Reserved

Why Better Test Development?

Many tests are often mechanical now blindly follows specs or reqs

which often suits ok, but lacks aggression

no combinations, no unexpected situations

"methodical" does not have to mean "mechanical"

For a higher “ambition level” you need understanding of the system under test, and the business under test

analytical understanding of what could go wrong

creativity, and the commitment to use it

Poor test development results in cumbersome automation due to lack of focus

tedious retest cycles, loosing the agile advantage

Are your suffering from lame tests too?

© 2013 LogiGear Corporation. All Rights Reserved

Test Design

Effective test breakdown (into test modules) make sure every test module has a clear focus

keep different kinds and levels of tests separate

Right level of actions as “high level” if possible, hiding as many details as

much as possible

but not if the details are relevant for the test

It is my believe that successful automation is not a technical

challenge. It is most of all a test design challenge.

Page 18: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

16

© 2013 LogiGear Corporation. All Rights Reserved

Case: A Financial Project

Large project, with many consultants embraced the approach

however, they were from our competitor

One of the first with action words

However, the project was confident about their

test development, no help needed

Result: many very hard to maintain tests, and

way too many action words crushing complexity

almost the end of the action words method

one memo saved the day

© 2013 LogiGear Corporation. All Rights Reserved

The Three “Holy Grails” of Test Design

Metaphor to depict three main steps in test design

Using "grail" to illustrate that there is no one perfect

solution, but that it matters to pay attention (to search)

About quality of tests, but most of all about scalability and

maintainability in BIG projects

Right approach for each test module

Proper level of detail in the test specification

Organization of tests into test modules

Page 19: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

17

© 2013 LogiGear Corporation. All Rights Reserved

Case for organizing tests in BIG projects

Can help keep the volume down

Isolate the complexities

Efficient and re-usable automation

Deal with changing requirements

For example: much of tested subject matter is not

system specific, but business specific a mortgage is a mortgage

© 2013 LogiGear Corporation. All Rights Reserved

What's the trick...

Page 20: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

18

© 2013 LogiGear Corporation. All Rights Reserved

What's the trick...

Have or acquire facilities to store and organize

you content

Edit your stuff

Decide where to put what assign and label the shelves

Put it there

If the organization is not sufficient anymore, add

to it or change it

© 2013 LogiGear Corporation. All Rights Reserved

Properties of a good Breakdown

Test modules are well differentiated and clear in

scope

Reflects the level of tests

Balanced in size and amount

Modules are mutually independent

Fit the priorities and planning of the project

Page 21: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

19

© 2013 LogiGear Corporation. All Rights Reserved

Breakdown Criteria

Straightforward Criteria Business tests versus interaction tests

Architecture of the system under test (client, server, protocol, sub

systems, components, modules, ...)

Functionality (customers, finances, management information, ...)

Kind of test (navigation flow, negative tests, response time, ...)

Ambition level (smoke test, regression, aggressive, …)

Additional Criteria Stakeholders (like "Accounting", "Compliance", "HR", ...)

Complexity of the test (put complex tests in separate modules)

Technical aspects of execution (special hardware, multi-station, ...)

Overall project planning (availability of information, timelines, sprints, ...)

Risks involved (extra test modules for high risk areas)

© 2013 LogiGear Corporation. All Rights Reserved

Example breakdown

Tests of user interface does function key F4 work

does listbox xyz the right values

is the tab order correct

Form Tests, do all the forms (dialogs, screens, pages) work: can data be entered and is it stored well

is displayed data correct

split these from everything else

Function tests, do individual functions work can I count the orders

Alternate paths in use cases can I cancel a transaction

End-to-end tests do all components of a system work well together in implementing the business processes

like enter sale order, then check inventory and accounting

Tests with specific automation needs like multi station tests

Tests of non-UI functions

High ambition level tests (aggressive tests) can I break the system under test

If in doubt: try high level first

Page 22: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

20

© 2013 LogiGear Corporation. All Rights Reserved

What is probably not a good design

Navigational and functional tests are mixed for example "over checking": a test of a premium calculation also

checks the existence of a window

You have to change all of them for every new release of

the system under test

All test modules have a similar design

Test modules are dependent on each other

You can’t start developing any test modules early in the

life cycle

© 2013 LogiGear Corporation. All Rights Reserved

Symptoms

Tediousness in the test and test automation

process

No sense of control

Complaining people

Unnecessary high test maintenance changes in the system under test impact many tests

hard to understand which tests need to be modified

Difficulties in running any test teams start "debugging" tests

Page 23: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

21

© 2013 LogiGear Corporation. All Rights Reserved

Questions for Test Design

Does your organization make

something like a high level test

design?

If yes, how do you document it?

© 2013 LogiGear Corporation. All Rights Reserved

Case Study

Large IT provider

New version of one of their major web-sites

Test scope was user acceptance test (functional

acceptance) the users were the “business owners”

Development was off-shore

Page 24: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

22

© 2013 LogiGear Corporation. All Rights Reserved

Case Study

Test development was done separate from

automation time-line for test development: May – Oct

time-line for automation (roughly): Jan – Feb

All tests were reviewed and approved by the

business owners acceptance was finished by the end of the test

development cycle

© 2013 LogiGear Corporation. All Rights Reserved

Example of a Test Development Plan

Nr Module Business Owner Date to BO

1 Portal Navigation, Audience Robyn Peterson 05 / 23

2 Portal Navigation, Search Ted Jones 05 / 27

3 Membership, registration Steve Shao 06 / 03

4 Portal Navigation, Category Ted Jones 06 / 08

5 Portal Navigation, Topic and Expert Ted Jones 06 / 13

6 Access Control Mike Soderfeldt 06 / 17

7 Portal Navigation, Task Ted Jones 06 / 22

8 Contact DSPP Ted Jones 06 / 27

9 Portal search Mike Soderfeldt 07 / 01

10 Membership, review and update Steve Shao 07 / 05

11 Program contact assignment Alan Lai 07 / 11

12 Company, registration Steve Shao 07 / 14

13 Catalog, view and query Robyn Peterson 07 / 19

14 Site map Ted Jones 07 / 25

15 Membership, affiliation Steve Shao 07 / 28

16 Learn about DSPP Ted Jones 08 / 01

17 Products and services Steve Shao, Robyn Peterson 08 / 08

18 What's new Ted Jones 08 / 11

19 Company, life cycle Steve Shao, Alan Lai 08 / 17

20 Specialized programs Ted Jones, Steve Shao 08 / 22

21 Customer surveys Ted Jones 08 / 29

22 Software downloads Mike Soderfeldt 09 / 01

23 Newsletters Ted Jones 09 / 06

24 Internationalization and localization Ted Jones 09 / 13

25 Membership, life cycles Steve Shao 09 / 19

26 Collaboration, forums Ted Jones 09 / 23

27 Collaboration, blogs Mike Soderfeldt 09 / 28

28 Collaboration, mailing lists Ted Jones 10 / 03

Page 25: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

23

© 2013 LogiGear Corporation. All Rights Reserved

Review Process with Stake Holders

Test Team sends draft

Module to Stake Holder

Stake Holder reviews:

- coverage

- correctness

Stake Holder returns

notes:

- additions

- corrections

Test Team receives and

processes notes

changes needed? Stake Holder returns

notice of approval

Test Team marks the

Module as "Final"

END

no

yes

START

© 2013 LogiGear Corporation. All Rights Reserved

Case Study, Results

All tests were developed and reviewed on schedule many notes and questions during test development phase

The automation was 100% of the tests all actions were automated, thus automating all test modules

The test development took an estimated 18 person

months one on-shore resource, two off-shore resources

The automation took between one and two months focused on actions

most time was spent in handling changes in the interface (layout of pages

etc)

Page 26: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

24

© 2013 LogiGear Corporation. All Rights Reserved

Case: The French Director

Mid size company

Struggling under high pressure

Testing of their main product, standard financial

software

Control and priority main issue

Unfamiliar business culture

Main instrument: module break down

© 2013 LogiGear Corporation. All Rights Reserved

Test Modules versus Test Cases

The test module is a bigger unit in the test design easier to identify

a chapter rather than a paragraph

easier to plan and manage, as a product (can be treated as part of product

backlog in scrum projects)

Better flow of execution each test case can set up for the next one

keep test modules independent, test cases can be dependent

Test cases become creative output, rather than stifling input avoids having to define all test cases at once early in the process

Clear scope helps to identify cases, actions and checks using "test objectives" to further detail scope

had a significant effect on maintainability

Page 27: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

25

© 2013 LogiGear Corporation. All Rights Reserved

"Thou Shall Not Debug Tests..."

Large and complex test projects can be hard to "get to

run"

If they are however, start with taking a good look again at

your test design...

Rule of thumb: don't debug tests. If tests don't run

smoothly, make sure: lower level tests have been successfully executed first -> UI flow in the AUT

is stable

actions and interface definitions have been tested sufficiently with their own

test modules -> automation can be trusted

are you test modules not too long and complex?

© 2013 LogiGear Corporation. All Rights Reserved

What about existing tests?

Compare to moving house: some effort can't be avoided

decide where to put what, then put it there

consider a moving company to help

Adopt the module model define the modules, and their scope

worry about the existing test cases later

Moving considerations be selective, moving is a chance, unlikely you get that

opportunity again

for the important modules: design as normal but harvest

from existing set

avoid porting over test cases "step by step", in particular

avoid over-checking

Page 28: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

26

© 2013 LogiGear Corporation. All Rights Reserved

Grail 2: Approach per Test Module

Plan the test module: when to develop: is enough specification available

when to execute: make sure the functionality at action level is well-

tested and working already

Process: do an intake: understand what is needed and devise an approach

analyze of requirements

formulate "test objectives"

create "test cases"

Identify stakeholders and their involvement: users, subject matter experts

developers

auditors

Choose testing techniques if applicable: boundary analysis, decision tables, transition diagrams, soap opera

testing, ...

© 2013 LogiGear Corporation. All Rights Reserved

Eye on the ball, Scope

Always know the scope of the test module

The scope should be unambiguous

The scope determines many things: what the test objectives are

which test cases to expect

what level of actions to use

what the checks are about and which events should

generate a warning or error (if a “lower” functionality is

wrong)

Page 29: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

27

© 2013 LogiGear Corporation. All Rights Reserved

What I have seen not work

"Over-Checking": having checks that don't fit the scope of

the test

Forcing data driven: making all tests data driven

(variables, data files) without clear reason

Combinatorial explosions: test all ... for all ... in all ...

All actions high level (or all actions low level)

Many tests for forms and dialogs, little tests for business

processes

Abundance of irrelevant comments, and lack of relevant

comments

© 2013 LogiGear Corporation. All Rights Reserved

Detail out the scope with test objectives

...

TO-3.51 The exit date must be after the entry date

...

test objective TO-3.51

name entry date exit date

enter employment Bill Goodfellow 2002-10-02 2002-10-01

check error message The exit date must be after the entry date.

Page 30: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

28

© 2013 LogiGear Corporation. All Rights Reserved

Examples of Testing Techniques

Equivalence class partitioning

any age between 18 and 65

Boundary condition analysis

try 17, 18, 19 and 64, 65, 66

Error guessing

try Cécile Schäfer to test sorting of

a name list

Exploratory

"Exploratory testing is

simultaneous learning, test design,

and test execution", James Bach,

www.satisfice.com

Error seeding

deliberately injecting faults in a test

version of the system, to see if the

tests catch them

handle with care, don't let the bugs

get into the production version

Decision tables

define possible situations and the

expected responses of the system

under test

State transition diagrams

identify "states" of the system, and

have your tests go through each

transition between states at least

once

Jungle Testing

focus on unexpected situations,

like hacking attacks

Soap Opera Testing

describe typical situations and

scenarios in the style of episodes

of a soap opera, with fixed

characters

high density of events,

exaggerated

make sure the system under test

can still handle these

© 2013 LogiGear Corporation. All Rights Reserved

"Jungle Testing"

Expect the unexpected unexpected requests

unexpected situations (often data oriented)

deliberate attacks

how does a generic design respond to a specific unexpected event?

Difference in thinking coding bug: implementation is different from what was intended/specified

jungle bug: system does not respond well to an unexpected situation

To address study the matter (common hack attacks, ...)

make a risk analysis

make time to discuss about it (analysis, brainstorm)

involve people who can know

use "exploratory testing" (see James Bach's work on this)

use an agile approach for test development

consider randomized testing, like "monkey" testing

New York. The city of a million stories. Half of them are true, the other half just haven't happened yet -- Dr Who

Page 31: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

29

© 2013 LogiGear Corporation. All Rights Reserved

Soap Opera Testing

Informal scenario technique to invite subject-matter

experiences into the tests, and efficiently address multiple

objectives

Using a recurring theme, with “episodes”

About “real life”

But condensed

And more extreme

Typically created with a high involvement of end-users

and/or subject-matter experts

© 2013 LogiGear Corporation. All Rights Reserved

Lisa Crispin: Disorder Depot . . .

There are 20 preorders for George W. Bush action figures in "Enterprise", the ERP system, awaiting the receipt of the items in the warehouse. Finally, the great day arrives, and Jane at the warehouse receives 100 of the action figures as available inventory against the purchase order. She updates the item record in Enterprise to show it is no longer a preorder. Some time passes, during which the Enterprise background workflow to release preorders runs. The 20 orders are pick-released and sent down to the warehouse.

Source: Hans Buwalda, Soap Opera Testing (article), Better Software Magazine, February 2005

Page 32: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

30

© 2013 LogiGear Corporation. All Rights Reserved

Lisa Crispin: Disorder Depot . . .

Then Joe loses control of his forklift and accidentally drives it into the shelf containing the Bush action figures. All appear to be shredded to bits. Jane, horrified, removes all 100 items from available inventory with a miscellaneous issue. Meanwhile, more orders for this very popular item have come in to Enterprise. Sorting through the rubble, Jane and Joe find that 14 of the action figures have survived intact in their boxes. Jane adds them back into available inventory with a miscellaneous receipt.

© 2013 LogiGear Corporation. All Rights Reserved

Lisa Crispin: Disorder Depot . . .

This scenario tests

• Preorder process

• PO receipt process

• Miscellaneous receipt and issue

• Backorder process

• Pick-release process

• Preorder release process

• Warehouse cancels

Page 33: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

31

© 2013 LogiGear Corporation. All Rights Reserved

Vary your tests?

Automated tests have a tendency to be rigid, and

predictable

Real-world situations are not necessarily

predictable

Whenever possible try to vary: with select other data cases that still fit the goal of tests

with randomized behavior of the test

© 2013 LogiGear Corporation. All Rights Reserved

Generation and randomization techniques

Model-based use models of the system under test to create tests

see: Harry Robinson, www.model-based-testing.org, and Hans Buwalda, Better

Software, March 2003

Data driven testing apply one test scenario to multiple data elements

either coming from a file or produce by an automation

"Monkey testing" use automation to generate random data or behavior

"smart monkeys" will follow typical user behavior, most helpful in efficiency

"dumb monkeys" are more purely random, may find more unexpected issues

long simulations can expose bugs traditional tests won't find

Extended Random Regression have a large database of tests

randomly select and run them, for a very long time

this will expose bugs otherwise hidden

see Cem Kaner e.a.: "High Volume Test Automation", StarEast 2004

Page 34: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

32

© 2013 LogiGear Corporation. All Rights Reserved

Data Driven Testing

Separate test logic from the data

Possible origins for the data: earlier steps in the test

data table

randomizer, or other formula

external sources, like a database query

Use "variables" as placeholders in the test case,

instead of hard values

Data driven is powerful, but use modestly: value cannot be known at test time, or changes over time

having many data variations is meaningful for the test

© 2013 LogiGear Corporation. All Rights Reserved

Variables and expressions with keywords

This test does not need an absolute number for the

available cars, just wants to see if a stock is updated

As a convention we denote an assignment with ">>"

The "#" indicates an expression

TEST CASE TC 02 Rent some more cars

car available

get quantity Chevvy Volt >> volts

first name last name car

rent car John Doe Chevvy Volt

rent car John Doe Chevvy Volt

car expected

check quantity Chevvy Volt # volts - 2

Page 35: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

33

© 2013 LogiGear Corporation. All Rights Reserved

Data driven testing with keywords

The test lines will be repeated for each row in the data set

The values represented by "car", "first" and "last" come

from the selected row of the data set

DATA SET cars

car first last value

Chevvy Volt John Doe 40000

Ford Escape Mary Kane 22500

Chrysler 300 Jane Collins 29000

Buick Verano Tom Anderson 23000

BMW 750 Henry Smyth 87000

Toyota Corolla Vivian Major 16000

TEST CASE TC 03 Check stocks

data set

use data set /cars

car available

get quantity # car >> quantity

first name last name car

rent car # first # last # car

car expected

check quantity # car # quantity - 1

repeat for data set

© 2013 LogiGear Corporation. All Rights Reserved

Combinations

Input values determine equivalence classes of values for a variable or field

for each class pick a value (or randomize)

Options, settings

Configurations operating systems, operating system versions and flavors

• Windows service packs, Linux distributions

browsers, browser versions

protocol stacks (IPv4, IPv6, USB, ...)

processors

DBMS's

Combinations of all of the above

Trying all combinations will spin out of control quickly

Page 36: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

34

© 2013 LogiGear Corporation. All Rights Reserved

Pairwise versus exhaustive testing

Group values of variables in pairs (or tuples with more than 2)

Each pair (tuple) should occur in the test at least once maybe not in every run, but at least once before you assume "done"

consider to go through combinations round-robin, for example pick a different

combination every time you run a build acceptance test

in a NASA study: • 67 percent of failures triggered by a single value

• 93 percent by two-way combinations, and

• 98 percent by three-way combinations

Example, configurations operating system: Windows XP,

Apple OS X, Red Hat Enterprise Linux

browser: Internet Explorer, Firefox, Chrome

processor: Intel, AMD

database: MySQL, Sybase, Oracle

72 combinations possible, to test each pair: 10 tests

Example of tools: ACTS from NIST, PICT from Microsoft, AllPairs from James Bach (Perl)

for a longer list see: www.pairwise.org

These techniques and tool are supportive only. Often priorities

between platforms and values can drive more informed selection

Source: PRACTICAL COMBINATORIAL TESTING, D. Richard Kuhn, Raghu N.

Kacker, Yu Lei, NIST Special Publication 800-142, October, 2010

© 2013 LogiGear Corporation. All Rights Reserved

Grail 3: Specification Level, choosing actions

Scope of the test determines the specification level

As high level as appropriate, as little arguments as

possible Use default values for non-relevant arguments

Clear names (usually verb + noun usually works well) to standardize action names: standardize both the verbs and the nouns, so

"check customer" versus "verify client" (or vice versa)

tests are not C++ code: avoid "technical habits", like mixed case and (worse)

underlines

Manage the Actions

Document the Actions

By-product of the test design

Page 37: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

35

© 2013 LogiGear Corporation. All Rights Reserved

Case: American Bank

Project for a new teller system

Large, state of the art

Many system releases, many adjustments

Need for very high level of automation

Over 1 million test lines, in over 650 test modules

Initially little attention paid to "holy grails" UI and functional tests in the same modules

virtually un-maintainable, came close to killing the project

test design forced upon the team by a powerful stakeholder

who did not care much for methods...

Emergency re-organization of the test modules after system changes the tests would run again within a day

© 2013 LogiGear Corporation. All Rights Reserved

Example of using actions In this real world example the first "sequence number" for teller transactions for a given day is retrieved, using a search function

• the "#" means an expression, in this case a variable

• the ">>" assign to a variable for use later on in the test

key

key navigate F7

key navigate 3

page tab

locate page tab Scan Criteria

w indow

wait for controls loaded search

text

check breadcrumb general functions > search

w indow control value

select search scan direction Backward

w indow control value

enter value search business date match # bus date

source control

click search go

w indow

wait for controls loaded search results

w indow control variable

get search results sequence number >> seq num

Page 38: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

36

© 2013 LogiGear Corporation. All Rights Reserved

variable

get sequence number >> seq num

Example of using actions In this real world example the first "sequence number" for teller transactions for a given day is retrieved, using a search function

• the "#" means an expression, in this case a variable

• the ">>" assign to a variable for use later on in the test

© 2013 LogiGear Corporation. All Rights Reserved

Mid level actions

Most tests will have low level and high level actions low level: generic operations, know the interface, don't know the functionality

• examples: "selection menu item", "expand tree node", ...

high level: business oriented operations, know the functionality, don't know

the interface • examples: "enter purchase order", "check inventory of article"

For complex forms (dialog) with many input fields

consider using "mid level" actions an argument for each field

for use in high level actions

Examples of mid-level actions: "enter address fields"

"check address fields"

enter customer

enter address fields

enter select set . . . . . .

Page 39: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

37

© 2013 LogiGear Corporation. All Rights Reserved

Low-level, high-level, mid-level actions

Low-level: detailed interaction with the UI (or API) generic, do not show any functional or business logic

examples: "click", "expand tree node", "select menu"

High-level: represent a business function specific to the

scope of the test hide the interaction

examples: "enter customer", "rent car", "check balance"

Mid-level: auxiliary actions that represent common

sequences of low level actions usually to wrap a form or dialog

greatly enhance maintainability

example: "enter address fields"

enter customer

enter address fields

enter select set . . . . . .

© 2013 LogiGear Corporation. All Rights Reserved

Mapping an interface entity (like a window)

An interface mapping will map windows and controls to names

When the interface of an application changes, you only have to

update this in one place

The interface mapping is a key step in your automation success,

allocate time to design it well

INTERFACE ENTITY balance inquiry

interface entity setting title Balance inquiry

ta name ta class label

interface element last name text Last name:

interface element first name text First name (optional):

interface element client id text Client id (optional):

ta name ta class caption

interface element view balance button View Balance

interface element close button Close

ta name ta class global pos

interface element balance label label 5

Page 40: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

38

© 2013 LogiGear Corporation. All Rights Reserved

Some Tips to Get Stable Automation

Make the system under test automation-friendly

Use "active" timing

Test your automation

Use automation to identify differences between

versions of the system under test

Keep an eye on the test design

© 2013 LogiGear Corporation. All Rights Reserved

Look for properties a human user can't see, but a test tool can

This approach is a must-do for speedier and more stable automation interface mapping is often bottleneck, and source of maintenance problems

with predefined identifying property values an interface map can be created without "spy" tools

not sensitive to changes in the system under test

not sensitive to languages and localizations

Examples: "id" attribute for HTML elements

"name" field for Java controls

"AccessibleName" property in windows forms controls (see below)

Automation-friendly design: hidden properties

Page 41: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

39

© 2013 LogiGear Corporation. All Rights Reserved

Active Timing

Passive timing wait a set amount of time

in large scale testing, try to avoid passive timing altogether: • if wait too short, test will be interrupted

• if wait too long, time is wasted

Active timing wait for a measurable event

usually the wait is up to a, generous, maximum time

common example: wait for a window or control to appear (usually the test tool

will do this for you)

Even if not obvious, find something to wait for...

Involve developers if needed relatively easy in an agile team, but also in traditional projects, give this

priority

If using a waiting loop make sure to use a "sleep" function in each cycle that frees up the processor

(giving the AUT time to respond)

wait for an end time, rather then a set amount of cycles

© 2013 LogiGear Corporation. All Rights Reserved

Active Timing, your situation

How much passive timing do you have in your

scripts?

If you're not sure, find out...

... and let me know

"First action I took upon my return was to evaluate the percentage of

passive time in our code and found passive time 68% versus active

time 32%. Needless to say our automation test cases were very

expensive time operations and now I know why..." Raed Atawneh, 2012 (extract)

Page 42: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

40

© 2013 LogiGear Corporation. All Rights Reserved

Things to wait for...

Wait for a last control or elements to load developers can help knowing which one that is

Non-UI criteria API function

existence of a file

Criteria added in development specifically for this purpose, like: "disabling" big slow controls (like lists or trees) until they're done loading

API functions or UI window or control properties

Use a "delta" approach: every wait cycle, test if there was a change; if no change, assume that the

loading time is over:

examples of changes: • the controls on a window

• count of items in a list

• size a file (like a log file)

© 2013 LogiGear Corporation. All Rights Reserved

Alternatives to UI automation ("non-GUI")

A GUI (Graphical User Interface) is only one example of an interface

for interaction with a system under test

Examples HTTP and XML based interfaces, like REST

application programming interfaces (API’s)

embedded software

protocols

files, batches

databases

command line interfaces (CLI’s)

multi-media

mobile devices

In many cases non-GUI automation is used since there simply is not

GUI, but it can also often speed things up: tends to be more straightforward technically, little effort needed to build up or maintain

once it works, it tends to work much faster and more stably than GUI automation

In BIG testing projects routinely: identify which non-GUI alternatives are available

as part of test planning: identify which tests qualify for non-GUI automation

Page 43: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

41

© 2013 LogiGear Corporation. All Rights Reserved

Technical Complexity

Technology is another dimension that can make a project

"complex"

Examples: graphics, charts, 3D, ...

hard to access systems, like embedded software, iOS, Flash,

dedicated hardware

difficult protocols, like SS7, transactions servers

Approach: isolate the technical problems embed in functions and actions

let experts look at them

tackle early in a project, since impact is large

once resolved, no longer center stage

© 2013 LogiGear Corporation. All Rights Reserved

The importance of innovation

Large and complex testing projects pose many challenges

Initial discussion of approach is a key requisite thinking before doing

tackling technologies

agreeing on methods and practices

who does what, who needs to be involved

high level test design

debate the problems, not just the solutions

However, also plan for continuous improvements this is at the heart of agile thinking, and it applies very much to big testing

never stop thinking, "there is always one more trick"

share the tricks, other teams may like them too

improvements can apply to test design, to automation techniques, or even to

how to organize the work

Page 44: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

42

© 2013 LogiGear Corporation. All Rights Reserved

Tools that can help manage BIG projects

Application Lifecycle Management (ALM) abundant now, mainly on the wings of agile

very good for control, team cooperation, and traceability

often relate to IDE's (like Microsoft TFS and Visual Studio)

examples: Rally, Jira, TFS

Test Managers as separate tools on their way out

morphing into or replaced by ALM options

examples: HP Quality Center, Microsoft Test Manager

Test development and automation tools develop and/or automate tests

• these are not the same, automation tools are not always so good for test development

examples are HP Quick Test Pro, Borland Silk, Selenium, FitNesse, Microsoft Coded UI, and LogiGear's

TestArchitect and TestArchitect for Visual Studio (our own products)

Build tools succeed the traditional "make" tools

in particular "continuous build" tools combine "make" functionality with source control systems to rebuild

components that have changed, either continuously or on set times, like nightly

can very well also run related tests (unit and functional), and act on the results (stop build, report, etc)

examples: Hudson, Jenkins, TFS

Bug trackers not only register issues, but also facilitate their follow up, with workflow features

often also part of other tools, and tend to get absorbed now by the ALMs

Examples: BugZilla, Mantis, Trac

© 2013 LogiGear Corporation. All Rights Reserved

Tooling and Traceability

Reference item

(ALM item, req,

code module, ...)

Test Objective Test Case Execution Result

Test Module

Bug, issue

ALM, IDE,

Project Mgr,

Req Mgr

Test Development Tool

Automation Tool

Execution Manager

Continuous Build Tool

Lab manager

Issue Tracker

ALM

Testing

Trace back

Page 45: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

43

© 2013 LogiGear Corporation. All Rights Reserved

Test Execution

Have an explicit approach for when and how to execute

which tests

Having a good high level test design will help to organize

this

Execution can be selective or integral unit tests are typically executed selectively, possibly automatically

based on code changes in a system like SVN or TFS

for functional tests, decisions are needed: • selective execution will be quicker and more efficient

• integral execution may catch more issues ("bonus bugs")

• generally extensive functional test execution will be related to releases, rather

than code check ins

the ability to run "big testing" efficiently may determine how much can

be done

© 2013 LogiGear Corporation. All Rights Reserved

Environments, configurations

Many factors can influence details of automation language, localization

hardware

version of the system under test

system components, like OS or browser

Test design can reflect these certain test modules are more general

others are specific, for example for a language

But for tests that do not care about the differences, the

automation just needs to "deal" with them shield them from the tests

Page 46: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

44

© 2013 LogiGear Corporation. All Rights Reserved

Capture variations of the system under test in the actions and interface

definitions, rather than in the tests (unless relevant there).

Can be a feature in a test playback tool, or something you do with a global

variable or setting.

Variation Variation Variation

"Variations"

"Master Switch"

Actions, Interface Definitions

. . .

© 2013 LogiGear Corporation. All Rights Reserved

Possible set up of variations

linked variation

keyworded variation

Specify for example in a dialog when you start an execution:

Page 47: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

45

© 2013 LogiGear Corporation. All Rights Reserved

Test Environments

Physical • hardware • infrastructure • location • . . .

Software • programs • data models • protocols • . . .

Data • initial data • parameters / tables • . . .

• costs money

• can be scarce

• configurations

• availability

• manageability

© 2013 LogiGear Corporation. All Rights Reserved

Dealing with data

Constructed data is easier to manage can use automation to generate it, and to enter it in the environment

result of test analysis and design, reflecting "interesting" situations

however, less "surprises": real life situations which were not foreseen

Real-world data is challenging to organize make it a project, or task, in itself

make absolutely sure to deal with privacy, security and legal aspects

appropriately • study this, ask advice

• apply appropriate "scrubbing"

Consider using automation to select data for a test set criteria ("need a male older than 50, married, living in Denver"),

query for matching cases, and select one randomly (if possible a

different one each run)

this approach will introduce variation and unexpectedness, making

automated tests stronger and more interesting

Page 48: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

46

© 2013 LogiGear Corporation. All Rights Reserved

Unattended testing...

When a test cannot pass, it can be: a difference between expected and recorded values or behavior, as a result

of a check designed by the tester: this is a fail

the automation encounters a problem, like a window or control doesn't show,

that is not part of a check: this is an error

An error can disrupt the test flow, and you may want to catch and

handle it properly: skip smaller or larger parts of the ongoing test

bring the system back in a known state (typically: close any open windows,

go to the main screen)

make sure the report clearly indicates these kind of problems, to avoid false

positives

example "on error action" that executes a predefined action that will do

recovery

However, better is to avoid these situations lots of efforts needed for unattended testing should raise questions about test

design or quality of the automation ("thou shall not debug tests")

© 2013 LogiGear Corporation. All Rights Reserved

"Known bug" problem

Not uncommon in large scale systems typically related to a version of the system under test

A known bug may: generate fails you want to ignore, also in statistics

throw off automation

If many known bug situation occur, take another look at

your high level test design

One possible workaround, a "known bug" action: other alternatives: conditionally ignore steps or single check points

version

known bug 1.1

...

end known bug

Page 49: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

47

© 2013 LogiGear Corporation. All Rights Reserved

Virtualization

Virtual machines rather than physical machines allow "guest" systems to operate on a "host" system

host can be Windows, Linux, etc, but also a specialized "hypervisor"

the hypervisor can be "hosted" or "bare metal"

Main providers: VMWare: ESX and ESXi

Microsoft: Hyper-V

Oracle/Sun: Virtual Box

Citrix: Xen (open source)

Hardware support gets common now processor, chipset, i/o

Like Intel's i7/Xeon

For most testing purposes you need virtual clients, not virtual servers most offerings in the market currently target virtual servers, particularly data centers

Virtual clients will become more mainstream with the coming of VM's as part

of regular operating systems Windows 8: Hyper-V

Linux: KVM

© 2013 LogiGear Corporation. All Rights Reserved

Virtualization, a testers dream...

In particular for functional testing

Much easier to define and create needed configurations you basically just need storage

managing this is your next challenge

One stored configuration can be re-used over and over again

The VM can always start "fresh", in particular with fresh base data (either server or client)

specified state, for example to repeat a particular problematic automation

situation

Can take "snap shots" of situations, for analysis of problems

Can use automation itself to select and start/stop suitable VM's for example using actions for this

or letting an overnight or continuous build take care of this

Page 50: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

48

© 2013 LogiGear Corporation. All Rights Reserved

Virtualization, bad dream?

Performance, response times, capacities

Virtual machine latency can add timing problems see next slide

can be derailing in big test runs

Management of images images can be large, and difficult to store and move around

• there can be many, with numbers growing combinatorial style

• configuration in the VM can have an impact, like fixed/growing virtual disks

distinguish between managed configurations and sandboxes

define ownership, organize it

IT may be the one giving out (running) VM's, restricting your flexibility

Managing running tests in virtual machines can take additional efforts

on top of managing the VM's themselves with the luxury of having VM's the number of executing machines can

increase rapidly

one approach: let longer running tests report their progress to a central

monitoring service (various tools have features for this)

© 2013 LogiGear Corporation. All Rights Reserved

Virtualization: "time is relative"

Consider this waiting time loop, typical for a test script: endTime = currentTime + maxWait

while not endTime, wait in 100 millisecond intervals

When the physical machine overloads VM's can get slow or have

drop outs, and endTime may pass not due to AUT latency GetLocalTime will suffer from the latency

GetTickCount is probably better, but known for being unreliable on VM's

Therefore tests that run smooth on physical machines, may not

consistently do so on VM's. The timing problems are not easy to

predict

Possible approaches: in general: be generous with maximum wait times if you can

don't put too many virtual machines on a physical box

consider a compensation algorithm, for example using both tick count and clock time

Page 51: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

49

© 2013 LogiGear Corporation. All Rights Reserved

Virtual machines, capacity

Key to pricing is number of VM's that can run in parallel

on a physical machine

An automated test execution will typically keep a VM

more busy than human use

Factors in determining VM/PM ratio: memory, for guest OS, AUT, test tooling

storage devices (physical devices, not disk images)

processors, processor cores

specific hardware support (becoming more common) • processor, chipset, I/O

We started regression with 140 VMs.

Very slow performance of

Citrix VM clients.

© 2013 LogiGear Corporation. All Rights Reserved

Building up virtualization

Pay attention to pricing: beefed up hardware can increase VM's/box ratio, but at a price software can be expensive depending on features, that you may not

need

In a large organization, virtual machines are probably available make sure to allocate timely (which can be long before you get there

with your sprints) keep in mind the capacity requirements

Logical and physical management which images, the wealth of possible images can quickly become

hard to see forest through the trees physical management of infrastructure is beyond this tutorial

Minimum requirement: snapshots/images freeware versions don't always carry this feature allow to set up: OS, environment, AUT, tooling, but also: data, states

Page 52: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

50

© 2013 LogiGear Corporation. All Rights Reserved

Infrastructure

For large scale test execution this needs attention physical infrastructure, but also how to use it

Also consider managing infrastructure and test

execution as a separate task in or out of the team

avoid slowing down development (of system, test and/or

automation)

© 2013 LogiGear Corporation. All Rights Reserved

Remote execution, servers

Allowing execution separately from the machines the testers and

automation engineers are working on increases scalability

Large scale text execution, in particular with VM's, like to have: lots of processing power, lots of cores

lots of memory

Test execution tends to care less about: storage

networking

Test execution facilities tend to be a bottle neck very quickly in big

testing projects the teams can use whatever they can get

First step up: give team members a second machine

Second step up: use servers, users coordinate their use of them

Third step up: major infrastructures with organized allocation

Page 53: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

51

© 2013 LogiGear Corporation. All Rights Reserved

Tower Servers

Smaller shops (smaller companies, departments)

Affordable, simple, first step up from clients execution

Not very scalable when the projects get larger

© 2013 LogiGear Corporation. All Rights Reserved

Rack Servers

Well scalable

Pricing not unlike tower servers

Tend to need more mature IT expertise

Page 54: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

52

© 2013 LogiGear Corporation. All Rights Reserved

Server Blades

Big league infrastructure, high density, very scalable

Tends to be pricey, use when space and energy matters

Usually out of sight for you and your team

© 2013 LogiGear Corporation. All Rights Reserved

Cloud

Cloud can be target of testing normal tests, plus cloud specific tests

• functional, load, response times

from multiple locations

moving production through data centers

Cloud can be host of test execution considerations can be economical or organizational

providers offer imaging facilities, similar to virtual machines

make sure machines are rented and returned efficiently

Public cloud providers like EC2 offer API's, so your automation can

automatically allocate and release them be careful, software bugs can have costing consequences

for example, consider having a second automation process to double-check cloud

machines have been released after a set time

Note: public cloud is not taking of as fast as expected, cloud services,

and private clouds, taking of much faster

(Xinhua Photo)

Page 55: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

53

© 2013 LogiGear Corporation. All Rights Reserved

Cloud Providers

Source: Jack of All Clouds, January 2011

http://www.jackofallclouds.com/2011/01/state-of-the-cloud-january-201/

© 2013 LogiGear Corporation. All Rights Reserved

Cloud growth

Growth of public clouds not as big as expected

Cost benefits not necessarily convincing low startup cost, but long ongoing cost

See also: news.cnet.com/8301-13556_3-20063361-61.html

source: IDC forecast, 2010

Page 56: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

54

© 2013 LogiGear Corporation. All Rights Reserved

Cloud, example pricing, hourly rates

Source: Amazon EC2 (my interpretation, actual prices may vary)

Linux Windows

Small 0.085 0.12 1.7 GB, 1 core (32 bits)

Large 0.34 0.48 7.5 GB, 4 cores

Extra Large 0.68 0.96 15 GB, 8 cores

High memory

Extra Large 0.50 0.62 17.1 GB, 6.5 core

Double Extra Large 1.00 1.24 34.2 GB, 13 cores

Quadruple Extra Large 2.00 2.48 68.4 GB, 26 cores

High CPU

Medium 0.17 0.29 1.7 GB, 5 core (32 bits)

Extra Large 0.68 1.16 7 GB, 20 cores

© 2013 LogiGear Corporation. All Rights Reserved

Cloud, example economy

Not counting possible use of VM's within the buy option

Also not counting: additional cost of ownership elements for owning or

cloud (like IT management, contract and usage management)

Impressions: cloud could fit well for bursty testing needs, which is often the case

for full continuous, or very frequent, testing: consider buying

hybrid models may fit many big-testing situations: own a base capacity, rent

more during peak use periods

small large extra

Windows $0.12 $0.48 $0.96

buy (estimate) $300 $650 $900

hours to break even 2,500 1,354 938

months (24 / 7) 3.4 1.8 1.3

Page 57: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

55

© 2013 LogiGear Corporation. All Rights Reserved

Data centers can go down

However, disruption could have been minimized by using multiple data centers

© 2013 LogiGear Corporation. All Rights Reserved

Data centers can go down

This time, it did involve multiple data centers . . .

Page 58: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

56

© 2013 LogiGear Corporation. All Rights Reserved

Data centers can go down

Service providers can occasionally go down too

© 2013 LogiGear Corporation. All Rights Reserved

Cloud, usage for special testing needs

Multi-region testing Amazon for example has several regions

• US East, Northern Virginia

• US West, Oregon, Northern California

• EU, Ireland

• Asia Pacific, Singapore, Tokyo

• South America, Sao Paulo

be careful that data transfers between regions costs money

(0.01/GB)

Load generation example: "JMeter In The Cloud"

• based on the JMeter load test tool

• uses Amazon AMI's for the slave machines

• allows to distribute the AMI's in the different regions of Amazon

• see more here:

aws.amazon.com/amis/jmeter-in-the-cloud-a-cloud-based-load-testing-environment

Page 59: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

57

© 2013 LogiGear Corporation. All Rights Reserved

Questions for Infrastructure

What kind of infrastructure does

your organization use for

testing?

What is the role of

virtualization, now or in the

future?

Are you using a private or a

public cloud for testing?

© 2013 LogiGear Corporation. All Rights Reserved

Example of a cloud system under test

source: Windows Azure reference platform

Page 60: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

58

© 2013 LogiGear Corporation. All Rights Reserved

Approaches

Automation does not have to be black box for very big systems, a separate black box automation effort may not

be efficient

and building and keeping lab situations might be cumbersome

some simple hooks can greatly help already

remember... this is about automation, not test design.

Make testability part of requirements and architecture a key question should not just be "how do I design this", but "how do I

test this" (test design, automation)

some cloud/web systems are changed frequently, and tested "live" • "Testing in Production (TiP)"

allow redirection of some or all traffic through another version of a

component or layer

Example: reverse proxy's enabling A/B testing

see also: Ken Johnston's chapter in the book of Dorothy Graham and Mark Fewster,

and his keynote at StarWest 2012

© 2013 LogiGear Corporation. All Rights Reserved

A/B testing with a reverse proxy

Watch your test design, easy to drown in technical solutions only

B could be a real-life user or also a keyword driven test machine

A/B testing means part of traffic is routed through a different

server or component (see if it works, and/or how users react)

A similar strategy could be done at any component level

A

A

B

Reverse

Proxy

Users Servers

A

B

new current

A B

Page 61: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

59

© 2013 LogiGear Corporation. All Rights Reserved

Organization

Much of the success is gained or lost in how you organize the

process part of the teams

who does test design

who does automation

what to outsource, what to keep in-house

Write a plan of approach for the test development and automation scope, assumptions, risks, planning

methods, best practices

tools, technologies, architecture

stake holders, including roles and processes for input and approvals

team

. . .

Assemble the right resources testers, lead testers

automation engineer(s)

managers, ambassadors, ...

Test design is a skill . . . Automation is a skill . . . Management is a skill . . .

. . . and those skills are different . . .

© 2013 LogiGear Corporation. All Rights Reserved

Industrial Organization

Large scale testing can move from a "design" to a

"production" focus mostly applies to test execution, but also seen for test development

this not black and white, both paradigms can occur in the same projects

this is often more easy to outsource than development

A production organization is different a development

organization this is not unique for software

different professional culture

emphasis more on delivery and scale, "thinking big"

discipline rather than creativity, "get stuff done"

activities are like planning, control, logistics, information

Page 62: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

60

© 2013 LogiGear Corporation. All Rights Reserved

Task in "production" (test execution)

Keeping the tests running

Allocating resources

Respond to hick-ups

Analyze and address automation issues

Address fails or other testing outcomes including dealing with "known bugs"

part of a bigger team

© 2013 LogiGear Corporation. All Rights Reserved

Stake Holders

Test Development

Test Automation

Technology/

Infrastructure

Production Marketing/

Sales

System

Development

End User

Departments

Quality Assurance

Management

After Sales/

Help Desk

Customers

Vendors

Government

Agencies

Publicity

EXTERNAL INTERNAL

Page 63: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

61

© 2013 LogiGear Corporation. All Rights Reserved

Team roles, examples

Test development

Automation

Planning and managing the test runs

Managing environments

Managing infrastructure

Dealing with stakeholders

Analysis of results, and follow up

Reporting

© 2013 LogiGear Corporation. All Rights Reserved

Test Development and Automation in sprints

Test Module

Definition

(optional)

Test Module Development

Interface Definition

Action Automation

Test Execution

Sprint Products Product Backlog

Test re-use

Automation re-use

product owner

team prod owner

& team

User stories

Documentation

Domain understanding

Acceptance Criteria

PO Questions

Situations

Relations

Agile life cycle

Test development

Main Level Test Modules

Interaction Test Modules

Cross over Test Modules

Page 64: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

62

© 2013 LogiGear Corporation. All Rights Reserved

Test automation in sprints

Try keep the main test modules at a similar level as the user stories

and acceptance criteria

Aim for "sprint + zero", meaning: try to get test development and

automation "done" in the same sprint, not the next one next one means work clutters up, part of team is not working on the same sprint, work is

done double (manually and automated), ...

Make sure you can do the interface mapping by hand (using

developer provided identifications) can do earlier, before UI is finalized, and

recording of actions (not tests) will go better

Also plan for additional test modules: low-level testing of the interaction with the system under test (like UI's)

crossing over to other parts of the system under test

There should be agreement on the method(s) for testing and automation

The team should include the skills and experienced needed for automated

testing and the approach(es) taken for it

© 2013 LogiGear Corporation. All Rights Reserved

Fitting in sprints

Agree on the approach: questions like does "done" include tests developed and automated?

do we see testing and automation as distinguishable tasks and

skillsets

is testability a requirement for the software

Create good starting conditions for a sprint: automation technology available (like hooks, calling functions, etc)

how to deal with data and environments

understanding of subject matter, testing, automation, etc

Make testing and automation part of the evaluations

Address tests and automation also in hardening sprints

Just like for development, use discussions with the team

and product owners to deepen understanding: also to help identify negative, alternate and unexpected situations

Page 65: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

63

© 2013 LogiGear Corporation. All Rights Reserved

Testing as a profession

"Do thorough acceptance testing, but not only by the

customer" source: "Agile Software Testing in a Large-Scale Project", Israeli Air

Force

Focus on tests, not development: what can be consequences of situations and events

relieve developers

Knowledge and experience with testing techniques and

principles

The challenge for the tester in the new era is to become a

more credible professional tester, not a pseudo programmer

part of the team

Forcing a nontechnical tester to become a programmer

may lose a good tester and gain a poor programmer

© 2013 LogiGear Corporation. All Rights Reserved

Automation is a profession too

Overlaps with regular system development, but not same

Less concerned with complex code structures or

algorithms

More concerned with navigating through other software

efficiently, dealing with control classes, obtaining

information, timing, etc if you would compare developers to "creators", automation engineers might

be likened to "adventurers"...

The automation engineering role can also be a consultant: for test developers: help express tests efficiently

for system developers: how to make a system more automation friendly

important player in innovation in the automated testing

Page 66: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

64

© 2013 LogiGear Corporation. All Rights Reserved

Questions for Organization

How is your testing currently

organized (who is doing what)? test design

test development

automation

execution

assessment of release readiness

Do you use agile? If yes, is

there a role for a test

professional? And for an

automation professional?

© 2013 LogiGear Corporation. All Rights Reserved

Reporting

Aim at needs: avoid lengthy automated reports, have bottom line numbers

reports for stake holders

reporting for the team

Reporting for a big testing project is about: test and automation progress

production (running the tests)

results (aimed at system under test)

Teams need (relevant) details what happened, reproducibility, ...

either the tests, the automation, or the system under test

overall situations, with an ability to "drill down" to problem areas

Management needs: status, expectations, issues (realistic! bad news matter, you get punished for not telling)

bottom lines, plan versus reality confrontation

dates, efforts, used resources, costs, run times, ...

never allow planned numbers or dates to be "updated"

Also for reporting, test organization is a key driver

Page 67: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

65

© 2013 LogiGear Corporation. All Rights Reserved

War rooms

Helpful if response times are critical, and a need for cooperation,

towards the same goal similar grounds as for agile scrum rooms

Set up at critical times, like before important deadlines, or during

critical releases

Can temporarily bring together multiple parties, that normally are not

co-workers like competitor vendors

Pay attention to physical conditions machines, monitors, white boards, meeting places, headsets, ...

food, drinks, ...

The test execution cycle should match the needs of the war room

approach fast turnarounds

effortless

completeness

selective or integral

See also: "Your Game is Live, Now What?", Jane Fraser, Electronic Arts

© 2013 LogiGear Corporation. All Rights Reserved

Globalization....

Page 68: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

66

© 2013 LogiGear Corporation. All Rights Reserved

Main Challenges

Other countries

Distances

Time differences

© 2013 LogiGear Corporation. All Rights Reserved

Globalization

Three Challenges: another countries, other cultures

geographic distances

time differences

Seven "Patterns": "Solution"

"Push Back"

"Time Pressure"

"Surprises"

"Ownership"

"Mythical Man Month"

"Cooperation"

Page 69: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

67

© 2013 LogiGear Corporation. All Rights Reserved

Challenge: Other Country

© 2013 LogiGear Corporation. All Rights Reserved

Other Country

Differences in culture more on the next slide...

Different languages, and accents

Differences in education style, orientation and contents

position of critical thinking, factual knowledge, practice, theory,...

US, British, French, Asian, ...

Differences in circumstances demographics

economy, infrastructure

politics

Apprehension on-shore and off-shore about job security doesn't help in

projects management responsibility: understand your strategic intentions, and their consequences, and clarify

them

be realistic in cost and benefit expectations

Page 70: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

68

© 2013 LogiGear Corporation. All Rights Reserved

More on Culture...

Regional culture. There are numerous factors: very difficult to make general statements

• many anecdotes, stories and perceptions, some are very helpful, some have limited general

value

• not sure on impact of regional culture (see also [Al-Ani])

numerous factors, like history, religion, political system

• e.g. valuing of: critical thinking, theory, bottom-line, relations, status, work-ethic, bad news,

saying 'no'

• entertaining guests, eating habits, alcohol, meat, humor, etc

• position of leaders, position of women managers

• mistakes can be benign and funny, but also damaging, visibly or hidden, in particular perceived

disrespect hurts

Organizational culture can be different from country to country, sector to sector, company to company, group to group

I feel this to be at least as strong than regional culture (see for example [Al-Ani])

you can have at least some control over this

Professional cultures for example engineers, QA, managers, ...

Some ideas to help: get to know each other (it helps, see for example [Gotel])

study the matter, and make adaptations

© 2013 LogiGear Corporation. All Rights Reserved

Page 71: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

69

© 2013 LogiGear Corporation. All Rights Reserved

© 2013 LogiGear Corporation. All Rights Reserved

Page 72: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

70

© 2013 LogiGear Corporation. All Rights Reserved

© 2013 LogiGear Corporation. All Rights Reserved

Challenge: Distance

Page 73: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

71

© 2013 LogiGear Corporation. All Rights Reserved

Distance

Continuous logistical challenges

Lots of costs, and disruptions, for traveling

Distance creates distrust and conflict could be "normal" behavior, inherent to humans

Complex coordination can create misunderstandings on technical topics

on actions, priorities, and intentions

© 2013 LogiGear Corporation. All Rights Reserved

Challenge: Time difference

Page 74: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

72

© 2013 LogiGear Corporation. All Rights Reserved

Challenge: Time difference

Additional complication for communication and

coordination

Places a major burden on both on-shore and off-shore

staff having to work evenings and/or early mornings

potential for exhaustion, lack of relaxation, mistakes, irritation

Can easily lead to loss of time at critical moments

Some solutions: manage this actively

constantly seek to optimize task and responsibility allocation

build the on-shore and off-shore organizations to match

seek ways to save meeting time, like optimal information handling

© 2013 LogiGear Corporation. All Rights Reserved

Effect of time difference

Test Module: “Segment Y, Default Settings”

Windows Linux

TestArchitect 5 ~ 4:16 m ~ 4:28 m

TestArchitect 6 ~ 11:00 m ~ 8:00 m

Report from the team to the US management . . .

Performance comparison TestArchitect 5 and 6

Page 75: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

73

© 2013 LogiGear Corporation. All Rights Reserved

Patterns

Experiences seem to follow patterns at least our own experiences do

variations are numerous, but seem to follow similar lines

following are examples, not limitative

It can help to recognize patterns quickly, and act upon

them

Resolutions have side-effects, can introduce new issues for example strengthening local management means less direct

contact with the project members doing the work

Just about every pattern occurs in every direction from your perspective regarding "them"

their perspective on you, or each other

sometimes equaling, sometimes mirroring

© 2013 LogiGear Corporation. All Rights Reserved

Pattern: "The Solution"

Typical sequence of events: the team finds a problem in running a test

the team discusses it and comes up with a "solution"

the solution: (1) creates issues, and (2) hides the real

problem

Better way: define as an issue

discuss with project manager and customer

Page 76: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

74

© 2013 LogiGear Corporation. All Rights Reserved

Pattern: "Push Back"

US side, or customer, gives bad direction

Team doesn't like it, but feels obliged to follow orders

The result is disappointing

Team is blamed and will speak up even less next time

Better way: discuss with the principal/customer at multiple levels

• strategic about direction, operational day-to-day

empower and encourage the team to speak up

write plans of approach, and reports

© 2013 LogiGear Corporation. All Rights Reserved

Pattern: "Time Pressure"

Deadline must be met no matter what

use over-time

"failure is not an option"

Deadlines are sometimes real, sometimes not become a routine on the US side

easy to pressure over the email

very difficult for a non-empowered team to push back

risk: inflation of urgency

Better way: good planning

proper weighing of deadlines and priorities

frequent reporting

local management

Page 77: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

75

© 2013 LogiGear Corporation. All Rights Reserved

Pattern: "Surprises"

Good news travels better than bad news... should be the other way around

the "cover up": "let's fix, no need to tell...."

over time: needing bigger cover ups to conceal

smaller ones

not unique for off-shoring, but more difficult to

detect and deal with

Once a surprise happens: you will feel frustrated, and betrayed

fix the problems, point out the consequences of

hiding, avoid screaming and flaming

Better ways: agree: NO SURPRISES!!

emphasize again and again

train against this

continuously manage, point out

the magic word: transparency

SUPRISES

© 2013 LogiGear Corporation. All Rights Reserved

Pattern: "Ownership"

Shared responsibility is no responsibility

Effort-based versus result-based

On-shore players feel the off-shore team has a result responsibility

Off-shore team members feel an effort-based responsibility ("work

hard")

Better way: clear responsibilities and expectations

on-shore ownership for quality control of system under test • and therefore the tests

off-shore ownership of producing good tests and good automation

empower according to ownership

Page 78: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

76

© 2013 LogiGear Corporation. All Rights Reserved

Pattern: "Mythical Man Month"

Fred Brooks classic book, "Mythical man month": "Assigning more programmers to a project running behind schedule

will make it even later"

"The bearing of a child takes nine months, no matter how many

women are assigned"

In test automation, there must be clear ownership of: test design (not just cranking out test cases)

automation, this is different skill and interest

Assign at least the following roles: project lead, owns quality and schedule

test lead: owns test design, coaches and coordinates the other testers

automation: make the actions work (assuming ABT, not the test

cases)

Define distinct career paths in: testing, automation,

management

© 2013 LogiGear Corporation. All Rights Reserved

Pattern: "Cooperation"

Communication is tedious, takes a long time

Questions, questions, questions, ... reverse: questions don't get answered

For at least one side in private time, extra annoying

Misunderstandings, confusion, actions not followed up double check apparent "crazy things" with the team before jumping to conclusions, and

actions (assume the other side is not "nuts" or "dumb"...)

Please understand: distance fosters conflicts we're born that way, be ready for it

Better ways: prioritize training, coaching, preparation and planning. Saves a lot of questions...

write stuff down, use briefs, minutes

define workflows and information flows • buckets, reporting, select and use good tools

specialize meetings • table things for in-depth meetings

• ask to meet internally first

be quick, no more than 30 mins

Page 79: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

77

© 2013 LogiGear Corporation. All Rights Reserved

Cooperation

© 2013 LogiGear Corporation. All Rights Reserved

Training, some ideas

Many areas, big pay-offs: system under test

subject matter under test, domain knowledge

methods, best practices

technologies, tools, ...

processes

soft skills, like creativity, critical thinking, management, ...

language

cross-cultural

Have exams think about the consequences of passing and failing

teams pay more attention when they know they will get tested

you will know whether you were understood

Also have coaching and train-the-trainers more experienced people help newbie's

also runs a risk: bad habits can creep in and procreate

"Tribal knowledge", learning by osmosis, water cooler conversations, encourage it

consider "special interest groups (SIG's)"

Rule of thumb for off-shore teams: hire for technical knowledge, train for business

knowledge

The on-shore staff needs training and coaching too

Page 80: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

78

© 2013 LogiGear Corporation. All Rights Reserved

Additional ideas

Go there, be with the team, experience yourself how "your side" is doing I go about twice per year

Manage ownership, is it you or them the distinction between efforts and results

Provide clear direction, constant attention and coaching

Supervise, supervise, supervise but don't micromanage if the other side has ownership

Ask to create example products (like ABT test modules and actions), review

them carefully, and use as direction for subsequent work

Leadership style: participative styles seem most common (as opposed to

consensus or authoritative, see also [Al-Ani])

Organize informal/fun events, provide a good environment solidify the group, improve retention

include visiting US staff, this tends to do a lot of good ("priceless")

Manage expectations stuff takes time and energy

differences can be addressed, but not 100% cake...

© 2013 LogiGear Corporation. All Rights Reserved

Outsourcing and Agile

If done well, can provide relieve to a lot of the patterns

Several models possible

Model 1: Full team outsourcing development, testing and automation

Model 2: "2nd unit" off-shore team works under control of sprint team members

Model 3: Part of integrated team: needs online tool like Jira or Rally

you must have shared meetings

advantage: more project time

Large scale test development and automation might be easier to

outsource than development

Page 81: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

79

© 2013 LogiGear Corporation. All Rights Reserved

Summary

Not all "big project" challenges are the same

Think before you do. Best results come from planning

well, and combining effective concepts, tricks and tools

Consider tests and automation as products

Team work is a key for short term and long term success

There are many options for infrastructure, but keep an

eye on economy and planning

Off-shoring can help scale up, but needs attention to do it

right, in particular communication

Repeat of initial invitation

Focus today was on overview and concepts, not always on details. Please see

me in person for any discussion you would like on your situation that I didn't

cover. We're also exhibiting here, probably easiest to reach me there.

© 2013 LogiGear Corporation. All Rights Reserved

Homework . . .

1. Testing Computer Software, Cem Kaner, Hung Nguyen, Jack Falk, Wiley

2. Lessons Learned in Software Testing, Cem Kaner, James Bach, Bret Pettichord, Wiley

3. Experiences of Test Automation, Dorothy Graham, Mark Fewster, Addison Wesley, 2012

4. Automating Software Testing, Dorothy Graham, Mark Fewster, Addison Wesley

5. "Build a Successful Global Training Program", Michael Hackett, www.logigear.com

6. Action Based Testing (overview article), Hans Buwalda, Better Software, March 2011

7. Action Figures (on model-based testing), Hans Buwalda, Better Software, March 2003

8. Integrated Test Design & Automation, Hans Buwalda, Dennis Janssen and Iris Pinkster, Addison Wesley

9. Soap Opera Testing (article), Hans Buwalda, Better Software Magazine, February 2005

10. Testing with Action Words, Abandoning Record and Playback, Hans Buwalda, Eurostar 1996

11. QA All Stars, Building Your Dream Team, Hans Buwalda, Better Software, September 2006

12. The 5% Solutions, Hans Buwalda, Software Test & Performance Magazine, September 2006

13. Happy About Global Software Test Automation, Hung Nguyen, Michael Hackett, e.a., Happy About

14. Testing Applications on the Web, Hung Nguyen, Robert Johnson, Michael Hackett, Wiley

15. Practical Combinatorial Testing, Richard Kuhn, Raghu Kacker, Yu Lei, NIST, October, 2010

16. Agile Software Testing in a Large-Scale Project, David Talby, Arie Keren, Orit Hazzan, Yael Dubinsky, IEEE Software, July/August 2006

17. JMeter in the Cloud, Jörg Kalsbach, http://aws.amazon.com/amis/2924

18. Using Monkey Test Tools, Noel Nyman, STQE issue January/February 2000

19. High Volume Test Automation, Cem Kaner, Walter P. Bond, Pat McGee, StarEast 2004

20. Descriptive Analysis of Fear and Distrust in Early Phases of GSD Projects, Arttu Piri, Tuomas Niinimäki, Casper Lassenius, 2009 Fourth IEEE International Conference on Global Software Engineering [Piri]

21. Quality Indicators on Global Software Development Projects: Does 'Getting to Know You' Really Matter? Olly Gotel, Vidya Kulkarni,

Moniphal Say, Christelle Scharff, Thanwadee Sunetnanta, 2009 Fourth IEEE International Conference on Global Software

Engineering [Gotel]

Page 82: The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More

8/20/2013

80

© 2013 LogiGear Corporation. All Rights Reserved

Thanks...

Please fill out the evaluation form, from the back

of the book

Let me know any questions or concerns:

We're at the expo, questions welcome I will be there myself too quite a bit

TESTING FOR SALE

Supersize your tests for less...

email: hans @ logigear.com

articles: www.happytester.com

company: www.logigear.com

TestArchitect: www.testarchitect.com we're at the expo