exploratory testingread.pudn.com/downloads163/doc/743445/et_v2-1_course_book.pdf · ¥understand...

85
Exploratory Testing 1 v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc. A course brought to you by Exploratory Testing Instructor: Elisabeth Hendrickson Quality Tree Software, Inc. [email protected] Course Contents 1. Getting Started…5 2. Charters and Sessions…15 3. Analyzing Variables… 28 4. Modeling Behavior… 34 5. Modeling the System… 43 6. Nouns and Verbs… 54 7. User Perspectives… 61 8. Chartering and Sessions Revisited…68 Wrap Up…77 Additional Reference Materials “Rigorous Exploratory Testing”…79 “It’s All about the Variables”…82 “Nouns and Verbs”…86 Acknowledgements… 90 Bibliography…93

Upload: others

Post on 04-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

1v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

A course brought to you by

Exploratory Testing

Instructor:

Elisabeth Hendrickson

Quality Tree Software, Inc.

[email protected]

Course Contents

1. Getting Started…5

2. Charters and Sessions…15

3. Analyzing Variables… 28

4. Modeling Behavior… 34

5. Modeling the System… 43

6. Nouns and Verbs… 54

7. User Perspectives… 61

8. Chartering and Sessions Revisited…68

Wrap Up…77

Additional Reference Materials

“Rigorous Exploratory Testing”…79

“It’s All about the Variables”…82

“Nouns and Verbs”…86

Acknowledgements… 90

Bibliography…93

Page 2: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

2v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

Before We Get Started...Introductions & Logistics

Course Objectives:

• Learn how Exploratory Testing fits in with other kinds of testing

• Understand how Exploratory Testing contributes to the discovery of risk and

verification of value

• Become more adept at observing what the software or system is doing

• Discover how identifying Variables can lead to more interesting explorations

• Learn techniques and heuristics for testing based on states, environment,

sequences, and personae

• Learn how to use Charters and Sessions to manage and control your

Exploratory Testing

Copyright Notice

These course notes are copyrighted. Please do not make unauthorized copies of

these materials for your manager, coworkers, or friends. Send them to class to get

their own copy.

Page 3: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

3v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

3

Introductions

Elisabeth Hendrickson ([email protected])

Elisabeth began working in the software industry in 1984. She has held positions as

a Tester, Programmer, Test Automation Manager, Quality Engineering Director, and

Technical Writer working for companies ranging from a 20-person startup to a large

multi-national software vendor.

Elisabeth is an experienced facilitator and trainer. A student of Jerry Weinberg's,

Elisabeth is a graduate of the Weinberg & Weinberg PSL, ChangeShop, and SEM

programs. She also studied Experiential Training Design with Jerry and his wife

Dani.

Elisabeth is frequently invited to speak at conferences around the world. She has

given keynote addresses at conferences in the US, Sweden, Portugal, Australia, and

New Zealand.

In 2003, Elisabeth became involved with the Agile community. In 2005 she became

a Certified Scrum Master and in 2006 she joined the board of directors for the Agile

Alliance. These days Elisabeth splits her time between teaching, speaking, writing,

and working on Extreme Programming teams with test-infected programmers who

value her obsession with testing.

Page 4: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

4v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

4

Agenda & Schedule

Day 2

! Warm up andDay 1 Survey Results

5. Modeling the System

6. Nouns and Verbs

7. User Perspectives

8. Chartering andSessions Revisited

! Wrap Up

Day 1

! Introductions

1. Getting Started

2. Charters andSessions

3. Analyzing Variables

4. Modeling Behavior

! End of Day Survey

Setting the Schedule

We’ll fill in the blanks:

_________ Start

_________ Morning Break(s)

_________ Lunch

_________ Afternoon Break(s)

_________ End

Page 5: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

5v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

1. Getting Started

Why Exploratory Testing?

Pitfalls and Strategies in ET

Definitions

Page 6: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

6v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

6

Famous Explorers

John TukeyStatistician

Richard FeynmanNobel Prize

Winning Physicist

Lewis & ClarkMapped the

North West US

Explorers Use New Approaches

Feynman advanced physics because he was able to visualize and articulate complex

concepts in new ways. He demonstrated this for the general public when he

performed the now-famous ice water test at a press conference after the Challenger

space shuttle disaster.

They Have Charters

Lewis & Clark's charter as given to them by president Thomas Jefferson: “The

Object of your mission is to explore the Missouri river & such principal stream of it

as by it's course and communication with the waters of the Pacific ocean, whether

the Columbia, Oregon, Colorado or any other river may offer the most direct &

practicable water communication across this continent for the purpose of

commerce.” See http://www.lewis-clark.org/content/content-

article.asp?ArticleID=1047

And Attitude

“If we need a short suggestion of what exploratory data analysis is, I would suggest

that it is an attitude AND a flexibility AND some graph paper (or transparencies, or

both).” –John Tukey in American Statistician quoted on http://www-groups.dcs.st-

and.ac.uk/~history/Quotations/Tukey.html

Page 7: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

7v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

7

What Is Testing and Why Do It?

Testing is a process of gatheringinformation by making observations andcomparing them to expectations.

Decision makers use this information tomake decisions about projects.

We have a responsibility to provide thosedecision makers with the bestinformation we can.

Definitions of Testing

There are almost as many definitions of software testing as there are books on the

topic. In his 1979 book The Art of Software Testing, Glenford Myers said, “Testing

is the process of executing a program with the intent of finding errors.” Kaner, Falk

and Nguyen agreed in their 1999 book, Testing Computer Software (2nd Ed): “The

purpose of testing a program is to find problems in it.” In his 1983 book The

Complete Guide to Software Testing, Bill Hetzel said, “Testing is any activity aimed

at evaluating an attribute or capability of a program or system and determining that

it meets its required results.” The 1990 IEEE Standard Glossary of Software

Engineering Terminology (Std 610.12-1990) says testing is, “The process of

operating a system or component under specified conditions, observing or recording

the results, and making an evaluation of some aspect of the system or component.”

Whatever definition of testing you subscribe to, the result of testing is information:

information about bugs, discrepancies, misunderstandings, current status, and

software behavior under a variety of conditions.

What’s a Test?

There are numerous definitions including the one in the IEEE glossary (Standard

610.12) that defines “test” as “An activity in which a system or component is

executed under specified conditions, the results are observed or recorded, and an

evaluation is made of some aspect of the system or component.” However, we find

a slight reframe helpful:

A test is an experiment designed to reveal information, or answer a specific

question, about the software or system.

Page 8: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

8v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

8

Definition: Exploratory Testing

Exploratory Testing is a style of testingin which you learn about the softwarewhile simultaneously designing andexecuting tests, using feedback fromthe last test to inform the next.

Because Exploratory Testing allows forfeedback, it usually gives us additionalinformation that strict script testingcannot.

A Short History of Exploratory Testing

Cem Kaner coined the term “Exploratory Testing” in his book Testing Computer

Software, although the practice of Exploratory Testing certainly predates the book.

Since the book’s publication two decades ago, Cem Kaner, James Bach, and a group

of others (including Elisabeth Hendrickson and James Lyndsay) have worked to

articulate just what Exploratory Testing is and how to do it.

Exploratory Testing Can Be Rigorous

Two key things distinguish good Exploratory Testing as a disciplined form of

testing:

• Using a wide variety of analysis/testing techniques to target vulnerabilities

from multiple perspectives.

• Using charters to focus effort on those vulnerabilities that are of most

interest to stakeholders.

Page 9: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

9v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

9

Scripted v. Exploratory

Start

End

ExploratoryScripted

Test 1

Test 2

Test 3

Test 4

Test 5

Session 1

Session 2

Session 3

Session 4

Session 5

Why Not Just Do All Scripted or All Exploratory Testing?

Scripted testing involves using a step-by-step script for executing tests, where the

script has been prepared in advance based on a careful analysis of the specifications

and/or requirements. It has these advantages:

• Methodical, repeatable process.

• Comprehensive scripts can support process scalability by giving new people

an easy-to-follow recipe for testing.

However, it has disadvantages as well:

• It involves significant up front investment in artifacts that usually do not

directly contribute to the end deliverable to the customer

• It risks perpetuating any holes in the test analysis throughout the project

lifecycle

Test Efforts Usually Involve Some Amount of Both

Scripted tests help ensure we’ve tested that the implementation does what it’s

supposed to do under various conditions (including the error conditions the software

is intended to handle). Exploratory Testing helps us find surprises, implications of

interactions that no one ever considered, and misunderstandings about what the

software is supposed to do. The two practices work hand-in-hand.

In practice, all test efforts I have ever seen have involved some degree of Scripted

Testing and some degree of Exploratory Testing. The question will not be “Should

we do any of that kind of testing?” but rather “How much?”

Page 10: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

10v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

10

Exploratory Testing on Agile Projects

Within an Iteration

AutomatedUnit Tests

AutomatedAcceptance or

Story Tests

ManualExploratory

Testing

Define “done”,representexecutable

requirements

Drive design,representexecutable

specifications

Providesadditionalfeedback

Ward Cunningham on Exploratory Testing

“Because an agile development can accept new and unanticipated functionality so

fast, it is impossible to reason out the consequences of every decision ahead of time.

“In other words, agile programs are more subject to unintended consequences of

choices simply because choices happen so much faster. This is where exploratory

testing saves the day. Because the program always runs, it is always ready to be

explored.

“[T]here is a tremendous opportunity in front of us if we are just willing to support

exploratory testing with the same vigor that we now support automatic testing.”

Posted March 25, 2004 by Ward Cunningham on the agile-testing mail list

(see http://groups.yahoo.com/group/agile-testing/message/3881)

Page 11: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

11v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

11

Story BoardTo Do Done

Exploratory Testing and Emergent Behavior

Burn Down

This security story passesmy acceptance tests.

And the unittests all pass!

Did anyone notice that a user canmake themselves an administrator?

A True Story of Emergent Behavior

Elisabeth was working on a project where the requirements indicated that the systemwould use a Role-Based Access Control (RBAC) security model. The XP team hadcreated numerous stories describing the relationship between users, groups, roles,and view/edit/delete permissions. The unit tests were all green, and the Customerhad executed her acceptance tests and was satisfied with the results.

While doing exploratory testing, Elisabeth discovered that a regular user who hadpermission to view all groups could then proceed to add themselves to theadministrator group—giving them access to everything to which they previouslyhadn’t had access.

This story illustrates how even rigorous unit testing and acceptance testing can missside-effect behaviors.

Automated unit testing and acceptance testing are powerful. They are necessary.But they are not sufficient. Manual exploration fills in the gaps.

Page 12: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

12v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

12

I don’t know what Idid, but it crashed!

Exploratory Risks & Remedies

Look! An extraspace in the log!

I’ll just tab aroundthe form a lot.

Rigorous Analysis

Note-taking

Chartering andStakeholderDiscussions

Heuristics

To Avoid These Problems… …Use These Tools

Judgment

Risks of Exploratory Testing

• Unrepeatable Bugs: after several hours of testing, you discover a great bug.

But then you can’t figure out what you were doing that triggered it.

• Obscure Low Priority Anomalies: you spend all your time chasing down

obscure behavior quirks that no one is willing to fix.

• Unfocused Pounding: unsure what might be interesting to test, you end up

doing a whole lot of keyboard pounding without a

Remedies

• Rigorous Analysis: use various analysis techniques including variables,

inputs-outputs-linkage, states, etc. to identify ways in which it might be

interesting to exercise the software.

• Heuristics: use test design techniques as guidelines for defining tests.

• Note-taking: take notes on conditions, actions, and observations throughout.

• Judgment: exercise judgment in deciding how far to take a particular line of

inquiry and to decide whether or not something is a problem

• Stakeholder Discussions: use discussion techniques like “Tell Me a Story”

and “Would You Want to Know If…?” (see Section 3, Charters) and “Bring

Me a Rock” (see Section 9, Testing Feedback) to ensure your testing is

aligned with your stakeholders’ information needs.

Page 13: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

13v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

13

Key Insights?

Page 14: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

14v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

2. Charters and Sessions

Page 15: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

15v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

15

Charters and Sessions

Session

Session

Session

Session

Session

Session

Explore the feature in combinationwith other completed features.

Explore to see how configurationvariables affect capability.

Explore error handling using data typeattacks to find problems related to

data validation.

Charters Sessions

Sessions Make ET Estimable and Measurable

Exploratory testing is open-ended, and this is one of the concerns managers often

have about it. “But how can we estimate how much time we need?” they ask.

“How can we track progress? How will we know when we’re done?”

The answer is to use Sessions to manage your Exploratory Testing effort.

An excellent resource for how to do Session-Based Testing can be found in

Jonathan Bach’s article “Session-Based Test Management” available online at

http://www.satisfice.com/articles/sbtm.pdf

In Session-Based Testing, each session is guided by a charter. The charter

constrains the scope of testing for the session.

Choosing a Session Length

Some people like to vary the length of their sessions. Others prefer to time box their

sessions at 1 – 2 hours. (Jonathan Bach uses 90 minutes for his sessions.)

An advantage of time boxing is that it creates structure around Exploratory Testing

so you can estimate, predict, and manage the effort. Thus, the exact length of

sessions is less important than the idea of choosing a length for your sessions that

works consistently within your organization.

Page 16: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

16v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

16

Where Do Charters Come From?

• Stakeholder questions“What happens if we have more than 5000records?”

• Functional requirements“Verify the Coupon feature works for all thedifferent types of coupons.”

• Quality attributes“Explore the features associated with users,groups, and security to find security holes.”

• “Nightmare Headline Game”Imagine the worst thing that could possiblyhappen, and consider what might contributeto it happening?

Charters Provide Test Objectives

Because good test analysis will inevitably reveal more tests than we could possibly

execute in a lifetime, we have to be choosy about how we spend our time. It’s too

easy to fall into a rat hole of attempting to explore every possible sequence and data

permutation and variation.

There are a variety of test selection strategies we can employ, such as equivalence

analysis and all-pairs. But even before we begin combining or eliminating test cases,

we need a charter: we need to know who we’re testing for and what information

they need. Exploratory Testing charters define the area we’re testing and the kind of

vulnerabilities we’re looking for.

Focusing with charters, then using a variety of analysis techniques to approach the

targeted area from multiple perspectives, helps ensure that your Exploratory Testing

efforts consistently yield information that your stakeholders will value.

Page 17: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

17v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

17

Stakeholders in the Testing Process

Who are yourstakeholders?

What questions do theywant testing to answer?

What do you do todiscover theinformation that willanswer the questions?

How do you provide theinformation?

I havequestions…

I haveanswers…

Who Are Stakeholders?

A stakeholder is someone with a direct stake in the outcome. In the case of testing,

stakeholders are the people who receive information from testing directly.

Many testers report that the end user is their stakeholder. This is only true if the end

user receives some form of information directly from testing whether bug reports,

test documentation, or test execution results.

Stakeholders in the test process are far more likely to be project managers, product

managers, development managers, programmers, technical support technicians,

technical writers, and other testers. In regulated organizations, testing stakeholders

may include auditors. However, no two organizations are exactly alike. For that

matter, no two projects, even within the same organization, are exactly alike. Your

list of stakeholders will vary.

Page 18: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

18v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

18

Charters and Stakeholder Questions

What questions do your stakeholders haveabout the software you’re testing?

I havequestions…

I haveanswers…

Stakeholder Questions Help Define Charters

A conference attendee asked Elisabeth, “How do you ensure Exploratory Testing

doesn’t devolve into banging on the keyboard?” His testers were doing Exploratory

Testing, but in such a disorganized fashion no one knew what had been tested and

what hadn’t. (And even if you’re doing Scripted Testing, where progress is

measured against a plan, the plan could be outdated.)

Charters identify sets of information that our stakeholders want us to find. For

example:

• Use the CRUD (Create, Read, Update, Delete) heuristic, Zero-One-Many

heuristic, Some-None-All heuristic, and data dependencies to find potential

problems with creating, viewing, updating, and deleting the different types

of entities the system tracks.

• Exercise the Publish feature in various ways to find any instances where a

valid publish request does not complete successfully or where the user does

not receive any feedback about the actions the Publish feature took on their

behalf.

• Use a combination of valid and invalid transactions to explore the responses

from the SOAP/XML interface.

A test effort will have numerous charters. Each charter leads to one or more test

sessions. Each exploratory session involves executing uncountable tests, or

experiments, as we learn about the software and use that new knowledge to design

new tests on-the-fly.

Page 19: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

19v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

19

Questions Suggest Types of Tests

If you’re most concernedabout user experience, we’ll

focus on end-to-end scenarios.

Test Types and Stakeholder Questions

Tests provide the answer to stakeholder questions. Examples:

• Happy path (positive) tests answer the question, “Does it work at all?”

• Error condition (negative) tests answer the question, “How does it behave

when things go wrong?”

• End-to-end scenario tests answer the question, “What will a real user

experience?”

• Performance tests answer the question, “How responsive will it be in the real

world?”

• Stress/load tests answer the question, “How well will it work when pushed to

its limits?”

Gauge how much of each type of testing to perform according to the questions you

stakeholders want answered.

Remember that your stakeholders may discover that they need different information

at different points in the project. At first, for example, they might want to know if it

works at all, while later they might want to know if there’s anything a user could do

that would have bad results (like corrupting data or bringing down a server).

Page 20: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

20v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

20

Would You Want to Know If…

Would you want to know if Ifound any issues with usability

or accessibility?

Engaging the Stakeholders

Only the stakeholders can tell you if they would find a given kind of information

valuable in driving the project forward.

Internal and external quality attributes are a good way to drive the “Would You

Want to Know If…” discussions.

Page 21: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

21v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

21

Understand Internal and External Quality Criteria

Internal

Anything that makesthe software

development processmore predictable and

efficient, such as:

Maintainability

Reusability

Portability

Integrity

Consistency

Testability

Modifiability

External

Anything thatimproves the value of

the software to theexternal customer

and/or user, such as:

Capability

Reliability

Stability

Usability

Scalability

Performance

Interoperability

Exploring Quality Criteria: Fill in the Blank

Trying to figure out what the key quality criteria are? Work with your stakeholders

to fill in the blanks in statements like these:

• If nothing else works correctly in this system, it must be able to ________

• Our customers use our software so they can _________

• Our customers will not use our software if _________

• Our software must always ____ and never ____

• We want users to think of our system as especially ______

Functional and Non-Functional Success Criteria

“Capability” refers to the degree to which the software has the promised

functionality. It’s the only quality criterion in the lists above that refer to the

functionality of the system. The rest of the quality criteria are non-functional: they

refer to some difficult-to-measure attribute of the system.

One of the difficult things about testing non-functional attributes, such as

“reliability” is that it is much easier to demonstrate absence than presence. All we

have to do to show that a system is unreliable is find one instance in which it loses

or corrupts data. To demonstrate that the system is reliable, we have to demonstrate

that it does not lose or corrupt data in every possible situation we can think of where

data might possibly be at risk.

It’s easy to say, “Our testing will demonstrate the reliability, stability, scalability,

interoperability, and performance of the system.” It’s much harder to design tests

that actually show that.

Page 22: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

22v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

22

Risk Exercise: Nightmare Headlines

“Product Recalled Amid Public Hysteria”

“Consumers Irate Over Double-Billing”

“Credit Card Numbers Published to the

Web”

“Killer Product Claims More Victims”

“Investors Bilked of IT Investments”

The Morning Paper

Imagine you open the morning paper to find a front page headline about your

product or service. It’s bad news. The worst possible.

What is it?

Short on inspiration? Peruse newspapers and news sites to find headlines about

other software. For example, here’s a list of past headlines (company names

omitted, but all are real articles):

• “Flawed Routers Flood Internet Time Server”

• “Cracks appear in Bluetooth security”

• “Flawed Slot Machine Software Angers Gamblers”

• “Company lowers earnings report after software glitch: Incorrect

configuration of accounting tool cited”

• “Virus Update Freezes PCs”

Page 23: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

23v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

23

Risks Lead to Tests

Possible causes:

• Client stops functioning

• Server stops functioning

• Network fails

• Client times out

• Client fails to detect network

• High load prevents transaction from beingprocessed

• Database rejects transaction

Failure Mode: Transaction Fails

Risks, Failure Modes, and Tests

Each disaster you envision is a potential failure mode: something that could go

wrong.

For each failure mode, brainstorm a list of ways in which the failure could occur.

Sometimes one failure mode will suggest other failure modes (as above, where the

original failure mode of a transaction failing could be because of another failure

mode in which the client software stops functioning).

Use your list of possible causes for each failure mode as a springboard for additional

tests by asking, “How could I possibly cause this to happen?” and “How should the

system respond if this does happen?”

Root Cause Analysis and FMEA

These techniques around imagining potential risks and failures are inspired by root

cause analysis techniques and Failure Mode Effects Analysis (FMEA). For more

information, see the ASQ’s website (www.asq.org).

Page 24: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

24v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

24

Framing Charters

A template:

Explore area/feature [ with resources,conditions, or constraints ] to discoverinformation

Examples:

Explore the Calculator feature in the Fly todiscover conditions or sequences that generateinaccurate results.

Explore drawing calculators with irregular,small, and too close buttons and lines todiscover how the Fly behaves under theseconditions.

Defining Charters

Your charter specifies:

• The information you seek

• Conditions or constraints

• The resources, approaches, and/or heuristics you’ll use to find it

A good charter guides the tester. It clarifies what kind of testing is in scope and

what is out of scope. At the same time, the charter is not so specific that it

constrains the tester.

“Check that it works” is too broad.

“Check that three of a kind beats a pair” is too narrow.

“Check that winning hands are awarded the pot under various circumstances”

constrains the test to verifying the logic related to evaluating hands and awarding

wins while leaving the actual test conditions up to the individual tester.

Page 25: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

25v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

25

Estimating and Allocating Sessions

2Explore the item detail…

SessionsCharter…

10Explore checkout …

5Explore coupons …

5Explore browsing the catalog…

10Explore the catalog search …

Tips for estimating:

• Plan for a minimum of 1 session per charter. If a charter is not deemed

sufficiently important to warrant at least one test session, the charter should

be dropped to from the list.

• Split charters requiring more than a week’s worth of sessions.

• Discuss the estimates with the stakeholders to ensure testing time is being

spent on charters with the highest probability of providing useful information

to move the project forward.

Page 26: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

26v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

26

Measuring ET: Counting Sessions

Plan

ned S

ess

ions

Time

Number of remaining sessionsgoes down when we completesessions or when we cut scope.

Remainingsessionsincreases whenscope increasesor we discoverwe under-estimated.

The Value of Big Visible Charts

The chart in the image above is an example of a Big Visible Chart. In this case, it’s

an example of a Burn Down Chart, a type of chart used by Scrum teams. Even if

your organization does not use Scrum, or any Agile method, using a Burn Down

Chart to track test sessions can provide the entire team with information about

testing progress.

Page 27: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

27v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

27

Key Insights?

Page 28: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

28v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

3. Analyzing Variables

Testing involves varying anythingthat can be changed. And quite alot can be changed.

Page 29: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

29v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

29

Variables

Definition: Sometimes variables are:

• obvious, like a field on aform

• hidden, like an internalvalue

• subtle, like the state of acontrol

Variable:

Anything

whose value

can be

changed.

Testing involves choosing values for every

variable (not just the obvious ones).

Conscious choices make for better tests.

Examples of Variables

• Network related variables: characteristics of a network can change, such as

load, latency, and collisions.

• Environment related variables: attributes of the environment in which the

application runs can change, such as available memory, CPU load, available

disk space, and other programs running.

• Web-related variables: Web applications can run in different browsers.

Different browsers, and different versions of browsers, have different

capabilities and may interpret web pages differently. Further, each browser

allows users to turn on/off JavaScript, Cookies, and other settings.

• Sequences: the order in which a user might perform operations may change.

Variables and Values

Sometimes when we ask for examples of variables, participants say things like

“boundary conditions” or “-1.”

These are possible values for a variable. Your goal in analyzing variables is to

identify those things that you can change (or cause to be changed by another part of

the system), and that might affect the behavior of the software. Once you’ve

identified variables, then you can think about good values for those variables.

Page 30: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

30v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

30

A Real Life Example of a Subtle Variable

In January 2004, Nasa lostcontact with the MarsRover Spirit.

They soon discovered thatthe Spirit had encountereda serious anomaly and wasrebooting itself over andover again.

According to an article onthe site Spaceflight Now,the Mars Rover problemwas that there were toomany files on the flashmemory.

The Mars Rover

Image courtesy of JPL/NASA

More Details About the Mars Rover

As with any serious problem, the simple explanation (too many files) doesn’t tell the

full story.

Every time the Rover created a new file, the DOS table of files grew. Some

operations created numerous small files, and over time the table of files became

huge. Part of the system mirrored the flash memory contents in RAM, and there was

half as much RAM as flash memory. Eventually the DOS table of files swamped

the RAM, causing continuous reboots.

Note the number of variables involved, all interdependent: number of files, size of

the DOS table of files, space on flash memory and available RAM.

When explaining the issue in a talk at the Hot Chips conference in 2004, Robert

Denise of JPL, quipped, “The Spirit was the willing, but the flash was weak.”

For further reading, see:

•http://www.spaceflightnow.com/mars/mera/040201spirit.html

•http://www.extremetech.com/article2/0,1697,1638764,00.asp

•http://klabs.org/richcontent/MemoryContent/flash/mer_spirit_mishap.htm

Page 31: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

31v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

31

Classic Triangle Example*, Updated

Triangle 2.0 Specification:

The user enters the size of the sides,then clicks Draw. Triangle draws thetriangle and reports its type:

– Right: it has a 90 degree angle

– Equilateral: all sides are equal

– Isosceles: two sides are equal

– Scalene: no sides are equal

– Invalid: the length of the largestside is greater than or equal to thesum of the two smaller sides

* Glenford Myers posed the triangle puzzle in his 1979 book The Art of

Software Testing. His triangle program ran on punch cards.

Varying Variables

Once you’ve identified variables, you can think about how to vary them. Forexample:

Interesting Data Values

• Various acceptable values that produce different results

• “Goldilocks tests” (too big, too small, just right) and boundary tests (almosttoo big, almost too small)

• Text that includes delimiters and special characters: ~!@#$%^&*()_+=-`{}|[]\;’:”

• Data that violates a domain-specific rule such as an ip address like999.999.999.999 or an age of -1.

Interesting Actions and Sequences

• Vary the order in which you perform operations

• Do “anytime” actions, like logging off, shutting down, or hitting the refreshor back button in a browser, at random

• For position-dependent actions: perform the action in the beginning, middle,and end

See the Heuristics Cheat Sheet for more ideas on interesting values for variables.

Page 32: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

32v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

32

Heuristic: Catalog of Interesting Values

With a catalog ofgeneric tests, you’llhave a laundry list oftests for various typesof variables.

– Text/Strings

– Paths

– Numbers

– Dates

– Windows

– Option settings

– Etc.

Aha! I knowhow to test

that!

Test Heuristics

“Interesting Data Values” (from the previous slide notes) like “Goldilocks tests” or

“common delimiters” are test heuristics, aids that speed testing by encapsulating a

series of test cases in an easy-to-remember aid.

Pre-packaged “Generic” tests serve the same purpose. They’re lists of things to

remember to test that are particular for your software or environment.

See the Heuristics Cheat Sheet for some ideas.

Page 33: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

33v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

33

Exercise: Variables in FLY™

1. Form teams. Your team will be given aFLY pen for this exercise.

2. Create a calculator with FLY tofamiliarize yourself with the process.

3. Within your team, brainstorm a list ofvariables (things you can change, orcause to be changed, that could affectthe behavior of the software) associatedwith drawing calculators with the FLY.

Page 34: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

34v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

34

Key Insights?

Page 35: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

35v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

4. Modeling Behavior

Modeling system behavior

Analyzing states and events

Page 36: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

36v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

36

World’s Simplest State Model

OFF ON

Start

Stop

A State

An Event

States describe behavioral modes. For example, when projecting slides for a class,

we put PowerPoint in Slide Show mode. This is a state. To identify states, ask:

• What can I do now that I couldn’t do before?

• What can’t I do now that I could do before?

• Do my actions have different results now?

Events are occurrences of interest to the system. They trigger transitions between

states. For example, to put PowerPoint in Slide Show mode, we can click on the

Slide Show mode icon. To identify events, consider what triggers state transitions,

such as:

• User actions: clicking links or pushing buttons

• Changing conditions: a task completed

• Application interaction: another application issues start and stop requests

• Operating system requests: shutdown

• Time: a session timeout.

The State Model describes the states and events. State models can be expressed in

terms of bubbles and arrows, or in terms of tables.

Page 37: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

37v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

37

A (Slightly) More Complex State Chart

A Thermostat

Heat Off Cool

Temp >=Setting

Temp >Setting

Temp <=Setting

Temp <Setting

What Else Has States?

States apply to more than executable code. Objects within the system may have

states. For example:

• User accounts can be enabled, disabled, logged in, logged out, expired,

suspended, etc.

• Transactions could be in process, suspended, pending, completed, etc.

• Records or files might be available, locked, etc.

What’s the Difference between State Charts and Flow Charts?

The two types of models are very similar: both describe the behavior of the system,

and both involve transitions. Both are useful for testers in identifying paths through

the system.

However, flow charts tend to emphasizes decisions, like “Age > 18?”, and so can

help testers identify boundaries and logic rules.

By contrast, state models emphasize events or conditions that the system can

respond to. As a result, state models help testers identify external influences and

opportunities to interrupt the system.

Page 38: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

38v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

38

An Example Flow Chart: the Thermostat

Check

TempAC On? AC off

Temp <=

Setting?

Heat On?Temp >=

Setting?Heat off

AC onTemp >

Setting?

Heat onTemp <

Setting?

no

yes

no

no

no

yes

yes

yes

yes

yes

no

no

When in Doubt, Change the Representation

In the case of the thermostat, the flow chart appears far more complex than the state

model. Sometimes the state model looks more complex than the flow chart. If

you’re having trouble with one type of diagram, try another. Perhaps the

information you need to represent will fit better into another style of diagram or

perhaps even a table.

Page 39: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

39v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

39

Yet Another Example: PowerPoint Display Modes

White Screen Slideshow Black Screen

Any key pressed

“B” key pressed

Any key pressed

“W” key pressedDiagr

am

Table

N/AN/AAny KeyBlack

N/AN/AAny KeyWhite

“B”“W”N/ASlideshow

BlackWhiteSlideshow

State Tables Suggest New Tests

State-to-State: label the rows and columns with the states, then fill in the cells of the

table with the events that trigger a transition from the starting state to the ending

state. This style of table emphasizes the possible transitions and prompts us to ask

the question, “how can I get from this state to that state?”

State-to-Event: label the rows with the states and the columns with the events, then

fill in the cells with the resulting state. This style of table emphasizes the effect of

events on the system under various conditions and prompts us to ask the question,

“what happens if I trigger this event now?”

Page 40: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

40v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

40

Interacting State Models

Update

DB

Check

Conditions

Idle2 min

passed

DB

Updated

Gathering

Done

Check

Failed

Gather

Data

Check

Passed

Update

System

Check

Network

Idle

Time to

connectSystem

Updated

No

Network

Get

Updates

Network

Avail

No Updates

Content

Downloaded

Gathering Local Data Downloading Updates

Testing Interactions

In this system, both the Update DB state on the left and the Update System state on

the right required access to the database. However, the Update System state

required an exclusive lock. If it could not acquire the lock it needed, the software

crashed.

This is an example of how the states for two different parts of a system might

interact. Where two state machines interact, try combinations of states.

• Are there any combinations that are illegal?

• Is it possible to get the system into that combination state?

Page 41: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

41v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

41

Testing from State Models

When testing from a state model, think about:

1. Transitions

– Do all the events trigger the transitions weexpect? Test every path.

– What happens if we repeat loops (submit-cancel-submit-repeat)? Check for memoryleaks or data corruption.

2. Interruptions

– What happens if we force an exit fromintermediate states?

– What happens if we trigger events when theyaren’t expected?

More Ideas for Interruption Tests

Apply every controllable event to every state. Examples:

• Push buttons, launch software, or take other user-controllable actions when

you aren’t supposed to.

• Cause triggering events from other software to occur.

Force exits from every state. Examples:

• Shut down the software as it’s starting up.

• Close windows when they least expect it.

• Force the computer into hibernation (close the lid on a laptop).

• Kill the software from the task manager or process list.

Page 42: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

42v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

42

Exercise: Modeling States

1. Form teams.

2. Create a state model of the FLY statesinvolved in creating a calculator.

3. Design and execute tests from yourstate model. Take notes on anythingthat surprised you (whether or not youthink it’s a bug).

4. Prepare to share your model and testswith the other teams.

Tips for Modeling

• Choose a target or perspective. In this case, you are modeling the Fly’s

calculator software.

• Choose a starting point. If you’re not sure where to start, try blank screen,

not running, idle, or whatever state the target is in most of the time.

• Identify transitions and subsequent states by performing common actions.

• Expand your diagram, exploring further, performing less common actions or

forcing non-user-generated events to occur.

• Use the documentation and other resources to uncover clues about states and

events you have not yet seen.

Page 43: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

43v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

43

Key Insights?

Page 44: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

44v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

5. Modeling the System

Page 45: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

45v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

45

The World’s Simplest Deployment Diagram

Client Server

A Process

A NetworkConnection

What Good Are These Simple Diagrams?

Both the World’s Simplest State Model and the World’s Simplest Deployment

Diagram may seem so simple, so trivial, that they aren’t useful. Ah, but they are.

The simplistic diagram above can prompt questions, and thus tests, like:

• What if the client process unexpectedly stops?

• What if the server process unexpectedly stops?

• …in mid transaction?

• What if the network is slow? Flaky? Disconnected?

Page 46: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

46v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

46

Example: Network Connection Tests

Try this...

• Disconnect the network

• Slow the connection down(28.8 modem overInternet or VPN) or speedit up.

• Force timeouts

• Force connection refusals

• Close the port on whichthe server is listening

When you see this...

A Connection

What variablesaffect

connectivity?

Network Variables

Notice that the ideas in the list on the slide above are all related to things we can

change about the network:

• Connected or disconnected

• Speed

• Bandwidth

• Responsiveness

• Port availability

• Network configuration

There are a wide variety of ways in which you could change the network

configuration that might affect the behavior of the software. For example:

• Insert a firewall between the client and the server

• Put the two connected elements on different subnets or domains

• Simulate a connection with low bandwidth or high latency

Variables Revisited

Throughout this section, we’ve explored numerous things we can change that might

affect the behavior of the software. All these things are variables: hardware

configurations, file states, other processes running, etc.

In choosing values for your variables, remember that if you don’t choose

intentionally, you’ve chosen by default. Intentional choices make for better tests.

Page 47: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

47v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

47

Look at the Database

Looking at thedatabase schemacan give you a greatdeal of informationabout the datadependencies in asystem.

Using the Database to Suggest Tests

• Use database tools to examine the relational schema.

• Watch what changes where when you insert, update, and delete data.

• Watch for data that can be updated by multiple functions: there may be

locking problems.

• Watch for multiple storage locations of the same data (such as customer

address): there may be synchronization problems.

Page 48: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

48v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

48

Following the Data, Data Integrity, andDependence

Imagine a system where each invoice is associatedwith a customer record that has attributes such

as contact information and credit limit.

Invoice Customer

Invoices have a dependency on Customer data.

Now imagine what happens if the Customer record isdeleted. What happens to the Invoice record?

Has a

Invoice CustomerHas a

Broken Dependencies

As with other types of follow-the-data tests, it’s not enough to verify that you can

view the Invoice even after the Customer record has been deleted or disabled. Try

to:

• Run reports that include that invoice

• Perform calculations that include the figures from that invoice

• Export the invoice

• Update the invoice

• Void the invoice

Even if it isn’t clear what the system should do in the case of a situation like this, it

is clear what it should not do. The system should not corrupt the data so you cannot

use it (or so you cannot balance your books ever again). The system should not

crash. The system should not produce invalid or misleading results. Sometimes we

may not have clear acceptance criteria for what the system should do, but we know a

lot about what it shouldn’t do.

Page 49: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

49v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

49

Processes and Environment Interactions

CPUMemory Disk

Mouse

Keyboard

Video

Printers

Other

Devices

ComputerInput/Output

Peripherals

Modem

Network

Network

Card

Other

Systems

LANs, WANs,

Internet

Software

Under Test

No Process is an Island

Programs execute in an environment that includes an operating system that governs

access to resources such as peripherals, CPU, memory, and the file system as well as

other programs and libraries loaded. Any of these things can interact.

In fact, some programs are designed to interact. For a system that relies on tightly

integrated processes and inter-process communication, consider these scenarios:

• If Process A attempts to launch Process B, what happens if B fails to launch?

• If A launches B, what happens if A exits before B?

• If A sends data to B, what happens if B isn’t able to receive it?

• If B fails to send data back to A, can A resume operation?

• If A and B share a resource, such as a file, what if A has the file locked when

B attempts to access it?

Page 50: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

50v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

50

A Context Diagram

The SystemUnder Test

AnotherInteracting

System

AnotherInteracting

System

AnotherInteracting

System

Common Context Elements

• Most software interacts with common elements like the operating system’s

File System, the Network, one or more Databases.

• Web software interacts with a Browser.

• Other software may rely on external services such as LDAP

Authentication.

Understanding the Deployment Context

Consider:

• What interacts with the system I’m testing?

• What are all the data entry points for the system I’m testing?

• What are all the data exit points for the system I’m testing?

Page 51: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

51v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

51

There are millions ofconfigurations. How willwe ever test them all?

Ah! We won’t. Instead, we’llvary as much as we can toincrease the probability offinding important problems.

Varying Configurations

Configuration Variables

There are an almost infinite number of configuration variables. You could vary the

operating system version, input devices, option settings, installed printers, modems

or network cards, available memory, etc. Sometimes configuration choices have a

surprising effect on software. For example, one software package crashed on launch

if there was no default printer installed on the machine. Since we can’t test every

possible configuration, our goal is to increase the probability that we’ll find the

important configuration-related problems.

Determining the right set of configurations to test is difficult.

One test group on an enterprise software system drafted a test plan that involved

testing the software on both Windows XP Pro and XP Home. The product manager

objected that XP Home was not supported so it shouldn’t be tested. Information

about how the software behaved on XP Home was not relevant, so any time the

testers spent testing on XP Home would simply be wasted.

And yet, the officially supported configurations aren’t always the right set to test.

Another test group only tested their internal software on the officially supported

hardware configuration. Their frustrated users (internal to the company) pointed out

that the machines they had were not up to the “official” standard, nor was there

budget available to upgrade the machines. If the IT department wanted to set a

higher standard, they would have to provide machines. The testers tested according

to the requirements, but the requirements did not match reality.

The key is to work with the stakeholders to understand the real configuration

requirements, and test accordingly.

Page 52: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

52v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

6. Nouns and Verbs

Page 53: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

53v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

53

FeatureB

Nouns & Verbs Force Feature Interactions

FeatureA

Features Interact in Unexpected Ways

The features in a software system may have interesting interactions. For example:

• Saving a file in a different format may corrupt certain kinds of objects in the file.

• A series of actions when done at different times may complete in a timelymanner, but when performed one right after another may cause a seriousperformance problem.

• Actions may work in some views but not others.

• Actions may be allowed when they make no logical sense, such as a file saveoption that’s available when there is no open file to save.

A Feature Interaction Bug Report

Description: Program crashes on viewing file saved in zoom view.

Steps to Reproduce:

1. Create a file

2. Save it in zoom view

3. Re-open the file

4. Scroll to the second page

Result: program crashes.

Page 54: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

54v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

54

Nouns & Verbs Describe Feature Interactions

Describe feature interactionsin terms of nouns andverbs:

• Nouns: what kinds ofobjects or data can youcreate and manage in thesoftware?

• Verbs: What can differenttypes of users do to them?

The “grammar” of software applications is usuallycomplex enough to write complete stories.

Sen

d Log

File

Lock

User

Text Editor Example

Consider a text editor.

• Nouns would include Text, Margin, Option, Alignment, Document, File,

Characters, Window, Menu, Status Bar, Line, Cursor, etc.

• Verbs would include Edit, Open, Close, Save, Search, Replace, Format, Set,

Resize, Select, Drag, Drop, etc.

We can combine these nouns and verbs in numerous permutations: format text, open

files, select options, etc.

Common Nouns and Verbs

Remember the CRUD heuristic: there are four verbs that are common to any system

that stores data: Create, Read, Update, Delete.

Page 55: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

55v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

55

Modifiers: Adjectives and Adverbs

Adjectives (Attributes)

• Color

• Visible

• Identical

• Verbose

Adverbs (Action Descriptors)

• Quickly

• Slowly

• Repeatedly

• Precisely

• Randomly

Running short ofideas? Grab adictionary or

thesaurus and thumbthrough it for

random adjectivesand adverbs to apply.

Adjectives Modify Nouns

It’s not just text in our text editor, it’s big text, verbose text, unicode text, invisible

text.

Adjectives can help us think of more interesting test conditions by prompting us to

create particular types of objects in our system.

Adverbs Modify Actions

Where adjectives prompt interesting conditions, adverbs prompt interesting

sequences.

Consider a graphics editor Elisabeth used in 1988. Elisabeth created one figure in

the editor, and wanted multiple copies. So she copied and pasted the figure.

Repeatedly. After a while, Elisabeth noticed that the figure was growing each time

she pasted it. The software had a rounding error, and the error was compounding

with every subsequent paste. Eventually the newly pasted figures were several

times the size of the original.

Or consider a What-You-See-Is-What-You-Get (WYSIWYG) web page editor.

Users wanted their HTML to look exactly the same in the browser as it did in the

editor. It was important to evaluate output precisely.

Page 56: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

56v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

56

The Nouns & Verbs Process

1. Brainstorm nouns & verbs

2. Combine into random noun/verb pairs

3. Add adjectives and adverbs to make thecombinations more interesting

4. Use your creative powers to translateeach pair into an action you couldperform with the system

Text Editor Example

Random pairs from step 2: [Create File] [Open Alignment] [Search File] [Drop

Status] [Replace Overstrike] [Drag Window] [Resize Window]

Oh, dear. “Create File” is clear enough. But what about “Drop Status” or “Replace

Overstrike”?

In step 4, we stretch our imaginations to come up with a scenario. How about:

1. Create an empty file.

2. Choose Format Settings from the menu to open the alignment dialog. [“Open

Alignment”]

3. Quickly close the alignment dialog.

4. Search for a pattern of characters within the empty file. [“Search File”]

5. Turn off the status bar from the Options menu. [“Drop Status”]

6. Turn the status bar back on, and also turn on overstrike. [“Replace

Overstrike”]

7. Move the window, then resize it.

Page 57: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

57v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

57

Exercise: Creating Stories

1. Form a team.

2. Brainstorm a list of nouns and verbs forFLY.

3. Randomly choose from the list toconstruct 5 or more sentences that willbecome one or more complex scenarios.

4. Edit the random scenario to create ascenario you can imagine actuallyexecuting. Add adjectives and adverbsto further describe the scenario.

5. Execute the scenario on FLY, takingnotes on anything that surprised you(whether or not you think it’s a bug).

Surprises and Information

“Fish live in water.” Did that surprise you? Probably not. So the idea that fish live

in water isn’t new to you. It isn’t interesting information.

But if I said “I have a fish that lives in a hamster cage,” you probably would be

surprised. Now that would be interesting information.

When you’re testing, look for surprises. If it surprises you, it’s probably interesting

information for your stakeholders even if it isn’t a bug. (Sometimes the best

surprises are the discovery that something actually works when you didn’t think it

would!)

Page 58: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

58v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

58

Key Insights?

Page 59: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

59v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

7. User Perspectives

Scenarios

Extreme Personalities

Soap Operas

Page 60: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

60v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

60

Walk in the Users’ Shoes

Pretend to be a user:

• Attempt toaccomplish a typicalset of tasks

• Imagine mistakes anew user to thesystem might make

• Try to solve aproblem using thesystem

Sometimes we see bugs we’d missed before when we use

the software to accomplish real work.

A Day in the Life

One way to discover new use cases is to imagine yourself as a typical user just

trying to get something done. For example, imagine that you’re a system

administrator concerned about security, so you’re trying to lock out all unauthorized

users. You might give everyone the bare minimum permissions required to do their

job. Then put yourself in the shoes of one of those users with minimal security and

try to complete basic tasks.

Or put yourself in the shoes of a mid-level manager trying to get reports from home

in the middle of the night. You’ve got AOL dial up. How long will you have to

wait for those reports?

Page 61: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

61v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

61

The Power of Use Cases

This will be great! Thisadded flexibility willincrease our sales!

It sounded like a good Idea at the time…

This will be easy!We’ll just add a fewmore configuration

parameters!

…Until we explored the use cases

How will administratorsmanage configs?

What aboutmaintenance?

Who will usethis flexibility

and how?

Uhoh.

Happy Paths and Alternate Paths

Use cases define paths through the system that a user will take in attempting to

accomplish a goal. Use cases include both basic paths (happy paths) and alternate

paths (possible error conditions).

Testing Use Cases

There are two key reasons to test using use cases:

• Executing the tests reveals information about the user experience.

• A path through the system usually involves multiple features, and as we saw

in the Nouns and Verbs exercise, sometimes features interact in unexpected

ways. Use cases can help reveal interactions you wouldn’t see if you tested

features independently.

An Example

A web based system allowed users to select items from a list by clicking a Select

button, and also allowed users to edit items by clicking on the name of the item. But

until there were more items in the list than would fit on a single page, no one noticed

that clicking the Select button caused the list to redisplay from the beginning. Users

had to make several mouse clicks to both select and edit an item. This is the sort of

surprise that use case based testing is likely to reveal.

Page 62: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

62v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

62

Extreme Personalities

Imagine different extreme personalities you might encounter.

Then define scenarios based on those personalities.

Now howcan I tweak

this?

I guess that’swhy they call

me klutzy.

Neurotic?I’m not

neurotic.I’m intense!

Me likecomputer!

Find thatfile orelse!!!

Personae

What would happen if Bart Simpson used your software? He’s a trickster. He likes

to bend the rules. He pushes boundaries to see what reaction he gets. He might

hack URLs in a browser, changing parameters to see what havoc he can wreak.

How about Homer Simpson? He’s a klutz. He means well, but he’s more interested

in donuts than reading manuals. He’s fumble-fingered and slow. He’s just as likely

to use the CD holder for his coffee cup as for a CD.

Extreme personalities go far beyond what any reasonable person would do. They’re

more clumsy, stubborn, neurotic, malicious, or overly clever than any normal user.

That’s why they inspire great out-of-the-box tests that normal people wouldn’t think

of. When testing, try channeling your inner Bart, Neanderthal, or Hacker.

Page 63: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

63v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

63

Soap Operas

Let’s create an invoice with themaximum number of items…

Where some will qualify fordiscounts and some that don’t…

Yeah! And we’ll start processing it,then void it, then reinstate it then…

Soap Operas are Extreme Scenarios

In his article “Soap Opera Testing” published in Better Software February 2004,

Hans Buwalda wrote:

As many readers know, soap operas are dramatic daytime television shows

that were originally sponsored by soap vendors. They depict life in a way

that viewers can relate to, but the situations portrayed are typically

condensed and exaggerated. In one episode, more things happen to the

characters than most of us will experience in a lifetime. Opinions may differ

about whether soap operas are fun to watch, but it must be great fun to write

them. Soap opera testing is similar to a soap opera in that tests are based on

real life, exaggerated, and condensed.

Notice that coming up with good soap operas requires identifying variables. It

always comes back to variables: the things—inputs, outputs, data, and

conditions—that we can change or cause to be changed as we exercise the software.

Page 64: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

64v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

64

Exercise: Testing with Personas

1. Form a team.

2. Identify two or three personas,descriptions of people who might usethe FLY.

3. Write a short description of eachpersona.

4. Taking each persona in turn, imaginewhat that person would do with the FLY.

5. Use the FLY as you imagine yourpersona would, taking notes onanything that surprised you (whether ornot you think it’s a bug).

FLY Personas

Having trouble imagining a persona that might have an interesting time using the

FLY?

What about…

• Bart Simpson, Lisa Simpson, or Homer Simpson (D’OH!)

• Albert Einstein

• Pablo Picasso

• Someone with dyslexia

• Someone in a hurry (or who drank too much coffee)

Page 65: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

65v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

65

Key Insights?

Page 66: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

66v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

8. Chartering and SessionsRevisited

Exploratory Testing in the Context ofAgile

Page 67: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

67v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

67

Agile Tests – Automated and Manual

Within an Iteration

AutomatedUnit Tests

AutomatedAcceptance or

Story Tests

ManualExploratory

Testing

Define “done”,representexecutable

requirements

Drive design,representexecutable

specifications

Providesadditionalfeedback

Unit Testing

• Done by developers, usually with an xUnit framework, often as a result of

practicing Test Driven Development (TDD)

• Supports the development process

• Unit test suites represent executable specifications

Automated Acceptance Testing

• The result of a collaboration between the developers, testers, and business

stakeholders

• Often implemented in a FIT-like framework or Domain Specific Language

• Acceptance test suites represent executable requirements

Exploratory Testing

• Provides additional feedback and covers gaps in automation

• Necessary to augment the automated tests

Page 68: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

68v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

68

Exploratory Testing Provides Feedback

About thesoftware…

…and about theexisting automatedtests.

Page 69: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

69v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

69

Where Do Charters Come From?

• Stakeholder questions

• Functional requirements

• Quality attributes

• “Nightmare Headline Game”

•…and questions aboutautomated test coverage.

Page 70: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

70v 2.1. Copyright (c) 2006-2007 Quality Tree Software, Inc.

Wrapup

Page 71: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006 - 2007 Quality Tree Software, Inc 71

Rigorous Exploratory Testing Blog entry

Published on http://www.testobsessed.com

April 19, 2006

by Elisabeth Hendrickson

Exploratory Testing is a style of testing in which you explore the software while

simultaneously designing and executing tests, using feedback from the last test to inform

the next. Exploratory Testing helps us find surprises, implications of interactions that no

one ever considered, and misunderstandings about what the software is supposed to do.

Cem Kaner first coined the term “Exploratory Testing” a couple decades ago, though

exploratory or “ad hoc” testing has been around longer than that.

Recently, I was talking with a group of XP developers about using Exploratory Testing

on XP projects to augment the TDD unit tests and automated acceptance tests.

“Oh, Exploratory Testing,” said one of the developers, “that’s where the tester does a

bunch of wacky, random stuff, right?”

“Not exactly,” I replied, a little dismayed that myths about Exploratory Testing still

abound after so many years. What looked wild to that developer was actually the result of

careful analysis. He held a common misconception about Exploratory Testing: he noted

the lack of formality and apparently arbitrary sequences and actions, and he concluded

that Exploratory Testing was an exercise in keyboard pounding rather than a rigorous

approach.

Two key things distinguish good Exploratory Testing as a disciplined form of testing:

1. Using a wide variety of analysis/testing techniques to target vulnerabilities from

multiple perspectives.

2. Using charters to focus effort on those vulnerabilities that are of most interest to

stakeholders.

Variety and Perspectives

The old saying goes, “If all you have is a hammer, everything looks like a nail.” If the

only testing technique a tester knows is how to stuff long strings into fields in search of

buffer overflow errors, that’s the only kind of vulnerability that tester is likely to find.

Good test analysis requires looking at the software from multiple perspectives. Field

attacks like entering long strings or badly formatted dates, or entering data that’s the

wrong type altogether (strings where a number should be) are one approach. Other

approaches include:

• Varying sequences of actions

Page 72: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006-2007 Quality Tree Software, Inc. 72

• Varying timing

• Using a deployment diagram to find opportunities to test error handling by

making required resources unavailable or locked, or to break connections

• Deriving transition and interrupt tests from state models

• Using use cases or analyzing the user perspective to identify real-world scenarios

• Inventing personae or soap operas to generate extreme scenarios

• Using cause-effect diagrams to test business rules or logic

• Using entity-relationship diagrams to test around data dependendencies

• Varying how data gets into and leaves the software under test using a data flow

diagram as a guide

Each of these types of testing reveal different kinds of vulnerabilities. Some check for

problems related to error handling while others look at potential problems under normal

use. Some find timing problems or race conditions, others identify logic problems. Using

a combination of analysis techniques increases the probability that, if there’s a problem,

the testing will find it.

Charters and Focus

Because good test analysis will inevitably reveal more tests than we could possibly

execute in a lifetime, much less by the ship date, we have to be choosy about how we

spend our time. It’s too easy to fall into a rat hole of potentially interesting sequence and

data permutations and variations.

There are a variety of test selection strategies we can employ, such as equivalence

analysis and all-pairs. But even before we begin combining or eliminating test cases, we

need a charter: we need to know who we’re testing for and what information they need.

Exploratory Testing charters define the area we’re testing and the kind of vulnerabilities

we’re looking for. I’ve used charters like these in past Exploratory Testing sessions:

• “Use the CRUD (Create, Read, Update, Delete) heuristic, Zero-One-Many

heuristic, Some-None-All heuristic, and data dependencies to find potential

problems with creating, viewing, updating, and deleting the different types of

entities the system tracks.”

• “Exercise the Publish feature in various ways to find any instances where a valid

publish request does not complete successfully or where the user does not receive

any feedback about the actions the Publish feature took on their behalf.”

• “Use a combination of valid and invalid transactions to explore the responses

from the SOAP/XML interface.”

Notice that each charter is general enough to cover numerous different types of tests, yet

specific in that it constrains my exploration to a particular interface, feature, or type of

action.

Page 73: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006 - 2007 Quality Tree Software, Inc 73

Variety and Focus Yield Consistently Useful Information

Exploratory Testing is particularly good at revealing vulnerabilities that no one thought to

look for before. Because you use the feedback from each experiment to inform the next,

you have the opportunity to pick up on subtle cues and allow your intuition to guide you

in your search for bugs.

But because Exploratory Testing involves designing tests on the fly, there’s a risk of

falling into a rut of executing just one or two types of tests (the hammer/nail problem) or

of discovering information that’s far afield from what your stakeholders need to know.

Focusing with charters, then using a variety of analysis techniques to approach the

targeted area from multiple perspectives, helps ensure that your Exploratory Testing

efforts consistently yield information that your stakeholders will value.

References

• Bach, James. “What Is Exploratory Testing?”

http://www.satisfice.com/articles/what_is_et.htm

• Bach, Jonathan. “Session-Based Test Management”

http://www.satisfice.com/articles/sbtm.pdf

• Kohl, Jonathan. “Exploratory Testing on Agile Teams”

http://www.informit.com/articles/article.asp?p=405514&rl=1

• Kohl, Jonathan, “User Profiles and Exploratory Testing”

http://www.kohl.ca/blog/archives/000104.html

• Marick, Brian. “A Survey of Exploratory Testing”

http://www.testingcraft.com/exploratory.html

• Rice, Randy. “Why I’m More Positive About Exploratory Testing”

http://www.riceconsulting.com/articles/exploratory_testing.htm

• Tinkham, Andy and Kaner, Cem. “Exploring Exploratory Testing”

http://www.testingeducation.org/a/explore.pdf

Page 74: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006-2007 Quality Tree Software, Inc. 74

It’s All about the Variables Blog entry

Published on http://www.testobsessed.com

April 7, 2006

by Elisabeth Hendrickson

Let’s talk about bugs for a moment. Bad bugs. The kind of bugs that make headlines.

Bugs like these:

• From 1985 - 1987, Therac 25 radiation therapy machines overdosed patients with

radiation, killing them.

• In 1996, the Ariane 5 rocket exploded spectacularly during its first flight.

• In 2004, the NASA Mars rover “Spirit” was inoperable for several days as it

rebooted itself over and over.

• Also in 2004, a bug in GE energy management software contributed to the

devastating blackout that cut off electricity to 50 million people.

So why do I want to talk about these bugs? Because they provide fascinating examples of

how variables—things we can change while testing—are sometimes subtle and tricky.

Variables can be difficult to identify, and even more difficult to control. And yet, if we

want to design interesting tests that will give us the information we need about

vulnerabilities in our software and systems, we need to identify those subtle variables and

the interesting ways in which we can tweak them.

About “Variables” in Testing

But first, let’s take a step back and talk about what I mean by “variable.”

If you’re a programmer, a variable is a named location in memory. You declare variables

with statements like “int foo;” However, as a tester, I mean “variable” in the more

garden-variety English sense of the word. According to www.m-w.com, a variable is

something that changes. And as a system tester, I’m always alert for things I can change

through external interfaces (like the UI or the file system) while executing the software.

Sometimes variables are obviously changeable things like the value in a field on a form.

Sometimes they’re obvious, but not intended to be changed directly, like the key/value

pairs in a URL string for a web-based application. Sometimes they’re subtle things that

can only be controlled indirectly, like the number of users logged in at any given time or

the number of results returned by a search. And as the bugs listed above demonstrate, the

subtle variables are the ones we often miss when analyzing the software to design tests.

Horror Stories Provide Clues to Subtle Variables

So let’s consider the variables involved in these disastrous bugs.

Page 75: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006 - 2007 Quality Tree Software, Inc 75

In the case of the Therac-25 incidents, there were numerous contributing causes involved

in the deaths of the patients including both software bugs and hardware safety

deficiencies. This is not a simple case of one oversight but rather a cavalcade of factors.

But there were some factors that were entirely controlled by the software. Nancy Leveson

explains in Safeware that in at least one of the incidents the malfunction could be traced

back to the technician’s entering then editing the treatment data in under 8 seconds, the

time it took the magnetic locks to engage. So here are two key subtle variables: speed of

input and user actions. Further in Leveson’s report is an explanation of how every 256th

time the setup routine ran, it bypassed an important safety check. This provides yet

another subtle variable: the number of times the setup routine ran.

The Ariane 5 rocket provides an example of code re-use gone awry. In investigating the

incident, the review board concluded that the root cause of the explosion was the

conversion of a 64-bit floating-point number (maximum value 8,589,934,592) to a 16-bit

signed integer value (maximum value 32768). That conversion caused an overflow error,

and compounding the problem, the system interpreted the resulting error codes as data

and attempted to act on the information, causing the rocket to veer off course. The rocket

self-destructed as designed when it detected the navigation failure. The conversion

problem stemmed from differences between the Ariane 5 rocket, and its predecessor, the

Ariane 4 rocket for which the control software was originally developed. It turns out that

the Ariane 5 rocket was significantly faster than the Ariane 4 rocket, and the Ariane 5

software simply could not handle the horizontal velocity its sensors were registering. The

variables involved here are both velocity and the presence of an error condition.

An article in Spaceflight Now explains that the Mars rover “Spirit” rebooted over and

over again because of the number of files in flash memory. Every time the rover created a

new file, the DOS table of files grew. Some operations created numerous small files, and

over time the table of files became huge. Part of the system mirrored the flash memory

contents in RAM, and there was half as much RAM as flash memory. Eventually the

DOS table of files swamped the RAM, causing the continuous reboots. Note the number

of variables involved, all interdependent: number of files, size of the DOS table of files,

space on flash memory and available RAM.

Finally, the GE energy management software provides a cautionary tale about the

problem of silent failures. As in other cases, there are numerous contributing factors in

the massive-scale blackout. Everything from lack of situational awareness to lack of

operator training to inadequate tree-trimming is named in the final report submitted by a

US-Canadian task force. However, there are tantalizing hints in that final report that

software problems contributed to the operators’ blindness to the problems with the power

grid. According to the report, FirstEnergy, the company responsible for monitoring the

power grid, had reported problems with the GE XA/21 software’s alarm system in the

past. In his report published on SecurityFocus, Kevin Poulsen quotes GE Manager Mike

Unum as pinning the blame for the software failure on a race condition that caused two

processes to have write access to a data structure simultaneously. Event timing and

concurrent processes turned out to be critical variables, and ones that took weeks to track

down.

Page 76: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006-2007 Quality Tree Software, Inc. 76

Looking for Variables

We may not be testing software that can heal or kill people, that blasts into space, or that

manages a nation’s energy supply, but we can apply these lessons to our projects. I don’t

test such mission critical systems, but the software I do work on still has variables around

timing, speed, user actions, number of times a given routine or method executes, files,

memory, and concurrent processes.

The final lesson in all these cases is that testing involves looking at all the variables, not

just the obvious ones. So the next time you’re thinking up test cases, consider this

question:

What variables can I change in the software under test, its

data, or the environment in which it operates, either directly or

indirectly, that might affect the system behavior? And what

might be interesting ways to change them?

It’s not a simple question to answer. But just thinking about it is likely to improve your

testing.

References

Therac-25:

• Wikipedia. “Therac-25!, http://en.wikipedia.org/wiki/Therac-25

• Leveson, Nancy. Safeware. See the appendix, “Medical Devices: Therac-25!,

http://sunnyday.mit.edu/papers/therac.pdf

Ariane 5:

• Gleick, James. “A Bug and a Crash: Sometimes a Bug Is More Than a Nuisance”,

http://www.around.com/ariane.html

• Society for Industrial and Applied Mathematics. “Inquiry Board Traces Ariane 5

Failure to Overflow Error”, http://www.siam.org/siamnews/general/ariane.htm

• Wikipedia. “Ariane 5 Flight 501!,

http://en.wikipedia.org/wiki/Ariane_5_Flight_501

Mars Rover “Spirit”:

• Spaceflight Now. “Thousands of files deleted on Spirit to fix computer trouble”,

http://www.spaceflightnow.com/mars/mera/040201spirit.html

• Hachman, Mark. “NASA: DOS Glitch Nearly Killed Mars Rover”,

http://www.extremetech.com/article2/0,1697,1638764,00.asp

Page 77: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006 - 2007 Quality Tree Software, Inc 77

• NASA Office of Logic Design. “MER Spirit Flash Memory Anomaly (2004)”,

http://klabs.org/richcontent/MemoryContent/flash/mer_spirit_mishap.htm

The East Coast Blackout of 2004:

• Poulsen, Kevin. “Software Bug Contributed to Blackout”,

http://www.securityfocus.com/news/8016

• Poulsen, Kevin. “Tracking the blackout bug”,

http://www.securityfocus.com/news/8412

• Pettichord, Bret. “How Did the Blackout Happen?”,

http://www.io.com/%7Ewazmo/blog/archives/2004_04.html

• US-Canada Power System Outage Task Force. “Final Report on the August 14,

2003 Blackout”, https://reports.energy.gov/BlackoutFinal-Web.pdf

Page 78: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006-2007 Quality Tree Software, Inc. 78

Nouns and Verbs

Nouns

Account

Administrator

Address

Agent

Application

Authorization

Backup

Beginning

Buffer

Button

Category

CD

Change

Character

Client

Clock

Color

Column

Command

Comment

Configuration

Connection

Cursor

Data

Database

Date

Default

Desktop

Dialog

Disk

Display

Document

Documentation

Element

End

Entry

Error

Event

Executable

Failure

Fault

Field

File

Floppy

Font

Form

Format

Graph

Graphic

Group

Heading

Help

Hierarchy

History

ID

Image

Index

Information

Input

Integration

Interruption

Item

Job

Keyboard

Layout

Limit

Line

Link

List

Log

Maximum

Memory

Menu

Message

Minimum

Name

Network

Note

Number

Object

Operating System

Operation

Option

Output

Page

Password

Patch

Payment

Permission

Picture

Port

Preference

Printer

Process

Property

Property Panel

Protocol

Queue

Record

Report

Request

Row

Screen

Scroll Bar

Selection

Sequence

Server

Service

Setting

Speed

Stack

Storage Media

Stream

Symbol

Table

Tape

Task

Template

Text

Thing

Time

Timer

Transaction

Transition

Tree

User

Value

Window

Page 79: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006 - 2007 Quality Tree Software, Inc 79

Verbs

Acknowledge

Activate

Add

Apply

Abort

Arrange

Assign

Backup

Beep

Begin

Call

Cancel

Change

Clear

Click

Close

Commit

Communicate

Compare

Configure

Confirm

Connect

Copy

Corrupt

Crash

Create

Cut

Delete

De-select

Disable

Disconnect

Display

Dock

Double-click

Drag

Draw

Drop

Edit

Enable

End

Enter

Exit

Export

Find

Finish

Fix

Flush

Format

Gather

Hide

Highlight

Identify

Ignore

Import

Install

Insert

Intercept

Interrupt

Key

Kill

Lock

Log

Logon

Logout

Lookup

Manipulate

Map

Mark

Maximize

Merge

Minimize

Modify

Monitor

Open

Organize

Overflow

Paste

Print

Process

Purge

Push

Query

Queue

Quit

Read

Reboot

Receive

Redo

Redraw

Release

Remove

Rename

Repair

Repeat

Request

Reset

Resize

Restart

Restore

Resume

Retry

Revert

Rollback

Save

Save As

Schedule

Scroll

Search

Select

Send

Set

Share

Show

Sort

Start

Stop

Store

Synchronize

Tap

Time

Toggle

Transfer

Undo

Undock

Unlock

Update

View

Write

Page 80: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006-2007 Quality Tree Software, Inc. 80

Adjectives

Abnormal

Above

Abstract

Absurd

Additional

Adequate

Adjacent

Advanced

Alien

Alternative

Ample

Appropriate

Arbitrary

Attempted

Bad

Bare

Basic

Below

Big

Bizarre

Blank

Bottom

Brief

Busy

Close

Closed

Coherent

Compatible

Complex

Concrete

Consistent

Constant

Controlled

Correct

Damaging

Dependent

Distant

Double

Enhanced

Entire

Extensive

Extra

False

Fat

Final

Finished

Flawed

Formal

Future

Hidden

Horizontal

Imaginative

Implicit

Improper

Incoherent

Incompatible

Inconsistent

Incorrect

Incorrect

Independent

Infinite

Informal

Inside

Insufficient

Insufficient

Intact

Invisible

Legitimate

Minimal

Missing

Obscure

Opaque

Open

Outside

Partial

Pedestrian

Perfect

Portable

Precise

Prolonged

Renewed

Restricted

Right

Rising

Shared

Silent

Silly

Simple

Small

Sole

Solid

Specific

Sticky

Subtle

Sudden

Sufficient

Superior

Surplus

Temporary

Thick

Thin

Thin

Tiny

Top

Transparent

Tremendous

Triple

Trivial

True

Unfinished

Unique

Vertical

Visible

Volatile

Widespread

Wrong

Page 81: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006 - 2007 Quality Tree Software, Inc 81

Adverbs

Aggressively

Angrily

Approximately

Awkwardly

Back

Backwards

Beautifully

Blindly

Carefully

Carelessly

Cautiously

Clumsily

Completely

Concisely

Deliberately

Desperately

Early

Enthusiastically

Exactly

Extensively

Extremely

Fairly

Fiercely

Forwards

Fully

Happily

Hastily

Hopefully

Illegally

Immediately

Impatiently

Increasingly

Laconically

Late

Lazily

Legally

Locally

Loosely

Loquaciously

Neatly

Now

Often

Once

Only

Partially

Partly

Perfectly

Pointedly

Poorly

Precisely

Punctually

Quickly

Rarely

Recently

Recklessly

Regretfully

Remotely

Repeatedly

Safely

Sharply

Skillfully

Sleepily

Sloppily

Slowly

Smoothly

Softly

Sparingly

Strangely

Strongly

Swiftly

Unexpectedly

Unfairly

Verbosely

Visibly

Well

Wildly

Page 82: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006-2007 Quality Tree Software, Inc. 82

Acknowledgements

The Los Altos Workshop on Software Testing

In February 1997, Cem Kaner and Brian Lawrence hosted the first meeting of the Los

Altos Workshop on Software Testing (LAWST, pronounced “lost”). At the meeting,

about a dozen seasoned test professionals shared their trials and tribulations with test

automation. Cem and Brian designed the format of the meeting to encourage in depth

discussion. The result was a resounding success.

Since that first meeting, there have been 17 LAWST meetings and numerous offshoots

including STMR (pronounced “steamer”), WOPR (pronounced “whopper”), and AWTA.

I’ve been involved with the group since LAWST 2 in July 1997. These days I’m one of

the co-hosts of LAWST.

A typical LAWST-like meeting brings together 10 – 20 senior level contributors to

discuss a focused topic for 2-3 days with the help of a professional facilitator. If you look

carefully at conference proceedings and acknowledgements in articles you’re likely to

notice that numerous papers and articles have come out of these workshops in the last

several years.

This course is no exception. Many of the ideas discussed in these materials came out of

or were refined during LAWST meetings.

I am indebted to the entire LAWST community for their wisdom and insight. These

course notes were particularly influenced by the following LAWST meetings.

At LAWST 3 we discussed appropriate levels of test documentation and definitions for

"test case," "test coverage," etc. The participants of LAWST 3 were: Chris Agruss,

James Bach, Karla Fisher, David Gelperin, Kenneth Groder, Elisabeth Hendrickson,

Doug Hoffman, III (recorder), Bob Johnson, Cem Kaner (host), Brian Lawrence

(facilitator), Brian Marick, Thanga Meenakshi, Noel Nyman, Jeffery E. Payne, Bret

Pettichord, Johanna Rothman, Jane Stepak, Melora Svoboda, Jeremy White, and Rodney

Wilson

At LAWST 7 we discussed Exploratory Testing. The participants of LAWST 7 were

Cem Kaner (host), Brian Lawrence (facilitator), III (recorder), Jack Falk, Drew Pritsker,

Jim Bampos, Bob Johnson, Doug Hoffman, Chris Agruss, Dave Gelperin, Melora

Svoboda, Jeff Payne, James Tierney, Hung Nguyen, Harry Robinson, Elisabeth

Hendrickson, Noel Nyman, Bret Pettichord, Rodney Wilson

At LAWST 8 we discussed Measuring Testing. The participants of LAWST 8 were:

Chris Agruss, James Bach, Jaya Carl, Rocky Grober, Payson Hall, Elisabeth

Hendrickson, Doug Hoffman, Bob Johnson, Mark Johnson, Cem Kaner, Brian Lawrence,

Brian Marick, Hung Quoc Nguyen, Bret Pettichord, Melora Svoboda, and Scott Vernon.

At LAWST 9 we discussed how you can learn just enough about the internals of the

system to allow you to do better testing. The participants of LAWST 9 were: Payson

Hall, Richard Bender, Jaya Carl, Jack Falk, Ibrahim El-Far, James Whitaker, Bret

Page 83: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006 - 2007 Quality Tree Software, Inc 83

Pettichord, Brian Marick, Doug Hoffman, Hung Nguyen, Mark Johnson, Cem Kaner, BJ

Rollison, Noel Nyman, III, Brian Lawrence, Elisabeth Hendrickson

At LAWST 10 we discussed Testers’ Notebooks. The participants of LAWST 10 were:

James Bach, III, Brian Lawrence, Melora Svoboda, Cem Kaner, Karen Johnson, Doug

Hoffman, Michael Snyder, Elisabeth Hendrickson, Noel Nyman, Mark Johnson, Brian

Marick, Neal Reizer, Rocky Grober, Bret Pettichord, Marge Farrell, Jaya Carl, Chris

Agruss

At LAWST 13, LAWST 14, and Sim-LAWST 1 we discussed modeling. These

discussions influenced the material on modeling in this course.

The participants of LAWST 13 were: III, Chris Agruss, Sue Bartlett, Hans Buwalda,

Anne Dawson, Marge Farrell, David Gelperin, Sam Guckenheimer, Elisabeth

Hendrickson, Doug Hoffman, Mark Johnson, Karen Johnson, Cem Kaner, Brian

Lawrence, Hung Nguyen, Bret Pettichord, Harry Robinson, Melora Svoboda.

The participants of LAWST 14 were: Sue Bartlett, Fiona Charles, Ross Collard, Anne

Dawson, Marge Farrell, Sam Guckenheimer, Elisabeth Hendrickson, Doug Hoffman,

Bob Johnson, Mark Johnson, Cem Kaner, Brian Lawrence, Dave Liebreich, James

Lyndsay, Mary McCann, Harry Robinson, Melora Svoboda, Jo Webb

The participants of Sim-LAWST 1 were: III, Sue Bartlett, Paul Downes, Marge Farrell,

Rochelle Grober, Sam Guckenheimer, Elisabeth Hendrickson, Doug Hoffman, Brian

Lawrence, Serge Lucio, Frank McGrath, Bret Pettichord, Melora Svoboda, Andy

Tinkham, Jo Webb, Melissa Wibom

At LAWST 16 we discussed Test Analysis. The participants of LAWST 16 were: Brian

Lawrence, Doug Hoffman, Ross Collard, Rocky Grober, Elisabeth Hendrickson, Harry

Robinson, Marge Farrell, Anne Dawson, Mary McCann, Richard Bender

The Exploratory Testing Research Summit

In February 2006, Cem Kaner and James Bach hosted a LAWST-like meeting

specifically to discuss Exploratory Testing. Our discussions at that meeting were

interesting (to say the least), and James and Elisabeth’s collaboration is a direct result of

the meeting. We thank the participants for their comments, stories, and insights: James

Bach, Jonathan Bach, Scott Barber, Michael Bolton, Elisabeth Hendrickson, Cem Kaner,

Michael Kelly, Jonathan Kohl, James Lyndsay, and Robert Sabourin

Page 84: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006-2007 Quality Tree Software, Inc. 84

But Wait, There’s More

These materials are the result of a collaboration between James Lyndsay and Elisabeth

Hendrickson. They rely heavily on the materials for James and Elisabeth’s respective

courses on software testing.

Dale Emery was instrumental in developing some of these materials in partnership with

Elisabeth Hendrickson.

In addition, several people have influenced Elisabeth’s thinking in the last several years.

Many of them reviewed previous versions of Elisabeth’s materials:

Brian Marick

Cem Kaner

James Bach

Jerry Weinberg

Noel Nyman

Harry Robinson

James Tierney

Alan Jorgensen

Johanna Rothman

Esther Derby

Ward Cunningham

Joshua Kerievsky

Chris Sepulveda

Jack Falk

Kirk Hendrickson

Sam Guckenheimer

Doug Hoffman

Bob Johnson

Brian Lawrence

Hung Nguyen

Bret Pettichord

Neal Reizer

Brett Schuchert

Melissa Wibom

Jo Webb

Andy Tinkham

Gunjan Doshi

Page 85: Exploratory Testingread.pudn.com/downloads163/doc/743445/ET_v2-1_course_book.pdf · ¥Understand how Exploratory Testing contributes to the discovery of risk and verification of value

Exploratory Testing

v 2.1. Copyright © 2006 - 2007 Quality Tree Software, Inc 85

Bibliography

Agans, D. (2002). Debugging: The Nine Indispensable Rules for Finding Even the Most

Elusive Software and Hardware Problems: AMACOM.

Beck, K. (1999). Extreme Programming Explained: Embrace Change: Addison-Wesley.

Beizer, B. (1990). Software Testing Techniques (2nd ed.): Thomson Computer Press.

DeMarco, T., & Lister, T. Peopleware: Productive Projects and Teams.

Gause, D. C., & Weinberg, G. M. Are Your Lights On? : How to Figure Out What the

Problem Really Is.

Hoffman, D. (1998). "A Taxonomy of Test Oracles". Quality Week Conference.

Available online: http://www.softwarequalitymethods.com/Papers/OracleTax.pdf

Iberle, K. (2002). "But Will It Work for Me?" Proceedings of the Pacific Northwest

Quality Conference. Available online: http://www.kiberle.com/articles.htm

Kaner, C., Bach, J., & Pettichord, B. Lessons Learned in Software Testing.

Kaner, C., Falk, J., & Nguyen, H. (1999). Testing Computer Software (2nd ed.): John

Wiley & Sons.

Kruchten, P. (1995). "Architectural Blueprints—the “4+1” View Model of Software

Architecture" (Vol. 12). IEEE Software.

Marick, B. (1997). "Classic Testing Mistakes". Available online:

http://www.testing.com/writings/classic/mistakes.html

McCarthy, J., & Gilbert, D. Dynamics of Software Development.

Meyers, G. (1979). The Art of Software Testing: John Wiley & Sons.

Michalko, M. (1991). Thinkertoys (A Handbook of Business Creativity): Ten Speed Press.

Oech, R. V. (1990). Creative Whack Pack (Cards ed.): United States Games Systems.

Peterson, I. (1996). Fatal Defect : Chasing Killer Computer Bugs: Vintage.

Seashore, C. N. (1997). What Did You Say?: The Art of Giving and Receiving Feedback:

Bingham House Books.

Weinberg, G. M. (1986a). Becoming a Technical Leader: An Organic Problem-Solving

Approach: Dorset House.

Weinberg, G. M. (1986b). Secrets of Consulting: A Guide to Giving and Getting Advice

Successfully: Dorset House.

Weinberg, G. M. (1992). Quality Software Management (Vol. 1 - 4). New York: Dorset

House.

Whittaker, J. (2002). How to Break Software: A Practical Guide to Testing. Addison-

Wesley