coming to terms with test automation

24
Coming to TERMS When it’s Time to Start Test Automation Michael Larsen, SideReel.com

Upload: michael-larsen

Post on 11-May-2015

178 views

Category:

Technology


2 download

DESCRIPTION

How to know if your project is ready for test automation.

TRANSCRIPT

Page 1: Coming to TERMS with Test Automation

Coming to

TERMS

When it’s Time to Start Test Automation

Michael Larsen, SideReel.com

Page 2: Coming to TERMS with Test Automation

Ori

gin

Software Test & Quality

Assurance Magazine,

Jan./Feb., 2012 Authors: Michael Larsen & Albert Gareev

Part One of a Two-part

series on Test Automation Heuristics

from a tester’s perspective

Page 3: Coming to TERMS with Test Automation

What’

s H

appenin

g?

Testers are developing

coding chops practice development

at home meet-ups, coding dojos,

hack nights, etc. Testers are preparing

to dive head first into

automated software testing.

Page 4: Coming to TERMS with Test Automation

What’

s The

Goal?

Who’s making these

decisions? What are these projects going to address? Do you know?

Does anybody know?

Page 5: Coming to TERMS with Test Automation

What’

s The

Issu

e?

Many Test Automation

projects: have no real direction no clear goal or focus automation for the sake

of automation Is there a better way?

Page 6: Coming to TERMS with Test Automation

Gro

und R

ule

s

From the perspective of

testers: A test is never “automated”

A pre-scripted sequence of

instructions is run. More or less attempts to

imitate manual operations

performed by a human.

Active thinking and learning

is completely missing

Testers want to be: continuously engaged in an

active search for new

problems and new risks.

Page 7: Coming to TERMS with Test Automation

Why

Auto

mate

?

Automation can be of

service in the process

of verification and confirmation:

Are previously implemented requirements still met?

Can we verify known

risks do not reappear?

Page 8: Coming to TERMS with Test Automation

Why

Auto

mate

?

Some critical product

risks can only be discovered through interactions that are: high-volume prolonged Stochastic (randomly

determined) Beneath-the-GUI Importance of each

factor might vary depending on context.

Page 9: Coming to TERMS with Test Automation

TER

MS

Tester work with Heuristics Models or assumptions

that may be flawed but

are useful in various

contexts Different heuristics for

different contexts TERMS A heuristic mnemonic to

help decide if a project is

ready for test automation

Page 10: Coming to TERMS with Test Automation

T

TOOLS & TECHNOLOGY The platform gives us

options or constraints

as to the testing that

can be performed. Different requirements:

mobile app desktop app web app

Page 11: Coming to TERMS with Test Automation

T

Can we support and work with the technologies we want

to test with the tools

we have?” Are the application’s

technologies well supported by the tool?

Recognition Connection Interaction Reliability

Page 12: Coming to TERMS with Test Automation

T

Things to Consider:Can We handle unusual actions?

custom objects non-existing objects hidden and/or disabled objects

objects that can change their

properties at run-time

Do we take into account if

changes: impact applications functionality?

compromise security?

If (when) the application’s

technologies evolve, how will

the answers to all of these

questions change?

Page 13: Coming to TERMS with Test Automation

E

EXECUTION “Any repetitive task

should be automated.” - common test automation mantra Just as often wrong as

it is right.

Page 14: Coming to TERMS with Test Automation

E

Example: a manual, repetitive process

Is it an area that is really

a sapient process? Is it repetitive only on a

high level: Actually highly cognitive

work ( can’t be done

without human judgment)?

Is logic and data mutating

with every transaction?

Page 15: Coming to TERMS with Test Automation

E

Automation has extremely limited

observation capacities.

Creation of a single transaction as a

“test case”. Reduces the scope and quality of

testing, in comparison to skilled

human exploration.

Running such an automation script

hundreds of times: volume tests memory management

race conditions load or stress tests

Execution of scripts creates an

opportunity for testers to observe

and identify problems.

Execution by itself does not find any

problems.

Page 16: Coming to TERMS with Test Automation

E

Things to consider: Do scripts provide

reliable information? Is test report usable? How much can you rely

on a prompt execution

when requested? What is your back-up

plan if you can’t use

them as expected?”

Page 17: Coming to TERMS with Test Automation

R

REQUIREMENTS and RISKS

Possible to create useful

automation that verifies

expected results. Actually creating a script that

can explore, investigate, and

report about possible problems

is much harder (maybe

impossible).  Most of the potential

problems will be found during

the creation of automation

scripts. Unstable requirements =

endless maintenance

Page 18: Coming to TERMS with Test Automation

R

Things to consider: What product risks can

automated scripts address? What areas will

automation not be able to

address. If risks not addressed with

automation, will human-

engaged testing be required? What if all risks can be

covered with nearly the

same effort?”

Page 19: Coming to TERMS with Test Automation

M

MAINTENANCE After initial and expensive

development effort, desirable

to use and maintain those

scripts at minimal cost.

Maintenance effort distributed:

Source code (most expensive)

Test logic (moderately expensive)

Test data (least expensive)

“Cheap maintenance” could

become very expensive if you

have a large volume of data or

data sets change frequently.  

Page 20: Coming to TERMS with Test Automation

M

Things to consider: Will GUI changes mean high

maintenance costs? Will under the hood “hooks” give

answers developers are after?

To debug and update existing

code, do testers have

programming skills and

experience? If not, who does?

Is testing logic baked into source

code? What if test data were hard-coded?

What if updating / fixing

automation scripts take more time

than performing manual, engaged

sapient testing of the same areas

of the application?

Page 21: Coming to TERMS with Test Automation

S

SECURITY Listed last because it

may or may not be applicable in all evaluations. Even if SUT has no

security policies, does

automation compromise corporate

security settings and

policies? 

Page 22: Coming to TERMS with Test Automation

S

Things to consider: Does the tool require disabling of

the firewall? Granting administrator privileges?

What if the tool reads

username/password data?

Where does it store this

information? Is connecting over secured Internet

connection protocol supported by

the tool? Could turning it off obfuscate the

performance test results?

If your automation relies on API

calls, do you plan to test afterwards

that the final build does not allow

using any of these backdoors?

Page 23: Coming to TERMS with Test Automation

Concl

usi

on

Automation is a service to testing.

Start by assessing testing needs

(execution and requirements) and risks.

Application’s technologies will narrow

down list of tools

Expenses associated with a tool may

affect the decision whether it’s feasible.

Tools may reveal technical challenges

and limitations. Testing needs and the available

timeframe will have a direct impact on

the automation approach.

Maintenance requirements must to be

considered before you get started.

Defining our automation TERMS helps

us:

assess capabilities

provide a solid grounding of our

expectations

Page 24: Coming to TERMS with Test Automation

Quest

ions?