coming to terms with test automation
DESCRIPTION
How to know if your project is ready for test automation.TRANSCRIPT
Coming to
TERMS
When it’s Time to Start Test Automation
Michael Larsen, SideReel.com
Ori
gin
Software Test & Quality
Assurance Magazine,
Jan./Feb., 2012 Authors: Michael Larsen & Albert Gareev
Part One of a Two-part
series on Test Automation Heuristics
from a tester’s perspective
What’
s H
appenin
g?
Testers are developing
coding chops practice development
at home meet-ups, coding dojos,
hack nights, etc. Testers are preparing
to dive head first into
automated software testing.
What’
s The
Goal?
Who’s making these
decisions? What are these projects going to address? Do you know?
Does anybody know?
What’
s The
Issu
e?
Many Test Automation
projects: have no real direction no clear goal or focus automation for the sake
of automation Is there a better way?
Gro
und R
ule
s
From the perspective of
testers: A test is never “automated”
A pre-scripted sequence of
instructions is run. More or less attempts to
imitate manual operations
performed by a human.
Active thinking and learning
is completely missing
Testers want to be: continuously engaged in an
active search for new
problems and new risks.
Why
Auto
mate
?
Automation can be of
service in the process
of verification and confirmation:
Are previously implemented requirements still met?
Can we verify known
risks do not reappear?
Why
Auto
mate
?
Some critical product
risks can only be discovered through interactions that are: high-volume prolonged Stochastic (randomly
determined) Beneath-the-GUI Importance of each
factor might vary depending on context.
TER
MS
Tester work with Heuristics Models or assumptions
that may be flawed but
are useful in various
contexts Different heuristics for
different contexts TERMS A heuristic mnemonic to
help decide if a project is
ready for test automation
T
TOOLS & TECHNOLOGY The platform gives us
options or constraints
as to the testing that
can be performed. Different requirements:
mobile app desktop app web app
T
Can we support and work with the technologies we want
to test with the tools
we have?” Are the application’s
technologies well supported by the tool?
Recognition Connection Interaction Reliability
T
Things to Consider:Can We handle unusual actions?
custom objects non-existing objects hidden and/or disabled objects
objects that can change their
properties at run-time
Do we take into account if
changes: impact applications functionality?
compromise security?
If (when) the application’s
technologies evolve, how will
the answers to all of these
questions change?
E
EXECUTION “Any repetitive task
should be automated.” - common test automation mantra Just as often wrong as
it is right.
E
Example: a manual, repetitive process
Is it an area that is really
a sapient process? Is it repetitive only on a
high level: Actually highly cognitive
work ( can’t be done
without human judgment)?
Is logic and data mutating
with every transaction?
E
Automation has extremely limited
observation capacities.
Creation of a single transaction as a
“test case”. Reduces the scope and quality of
testing, in comparison to skilled
human exploration.
Running such an automation script
hundreds of times: volume tests memory management
race conditions load or stress tests
Execution of scripts creates an
opportunity for testers to observe
and identify problems.
Execution by itself does not find any
problems.
E
Things to consider: Do scripts provide
reliable information? Is test report usable? How much can you rely
on a prompt execution
when requested? What is your back-up
plan if you can’t use
them as expected?”
R
REQUIREMENTS and RISKS
Possible to create useful
automation that verifies
expected results. Actually creating a script that
can explore, investigate, and
report about possible problems
is much harder (maybe
impossible). Most of the potential
problems will be found during
the creation of automation
scripts. Unstable requirements =
endless maintenance
R
Things to consider: What product risks can
automated scripts address? What areas will
automation not be able to
address. If risks not addressed with
automation, will human-
engaged testing be required? What if all risks can be
covered with nearly the
same effort?”
M
MAINTENANCE After initial and expensive
development effort, desirable
to use and maintain those
scripts at minimal cost.
Maintenance effort distributed:
Source code (most expensive)
Test logic (moderately expensive)
Test data (least expensive)
“Cheap maintenance” could
become very expensive if you
have a large volume of data or
data sets change frequently.
M
Things to consider: Will GUI changes mean high
maintenance costs? Will under the hood “hooks” give
answers developers are after?
To debug and update existing
code, do testers have
programming skills and
experience? If not, who does?
Is testing logic baked into source
code? What if test data were hard-coded?
What if updating / fixing
automation scripts take more time
than performing manual, engaged
sapient testing of the same areas
of the application?
S
SECURITY Listed last because it
may or may not be applicable in all evaluations. Even if SUT has no
security policies, does
automation compromise corporate
security settings and
policies?
S
Things to consider: Does the tool require disabling of
the firewall? Granting administrator privileges?
What if the tool reads
username/password data?
Where does it store this
information? Is connecting over secured Internet
connection protocol supported by
the tool? Could turning it off obfuscate the
performance test results?
If your automation relies on API
calls, do you plan to test afterwards
that the final build does not allow
using any of these backdoors?
Concl
usi
on
Automation is a service to testing.
Start by assessing testing needs
(execution and requirements) and risks.
Application’s technologies will narrow
down list of tools
Expenses associated with a tool may
affect the decision whether it’s feasible.
Tools may reveal technical challenges
and limitations. Testing needs and the available
timeframe will have a direct impact on
the automation approach.
Maintenance requirements must to be
considered before you get started.
Defining our automation TERMS helps
us:
assess capabilities
provide a solid grounding of our
expectations
Quest
ions?