Download - Blunders in Test Automation
K4
Keynote
5/7/2015 8:30:00 AM
Blunders in Test Automation
Presented by:
Dorothy Graham
Software Test Consultant
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ [email protected] ∙ www.sqe.com
Dorothy Graham
Software Test Consultant
In software testing for over forty years, Dorothy Graham is coauthor of four books—Software
Inspection, Software Test Automation, Foundations of Software Testing and Experiences of Test
Automation—and is currently working with Seretta Gamba on a new book on a test
automation patterns wiki. A popular and entertaining speaker at conferences and seminars
worldwide, Dot has attended STAR conferences since the first one in 1992. She was a founding
member of the ISEB Software Testing Board and a member of the working party that developed
the ISTQB Foundation Syllabus. Dot was awarded the European Excellence Award in Software
Testing in 1999 and the first ISTQB Excellence Award in 2012. Learn more about Dot atÂ
DorothyGraham.co.uk.
© Dorothy Graham 2015
www.DorothyGraham.co.uk
1
Test Automation Blunders
Prepared and presented by
Dorothy Graham email: [email protected]
Twitter: @DorothyGraham
www.DorothyGraham.co.uk
© Dorothy Graham 2015
2
Blunder
• from old Norse word “blundra”
– meaning “to shut one’s eyes”
• now means
– mistake caused by ignorance, carelessness or not
thinking things through
– a foolish or tactless remark
– to move clumsily
– to mismanage
• people blunder when they don’t see or
understand
© Dorothy Graham 2015
www.DorothyGraham.co.uk
3
Contents
• Test automation blunders
– Testing-Tools-Test
– Who needs GPS?
– Silver Bullet
– The Wrong Thing
– How hard can it be?
– Stable Application Myth
– Project / Not Project
– Inside the Box
– Isolationism
• Conclusion
Twitter: @DorothyGraham
4
Testing-Tools-Test
• blunder: thinking that tools actually do testing
– most important, at the root of other blunders
• should not be called “testing tools”
• checking vs testing (Michael Bolton)
– a check is automatable
– a test requires sapience
• better names:
– “tester assistance tools”
– “check-running tools”
© Dorothy Graham 2015
www.DorothyGraham.co.uk
5
People (testers) vs tools
• what do people do?
– think, evaluate, assess, decide, observe, interpret
– recognize patterns, have new ideas, find bugs
– make mistakes
• what do tools do?
– they just run stuff - whatever they’ve been
programmed to execute (including bad tests)
– intelligence level: zero
Get tools to do what computers do best,
get testers to do what people do best
6
Contents
• Test automation blunders
– Testing-Tools-Test
– Who needs GPS?
– Silver Bullet
– The Wrong Thing
– How hard can it be?
– Stable Application Myth
– Project / Not Project
– Inside the Box
– Isolationism
• Conclusion
Twitter: @DorothyGraham
© Dorothy Graham 2015
www.DorothyGraham.co.uk
7
Who needs GPS?
• if you don’t know where you are going, any
road will do (Lewis Caroll)
• where are you going with your automation?
– testing and automation are different activities
– different activities require different objectives
• good objectives for testing?
– find bugs, gain confidence, investigate
• good objectives for automation?
– Hint: they shouldn’t be the same!
8
What finds most bugs?
regression tests exploratory testing
likelihood of
finding bugs
most often
automated
What is usually automated?
© Dorothy Graham 2015
www.DorothyGraham.co.uk
9
Automation success = find lots of bugs?
• tests find bugs, not automation
• automation is a mechanism for running tests
• the bug-finding ability of a single test is not
affected by the manner in which it is executed
• this can be a dangerous objective
– especially for regression automation!
Automated tests Manual Scripted Exploratory Fix Verification
Experiences of Test Automation, Ch 27, p 503, Ed Allen & Brian Newman
10
When do automated tests find bugs?
• the first time a test is run
– Test-Driven Design in Agile (& BDD)
– Model-Based Testing (MBT), automated test
design, monkey testing
– keyword-driven (e.g. users populate spreadsheet)
• find bugs in parts we wouldn’t have tested?
– automation allows you to run more tests
– more testing finds more bugs
© Dorothy Graham 2015
www.DorothyGraham.co.uk
11
fasttesting
slowtesting
Effectiveness
Low
High
EfficiencyManual testing Automated
Efficiency and effectiveness
poorfast
testing
poorslowtesting
goodgood
greatest
benefit
not good but
commonworst
better
12
Contents
• Test automation blunders
– Testing-Tools-Test
– Who needs GPS?
– Silver Bullet
– The Wrong Thing
– How hard can it be?
– Stable Application Myth
– Project / Not Project
– Inside the Box
– Isolationism
• Conclusion
Twitter: @DorothyGraham
© Dorothy Graham 2015
www.DorothyGraham.co.uk
13
Silver bullet solution: get the right tool
• no such thing as “the right tool” or “best tool”
– what’s “the best car”?
poor benefitslow cost
good benefitshigh cost
good benefitslow cost
poor benefitshigh cost
benefits
cost budget
investment in
good automation
good benefitsmoderate cost
commercial tools?
open source tools?
14
Silver bullet: success is automatic
• automation is (much) more than just a tool
• it takes time and effort to succeed
– building good automation is a learning process
• management support is critical
– high level managers need to understand
automation capability & limitations, and have
realistic expectations and budget
– “people issues” – people use the automation,
people develop the automation
© Dorothy Graham 2015
www.DorothyGraham.co.uk
15
Contents
• Test automation blunders
– Testing-Tools-Test
– Who needs GPS?
– Silver Bullet
– The Wrong Thing
– How hard can it be?
– Stable Application Myth
– Project / Not Project
– Inside the Box
– Isolationism
• Conclusion
Twitter: @DorothyGraham
16
Automate manual tests?
manual
tests automated
tests
tests not
worth
automatingnew approaches,
e.g. monkey
testing
manual tests
automated
(% manual)
tests (&
verification)
not possible to
do manually
tests not
automated
yet
© Dorothy Graham 2015
www.DorothyGraham.co.uk
17
Automation roles
engine:
test tool
passengers:
test cases
car: testware
architecture
driver:
tester
mechanic:
test automator
fleet manager:
test automation manager
car designer:
testware architect
18
Testware architecture
– poor architecture gives
high maintenance cost
• most frequent cause of
abandoned automation /
shelfware
– two layers of abstraction
• technical: for long life
• human: for wide use
– using the tool’s
architecture ties you to
that tool (version)
Testers
Test Execution Tool
runs scripts
HL Keywords
Structured
Scripts
testware
architecture
write tests (in DSTL)
Ch 6
© Dorothy Graham 2015
www.DorothyGraham.co.uk
19
Contents
• Test automation blunders
– Testing-Tools-Test
– Who needs GPS?
– Silver Bullet
– The Wrong Thing
– How hard can it be?
– Stable Application Myth
– Project / Not Project
– Inside the Box
– Isolationism
• Conclusion
Twitter: @DorothyGraham
20
How hard can it be?
• different activities require different skills
• classic blunder: let the testers automate
– automating without automation skills?
• newer blunder: let the automators write the
tests
– testing without testing skills?
© Dorothy Graham 2015
www.DorothyGraham.co.uk
21
Is it the tester’s job to automate tests?
– driver’s job to maintain the car and engine?
– test execution automation is software development
• needs programming skills
– not all testers want to become developers
• or would be good at it
– do automators need testing skills?
• helpful but not essential
– if testers are automators � a conflict of interest
• do you run tests or do you automate tests?
• automation is better long-term, BUT
• deadline pressure pushes you back into manual testing
22
Tools will replace testers?
• “we can reduce the number of testers once we
have the tool”
– what are your testers like?
• mindless morons, or
• intelligent investigators?
– need more skills, not fewer
– automation can free testers to do more test
design, exploratory testing
• and find more bugs
– tools don’t replace testers, they support them
© Dorothy Graham 2015
www.DorothyGraham.co.uk
23
Contents
• Test automation blunders
– Testing-Tools-Test
– Who needs GPS?
– Silver Bullet
– The Wrong Thing
– How hard can it be?
– Stable Application Myth
– Project / Not Project
– Inside the Box
– Isolationism
• Conclusion
Twitter: @DorothyGraham
24
Stable application myth
• can’t start automating until the application (or
the GUI) is stable
– throw-back to testing attitudes 30 years ago?
• testing comes at the end?
• testing is (only) execution?
• testing is more than testing
• automation is more than execution
• design your automation early, ready to run
when anything is ready to test
© Dorothy Graham 2015
www.DorothyGraham.co.uk
25
• test automation pyramid
– Mike Cohn, Lisa Crispin (Ch 1)
– more unit/component tests
– fewer GUI tests
• which aspects are stable? (GUI often late)
– build tests for stable parts when ready
– build (GUI) tests with detail abstracted out
• opportunity rather than a problem
– build flexibility & robustness in your automation
against common types of changes
When to write automated tests
26
Contents
• Test automation blunders
– Testing-Tools-Test
– Who needs GPS?
– Silver Bullet
– The Wrong Thing
– How hard can it be?
– Stable Application Myth
– Project / Not Project
– Inside the Box
– Isolationism
• Conclusion
Twitter: @DorothyGraham
© Dorothy Graham 2015
www.DorothyGraham.co.uk
27
Project / not project
• not thinking of automation as a project
– doesn’t need funding, resourcing
– just do it in your “spare time”
• thinking that automation is (just) a project
– when will automation be finished? (wrong question!)
– needs on-going continuous improvement
– refactoring at regular intervals
• project to start automation, then continued
support
28
Contents
• Test automation blunders
– Testing-Tools-Test
– Who needs GPS?
– Silver Bullet
– The Wrong Thing
– How hard can it be?
– Stable Application Myth
– Project / Not Project
– Inside the Box
– Isolationism
• Conclusion
Twitter: @DorothyGraham
© Dorothy Graham 2015
www.DorothyGraham.co.uk
29
Automated tests/automated testing
Select / identify test cases to run
Set-up test environment:• create test environment• load test data
Repeat for each test case:• set-up test pre-requisites• execute• compare results• log results• analyse test failures• report defect(s)• clear-up after test case
Clear-up test environment:• delete unwanted data• save important data
Summarise results
Automated tests
Select / identify test cases to run
Set-up test environment:• create test environment• load test data
Repeat for each test case:• set-up test pre-requisites• execute• compare results• log results• clear-up after test case
Clear-up test environment:• delete unwanted data• save important data
Summarise results
Analyse test failures
Report defects
Automated testing
Automated processManual process
30
Contents
• Test automation blunders
– Testing-Tools-Test
– Who needs GPS?
– Silver Bullet
– The Wrong Thing
– How hard can it be?
– Stable Application Myth
– Project / Not Project
– Inside the Box
– Isolationism
• Conclusion
Twitter: @DorothyGraham
© Dorothy Graham 2015
www.DorothyGraham.co.uk
31
Isolationism: isolated from
• change: encasing your first efforts in stone
– automated tests benefit from review and refactoring
• realism: optimism as a test strategy for the automation
– automated tests need to be tested and have bugs
– realistic expectations
• managers: not making automation benefits visible
– to the people who matter, in a way that communicates
• developers: not collaborating
– design for automated testability, help developers
• the wider world: not seeking existing knowledge
– books, wiki*, articles
*testautomationpatterns.wikispaces.com
32
Summary of blunders
• Test automation blunders
– Testing-Tools- DON’T –Test They just run stuff
– Who needs GPS? Automation needs direction
– NO Silver Bullet Needs time and effort
– The Wrong Thing Testware architecture, manual tests?
– How hard can it be? Needs different skills
– Stable Application Myth Start early, different levels
– Project at times / Not just a Project Ongoing, refactor
– Inside the Box Automation is more than automation
– Isolationism Collaborate, experiment, read
• Conclusion
Are these blunders
alive and well in
your organization?
How can you avoid or
recover from them?
© Dorothy Graham 2015
www.DorothyGraham.co.uk
33
Concluding advice
• understand and communicate
– see these blunders for the mistakes they are
• define objectives for automation
– different to objectives for testing
• testware architecture is critical for long life
• automate the right things
• if you want a return, you need to invest
• keep learning and improving your automation
34
Thank you!• More information:
• downloads www.DorothyGraham.co.uk
– articles and papers
• email [email protected]
• blog http://dorothygraham.blogspot.com
– including automation, DDP, certification
– @DorothyGraham
• TestAutomationPatterns.org– free wiki of automation advice,
with Seretta Gamba