functional testing patterns
DESCRIPTION
Software teams mostly find themselves working with three broad categories of tests - unit, integration and functional (excluding technology verification test categories like performance, load, stress etc.). Unit tests indicate whether the code is doing things right. Functional tests are complementary to - but quite different from unit tests. Functional tests tell whether the completed application is working correctly and providing the proper functionality. Simply put, unit tests are written from the code developer's perspective, while functional tests are written from the end user's perspective. When they work reliably, functional tests give users, stakeholders and developers confidence that the software meets agreed upon requirements. In reality, a lot of teams find themselves grappling with perennially failing, hard to understand, slow running tests, which take herculean efforts to maintain while inspiring low confidence in the reliability of the end product. This article examines recipes on how to create and maintain a smoothly running suite of functional/acceptance tests that can be reliably used to verify that the software is ready for release.TRANSCRIPT
Types of Tests Business Facing
Technology Facing
Supp
ort P
rogram
ming C
ritique Product
Showcases Exploratory Tests Usability Tests
Performance Tests Stress Tests Security Tests
Functional Tests Acceptance Tests
Unit Tests Component Tests System Tests
2
h0p://www.exampler.com/old-‐‑blog/2003/08/21/#agile-‐‑testing-‐‑project-‐‑1
© 2012 ThoughtWorks, Inc.
Functional Tests -‐‑ The Pitch • What Are Functional Tests?
o Verification that business requirements are met
o Black Box o Automated
• Why Functional Tests? o Maintain (high) external quality o Allow team to be bolder o Allow team to go faster
3 © 2012 ThoughtWorks, Inc.
Functional Tests – The Reality • Flaky • Take long time to run • Cumbersome to maintain • Don’t catch many bugs • Some things are hard to automate • Don’t quite give you the feedback
you were hoping for 4 © 2012 ThoughtWorks, Inc.
Consequences • Low confidence in automation • Over-reliance on manual testing
• Slipped Deadlines AND/OR • Lesser Testing AND/OR • Testing in Production :-)
5 © 2012 ThoughtWorks, Inc.
How To Get Be0er?
6
People
Process
Technology
© 2012 ThoughtWorks, Inc.
Process
7 © 2012 ThoughtWorks, Inc.
Functional Test – Example
© 2012 ThoughtWorks, Inc. 8
Example -‐‑ Specification
9 © 2012 ThoughtWorks, Inc.
Example – Implementation
10 © 2012 ThoughtWorks, Inc.
Implementation Problems? • No correlation between specification
and implementation • Intent lost in translation • Easy to write, hard to read, maintain • Too monolithic, lacks abstraction
• Causes non-technical domain experts to tune out of the process
11 © 2012 ThoughtWorks, Inc.
Executable Specifications
12
Specification Step
Implementation Step Tight Correlation
© 2012 ThoughtWorks, Inc.
Functional Test -‐‑ Specification
13 © 2012 ThoughtWorks, Inc.
14 © 2012 ThoughtWorks, Inc.
More useful information in failure reports
Acceptance Criteria DSL • Given some initial context • When an event occurs • Then ensure outcome • Example –
15 © 2012 ThoughtWorks, Inc.
Functional Test – New Implementation
16 © 2012 ThoughtWorks, Inc.
Good Acceptance Criteria
17
Scenario: Advanced Search for a hat !!Given I am searching on Etsy.com!And I am not logged in !And I specify the Knitting sub category !And I search for hat !Then there are search results !!
Scenario: Advanced Search for a hat !!Given I am searching on "http://www.etsy.com" !When I click on the select box with css class "select.handmade" !And I select "Knitting" !And I click on the text box with id "#search_query" !And type in "hat" !Then there are search results !
Focus on WHAT, not HOW
© 2012 ThoughtWorks, Inc.
Be@er
How Many Tests?
Unit Tests
Integration Tests
Functional Tests
Manual Testing
18
10%
20%
70%
h0p://blog.mountaingoatsoftware.com/tag/agile-‐‑testing © 2012 ThoughtWorks, Inc.
What About Legacy Systems? • Start with functional tests • Cover with unit tests for new
functionality • Be on the lookout to replace
functional tests more fine-grained tests at every given opportunity
19 © 2012 ThoughtWorks, Inc.
What To Automate? • The MOST important scenarios
o End-To-End system level scenarios o Happy paths
• Avoid automating individual feature-level acceptance criteria
• Make a distinction between “ephemeral” and “long-standing” tests
20 © 2012 ThoughtWorks, Inc.
When do these tests run? • After every check-in
• As part of CI build pipeline o Unit à Smoke à Regression à Deploy
• Incubator Test Suites
21 © 2012 ThoughtWorks, Inc.
Workflow • Sprint [N -1]
o Feature elaboration with entry level acceptance criteria
• Sprint [N] o Feature kick-off meeting o Developers implement feature o Testers (optionally) add additional acceptance criteria
• Sprint[N, N + 1] o Testers verify all acceptance criteria to be satisfied o Testers refactor functional test suite o Feature marked complete
22 © 2012 ThoughtWorks, Inc.
23
Technology
© 2012 ThoughtWorks, Inc.
Test Characteristics • Be idempotent
o At least re-runnable o Keep tests isolated from one another
• Be runnable in parallel o No inter-dependencies between tests
• Be deterministic o Quarantine/eradicate non-deterministic tests o Don’t “sleep”
• Prefer callbacks or at least poll for asynchronous responses o Consider mocking remote third-party services
• Have separate tests to verify interaction with remote services o Consider using relative dates/times or mock the system clock
o http://martinfowler.com/articles/nonDeterminism.html
24 © 2012 ThoughtWorks, Inc.
Test Sources • Don’t have any dependencies on
production sources • Treat as first-class citizens
o Keep DRY o Run static analysis metrics
• Leverage design patterns like o Page Objects o Compound Steps
• Consider the use of terser languages o Groovy, Ruby etc.
25 © 2012 ThoughtWorks, Inc.
Test Data • Use application to create test data • Use test data creation steps
o Frameworks like DBUnit
• Use anonymized production data subsets o Requires lot of time and effort
• Use anonymized production data copies o Huge data volumes may make it prohibitive
• Avoid the shared database anti-pattern • Consider mocking remote third-party
services 26 © 2012 ThoughtWorks, Inc.
Test Infrastructure • Invest in fast build hardware
o CI servers support multiple remote agents
• Use same software configuration as production in all environments
• Consider running tests on the cloud o http://www.saucelabs.com
27 © 2012 ThoughtWorks, Inc.
28
People
© 2012 ThoughtWorks, Inc.
Who Writes/Owns The Tests? • The specifications – acceptance
criteria? o Domain Experts
• Step implementations? o Developers & Testers
• The functional suite?
o The Team o Day-to-Day – Testers
29 © 2012 ThoughtWorks, Inc.
Team Dynamics • Fix broken builds and tests immediately
o OK to occasionally break the build o Not OK to leave the build broken for long
• Do not allow check-ins over a broken build o Unless check-in is being made to fix the build
• Co-locate teams as much as possible o At least create fully formed remote teams
30 © 2012 ThoughtWorks, Inc.