colm’o’heocha · buglifecycle create bug’ merge’ &’’pass’ to’test find source...
TRANSCRIPT
What’s Agile got to do with Tes1ng? Key Agile Tes+ng Topics
Colm O’hEocha
Build Quality In
Cost of Delay – Cost of Delayed Tes1ng
Test First and Test Driven Development
User Stories, Acceptance Criteria and SPE
Test Automa1on in Agile
Exploratory Tes1ng
Copyright © 2014 AgileInnova1on
COST OF DELAY Economies of Speed
AgileTour 2014
Copyright © 2012 AgileInnova1on 3
Evolu1on: Sequen1al, Sprint+1, Mini-‐Waterfall, Agile
Code
Test & Retest
Code & Bug Fix S1
Code & Bug Fix S1
Test
B
C
D
Code
A
Code & Bug Fix S1
Test & Retest
Code & Bug Fix S2
Code & Bug Fix S1
Test
Code
Test
Code & Bug Fix S2
Test & Retest
Code & Bug Fix S3
Code & Bug Fix S1
Test
Code & Bug Fix S1
Test
Code & Bug Fix S3
Test & Retest
Code & Bug Fix S4
Code & Bug Fix S1
Test
Code & Bug Fix S2
Test & Re-‐Test
Code & Bug Fix
Test & Re-‐Test
Code
Test
Sprint 1 Sprint 2 Sprint 3 Sprint 4
• Too late to fix bugs at end of sprint
• Unevenness in workload • No single version of the
truth • Cost of Delay
What is ‘Cost of Delay’?
• The difference in the value of doing something sooner vs. doing it later E.g. – Releasing a Product this month vs. next month – Valida1ng a perceived user need before we build the feature vs. a^er
– Fixing a bug today vs. in two weeks 1me – Delivering a feature 6 months a^er it was specified
COD – Delayed Release Example
Source: David Anderson, LKNA14
Bug Lifecycle
CreateBug
Merge & Pass to Test
Find Source& Fix Bug
Find Bug
Log Bug
Details
Review & Priori1se
Bug
Retest
Recreate Bug
Developer Tester
Prj Mgr
Squish it at birth!
Create Env
Recreate Env
Test & Debug – Cost of Delay Tasks that take up man hours
Delayed 3 weeks
Delayed 4 hours
Run the Tests that find the bug Log the Bug – Describe/Characterise it Log the Bug -‐ Detail to Recreate it ‘Manage the Bug’ – Review, Assign, Priori1se Developer Recreates & Locates the Bug Fix the Bug Merge the Bug with All appropriate branches Find and Fix Regression Issues Retest Regression Fixes Merge Regression Fixes Retest the Bug (a^er delay) TOTAL MAN HOURS
Tasks that take up man hours Delayed 3 weeks
Delayed 4 hours
Run the Tests that find the bug 1 1 Log the Bug – Describe/Characterise it 0.25 0.05 Log the Bug -‐ Detail to Recreate it 1 0 ‘Manage the Bug’ – Review, Assign, Priori1se 0.5 0 Developer Recreates & Locates the Bug 2 0.25 Fix the Bug 0.25 0.25 Merge the Bug with All appropriate branches 0.5 0 Find and Fix Regression Issues 1 0 Retest Regression Fixes 0.25 0 Merge Regression Fixes 0.5 0 Recreate Env & Retest the Bug (a^er delay) 1 0.25 TOTAL MAN HOURS 8.25 1.8
COD -‐ Example
0
2
4
6
8
10
12
Delayed 3 weeks
Delayed 4 hours
TEST FIRST Whole Team Approach
AgileTour 2014
Copyright © 2012 AgileInnova1on 10
Test First Development
• Test First: Build your Tests before you Implement – Understand Requirement before designing a solu1on
• Collabora1on between Testers and ‘Customers’ – ‘Executable Requirements’ – ideally automated
• Collabora1on between Testers and Developers • ‘Single Source of Truth’
– Early Tes1ng – ‘Build Quality In’ • Prevent Defects over Finding Failures
From: Lisa Crispin, 2011 12
Test Driven Development Step AcDon ObjecDve RED Write a Test reflec1ng the
behavior required – it Fails Learn – about the problem to be solved, the func1onality to be built
GREEN Write the code to make the test pass – as fast as you can (kludge, copy/paste, spagek code, whatever)
Learn – about the solu1on needed to pass the tests
REFACTOR Design and re-‐implement the code in the simplest, most parsimonious and most elegant way you can
Implement – the cleanest, best designed code you can -‐ based on what you’ve learned about the problem and the required solu1on
• Never write a single line of code unless you have a failing test • Eliminate Duplica+on
Test Driven Development • Primarily about Development
– Its about Learning/Understanding the requirement – What’s the Simplest thing that would work? – Allows us ‘refactor’ safely – reduced cost of change – Enables ‘Emergent Design/Architecture’ – Encourages callable & testable code
• Secondarily about Tes1ng – We test Early – even before we code – We test O^en (if we automate) – fast feedback – Prevent Defects over Finding Failures
TDD, ATDD, BDD: A word of Warning
• What ‘UNITS’ are we talking about? – Unit Tests test a ‘Unit’ of implementa1on
• Verifies the code does what the developer intended – TDD tests a ‘Unit’ of func1onality/behaviour
• Verifies the code implements the behaviour required
• TDD (Beck) is Outside-‐In: Create tests to test requirements, not implementa1on (BB)
• Developer level TDD o^en seen as ‘unit level’, what developers do to test implementa1on (WB)
• ATDD is the new TDD – something Testers do to test Func1onality/Behavior (ATDD=BDD)
USER STORIES Bridging The CommunicaDon Gap
AgileTour 2014
Copyright © 2012 AgileInnova1on 17
What are ‘User Stories’
• User Centric – what’s important to your customer • Story – The Power of Narra1ve
– We understand, remember and pay much more anen1on to stories than facts
– Collabora1ng on Stories helps us focus on the users and business value
• User Stories Define – The Actor/Persona/Role (Who) – The Ac1on/Func1onality (What) – The Result/Benefit/Goal (Why)
Copyright © 2012 AgileInnova1on 18
A brief statement of intent that describes something the system needs to do for the user
• What its not: – A Use Case – Requirements Document – Feature Specifica1on
Ver1cal Slices -‐ High Level Stories should represent a ver1cal slice through the system and should be completed in one itera1on – and can therefore be tested in the same itera+on
User Story Example – Call Processing High Roaming Spend Warning (Data) As a billpay customer with data roaming enabled I want to be made aware if I’m approaching my threshhold so I can decide whether to incur out-‐of-‐plan charges
Acceptabce Criteria: • Ini1al warning at 80%
threshhold. • Follow Up at 95% threshhold
and op1on to cap at threshhold. • Warnings issued by SMS and
email (where email address validated)
• If 80% threshhold achieved in first 25% of billing cycle, send ‘possible fraud alert’ to customer service queue.
CONVERSATION (PO, Testers, Developers):
• How many warnings? • How should we issue
the warning? • What text do we use in
the warnings? • Do we force the user to
acknowledge the warning?
• Can the user configure warning points?
• If they go above their threshhold, do we issue periodic reminders/warnings?
20
Scripts & Specifica1ons
1. Set Thold 1=80%, THold2=95%, Cap=N, In-‐Plan Limit 1GB, Usage 799MB
2. Create Billing Request for 1MB and post to the users account
3. Check that Request is granted and Warning 1 Issued
• On reaching Thold1, Billing Requests are granted and Warning 1 issued.
Copyright © 2012 AgileInnova1on 21
SubjecDve and difficult to automate.
Specifica1on by Example
• Realis1c Examples contain Precise Informa1on & require Precise Answers
• Concrete examples s1mulate discussion & shared understanding
• Edge cases flush out unseen requirements • Start with ‘How could we verify that this feature is implemented completely and correctly’ – what are all the examples we need?
Copyright © 2012 AgileInnova1on 22
On reaching Thold1, Billing Requests (1MB) are granted and Warning 1 issued.
Data Usage
In-‐Plan Limit
Thold 1 Thold 2 Usage Cap AcDve
AcDon
500MB 1GB 800MB 950MB N
849MB 1GB 800MB 950MB N
850MB 1GB 800MB 950MB N Warning 1
851MB 1GB 800MB 950MB N
500MB 1GB 800MB 950MB Y
849MB 1GB 800MB 950MB Y
850MB 1GB 800MB 950MB Y
851MB 1GB 800MB 950MB Y
ObjecDve and can be automated
without interpretaDon.
User Stories & Examples ’describe, demonstrate and develop’
Copyright © 2012 AgileInnova1on
User Story (Breadth)
Exam
ples (D
epth)
Exam
ples (D
epth)
Exam
ples (D
epth)
Exam
ples (D
epth)
Exam
ples (D
epth)
Exam
ples (D
epth)
Exam
ples (D
epth)
Exam
ples (D
epth)
Exam
ples (D
epth)
Edge Cases
Typical U
se Excep
1on
Typical U
se Excep
1on
Typical U
se Example
‘SPINE’ Example
Typical U
se Example
Typical U
se Excep
1on
Typical U
se Excep
1on
Edge Cases
TEST AUTOMATION AutonamaDon
AgileTour 2014
Copyright © 2012 AgileInnova1on 25
Test Automa1on -‐ What we do – (but shouldn’t)
The Automa1on Pyramid
Unit/Component layer Developer Tests
e.g. JUnit
API/Service layer Acceptance Tests
e.g. Fitnesse, Cucumber
GUI layer e.g. Selenium
Manual Tests e.g. exploratory
Automate at feature/workflow level
Automate at story level
Automate at design level
Based on Mike Cohn 27
Things to Consider: • Stable over Time/Cheap to Maintain • Test First/Early • Cheap to Produce (TDD Collateral) • Quick to Run/Reduced Overlap • Pinpoint Defect
EXPLORATORY TESTING The Spy that came in from the Cold
AgileTour 2014
Copyright © 2012 AgileInnova1on 28
Hendrickson: a style of testing in which you explore the software
while simultaneously designing and executing tests, using feedback from the last test to inform the next
Bolton: Operating and observing the product with the freedom and
mandate to investigate it in an open-ended search for information about the program.
Kaner: Simultaneous learning, design and execution, with an
emphasis on learning. “… agile programs are more subject to unintended consequences
of choices simply because choices happen so much faster. This is where exploratory testing saves the day. Because the program always runs, it is always ready to be explored." Ward Cunningham on Why should agile teams do exploratory testing?
Exploratory Testing
30
• Advantages: – FAST:
• Minimal prepara1on (e.g. a charter/objec1ve, 1mebox) • Test Cases/Procedure not reorded, only incidents
– FOCUSED • On a par1cular area to achieve coverage/confidence • Risk based focus – re-‐evaluated as tes1ng proceeds
– PRODUCTIVE • Finds more defects per unit of effort than other methods
• Disadvantages: – Un-‐repeatable and not suitable for regression – Requires a skilled tester to execute, with knowledge of the product/domain
– Need to avoid rever1ng to ‘AdHoc’ tes1ng
Why do exploratory tes1ng?
31
• 1-‐2 hour sessions each star1ng with a charter and ending with a report/debrief
• Charter – iden1fies the goal of the session – What to test? – How long for? – Areas to explore – Risky areas? – Bug types to be inves1gated? – Extreme values/tricky situa1ons?
• In general exploratory can help you get familiar with product & target hotspots – Try extremes, anything you like to break product – When you find a bug, explore around it – Look for panerns, clues to other bugs – Document the test that caused the failure; document other tests that
cover 'interes1ng' situa1ons – Report/Log an incident – Move on to next interes1ng area
Example: Session based tes1ng
Copyright © 2012 AgileInnova1on
Q&A
– Training – ConsulDng – Coaching – Assessments