beyond "quality assurance"
TRANSCRIPT
Jason BentonAgile BirminghamJune 2014
Beyond “Quality Assurance”The secrets of testing on an agile team
AGILE
You keep using that word. I do not think it means what you think it means
The Princess Bride (1987) ©
Teams are small and focused
Smaller, more frequent releases
Collaboration is essential
Self organizing teams produce the best software
Christoffer A Rasmussen
Continuous improvement
StuckInCustoms
Challenges…
As a Association Director, I would like to create a child care site so that I can schedule programs
As a Association Director, I would like to create a child care site so that I can schedule programs
As a Association Director, I would like to manage fee charts so that I can charge members different rates for different programs
As a Site Director, I would like to register a child for a program so that I can schedule programs
As a Site Director, I would like to look up a child so that I can view their registered programs
Smaller, more frequent releases
Test
ing
Perc
enta
ge
0
25
50
75
100
IterationBeginning of sprint End of sprint
Testing happens late
As a <type of user> , I would like <some functionality>, so that <I can accomplish some goal>.
Acceptance Criteria
Acceptance Criteria
Acceptance Criteria
Acceptance CriteriaAcceptance
Criteria
Agile Testing at Daxko
http://context-driven-testing.com/
• The value of any practice depends on its context.
• There are good practices in context, but there are no best practices.
• People, working together, are the most important part of any project’s context.
• Projects unfold over time in ways that are often not predictable.
• The product is a solution. If the problem isn’t solved, the product doesn’t work.
• Good software testing is a challenging intellectual process.
• Only through judgment and skill, exercised cooperatively throughout the entire project, are we able to do the right things at the right times to effectively test our products.
The Seven Basic Principles of the Context-Driven School
Quality is everyone’s responsibility
Testing Principles• The essential value of any test is its ability to provide information and reduce uncertainty.
• The value of any practice depends on its context.
• There are good practices in context, but there are no best practices. Effective testing requires contextual analysis and consideration. Factors such as risk, complexity, fragility, priority, value, and performance need to be considered.
• Testing activities should begin as early as possible. Be practical! Testers should collaborate with development to break stories into small, testable pieces during planning.
• Manual test sessions need to vary over time. The same tests repeated manually over time provide diminishing value. This is one of the biggest advantages of structured exploratory testing.
• Test documentation is a valuable tool to the extent that it helps you manage your testing project and find defects. Beyond that, it is a diversion of resources.
• Know the user. Know the problem we are trying to solve. Focus on delivering value to customers. Finding and fixing defects does not help if the system built is unusable and does not fulfill the user's needs and expectations.
• To be effective and efficient, testing activities must include: plan or charter for testing, test design, preparing test environments, executing tests, and evaluating results. These activities may overlap or take place concurrently.
• Defects should be communicated in a constructive way. Bad feelings between testers, designers, and developers can be avoided. Communicate findings in a neutral, fact focused way without criticism.
• Inspect and adapt. Reflect on what went well and what can be improved with each iteration. Strive to continuously learn and improve.
• Automated testing is better than manual testing.
• Manual testing is better than automated testing.
As a registered user, I would like to be able to login so that I can access my files from anywhere
3 unsuccessful login attempts results in a 10 minute lockout
Account is protected by
username (email) and password
Authenticated users can see their account
info (name, email, and password)
3 unsuccessful login attempts results in a 10 minute lockout
Account is protected by
username (email) and password
Authenticated users can see their account
info (name, email, and password)
As a registered user, I would like to be able to login so that I can access my files from anywhere
3 unsuccessful login attempts results in a 10 minute lockout
Account is protected by
username (email) and password
Authenticated users can see their account
info (name, email, and password)
As a registered user, I would like to be able to login so that I can access my files from anywhere
Iterative
Plan
Design
Develop
Test
Plan
Design
Develop
Test
Plan
Design
Develop
Test
Plan
Design
Develop
Test
2 Weeks 2 Weeks 2 Weeks 2 Weeks
Eliminate Waste
Exploratory Testing: Simultaneous learning, test design and test execution.
1) Charter - the goal or mission for the test session !2) Session - an uninterrupted period of time spent testing !3) Session Report - detailed record of the testing session and results !4) Debrief - a short discussion about the session report
Session Based Exploratory Testing
Charter Examples• Explore all workflows for adding members (in-house and online)
• Explore and analyze the Reciprocity app. Produce a Feature Map that outlines the capabilities and features.
• Explore the budget workflow from start to finish.
• Compare data in various custom reports from before and after an upgrade to Logi.
• Verify all maintenance item fixes in the Fundraising component for Operations 11.6
• Test all fields that allow data entry for SQL injection.
• Check UI against Daxko design standards.
• Test all 3rd party integration with Operations
• Explore the ability to change member info using the online account feature
• Validate the accuracy of the financial batch jobs for credit card processing
Session Report Example
• Require testers to identify test objectives and focus their testing efforts on fulfilling them.
• Encourage testers to modify existing tests and add new tests to meet those objectives as they test.
• Provide a useful way to manage and review exploratory testing.
• Focus attention on the testing that’s actually performed, not just on test case results.
• Give testers the flexibility to respond to changes and re-plan their testing quickly.
Benefits
Tools and Techniques
Feature Maps
Feature Maps
Heuristics
http://www.satisfice.com/tools/htsm.pdf
Risk Heuristics:
Generic risks are risks that are universal to any system. These are my favorite generic risks: !• Complex: anything disproportionately large, intricate, or convoluted. !
• New: anything that has no history in the product. !
• Changed: anything that has been tampered with or "improved". !
• Upstream Dependency: anything whose failure will cause cascading failure in the rest of the system. !
• Downstream Dependency: anything that is especially sensitive to failures in the rest of the system. !
• Critical: anything whose failure could cause substantial damage. !
• Precise: anything that must meet its requirements exactly. !
• Popular: anything that will be used a lot. !
• Strategic: anything that has special importance to your business, such as a feature that sets you apart from the competition. !
• Third-party: anything used in the product, but developed outside the project. !
• Distributed: anything spread out in time or space, yet whose elements must work together. !
• Buggy: anything known to have a lot of problems. !
• Recent failure: anything with a recent history of failure.
SFDPO - San Francisco Depot"!• Structure (what the product is): What files does it have? Do I know anything
about how it was built? Is it one program or many? What physical material comes with it? Can I test it module by module? !
• Function (what the product does): What are its functions? What kind of error handling does it do? What kind of user interface does it have? Does it do anything that is not visible to the user? How does it interface with the operating system? !
• Data (what it processes): What kinds of input does it process? What does its output look like? What kinds of modes or states can it be in? Does it come packaged with preset data? Is any of its input sensitive to timing or sequencing? !
• Platform (what it depends upon): What systems does it run on? Does the environment have to be configured in any special way? Does it depend on third-party components? !
• Operations (how it will be used): Who will use it? Where and how will they use it? What will they use it for? Are there certain things that users are more likely to do? Is there user data we could get to help make the tests more realistic?
Big Visible Charts
Test Automation
Unit
Service
UI
Test Automation
Unit
Service
UI
Functional Test Automation
• Whole team responsibility
• Page Object Model design pattern
• Watir & RSpec
• Breadth > Depth
• Run tests often
• Make broken tests visible to the team
Compliance
Lessons Learned
Test
ing
Perc
enta
ge
0
25
50
75
100
IterationBeginning of sprint End of sprint
Test early, test often
Build quality in
Unit Testing Code Reviews
Pairing Automated Tests
Continuous Integration Story Mapping
Collaborate on test ideas
Collaborate on problems
Get testers involved in planning and estimation
Jakuza
Be cross-functional!
Get out of the building!
Office Space (1999) ©
Outcomes > Output
Get everyone testing!
Question value vs. cost for all of your testing efforts.
Use prod-like environments for testing
Learn where defects cluster
Questions?
Software TestingAgile Coaching
Jason Benton
Twitter: @jasonfbentonEmail: [email protected]