the science of software testing - experiments, evolution & emergence (2011)
TRANSCRIPT
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
1
The Science of Software Testing:Experiments, Evolution & Emergencevia Value Flow
Neil ThompsonThompson information Systems Consulting Ltd
©Thompson information Systems Consulting Ltd
v1.0
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
2
©Thompson information Systems Consulting Ltd
In the beginning testing was Methods (or Psychology?) – then camethe Arts & Crafts movement(s)!
Brian Marick (1995) The Craft...:• specifically for subsystem testing &
object-orientedPaul Jorgensen (1995) ...A Craftsman’s approach:• “Mathematics is a descriptive device that helps
us better understand software to be tested”
Glenford Myers (1979) The Art...:• but 1976 “unnatural, destructive
process”... “problem in economics”
Contrary to popular belief, the first book devoted to software testing was1973, ed. Bill Hetzel:• (technically, a conference proceedings,
Chapel Hill North Carolina 1972)• contains the first V-model?!
But NB Jerry Weinberg had written in 1961 & 1971 of testing as intriguing, puzzle, a psychological problem
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
3
©Thompson information Systems Consulting Ltd
More about Gerald M. Weinberg
Acknowledgement to James Bach’s summary in “The Gift of Time” (2008 essays in honour of Jerry Weinberg on his 75th birthday)
• Ph.D. in Psychology (dissertation 1965 “Experiments in Problem Solving”)• In 1961’s Computer Programming Fundamentals
with Herbert Leeds (revised 1966 & 1970):– “testing... is by far the most intriguing part of programming”– “seldom a step-by-step procedure”... “normally must circle around”
• 1967 Natural Selection as applied to Computers & Programs (!)• 1971 The Psychology of Computer Programming:
– “testing is first and foremost a psychological problem”– “one way to guard against... stopping testing too soon... is to prepare
the tests in advance of testing and, if possible in advance of coding”– General Systems Thinking (1975):
– the science of modelling andsimplifying complex, open systems
– Systems Thinking (1992)– “observe what’s happening and ...understand the significance”– feedback loops, locking on to patterns, controlling & changing
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
4
©Thompson information Systems Consulting Ltd
But what is software testing now?
• Also, more later about the “schools” of software testing!
Engineering? Graham Bath & Judy McKay (2008):• But “engineering” isn’t in glossary, or even index!• “What is a Test Analyst? Defining a role at the international
level is not easy...”
Context-Driven? Yes but:• Context-Driven school and artistic? C-D and engineering???• “Craft” is often used in Context-Driven discussions• Plus science & passion!
Profession?• EuroSTAR 2010, Isabel Evans and others• Magazine(s)
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
5
©Thompson information Systems Consulting Ltd
Defining Quality is even more difficult!
Robert M Pirsig:• Zen and the Art of Motorcycle
Maintenance – an Inquiry into Values (Bodley Head 1974, also see http://en.wikipedia.org/wiki/Zen_and_the_Art_of_Motorcycle_Maintenance )
• Lila – an Inquiry into Morals (Bantam 1991, also see http://en.wikipedia.org/wiki/Lila:_An_Inquiry_into_Morals )
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
6
©Thompson information Systems Consulting Ltd
“Quality is value to some person(s)”
“Summit” image from www.topnews.in
Quality is value to me
Yes, but...
Jerry Weinberg, Quality Software Management 1992
Quality is value to me
Quality is value to me
Quality is value to me
Quality is value to me
Quality is value to me
Quality is value to me
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
7
©Thompson information Systems Consulting Ltd
“The Science of Software Testing” isn’t a book yet, but...
• Boris Beizer(1984) experimental process, and(1995) falsifiability (the well-known Popper principle)
• Rick Craig & Stefan Jaskiel (2002)black-box science & art, “white-box” science
• Marnie Hutcheson (2003) software art, science & engineering
Kaner, Bach & Pettichord (2002) explicit science:• theory that software works, experiments to falsify• testers behave empirically, think sceptically,
recognise limitations of “knowledge”• testing needs cognitive psychology,
inference, conjecture & refutation• all testing is based on models
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
8
©Thompson information Systems Consulting Ltd
Some bloggers have been more specific and detailed
Paul Carvalho (www.staqs.com) – testing skills include:• learning / relearning scientific method (multiple sources!)• knowledge of probability & statisticsRandy Rice (www.riceconsulting.com) – science rigour decreasing? but:• testing analogies with observation, experiment, hypothesis, law etc
David Coutts (en.wikipedia.org/wiki/User:David_Coutts) – yes “context” but:• both science & software testing have “right / wrong answers”, so...• test passes/fails based on its requirements. However, science & testing...• go beyond falsificationism to economy (few theories explaining many
observations), consistency, maths foundation & independent verification• retesting is a different theory
Apologies to anyone I’ve so far missed!
BJ Rollison (blogs.msdn.com/b/imtesty/ - note this is on his old blog) :• too many hypotheses to test in a reasonable time• debugging is scientific alsoCem Kaner (www.kaner.com) – software testing as a social science :• software is for people, we should measure accordingly• bad theories & models give blind spots & impede trade-offs
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
9
ExperimentsPart A
© Thompson information Systems Consulting Ltd
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
10
©Thompson information Systems Consulting Ltd
Why should it be useful to treat testing as a science?
System Requirements/Specification/ Design
Note: this is starting with the “traditional” views of testing & science
Test
Expectedresult
Testresult
Test result =Expected?
Experimentresult
Expectedresult
Hypothesis
Experiment
Experiment result =
Expected?
Test“passes”
Hypothesisconfirmed
Hypothesisrejected
Test“fails”Product
Part of thecosmos
Y
N
Y
N
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
11
©Thompson information Systems Consulting Ltd
What is software testing? Definitions through the ages
Overall periods developed after Gelperin & Hetzel, “The Growth of Software Testing”, 1988 CACM 31 (6) as quoted on Wikipedia
PERIODDEBUGGING (Psychology)DEMONSTRATION (Method)DESTRUCTION (Art)
EVALUATION
PREVENTION (Craft?)
SCHOOL(S)
1957
1976
EXEMPLAR OBJECTIVES SCOPE APPROACH
Pre-
1983
1984
2000
Science?2011
Verify, +maybe Prove, Validate, “Certify”
Weinberg(1961 & 71) Test + Debug Programs Think, Iterate
Hetzel(1972)
Show meetsrequirements
Find bugsMyers(1976 & 79)
Programs
Programs, Sys,Acceptance
+ Walkthroughs, Reviews & Inspections
Kaner et al(1988 & 99)
Experiment &Evolve? Neo-Holistic?
Measurequality
?
Beizer(1984)
+ IntegrationFind bugs,show meetsrequirements,+prevent bugsFind bugs, in serviceof improving quality,for customer needs
Realistic, pragmatic,normal
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
12
©Thompson information Systems Consulting Ltd
So, how would these “methods” look if we adopt Myers & Popper?
System Requirements/Specification/ Design
Note: this is starting with the “traditional” views of testing & science
Test
“Aim to find bugs”
Testresult
Test result =“as aim”?
Experimentresult
Hypothesis
Experiment
Experiment result =
“as aimed”?
Test is“successful”
Falsificationconfirmed
Hypothesisnot yetfalsified
Test is so far“unsuccessful”Product
Part of thecosmos
“Aim tofalsify hypothesis”
Y
N
Y
N
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
13
©Thompson information Systems Consulting Ltd
Other quality-relatedcriteria
A current hot topic: testing versus “just checking”
System Requirements/Specification/ Design
“Check”
Expectedresult
Checkresult
Check result =
Expected?
Check“passes”
Check“fails”Product
System Requirements/Specification/ Design
Test
Wayscould fail
Testresult
Test result =appropriate
?
Quality-related info
Info onqualityissuesProduct
Other oracles
Y
N
Y
N
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
14
©Thompson information Systems Consulting Ltd
Exploratory testing is more sophisticated than pre-designed, and does not demand a system specification
Other quality-relatedcriteriaSystem Requirements/
Specification/ Design
Test
Wayscould fail
Testresult
Test result =appropriate
?
Quality-related info
Info onqualityissuesProduct
Other oracles
HeuristicsContext Epistemology
Abductiveinference
Bugadvocacy
Cognitivepsychology
• Test Framing:– context, mission, requirements, principles, oracles, risks– models, value ideas, skills, heuristics, cost/value/time “issues”– mechanisms, techniques, procedures, execution methods– deliverables
Y
N
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
15
©Thompson information Systems Consulting Ltd
Science should help to understand overlapping models, and to derive better test models
SOFTWARE(observed)
DEV MODEL(expected)
TEST MODEL(verified /validated)
REAL WORLD (desired)
afterSOFTWARE TESTING:A CRAFTSMAN’S APPROACHPaul Jorgensen
• Examples of development model techniques:– entity relationships– state transitions
• Examples of test model techniques – the above plus:– equivalence partitioning, domain testing & boundaries– transaction, control & data flows– entity life history (CRUD)– classification trees; decision tables– timing– opportunity for more?
• In “checking”, the test model tries to cover the development model
• Testing (sapient) should expand its model beyond that, as far into actual/potential real behaviour as stakeholders want and can pay for
• Development models & test models each cover subsets of actual & potential reality
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
16
©Thompson information Systems Consulting Ltd
Heuristics, patterns & techniques – and scientific analogues?
Heuristics †
Patterns *
Techniques †
• catchy title;• description of the problem addressed• solution to the problem• context in which pattern applies• some examples.
• art of discovery in logic• education method in which student discovers for self• principle used in making decisions when all possibilities cannot be fully explored!
• method of performance• manipulation• mechanical part of an artistic performance!
?
?
Tests
† Definitions based on Chambers 1981* Definitions based on Software Testing Retreat #2, 2003
Conjectures w
Hypotheses w
Theories w
Experiments+Laws!
?
?
w Definitions based on Wikipedia 2011
• proposition that is unproven but is thought to be true and has not been disproven
• testable statement based on accepted grounds
• proposed explanation of empirical phenomena, made in a way consistent with scientific method and satisfactorily tested or proven
?
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
17
©Thompson information Systems Consulting Ltd
But wait! Is there one, agreed, “scientific method”?
* Images from various websites, top-ranking of Google image search May 2011
• No! These are the first dozen I found (unscientifically*)• Only two are near-identical, so here are eleven variants, all
with significant differences! (extent, structure & content)• The philosophy of science has evolved (see later slides)
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
18
©Thompson information Systems Consulting Ltd
So... a post-Popper view of theories: how science could help coverage
• Coverage is multi-dimensional – so General Systems Thinking helps analysedimensions, eg choose 2-D projections
• And models need to map this multi-dimensional space – not quite a hierarchy:• Development Model, of which...• Test Model should be a superset, of which...• Real World is an unattainable superset – but Test
should get as close as appropriate• Hybridise & innovate new techniques based on heuristics
& patterns – and taking inspiration from conjectures, hypotheses, theories...
• Remember multiple ways software can fail:
System under
test
Program state
Intended inputs
System state
Configuration andsystem resources
From other cooperating processes, clients or servers
Monitored outputs
Program state, including uninspected outputs
System state
Impacts on connected devices / system resources
To other cooperating processes, clients or servers
Sources: Neil Thompson EuroSTAR 1993Doug Hoffman via www.testingeducation.org
+ see variousBugTaxonomies
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
19
Evolution & Value Flow ScoreCards
Part B
© Thompson information Systems Consulting Ltd
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
20
©Thompson information Systems Consulting Ltd
Traditional Darwinian evolution (ie biological)
Image from www.qwickstep.com
• Nearly everyone is familiar with this, but...
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
21
...arguably Darwinan evolutionary principles apply beyond biology
© Thompson information Systems Consulting Ltd Image from http://www.aaas.org/spp/dser/03_Areas/cosmos/perspectives/Essay_Primack_SNAKE.GIF
(Ouroboros: Greek Οὐροβόρος or οὐρηβόρος,
from οὐροβόρος ὄφις "tail-devouring snake”)
• There is a cascade (& approx symmetry!):– Biology depends on Organic Chemistry– Organic chemistry depends on the special properties
of Carbon– Chemical elements in the upper part of the periodic
table come from supernovae– Elements in the lower part of the periodic table
come from ordinary stars– Elements are formed from protons, neutrons,
electrons (Physics)– ... quarks... string theory?? etc
• It just so happens that humans are about equidistant in scale from the smallest things we can measure to the largest
• Humans have evolved to use tools, build societies, read, invent computers...
• So, it is possible to think of pan-scientific evolution as a flow of value
• Now, back to software lifecycles...
Sources: Daniel Dennett “Darwin’s Dangerous Idea” “cosmic Ouroboros” (Sheldon Glashow, Primack & Abrams, Rees etc)
Inventions by humans,
eg Social Sciences
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
22
©Thompson information Systems Consulting Ltd
The software lifecycle as a flow of value
• Working systems have value; documents in themselves do not; so this is thequickestroute!
Programming
a b c Demonstrations &acceptance tests
Statedrequirements
RAW MATERIALS FINISHEDPRODUCT
• SDLCs are necessary, but introduce impediments to value flow: misunderstandings, disagreements…documents are like inventory/stock, or “waste”
Programming
a ab b’
c d
Documentedrequirements
Implicitrequirements
Meeting / escalation to agree
I I Acceptance tests
?
?
Intermediate documentation!
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
23
©Thompson information Systems Consulting Ltd
To improve value flow: agile methods following principles of lean manufacturing
LEVELS OF DOCUMENTATION,pushed by specifiers
WORKINGSOFTWARE
Accepted
System-tested
Integrated
Unit / Component-tested
FLOW OF FULLY-WORKING SOFTWARE, pulled by
customer demand
Requirements
+ FuncSpec
+ TechnicalDesign
+ Unit / Componentspecifications
+ Test Specifications
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
24
But any lifecycle should be improvable by considering the value flow through it
• The context influences what deliverables are mandatory / optional / not wanted
• Use reviews to find defects & other difficulties fast• Do Test Analysis before Test Design (again, this finds
defects early, before a large pile of detailed test scripts has been written)
• Even if pre-designed testing is wanted by stakeholders, do some exploratory testing also
• “Agile Documentation”*:– use tables & diagrams– consider wikis etc– care with structure © Thompson
information Systems Consulting Ltd* These points based on a book of that name, by Andreas Rüping
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
25
©Thompson information Systems Consulting Ltd
Testing has a hierarchy, eg...
AcceptanceTesting
SystemTesting
IntegrationTesting
Levels of specification
Individual units may malfunction (so seek bugs of type x)
Units may not interact properly (so seek bugs of type y)
Users maybe unhappy (sogenerate confidence)
System may contain bugs not found bylower levels (so seek bugs of type z)
Requirements
Functional &NF specifica-
tions
Technicalspec, Hi-level
design
Detaileddesigns
...spec may not be adhered to
Risks &testing
responsib’s
Remember: not only for waterfall or V-model SDLCs, rather iterative / incremental go down & up through layers of stakeholders, specifications & system integrations
UnitTesting
Levels of stakeholders
Business, Users,Business Analysts,Acceptance Testers
Architects,“independent” testers
Designers,integration testers
Developers,unit testers
Levels of system &service integration
+ Businessprocesses
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
26
©Thompson information Systems Consulting Ltd
...Quality and Science can also be seen as hierarchies, which testing can parallel
Levels of system &service integration
Levels of stakeholders
Business, Users,Business Analysts,Acceptance Testers
Architects,“independent” testers
Designers,integration testers
+ Businessprocesses
Developers,unit testers
Layers of quality
Static values:
• Inorganic
Dynam
ic values?
• Biological
• Social• Intellectual
Layers of science
......
......
...Sc
ale.
......
......
..Physics
Chemistry: Inorganic
Chemistry: Organic
Biology (& systemsthinking)
Social sciencesPhilosophy
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
27
©Thompson information Systems Consulting Ltd
Value flows down through, then up through, these layers
Levels of system &service integration
Levels of stakeholders
Business, Users,Business Analysts,Acceptance Testers
Architects,“independent” testers
Designers,integration testers
+ Businessprocesses
Developers,unit testers
Understanding of problem...
Unde
rsta
ndin
g of
sol
utio
n...
Desired quality
Tested (“known”) quality
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
28
©Thompson information Systems Consulting Ltd
So, test appropriately to your scale
Physics (gravityend)
Physics(quantumend)
Systemsthinking)
Understanding of solution
Social sciences
Chemistry: Inorganic
OrganicBiology
AcceptanceTesting
SystemTesting
IntegrationTesting
UnitTesting
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
29
How different sciences can inspire different levels of testing• First, Unit/Component Testing (Physics):
– think quanta (smallest things you can do to the software), equivalence partitions, data values
• For Integration Testing (Chemistry):– think about interactions, what reactions should be, symmetry, loops,
valencies, performance of interfaces• For System Testing (Biology):
– fitness for purpose, entity life histories, ecosystems, palaeontology (historic bugs)
• For Acceptance Testing:– think “Social Sciences”: what are contractual obligations?
• For each test level, consider and tune the value which that level is adding... © Thompson
information Systems Consulting Ltd
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
30
©Thompson information Systems Consulting Ltd
Value Flow ScoreCards
FinancialEfficiencyProductivityOn-time, in budget- Cost of quality
SupplierUpwardmanagement
Informationgathering
• Based on Kaplan & Norton Balanced Business Scorecard and other “quality” concepts
• Value chain ≈ Supply chain:– in the IS SDLC, each participant
should try to ‘manage their supplier’
– for example, development supplies testing(in trad lifecycles, at least!)
– we add supplier viewpoint to the other 5, giving a 6th view of quality
• So, each step in the value chain can manage its inputs, outputs and other stakeholders
CustomerVALIDATIONRisksBenefitsAcceptanceSatisfaction- Complaints
Improve-menteg TPI/TMM…PredictabilityLearningInnovation
ProcessComplianceeg ISO9000Repeatability
- Mistakes
ProductVERIFICATIONRisksTest coverage
- Faults- Failures
(...have been presented previously, so these slides are for background and will be skimmed through quickly in the presentation)
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
31
©Thompson information Systems Consulting Ltd
Value Flow ScoreCards can be cascaded
Pieces of a jig-sawIn addition to “measuring” qualityinformation within the SDLC:• can use to align SDLC principles with
higher-level principles from the organisation
Business Analysts Requirements Reviewers
Architects
Acceptance Test Analysts
Func Spec Reviewers
Designers
Acceptance Testers
Sys TestersSys Test Analysts
Tech Design Reviewers
Developers
Int Test Analysts
via pair programming?
Component Test Analysts, Designers & Executers?
AT Designers & Scripters
ST Designers & Scripters
IT Designers, Scripters & Executers
(...but you don’t necessarily need all of these!)
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
32
©Thompson information Systems Consulting Ltd
The Value Flow ScoreCard in action
Financial
CustomerSupplier Improv’t
Process Product
FinancialCustomerSupplier Improv’tProcess Product
• Yes – it’s just a table! …Into which we can put useful things…
• We start with repositionablepaper notes, then canput in spreadsheet(s)
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
33
©Thompson information Systems Consulting Ltd
Use #1, Test Policy: All views included? Why-What-How (G-Q-M) thought through?
FinancialEfficiencyProductivityOn-time, in budget
- Cost of quality
SupplierUpwardmanagement
Informationgathering
CustomerVALIDATIONRisksBenefitsAcceptanceSatisfaction- Complaints
Improvement & Infrastructure
eg TPI/TMM…PredictabilityLearningInnovation
ProcessComplianceeg ISO9000Repeatability
- Mistakes
ProductVERIFICATIONRisksTest coverage
- Faults- Failures
Organisation’sGoals & Objectives
Objectives
Measures
Targets
Initiatives
Organisation’s ScoreCards
• IS actively supports employees
• TMM levels
• TMM level 2 at least, now
• Both static & dynamic• Planning, preparation & evaluation• Software & related work products
• Products to satisfy specified requirements
• Products to be fit for purpose
• Detect defects early
• Indep- endence increases with test type
• Testing prioritised & managed
• Product risks
• Importance of req’ts
• Automate regr tests as much as possible
Source: summarised from an example in TestGrip by Marselis, van Royen, Schotanus & Pinkster (CMG, 2007)
• Use TestFrame for test analysis & execution
• Defect Detection Percentage
• Freq of process adjustments heeding metrics
• Twice per year
• Proj Mgr is responsible for quality
• Bus Mgt is responsible for enforcing Test Policy
• TMM level 3 within 2 years
• Staff must be certified
• Defect source analysis
• (comprehensive scope)
• ISTQB
• Advisors Expert• Managers Advanced• Analysts Foundation
• Constant improv’t of dev & test processes GOAL
QUESTION
METRIC
Why
What
How
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
34
©Thompson information Systems Consulting Ltd
Use #2, test coverage:Test conditions as measures & targets (not test cases!)
Test Items(level of integration)
Features to betested
Test BasisReferences
ProductRisks
Areas wecould cover
Objectives
Measures
Targets
Initiatives
FinancialEfficiencyProductivityOn-time, in budget
- Cost of quality
SupplierUpwardmanagementInfo from otherlevels of Treble-V model
CustomerVALIDATIONRisksBenefitsAcceptanceSatisfaction- Complaints
Improvement &Infrastructureeg TPI/TMM…PredictabilityLearningInnovation
ProcessComplianceeg ISO9000Repeatability
- Mistakes
ProductVERIFICATIONRisksTest coverage
- Faults- Failures
(from LEVEL TEST PLAN
andTEST BASES)
(to test design & execution)
Test Conditionswe intend to cover
Productbenefits
Features to betested
ProductRisks
Constraints
ProductRisks
ProductRisks
ProductRisks
Objectives forTest Cases
Agreedwithstakeholders
(to next levelof sys design)
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
35
©Thompson information Systems Consulting Ltd
Use #3: process improvement, eg via Goldratt’s Theory of Constraints:“Swimlane” symptoms, causes & proposed remedies
Note: this is similar to Kaplan & Norton’s “Strategy Maps” (Harvard Business School Press 2004)When cause-effect branches form feedback loops, this becomes part of Systems Thinking
FinancialEfficiencyProductivityOn-time, in budget
- Cost of quality
SupplierUpwardmanagement
Informationgethring
CustomerVALIDATIONRisksBenefitsAcceptanceSatisfaction- Complaints
Improvement &Infrastructureeg TPI/TMM…PredictabilityLearningInnovation
ProcessComplianceeg ISO9000Repeatability
- Mistakes
ProductVERIFICATIONRisksTest coverage
- Faults- Failures
Objectives
Measures
Targets
Initiatives
PRE-REQUISITES
TRANSITION
CONFLICTRESOLUTION
CURRENT ILLS
FUTURE REMEDIES
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
36
©Thompson information Systems Consulting Ltd
Use #4a: context-driven testing, egGoldratt conflict resolution on process areas with choices
Objectives
Measures
Targets
Initiatives
FinancialEfficiencyProductivityOn-time, in budget
- Cost of quality
SupplierUpwardmanagement
Informationgathering
CustomerVALIDATIONRisksBenefitsAcceptanceSatisfaction- Complaints
Improvement &Infrastructureeg TPI/TMM…PredictabilityLearningInnovation
ProcessComplianceeg ISO9000Repeatability
- Mistakes
ProductVERIFICATIONRisksTest coverage
- Faults- Failures
FromContext / Circumstances
Resources:• money ( skills, environments)• time
Application characteristics
Process constraints, eg:• quality mgmt• configuration mgmt
Technology
Technical risks
Business risks
Legal:• regulation• standards CURRENT
SITUATION
Test specifications Handover & acceptancecriteria
informal formal informal formal
Appropriate Testing in this context / circumstances
CONFLICTRESOLUTION
etc (about 30categories)
DESIRED SITUATION
Where in the range(specific aspects)
Where in the range(specific aspects)
Sector
Culture
Job type & size
Moral:• safety
CHOICE AREAS...
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
37
©Thompson information Systems Consulting Ltd
Use #4b: lifecycle methodology selection / design, Value Flow ScoreCard as unifying framework
FinancialEfficiencyProductivityOn-time, in budget
- Cost of quality
SupplierUpwardmanagement
Informationgathering
CustomerVALIDATIONRisksBenefitsAcceptanceSatisfaction- Complaints
Improvement &Infrastructureeg TPI/TMM…PredictabilityLearningInnovation
ProcessComplianceeg ISO9000Repeatability
- Mistakes
ProductVERIFICATIONRisksTest coverage
- Faults- Failures
Objectives
Measures
Targets
Initiatives
Conflicts & balances
One hand The other
Risks Risks Risks Risks Risks Risks
Game Theory......................................................................................................
...........................................(any other approaches?)...........................................
Appropriate lifecycle methodology in this context / circumstances
“METHODOLOGY PER PROJECT”
BALANCE
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
38
©Thompson information Systems Consulting Ltd
A further use, #4c??
• Could Value Flow ScoreCard ideas help discuss & bridge the (arguably) growing divide between traditional & agile software practitioners, eg: - waterfall, V-model, W-model, iterative, incremental...? - “schools” of software testing, eg Analytic, Standard, Quality, Context-Driven, Agile... Factory, Oblivious...? - scripted (or at least pre-designed) & exploratory testing?
• To attempt this, let’s extend the evolution & value flow concepts to...
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
39
Emergence &Value Flow Science
Part C
© Thompson information Systems Consulting Ltd
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
40
©Thompson information Systems Consulting Ltd
Evolution as Sophistication plotted against Diversity
Source: Daniel Dennett “Darwin’s Dangerous Idea”
Diversity
Sophistication
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
41
©Thompson information Systems Consulting Ltd
Punctuated equilibria
“Punctuated equilibra” idea originated by Niles Eldredge & Stephen Jay GouldImages from www.wikipedia.org
Sophistication
Diversity“Gradual” Darwinsim
Sophistication
DiversityPunctuated equilibria
“Explosion” in species, eg Cambrian
Spread into new niche,eg Mammals
Mass extinction, eg Dinosaurs
(equilibrium)
(equilibrium)
(equilibrium)
Sophistication
Diversity
Number ofspecies
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
42
©Thompson information Systems Consulting Ltd
Evolution of Science overall
Biology
Chemistry
Organic
Inorganic
Physics
Social sciences
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
43
©Thompson information Systems Consulting Ltd
Not only Evolution, but Emergence: progress along order-chaos edge
Physics
Social sciences
Chemistry
Biology
• For best innovation & progress, need neither too much order nor too much chaos• “Adjacent Possible”
ORDER
CHAO
S
ORDER
CHAO
S
ORDER
CHAO
S
ORDER
CHAO
S
Extrapolation from various sources, esp. Stuart Kauffmann, “The Origins of Order”, “Investigations”
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
44
©Thompson information Systems Consulting Ltd
OK, what’s this got to do with software testing?
Social sciencesTools
Language
Books
Computers
• We have an important and difficult job to do here!
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
45
©Thompson information Systems Consulting Ltd
...and computers are evolving, in both sophistication and diversity, faster than software testing?
Computers1GL
ObjectOrientation
Internet,Mobile devices
ArtificialIntelligence?!
4GL
3GL
2GL
• Are we ready to test AI??
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
46
©Thompson information Systems Consulting Ltd
The Philosophy of Science is also evolving!
Classical
Bayesianism,GroundedTheory...
Logical
##
• So, perhaps the Philosophy of Software Testing could learn from this, perhaps it’s also evolving?...
PositivismEmpiricism
PopperKuhn
LakatosLaudan
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
47
Platforms & Cranes(Genes to Memes)
Part D
© Thompson information Systems Consulting Ltd
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
48
©Thompson information Systems Consulting Ltd
Image from www.qwickstep.com
Biological reproduction & evolution is controlled by Genes
Image from .schools.wikipedia.org
Diversity
Sophist-ication
Replication & Selection
Mutation
GENE
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
49
©Thompson information Systems Consulting Ltd
Memes as an extension of the Genes concept
Biological evolution
GENES
Theme developed from Daniel Dennett “Darwin’s Dangerous Idea”
Mental, social & cultural evolution
Platforms
Cranes
MEMES
Replication & Selection
Mutation
Ideas Beliefs PracticesSymbols
Gestures
Speech
Writing
Image from .www.salon.comTaxonomy from www.wikipedia.org
“Otherimitable
phenomena”
Rituals
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
50
©Thompson information Systems Consulting Ltd
Some candidates for Memes in software testing
Always-considerEffectiveness
EfficiencyRisk management Quality management
Insurance Assurance
V-model: what testing against W-model: quality management
Risks: list & evaluate
Define & detect errors (UT,IT,ST)Give confidence (AT)
Prioritise tests based on risks
Tailor risks & priorities etc to factors
Refine test specifications progressively: Plan based on priorities & constraints Design flexible tests to fit Allow appropriate script format(s) Use synthetic + lifelike data
Allow & assess for coverage changes Document execution & management procedures
Distinguish problems from change requests Prioritise urgency & importance
Distinguish retesting from regression testing
Use handover & acceptance criteria
Define & measure test coverage
Measure progress & problem significance
Be pragmatic over quality targets
Quantify residual risks & confidence
Decide process targets & improve over time
Define & use metrics
Assess where errors originally made
Define & agree roles & responsibilities
Use appropriate skills mix
Use independent system & acceptance testers
Use appropriate techniques & patterns
Plan early, thenrehearse-run,acceptance tests
Use appropriate tools
Optimise efficiency
Source: Neil Thompson STAREast 2003
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
51
©Thompson information Systems Consulting Ltd
Four, five, six... schools of software testing?(Updated version) March 2007Copyright © 2003-2007 Bret Pettichord. Permission to reproduce granted with attribution
Emphasis onadapting to the circumstances under which the product is developed and used
Annotations by Neil Thompson after the Bret Pettichord ppt (blue text), the list in Cem Kaner’s blog December 2006 (black text)
Emphasis on analytical methods for assessing the quality of the software, including improvement of testability byimproved precision of specifications and many types of modeling
(Control):Emphasis on standards and processes that enforce or rely heavily on standards (Test-Driven):
emphasis on code-focused testing by programmers
Oblivious /Groucho?
Neo-Holistic?(like C-D)
Emphasis on policing developers and acting as “gatekeeper”
Axiomatic?
Factory: Emphasis on reduction of testing tasks to routines that can be automated or delegated to cheap labour
, and other sources! (red text)
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
52
Learning from the “Schools” situation
• Think in terms of memes: evolution and transmission• Separate “what people have been taught” from:
– what their bosses say they want (or should want??)– what their personalities push them towards
• Is school behaviour volatile? (context-driven!?)• Things we could adapt from other disciplines:
– see various conference talks, eg oil exploration– but what about insurance, and their actuaries?!
• Preparing for the future, eg testing Artificial Intelligence:– what happened to Genetic Algorithms?– what’s the latest Bayesian application?
© Thompson information Systems Consulting Ltd
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
53
Conclusions & Summary
© Thompson information Systems Consulting Ltd
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
54
©Thompson information Systems Consulting Ltd
Conclusions• The Ouroboros looks better with
software testing at the top! Value flows upwards :
• (right) from the Big Bang to planet Earth and human habitations; and
• (left) from subatomic particles to humans• (the “origin”, Quantum Gravity?, yet to be
agreed)
• Value Flow ScoreCards are useful, but this talk is more about applying the layered principles of science to IT quality
• Humans now evolve in terms of technology-aided Memes, and we can use that to understand & develop the future of software testing
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
55
Recap of messages for Testing & Quality• When strategising, planning and performing testing:
– test according to your scale, using analogies from different sciences to help “frame” your tests
– use Value Flow Scorecards to understand and balance your stakeholders
– design experiments to seek different bug types at different levels (don’t just “falsify” the opposite experiment)
• When considering your position & future in the testing industry:– it’s not just teaching but also psyche, and what bosses want– “stand on the shoulders of giants” ie make use of the platforms
which give huge leverage (eg exploratory automation)© Thompson information Systems Consulting Ltd
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
56
Some wider advice
• When reading new material, use the Adjacent Possible – consider reading two authors at once (or maybe three):– either different representations of similar
opinions, or– apparently opposing opinions
• (I’ve been quoting pairs of books on Twitter, “Things To Read Together”)
© Thompson information Systems Consulting Ltd
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
57
References
• Already quoted slide-by-slide – for summary of main sources see associated article in “The Tester”
• Stop Press (since preparing this talk) – see also related views:– The Software Testing Timeline, www.testingreferences.com– Stuart Reid, “Lines of Innovation in Software Testing”, 2011
paper– Jurgen Appelo, “Management 3.0” (2011 book and website,
about agile leadership practices – makes significant use of complexity theory, eg Kauffman)
© Thompson information Systems Consulting Ltd
SIGiSTSpecialist Interest Group inSoftware Testing 21 Jun 2011
58
©Thompson information Systems Consulting Ltd
• Thanks for listening! • Questions?