t-76.613 software testing and quality assurance · presenting and discussing alternative solutions...
TRANSCRIPT
HELSINKI UNIVERSITY OF TECHNOLOGY
T-76.5613 Software testing and quality assurance
22.10.2007
Reviews and Inspections
Mika MäntyläSoberIT
2Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Static techniques
Individual techniquesDesk-checking, proof-reading, …
Group techniquesReview
Tool based static analysis Metrics toolsLints
Static techniques do not execute codeStatic techniques do not execute code
3Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Contents
Introduction and benefits of reviewsTypes of reviewsDimensions of inspectionCost, problems, and alternativesSummary
4Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Review
A meeting or process at which an artifact is presented to peers, the user, customer, or other interested parties for comments and approvalGoals of a review may include
Identifying defects and improving qualityEducating and knowledge transferPresenting and discussing alternative solutions
Reviews are effective way to do quality assuranceReviews typically catch more than 50% of products defect (even >90% reported)
Defects detected in inspections / total detected defects
5Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Benefits of reviews
Can be done as soon as the artifact is writtenEasy to apply to parts of the system or incomplete componentsCan consider quality attributes as reusability, security, etc.Distribution of knowledgeEach defect can be considered in isolationDefect location time: ZeroIncreased awareness of quality issuesTool to improve entire development process
Analyzes of review data can be used to improve the software development process
6Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Defect removal efficiency
Activity Efficiency range
Informal design reviews 25 ... 40 %
Formal design inspections 45 ... 65 %
Informal code reviews 20 ... 35 %
Formal code inspections 45 ... 70 %
Unit test 15 ... 50 %
New function test 20 ... 35 %
Regression test 15 ... 30 %
Integration test 25 ... 40 %
System test 25 ... 55 %
Low-volume beta test (<10 clients) 25 ... 40 %
High-volume beta test (>100 clients) 60 ... 85 %
(Jones C. 1996)
7Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Cost of fixing defects
Relative cost to fix a fault
1 5 10
30
50
82
0
10
20
30
40
50
60
70
80
90
Requirements Design Coding Developmenttesting
Acceptancetesting
Operation
(Boehm. Software Engineering Economics. 1981)
Reviews Dynamic testing
8Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Reviews vs. dynamic testing
BothAim at evaluating and improving the quality before the software reaches the customersPurpose is to find and then fix faults and other potential problemsCan be applied early in the sw development process
ReviewsCan be applied earlier than dynamic testingCan only examine static documents and models; testing can evaluate the product working in operational environment
Reviews and testing are not mutually exclusiveThere is a trade-off between fixing and preventing
Cost of reviewing and testing vs. cost of fixing the defects when they are found
9Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Inspections, functional, and structural testing
Comparison of Inspections, functional, and structural testing
No significant difference between defect detection effectiveness (Hetzel 1976, Myers 1978, Basili 1987, and others)
Each technique finds different defectsInspection: unused parameter in a small function, occasionally produces incorrect resultTest: wrong parameter value, several parameters, textual distance between the call and the function
NOTE: The technique which is applied first always seems more effective
Defects found by reviews
Defects found by testing
10Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Defect removal efficiency and four factorsFactors used Results
Formal design inspection
Formal code inspection
Formal quality assurance
Formal testing Worst Median Best
1 30 % 40 % 50 %
2 x 32 % 45 % 55 %
3 x 37 % 53 % 60 %
4 x 43 % 57 % 66 %
5 x 45 % 60 % 68 %
6 x x 50 % 65 % 75 %
7 x x 53 % 68 % 78 %
8 x x 55 % 70 % 80 %
9 x x 60 % 75 % 85 %
10 x x 65 % 80 % 87 %
11 x x 70 % 85 % 90 %
12 x x x 75 % 87 % 93 %
13 x x x 77 % 90 % 95 %
14 x x x 83 % 95 % 97 %
15 x x x 85 % 97 % 99 %
16 x x x x 95 % 99 % 99,99 %
(Jones C. 1996)
11Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
What can be reviewed or inspected?
Software requirements specificationSoftware design descriptionSource codeTest casesSoftware test documentationSoftware user documentationSystem build proceduresInstallation proceduresRelease notesAnything that is written down…
12Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Contents
Introduction and benefits of reviewsTypes of reviewsDimensions of inspectionCost, problems, and alternativesSummary
13Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Types of reviews
Techinical reviews
Team Review
Formal Inspection
Walkthrough
Pair Review & Pass-around
Others
Audits
Management Reviews
(Wiegers. Peer reviews in Software. 2002)
14Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Team Review
PhasesPlanning
Schedule meeting(s), invite participants, assemble materialIndividual preparation
Reviewing the artifact, logging all potential defects or issuesMeeting
Going through the artifact togetherLogging all individual findingsIdentifying new issues
ReworkDeciding how to act on each issueImplementing the fixes (checking the fixes, re-review)
GoalsReveal defectsTransfer knowledgeDiscuss solutions
Team reviews are less rigorous than formal inspections
15Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Formal Inspections
More formal and rigorous than team reviewsDefined roles and trained participants
Author, Reader, Moderator, Recorder, Inspectors
Focused on revealing defects and issuesNo discussion of solutions, alternatives, education, etc
Formal entry and exit criteriaLoser inspections are not started
Carefully measured and trackedBenefits and required effort continuously visible
Process improvement proposals are an explicit deliverableRoot cause analysis of defect
16Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Reviews vs. InspectionsReviews Inspections
The document to be reviewed is given out in advance
Product compared against sources, and standards
Typically dozens of pages to review Chunk or sample of the document
Instructions are ”please review this” Training and formal roles
Some people have time to look through it and make comments
Entry criteria to the meeting, may not be worth holding
The meeting often lasts for hours Fixed duration, max 2 hours, often much shorter
”I don’t like this” Rule violations, objective not subjective
Much discussion, some about technical approaches, some about trivia
No discussion, highly focused, anti-trivia
Don’t know if it was worthwhile, but we keep doing it.
Data gathered and used to analyze entire sw development process. Only do it if value is proven (continually)
17Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Walkthrough
Author presents a work product to peersNo individual preparationPresentation of work product may include additional approaches
Program execution and debuggingIllustrations of how program works under certain scenarios
GoalsBetter for other purposes than defect detectionDiscuss ideas and brainstorm alternative approachesEducate a larger group of people
18Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Pair review & Pass-around
Also known as buddy-check and peer deskcheckSingle individual checks a work product and returns with a list of defectsSenior developers review code of juniors at MicrosoftDifferences to prior mentioned techniques
Inter-action is missing: less education, less discussion
Single reviewer
Pass-around Multiple, concurrent buddy-checks Pass-around often used for books and scientific articles
19Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Techinical reviews - Summary
Inspections Most rigorous, effective, expensiveFocus: Defect detection and measurement
Team-Review Inspection-litePossible to: educate, discuss better alternatives
Walkthrough Author has central role in presentingNo preparation
Pair review Single person review
Pass-around Multiple concurrent pair reviews
20Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Management Reviews
Ensure project progress, e.g. Iteration Demo in ScrumEnsure proper resource allocationSupport decisions
corrective actionschanges in allocation of resourceschanges to scope
Produces a report for management
21Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Audits
Independent evaluation to check conformance of software products and processes to prescribed standards, procedures, etc.
ISO 9001, CMM
Attended by lead auditor, team of auditors, initiator, recorder, audited organizationLead auditor prepares the audit report and passes it on to the initiator, who in turn passes it on to the audited organization
22Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Contents
Introduction and benefits of reviewsTypes of reviewsDimensions of inspectionCost, problems, and alternativesSummary
23Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Dimensions of software inspection
Technical Dimensionof Software Inspection
ProcessRoles
Products
Reading Techniques
Planning Overview DefectDetection
DefectCorrection Follow-up
OrganizerModeratorInspectorAuthorRecorderPresenterCollector
RequirementsDesignCodeTest Cases
Ad-HocChecklistDefect-based
Function-pointPerspective-basedReading by stepwise abstraction
Laitenberger, O. & J.-M. DeBaud, An Encompassing life cycle centric survey of software inspection, The Journal of Systems and Software, 50 (2000).
24Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Inspection process (Gilb & Graham)
(Gilb & Graham. 1993)
Planning
Kickoff Individualchecking
Loggingmeeting
Edit andfollow-up
Entr
y te
st
Exit
Productdocument
Inspecteddocument
Source document,rules, checklists,
procedures
ProcessImprovements
Change requeststo source
Entry criteria
Exit criteria
25Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Inspection process: Initiation
Entry: Making sure ‘Loser’ Inspections don’t startInspection process begins with request for inspection by the author(s) -> Inspection leaderLeader checks the product and its source documents against relevant entry criteria -> no resources wasted inspecting poor or incomplete documents
Planning: Determining the present Inspection’s objectives and tactics
How many cycles will be necessaryWho will participateOther practical details
Kickoff meeting: Training and motivating the teamTo ensure checkers know what is expected of themDistribute documents, assign rolesTraining in inspection proceduresSetting targets for inspection productivity
26Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Inspection process: Checking
Individual checking: The search for potential defectsCheckers work alone on the product documentUsing source documents, rules, procedures and checklistsAim to find the maximum number of unique major potential defectsIssues identified as objectively as possible and recorded for personal reference
Logging meeting: Log issues found earlier and check for more potential defects
Purpose to log the already identified issues, discover more major issues during the meeting and identify and log ways of improving the development processLogged items can be issues, questions of intent, or process improvement suggestionsNo discussion is allowed during the logging meetingAt the conclusion of the meeting, the questions of intent are answered, which might immediately lead logging more issues
27Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Inspection process: Completion
Edit: Improving the productSomeone (usually the author) is given the log of issues to resolve and he becomes an editorEditor classifies the issues to defects or not defects and can make change requests to source documentsEditor makes necessary corrections to the documentThe editor may make further improvements and corrections to the document and more process improvement suggestions
Follow-up: Checking the editingInspection leader checks that satisfactory action has been taken on all logged issues
Exit: Making sure the product is economic to releaseInspection leader verifies that the inspected product is ready to exit inspection process using applicable exit criteria
28Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Dimensions of software inspection
Technical Dimensionof Software Inspection
ProcessRoles
Products
Reading Techniques
Planning Overview DefectDetection
DefectCorrection Follow-up
OrganizerModeratorInspectorAuthorRecorderPresenterCollector
RequirementsDesignCodeTest Cases
Ad-HocChecklistDefect-based
Function-pointPerspective-basedReading by stepwise abstraction
Laitenberger, O. & J.-M. DeBaud, An Encompassing life cycle centric survey of software inspection, The Journal of Systems and Software, 50 (2000).
29Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Inspection Roles
Managers are not allowed to attend inspections
Leader (or organizer) – plans the Inspection, chooses participants, helps & encourages, conducts the meeting, performs follow-up, manages metricsModerator – acts as a review meeting chair, ensures that procedures are followed and people stay focused to review taskAuthor (or editor) of the document being inspectedInspectors (or checkers) – specialized fault finding roles for inspectionReader (or presenter) – presents the document in the review meetingRecorder (or scribe) – writes issue log during review meeting
30Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Choosing Reviewers
Possibilitiesspecialists in reviewing (e.g. QA people)people from the same team as the authorpeople invited for specialist expertisepeople with an interest in the productvisitors who have something to contributepeople from other parts of the organization
Excludeanyone responsible for reviewing the author
i.e. line manager, appraiser, etc.
anyone with known personality clashes with other reviewersanyone who is not qualified to contributeall managementanyone whose presence creates a conflict of interest
31Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Different team sizes
Effectiveness vs. efficiencyEffective => find as many defects as possibleEfficient => find as many defects per time as possibleStrong relation to team size
Large team may find more defects but works slower— Inefficient
Small team finds many defects in short time.— Not as many defects as a slightly larger team.
Find the optimal way between both.Depends on type of material + time to market
32Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Dimensions of software inspection
Technical Dimensionof Software Inspection
ProcessRoles
Products
Reading Techniques
Planning Overview DefectDetection
DefectCorrection Follow-up
OrganizerModeratorInspectorAuthorRecorderPresenterCollector
RequirementsDesignCodeTest Cases
Ad-HocChecklistDefect-based
Function-pointPerspective-basedReading by stepwise abstraction
Laitenberger, O. & J.-M. DeBaud, An Encompassing life cycle centric survey of software inspection, The Journal of Systems and Software, 50 (2000).
33Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Reading techniques
Inspections results depend mostly on ParticipantsIndividual preparation effort and strategies
Reading techniques: to improve individual defect detection effectiveness
34Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Reading techniques (1/4): Ad-hoc based reading
Product inspected systematicallyNo support given for reading
Support = guidelines, how to proceed, what to look for
35Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Reading techniques (2/4): Checklist based
Inspection performance dictated by the checklistEntire defect classes might be missed
To assure effectiveness (=problems of CL usage)Avoid too general questions
“Are the modules easy to change?”
Provide concrete instructions on how to use checklistWhen and based on what information a question is to be answered
Tailor the checklist based on the domainUsed development language: C++, Java, PERLThe application type: single vs. multi thread, UI vs. class library
Use previous error data as input to improve the checklistFocus on the most common errors in the given context
36Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Reading techniques (3/4): Abstraction based
Two similar abstraction based techniques:Stepwise abstraction (Procedural), Abstraction driven (OO)
Read a piece of code and abstracts its specification (or function) to natural language or mathematical formCompare against the intended behaviorUsed in CleanRoomResults
Encourages deeper understanding of codeHelps weaker participant suppressed the betterRequires reading of all code to get benefits from abstracts
37Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Reading techniques (4/4): Scenario based
Inspector is given a scenario or perspectiveAlso known as perspective based readingEach inspector may have different scenarios
Limit the attention of the inspector to particular areaAllow freedom inside the limited area
Scenario or perspective may be Set of use casesDefect classes: memory leaks, comparison errors, etcDifferent roles or stakeholders
E.g. Requirements specification should be read by — Programmers / Designers— Testers— Customers / Users— Project management
38Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Contents
Introduction and benefits of reviewsTypes of reviewsDimensions of inspectionCost, problems, and alternativesSummary
39Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Costs of reviews
Rough guide: 5-15 % of development effortHalf a day a week is 10 %
Effort required for reviewersPlanning (by leader/moderator)Preparation / self study checkingMeetingFixing / editing / follow-upRecording & analysis of statistics / metricsProcess improvement
40Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Experiences with reviews
[Freedman & Weinberg 1982] report that in large systems, reviews have reduced the number of errors reaching the testing stages by a factor of 10. They report that this reduction cut testing costs by 50% to 80% including review costs.
[Gilb & Graham 1993] 25 % reduction in schedules, remove 80-95 % of faults at each stage, 28 times reduction in maintenance cost
Not only do peers working together find more faults, they also find more serious faults than the software producer alone can find [Freedman & Weinberg 1990]
41Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Common problems in review process implementation
Participants don't understand the review processLack of training in the technique (especially inspection, the most formal)Reviews are not planned
Reviewers are not preparedThe wrong people participateReview meetings drift into problem-solving Reviewers critique the producer, not the productReviewers focus on style, not substanceLack of management support – want them done, but don’t allow time for them to happen in project schedulesFailure to improve the processes (gets disheartening just getting better at finding the same thing over again)
42Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Metrics – What For?
Metrics collection is optional under IEEEGilb & Graham: essential
Provide hard data continual management developers support of the process
Plan, control, monitor and improve the inspection processShould not be used by management for staff evaluation
43Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Metrics – What to Analyze?
Use, for example, GQM (Goal Question Metrics) to select which metrics to collect and analyzeAT&T Bell Laboratories uses GQM and recommends 9 metrics
Total KLOC inspected Avg. LOC Inspected / Inspections
Avg. Inspection Rate Avg. Preparation Rate
Avg. Effort / KLOC Avg. Effort / Fault Detected
Avg. Fault Detected / KLOC Defect-Removal Efficiency
Re-Inspection %
(Barnad et al. 1994)
44Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Alternative ways for reviews
Reviews are not needed to get review benefits Benefits come from more minds going through the same issue
What is wrong in the following scenario?A developer writes code that is then inspected by three other developers
i.e work products of one are “torn to pieces” by group of people
A better approached involve more people when code is written
Collaborative methods and reviewsPair-Programming vs. code inspection
Research has showed benefits of pair-debugging
Joint Application Design (JAD) vs. Requirements reviewJAD - an interactive systems design concept involving discussion groups in a workshop setting
45Mika MäntyläSoberIT/HUTHELSINKI UNIVERSITY OF TECHNOLOGY
Summary
Goal of reviews Detect defects before the testing Distribute knowledge
Reviews should be applied early, because defects are easier and cheaper to fix when they are found earlyTypes of reviews: Team review, Inspection, Walkthrough, Pair Review & Pass-around, Audits, Management ReviewInspection dimensions: Roles, Process, Product, Reading techniques,In inspection, remember to
Plan and Prepare for inspectionsFocus on finding defects, not solutionsCriticize the product, do not blame the producerCollect metrics and use them to improve the process
Alternatives: Collaborative work (JAD, pair-programming)