technical reviews a presentation by manuel richey for eecs 814 november 17 th, 2005

41
Technical Reviews Technical Reviews A Presentation By Manuel A Presentation By Manuel Richey Richey for EECS 814 for EECS 814 November 17 November 17 th th , 2005 , 2005

Upload: amice-peters

Post on 15-Jan-2016

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Technical ReviewsTechnical Reviews

A Presentation By Manuel A Presentation By Manuel RicheyRichey

for EECS 814for EECS 814

November 17November 17thth, 2005, 2005

Page 2: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Overview Of This Overview Of This PresentationPresentation

There are numerous types of technical There are numerous types of technical reviews, and numerous methods for reviews, and numerous methods for implementing reviews. implementing reviews.

This presentation will briefly describe This presentation will briefly describe several types and techniques of reviewsseveral types and techniques of reviews

It will go into greater detail on one type It will go into greater detail on one type and method for technical reviewsand method for technical reviews

Finally, we will perform a technical Finally, we will perform a technical review in class.review in class.

Page 3: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Purpose of ReviewsPurpose of Reviews

OversightOversight To create buy-in or a sense of To create buy-in or a sense of

ownershipownership To disseminate or transfer knowledgeTo disseminate or transfer knowledge To improve the quality of a document To improve the quality of a document

or work productor work product Other purposes?Other purposes?

Page 4: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Types Of ReviewsTypes Of Reviews

Customer ReviewsCustomer Reviews Management ReviewsManagement Reviews Peer ReviewsPeer Reviews Personal ReviewsPersonal Reviews Other Types of Reviews?Other Types of Reviews?

Page 5: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Customer ReviewsCustomer Reviews

Often specified as milestones in a Often specified as milestones in a contract (e.g. preliminary and critical contract (e.g. preliminary and critical design reviews). design reviews).

Formal documentation is submitted Formal documentation is submitted prior to the review.prior to the review.

A long walkthrough/review is A long walkthrough/review is conducted with the customer. conducted with the customer.

Often leads up to customer sign-off of Often leads up to customer sign-off of a milestone.a milestone.

Page 6: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Management ReviewsManagement Reviews

Each company has their own format for these.Each company has their own format for these. Typically a power point presentation to Typically a power point presentation to

Management or technical leadership followed Management or technical leadership followed by Q&A sessionby Q&A session

The usual objective is to either approve a The usual objective is to either approve a project or monitor project statusproject or monitor project status

Inputs are typically project plans and Inputs are typically project plans and schedules or status reports.schedules or status reports.

Management team makes decisions and Management team makes decisions and charts or approves a course of action to charts or approves a course of action to ensure progress, and properly allocate ensure progress, and properly allocate resources.resources.

Page 7: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Technical Peer ReviewTechnical Peer Review

Structured encounter in which a Structured encounter in which a group of technical personnel analyze group of technical personnel analyze a work product with the following a work product with the following primary objectives:primary objectives: improve the original quality of the work improve the original quality of the work

productproduct improve the quality of the review improve the quality of the review

processprocess

Page 8: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Why Hold Technical Peer Why Hold Technical Peer Reviews?Reviews?

Software Development is a very error-prone Software Development is a very error-prone process. process.

Early detection of defects is cost effective, Early detection of defects is cost effective, and peer reviews find errors early. and peer reviews find errors early.

Peer reviews find many of the same errors as Peer reviews find many of the same errors as testing, but earlier and with less effort.testing, but earlier and with less effort.

They serve to educate the participants and They serve to educate the participants and provide trainingprovide training

They raise a team’s core competence by They raise a team’s core competence by setting standards of excellencesetting standards of excellence

Page 9: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

A Generic Review ProcessA Generic Review Process

Plan Review: Plan Review: assess readiness of work product, assess readiness of work product, assign team members, send out announcement & assign team members, send out announcement & review package.review package.

Detect Defects: Detect Defects: each reviewer looks for defects each reviewer looks for defects in work productin work product

Collect Defects: Collect Defects: in a meeting or via email, etc.in a meeting or via email, etc. Correct Defects: Correct Defects: author corrects work productauthor corrects work product Follow Up: Follow Up: verify rework, document reviewverify rework, document review

Page 10: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Methods For Technical Methods For Technical ReviewsReviews

Ad-HocAd-Hoc PersonalPersonal WalkthroughWalkthrough Fagan Style Inspection Fagan Style Inspection Asynchronous ReviewAsynchronous Review N-Fold InspectionN-Fold Inspection Many Other TechniquesMany Other Techniques

Page 11: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Ad-Hoc ReviewsAd-Hoc Reviews

Provide no process or instructions on how Provide no process or instructions on how to detect defectsto detect defects

Defect detection depends on inspector’s Defect detection depends on inspector’s skill and experienceskill and experience

Still valuable for:Still valuable for: Enforcing standardsEnforcing standards Project status evaluationsProject status evaluations Improved communicationsImproved communications Training and knowledge disseminationTraining and knowledge dissemination

Page 12: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Personal ReviewsPersonal Reviews An integral part of the “Personal Software An integral part of the “Personal Software

Process” by Watts Humphrey. Process” by Watts Humphrey. Involves only the author of a work.Involves only the author of a work. Employs checklists and metrics (if Employs checklists and metrics (if

following PSP). following PSP). For PSP, code review occurs before first For PSP, code review occurs before first

compile.compile. May be performed by the author prior to a May be performed by the author prior to a

Formal technical review. Formal technical review.

Page 13: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

WalkthroughsWalkthroughs

A meeting in which the author presents a A meeting in which the author presents a work product in a sequential manner and work product in a sequential manner and clarifies the product as necessary.clarifies the product as necessary.

No preparation by meeting attendees.No preparation by meeting attendees. May be held as part of another type of May be held as part of another type of

review. review. As an example, a walkthrough may be held for a work As an example, a walkthrough may be held for a work

product prior to distributing the review packet to the product prior to distributing the review packet to the reviewers for a Fagan style software inspection.reviewers for a Fagan style software inspection.

Page 14: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Fagan Style Software Fagan Style Software InspectionsInspections

A method of technical review that involves A method of technical review that involves a meeting based process and specific a meeting based process and specific roles. roles.

Process: Reviewers detect defects Process: Reviewers detect defects separately, but hold a meeting to collect, separately, but hold a meeting to collect, classify and discuss defects. classify and discuss defects.

Defined roles: Moderator, Author, Defined roles: Moderator, Author, Presenter, Recorder, etc. Presenter, Recorder, etc.

We will examine this technique in detail.We will examine this technique in detail.

Page 15: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Asynchronous InspectionsAsynchronous Inspections No Meeting, so review can be distributed in No Meeting, so review can be distributed in

space and timespace and time Doesn’t involve authorDoesn’t involve author Process is as follows:Process is as follows:

Moderator sends out material via emailModerator sends out material via email Individual reviewers create list of defectsIndividual reviewers create list of defects Defect lists are circulated to all inspectors and Defect lists are circulated to all inspectors and

discussed via emaildiscussed via email Individual reviewers update defect list and send to Individual reviewers update defect list and send to

ModeratorModerator Moderator compiles final defect list, sends it to author Moderator compiles final defect list, sends it to author

and follows up – eliminates group approvaland follows up – eliminates group approval

Page 16: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

N-Fold InspectionsN-Fold Inspections

Several independent teams inspect the Several independent teams inspect the same work product using traditional same work product using traditional inspection methods.inspection methods.

Of course, many teams find overlapping Of course, many teams find overlapping defects, but unique defects are typically defects, but unique defects are typically found by each team. found by each team.

The Moderator collects faults from the The Moderator collects faults from the independent teams and composes the independent teams and composes the final defect list.final defect list.

This is an expensive process used when This is an expensive process used when high reliability is desired. high reliability is desired.

Page 17: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Fagan Inspection ProcessFagan Inspection Process

A six step review processA six step review process Author submits work products for reviewAuthor submits work products for review Moderator assesses the product’s readiness, Moderator assesses the product’s readiness,

assigns review team, and announces the assigns review team, and announces the reviewreview

Reviewers prepare for reviewReviewers prepare for review Reviewers hold review meetingReviewers hold review meeting Author corrects defectsAuthor corrects defects Moderator verifies rework and closes reviewModerator verifies rework and closes review

Page 18: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Standards and ChecklistsStandards and Checklists

Standards:Standards: Rules for requirements/design/coding that all work Rules for requirements/design/coding that all work

products must adhere toproducts must adhere to Typically either project or company specificTypically either project or company specific Improve software maintenance and qualityImprove software maintenance and quality

Checklists:Checklists: A list of questions for the inspectors to answer while A list of questions for the inspectors to answer while

reading the document.reading the document. Should be less than a page longShould be less than a page long Should be derived from most common past defectsShould be derived from most common past defects Should be periodically updatedShould be periodically updated

Page 19: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Inspection PackageInspection Package

Work-Product to be inspected (line Work-Product to be inspected (line numbered if possible)numbered if possible)

Supporting documentation (requirements Supporting documentation (requirements or work-product from which the work-or work-product from which the work-product to be inspected was derived)product to be inspected was derived)

Checklists and Standards are availableChecklists and Standards are available Inspection meeting notice (Often sent by Inspection meeting notice (Often sent by

email)email)

Page 20: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Fagan Inspection RolesFagan Inspection Roles Producer/Author – creates the product being Producer/Author – creates the product being

reviewed and answers questions. reviewed and answers questions. Moderator – prepares review package, Moderator – prepares review package,

moderates the meeting, verifies rework.moderates the meeting, verifies rework. Presenter/Reader – presents product during Presenter/Reader – presents product during

meeting meeting Recorder/Scribe – records defects during Recorder/Scribe – records defects during

meetingmeeting Reviewer – everyone is a reviewer, but you Reviewer – everyone is a reviewer, but you

may have reviewers that don’t have another may have reviewers that don’t have another role. role.

Page 21: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Reviewer’s ResponsibilitiesReviewer’s Responsibilities

Responsible for objectively inspecting Responsible for objectively inspecting the work-productthe work-product

Responsible for tracking the amount of Responsible for tracking the amount of time spent preparing for the inspection time spent preparing for the inspection meetingmeeting

Actively participate in inspection Actively participate in inspection meeting by providing defects found meeting by providing defects found during examination of the work productduring examination of the work product

Page 22: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Producer’s ResponsibilitiesProducer’s Responsibilities

Provides required reference material Provides required reference material for the inspectionfor the inspection

Finds defectsFinds defects Provides clarificationProvides clarification Answers questionsAnswers questions Modifies the inspected work-product Modifies the inspected work-product

to correct defectsto correct defects

Page 23: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Moderators Responsibilities Moderators Responsibilities Ensures entry criteria are met Ensures entry criteria are met Distributes the inspection package to review Distributes the inspection package to review

teamteam Ensures that all reviewers are prepared prior to Ensures that all reviewers are prepared prior to

the inspection meetingthe inspection meeting Facilitates the inspection meetingFacilitates the inspection meeting Also participates in review as a reviewer Also participates in review as a reviewer Assures that all items logged at the meeting are Assures that all items logged at the meeting are

dispositioneddispositioned Collects the data and completes the inspection Collects the data and completes the inspection

recordrecord

Page 24: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Presenter’s ResponsibilitiesPresenter’s Responsibilities

Presents the product in logical fashion Presents the product in logical fashion paraphrased at a suitable rateparaphrased at a suitable rate

Typical review rates are 100-200 LOC/hour Typical review rates are 100-200 LOC/hour (or 10-12 pages/hour for documents)(or 10-12 pages/hour for documents)

Can vary significantly due to following factorsCan vary significantly due to following factors languagelanguage comments and readabilitycomments and readability type of softwaretype of software structure of softwarestructure of software

Page 25: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Recorder’s ResponsibilitiesRecorder’s Responsibilities

Completes the defect logCompletes the defect log Defects should be classified based on Defects should be classified based on

team consensus by:team consensus by: Severity (Major, Minor)Severity (Major, Minor) typetype classclass

Should use techniques to minimize Should use techniques to minimize defect logging timedefect logging time

Not a secretaryNot a secretary

Page 26: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Review MeetingReview Meeting Reviewers fill out and sign inspection form Reviewers fill out and sign inspection form

indicating time spent reviewing product. indicating time spent reviewing product. Reviewers collectively decide if they are ready Reviewers collectively decide if they are ready

for the review to be held.for the review to be held. Presenter progresses through review product Presenter progresses through review product

eliciting defects as he progresses.eliciting defects as he progresses. Recorder records defects on defect form.Recorder records defects on defect form. Reviewers collectively disposition review as Reviewers collectively disposition review as

“Accept As Is”, “Accept with Corrections” or “Accept As Is”, “Accept with Corrections” or “Re-review”.“Re-review”.

Page 27: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Review Comprehension Review Comprehension MethodsMethods

Front to BackFront to Back Start at the front of a document or top of a code module, and Start at the front of a document or top of a code module, and

proceed to the end in sequential orderproceed to the end in sequential order Use with documents, or if already familiar with the code Use with documents, or if already familiar with the code

design design Bottom UpBottom Up

Start at the lowest level routines and work upStart at the lowest level routines and work up Used when code is new to inspectorUsed when code is new to inspector

Top DownTop Down Start at the main SW entry points and review those, then Start at the main SW entry points and review those, then

review the routines they callreview the routines they call Used when inspector is familiar with codeUsed when inspector is familiar with code

IntegratedIntegrated Use both Top-Down and Bottom-up approaches as Use both Top-Down and Bottom-up approaches as

appropriate.appropriate.

Page 28: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

What Makes A Good What Makes A Good ReviewerReviewer

A good reviewer is thoroughA good reviewer is thorough Is prepared (most peer review Is prepared (most peer review

postponements are due to lack of team postponements are due to lack of team preparation)preparation)

Reviews the product, not the producerReviews the product, not the producer Raises issues, doesn’t resolve themRaises issues, doesn’t resolve them Doesn’t give the author the benefit of Doesn’t give the author the benefit of

the doubtthe doubt

Page 29: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

What Makes A Good What Makes A Good ModeratorModerator

Encourages individuals to prepare and Encourages individuals to prepare and participate participate

Controls Meeting (starts on time, Controls Meeting (starts on time, keeps focus on agenda, eliminates keeps focus on agenda, eliminates problem solving, etc.)problem solving, etc.)

Nurtures inexperienced reviewersNurtures inexperienced reviewers Is sensitive to AuthorIs sensitive to Author Feels ownership in quality of productFeels ownership in quality of product

Page 30: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

How To Select ReviewersHow To Select Reviewers

Participants are typically selected by the Participants are typically selected by the Moderator. Moderator.

Important criteria for selecting Important criteria for selecting participants: participants: ability to detect defects (expertise)ability to detect defects (expertise) knowledge of source documentsknowledge of source documents need to understand work-product (recipients need to understand work-product (recipients

of work-product)of work-product) motivation, and other personal qualitiesmotivation, and other personal qualities

Page 31: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Review MetricsReview Metrics Help measure the effectiveness of reviewsHelp measure the effectiveness of reviews Aid in continuous process improvementAid in continuous process improvement Provide feedback to managementProvide feedback to management Typical metrics are:Typical metrics are:

Average preparation effort per unit of material Average preparation effort per unit of material (typically LOC, KLOC or pages)(typically LOC, KLOC or pages)

Average examination effort per unit of materialAverage examination effort per unit of material Average explanation rate per unit of materialAverage explanation rate per unit of material Average number of defects and major defects found per Average number of defects and major defects found per

unit of materialunit of material Average hours per defect and per major defectAverage hours per defect and per major defect Percentage of re-inspectionsPercentage of re-inspections

Page 32: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Industry ExperienceIndustry Experience Aetna Insurance Company:Aetna Insurance Company:

FTR found 82% of errors, 25% cost reduction.FTR found 82% of errors, 25% cost reduction. Bell-Northern Research:Bell-Northern Research:

Inspection cost: 1 hour per defect.Inspection cost: 1 hour per defect. Testing cost: 2-4 hours per defect. Testing cost: 2-4 hours per defect. Post-release cost: 33 hours per defect.Post-release cost: 33 hours per defect.

Hewlett-PackardHewlett-Packard Est. inspection savings (1993): $21,454,000Est. inspection savings (1993): $21,454,000

IBM IBM Reported 83% defect detection through inspectionsReported 83% defect detection through inspections

AT&TAT&T Reported 92% defect detection through inspectionsReported 92% defect detection through inspections

Page 33: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Personal Case StudyPersonal Case StudyThe KDR 510The KDR 510

A VHF aviation radio A VHF aviation radio and modem.and modem.

A Real-time, A Real-time, embedded, safety embedded, safety critical, DSP critical, DSP system.system.

Won the editors Won the editors choice award from choice award from Flying Magazine.Flying Magazine.

Formal peer reviews were main QA activity. Formal peer reviews were main QA activity.

Page 34: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Quality Data for the KDR Quality Data for the KDR 510510

KDR 510 reviews detected many errors.KDR 510 reviews detected many errors. 72% of SW requirements defects72% of SW requirements defects 90.7% of SW design defects90.7% of SW design defects 90.6% of SW coding defects90.6% of SW coding defects

Total review time was approx 5% of total Total review time was approx 5% of total project time. project time.

Only 23% of total project time was spent Only 23% of total project time was spent in integration & test.in integration & test.

Only one error escaped into the field. Only one error escaped into the field.

Page 35: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

The Real Reasons For The Real Reasons For Holding ReviewsHolding Reviews

Reviews improve schedule Reviews improve schedule performanceperformance

Reviews reduce rework.Reviews reduce rework. Rework accounts for 44% of dev. cost!Rework accounts for 44% of dev. cost!

No Revs.

Revs

Req Design Code Test

R R R TestCodeDesignReqReviews

No Reviews

Page 36: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Tools For Technical ReviewsTools For Technical Reviews

Various tools for different inspection methods.Various tools for different inspection methods. ICICLE – for inspection of C & C++ programsICICLE – for inspection of C & C++ programs Scrutiny & InspeQ for specific inspection processesScrutiny & InspeQ for specific inspection processes ASSIST –supports generic inspection processASSIST –supports generic inspection process For larger list, see: For larger list, see:

http://www2.ics.hawaii.edu/~johnson/FTR/http://www2.ics.hawaii.edu/~johnson/FTR/ Home grown toolsHome grown tools

Typically built with Access Database.Typically built with Access Database. Reviewer enters defects offline into database.Reviewer enters defects offline into database. Eliminates recorder and reader roles. Eliminates recorder and reader roles. Gives author time to consider defects before Gives author time to consider defects before

meetingmeeting

Page 37: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Some ReferencesSome References A great website for Formal Technical Reviews: A great website for Formal Technical Reviews:

http://www2.ics.hawaii.edu/~johnson/FTR/http://www2.ics.hawaii.edu/~johnson/FTR/ A Discipline for Software EngineeringA Discipline for Software Engineering, Watts S. Humphrey, , Watts S. Humphrey,

Addison-Wesley, January, 1995.Addison-Wesley, January, 1995. M. E. Fagan,M. E. Fagan, Design and code inspections to reduce errors Design and code inspections to reduce errors

in program developmentin program development, IBM Systems Journal, Vol 15, No , IBM Systems Journal, Vol 15, No 3, 1976, 182-2113, 1976, 182-211

G. M. Schneider, J. Martin, W. T. TSAI, G. M. Schneider, J. Martin, W. T. TSAI, An Experimental An Experimental Study of Fault Detection In User Requirements DocumentsStudy of Fault Detection In User Requirements Documents, , ACM Transactions On SW Engineering & Methodology, Vol ACM Transactions On SW Engineering & Methodology, Vol 2, No 2, April 1992, 188-204. 2, No 2, April 1992, 188-204.

A. Porter, H. Siy, C. Toman, and L. Votta, A. Porter, H. Siy, C. Toman, and L. Votta, An Experiment to An Experiment to Assess the Cost-Benefits of Code Inspections in Large Scale Assess the Cost-Benefits of Code Inspections in Large Scale Software Development, Software Development, IEEE Transactions On Software IEEE Transactions On Software Engineering, VOL. 23, NO. 6, JUNE 1997, 329-346Engineering, VOL. 23, NO. 6, JUNE 1997, 329-346

Page 38: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Questions?Questions?

Page 39: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Review WorkshopReview Workshop Objective: Allow everyone to take a role in Objective: Allow everyone to take a role in

a Fagan style code review a Fagan style code review Combine results to create an N-Fold Combine results to create an N-Fold

inspection. inspection. Break into teams of 4. Break into teams of 4. Handouts: Handouts: Source Code Files, Supplementary Source Code Files, Supplementary

Materiel, Review Forms.Materiel, Review Forms. Schedule: Schedule: 20 minutes to prepare for review, 20 20 minutes to prepare for review, 20

minutes for review, 10 minute break for everyone minutes for review, 10 minute break for everyone but moderators. 5 minutes to summarize results. but moderators. 5 minutes to summarize results.

Page 40: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

Discussion on Review Discussion on Review WorkshopWorkshop

Results from N-Fold inspection.Results from N-Fold inspection. What did you learn from this code What did you learn from this code

review?review? Was it effective?Was it effective? How long would it have taken to How long would it have taken to

detect some of these defects by detect some of these defects by testing?testing?

Other comments or conclusions?Other comments or conclusions?

Page 41: Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

ConclusionConclusion Reviews compliment software testing.Reviews compliment software testing. Reviews are cost effective techniques for Reviews are cost effective techniques for

improving the quality of a developed improving the quality of a developed product. They pay for themselves.product. They pay for themselves.

Reviews improve the maintainability of a Reviews improve the maintainability of a developed product.developed product.

One size doesn’t fit all - One size doesn’t fit all - An organization’s An organization’s size, culture and industry should be considered size, culture and industry should be considered in deciding on the methods to use for reviews.in deciding on the methods to use for reviews.