6/12/20072007_06_12_rqmts.ppt1 team software project (tsp) june 12, 2007 requirements inspection...
Post on 19-Dec-2015
213 views
TRANSCRIPT
6/12/2007 2007_06_12_Rqmts.ppt 1
Team Software Project (TSP)
June 12, 2007
Requirements Inspection &
High Level Designs
6/12/2007 2007_06_12_Rqmts.ppt 2
Outline
Key Discussions from last week (Project Risks)
Configuration Management
Schedule & Cross Team Inspections
Requirements OverviewGeneral
Specifics to look for in LOC Counter SRS
Reviews & Inspection Summary
SRS Inspection Process
Measurement Data Collection
<SRS Inspection>
Design Phase Overview
6/12/2007 2007_06_12_Rqmts.ppt 3
SRS & Test Plan
System Test manager participates in SRS inspection of other team**
Reviews for clarity and completeness of requirementsRequirements provide basis for test plan
Test plan (baseline next week) inspected by other team**
6/12/2007 2007_06_12_Rqmts.ppt 4
Milestones
Requirements (SRS)Initial Draft out for review June 10
Final draft for inspection June 12
Inspection June 12
Baselined June 15
System Test PlanDraft out for review June 17
Inspection June 19
Baselined June 22
High Level Design (SDS)Inspected & Baselined June 19
6/12/2007 2007_06_12_Rqmts.ppt 5
Configuration Management Process
Three aspects:
Change Tracking
Version Control
Library
Objectives:
Product content known & available at all times
Product configuration is documented & provides known basis for changes
Products labeled & correlated w/ associated requirements, design & product info
Product functions traceable from requirements to delivery
All product contents properly controlled & protected
Proposed changes identified & evaluated for impact prior to go/no go decisions
6/12/2007 2007_06_12_Rqmts.ppt 6
Configuration Management
Types of product elements
Maintained (e.g. software)
Baselined & Maintained (e.g. SRS)
Quality Records (e.g. meeting notes, action item list)
Key Functions
Latest Version of each product element (software, documents, quality records)
Copies of prior versions of each product element
Who changed from previous version
When changed
What changed
Why changed
6/12/2007 2007_06_12_Rqmts.ppt 7
Configuration Management Plan
Configuration identification •Name Configuration items•Owner•Storage location
Configuration control procedure•Process for making changes, lock-out procedures (e.g. check-out, check-in procedures)
Configuration control board (CCB)•Reviews all change requests•Determine if change is appropriate, well understood & resources available•Approvals, commitments•? Defects: holding for CCB vs. urgency of change ?
Configuration change request form (CCR, aka CR)Baseline Process (see page 326)Backup procedures & facilitiesConfiguration status reports (CSR)Software Change Management status reports @ weekly meeting
6/12/2007 2007_06_12_Rqmts.ppt 8
Baseline Considerations
Criteria– Defined in Configuration Management Plan
– Review / inspection completed & stakeholders recommend approve for baseline
– All major and blocking issues should be resolved
– CRs tracking any remaining (and unresolved) issues
Actions– Update version # to reflect baselined document (e.g. 1.0)
– Place under change control
Project “Baseline” – snapshot of CIs, baselined & current versions
6/12/2007 2007_06_12_Rqmts.ppt 9
Automated Configuration Mgmt
Lucent: Sablime / SBCS & SCCS
Rational: DDTS / ClearCase
Perforce Software: Perforce
Microsoft: Visual SourceSafe
MKS
6/12/2007 2007_06_12_Rqmts.ppt 10
Change Workflow
New / Proposed
Assigned
Resolved
Study
Integrated
Delivered
Verified
NoChangeDeferredDeclined
6/12/2007 2007_06_12_Rqmts.ppt 11
Requirements Phase
Outputs:Completed & Inspected SRS document
Completed Inspection form (INS)
Time, defect & size data collected
Configuration Management Plan*
Updated project notebook
Note: On baselining SRS, the document should be placed under change control
6/12/2007 2007_06_12_Rqmts.ppt 12
Requirements Drivers
Functional Needs Statement
SW Requirements Specification
Development Test CustomerUserDocumentation
6/12/2007 2007_06_12_Rqmts.ppt 13
Software Requirements Specification (SRS)
Objective:Provide information necessary for understanding the proposed product and to explain/justify
the need for various product attributes (user code & documentation)
Standards:IEEE610.12 – 1990, IEEE Standard Glossary of Software Engineering Terminology
IEE830 – 1998, IEEE Recommended practice for Software Requirements Specifications
IEEE 1220-1998 – Application and Management of the Systems Engineering Process
IEEE 1233-1998 – Guide for Developing System Requirements Specifications
6/12/2007 2007_06_12_Rqmts.ppt 14
Software Requirements Statements
• Unambiguous:All involved (e.g. customers, developers, testers) interpret statement in same wayGlossary defining each important term can help
• Correctness:describes a condition or attribute that is required of the final product & all agree this is the
caseAlso, each rqmts statement must be compatible with prior information
• Verifiable:Requirement can be verified prior to delivery to customer by inspection or testTo satisfy, use concrete terms and measurable quantities whenever possible
• Consistency:Assure individual requirements do not conflict with one another
• Completeness:All significant requirements for the product are provided (e.g. input: responses for both valid
& invalid data)
6/12/2007 2007_06_12_Rqmts.ppt 15
Software Requirements Types
• Functional:Specific actions that program needs to perform in order to meet users’ needs
Defined or quantified based upon customer expectations
• Quality:Various attributes including reliability, usability, efficiency, maintainability,
portability, etc.
• Performance:
• Regulatory:Industry Standards (TL9000)
Government/Regulatory (e.g. UL)
• Security:
6/12/2007 2007_06_12_Rqmts.ppt 16
Security Requirements
Policy: what to secure, against what threats, by what means? Who is authorized?
Confidentiality: preventing unauthorized reading or knowledge
Integrity: preventing unauthorized modification or destruction
Availability: accessible to authorized users
Privileges: controlling access and actions based on authorizations
Identification & authentication: challenging users to prove identity (e.g. passwords, codes)
Correctness: mediation of access to prevent bypassing controls or tampering with system/data
Audit: log to assist in identifying when a security attack has been attempted
6/12/2007 2007_06_12_Rqmts.ppt 17
Requirements Identification
Requirements should be numbered or labelede.g. Requirement XX Start
Requirement XX End
Requirement XX comment
Include release (e.g. cycle) number as part of label
Traceable to functional need statement (see next slide)
6/12/2007 2007_06_12_Rqmts.ppt 18
Requirements Traceability
Backwards Traceability includes explicit information that identifies the higher level requirements that the lower level requirement derives from
– Traceability should cover all phases (e.g. functional need – requirements, requirements – design, design – code, requirements – test)
– Ensures:• nothing is left out of product,
• change impact assessments
Trace Tables:Backwards trace table showing link from lower level (e.g. SRS) to higher level (e.g. Strat
form)• Part of lower level document
Forwards trace table shows lower level requirements derived from an upper level requirement
LOC Project – generate a backwards trace table*
6/12/2007 2007_06_12_Rqmts.ppt 19
SRS Document Baseline/Change History
Tracks all versions and modifications
Version numbering scheme documented in CM plan
Change request information tracks to CRs
e.g.Version 0.1 – Pre-baseline version for review
Version 1.0 – Cycle 1 baseline version
Version 1.1CR 101 – Clarify security requirements
CR 102 – delete support for VB files
Version 2.0 – Cycle 2 baseline version
Adds the following features ….
Version 2.1 …
6/12/2007 2007_06_12_Rqmts.ppt 20
SRS Characteristics Summary
Detailed, clearly delineated, concise, unambiguous & testable (quantifiable)Changes
DefectsClarificationsAdditions / Enhancements
Requirements should be numbered or labelede.g. Requirement XX StartRequirement XX EndRequirement XX comment
Traceable to functional need statementInspected & baselinedMaintained under change controlDocument includes structural elements including:
Baseline/change historyApproval pageCustomer documentation specifications
6/12/2007 2007_06_12_Rqmts.ppt 21
LOC Counter Requirements(See also TSPi pp112-113 )Overall description and framework of GUI (if provided)Input
File formats (ANSI text) & extensions (.c, .cc) supportedLimits on file names (e.g. max characters)Additional features (e.g. browsing for input file)Error cases, one or both files empty, non-existent, unable to be opened
Results of Comparison AlgorithmOutput if identical lines moved (e.g. Line A, B, C, D vs. Line A, C, B, D)Treatment of comments (in-line & alone), blank lines, braces (LOC counting)Multi-line statements / comments
OutputFormat and location of output (e.g. screen, file, directory)
ErrorsAll errors including messages (invalid inputs, algorithm errors, etc.)
OtherProduct installation & executionUser documentation planResponse timeSecurityScalability (e.g. max file sizes supported)ConcurrencyHW requirements (e.g. processor, hard drive, display resolution, OS, peripherals such as mouse)
6/12/2007 2007_06_12_Rqmts.ppt 22
Why Do Reviews / Inspections?
Can identify defects early in the process
more efficient (i.e. cheaper) defect removal
Leverages knowledge of multiple engineers
Leverages different viewpoints
Improves defect detection odds
Broadens understanding of product being inspected*
6/12/2007 2007_06_12_Rqmts.ppt 23
Inspections vs Reviews
InspectionsFormal, typically requires face to face meetings
Measurement data collected
Disposition of product agreed to
Quality records available
ReviewsInformal
Can be face to face, email exchange
Measurement data and quality records optional
Typically used for early product work & small code changes
6/12/2007 2007_06_12_Rqmts.ppt 24
Peer Reviews
Review Objectives:Find defects
Improve software element
Consider alternatives
Possibly, educate reviewers
Types:– Desk Check: informal, typically single peer, effectiveness?
– Walk-through: informal, several peers, notes taken, data collection optional
– Variant: test walk-through
6/12/2007 2007_06_12_Rqmts.ppt 25
Inspections
Inspection ObjectivesFind defects at earliest possible pointVerify to specification (e.g. design to requirements)Verify to standardsCollect element and process dataSet baseline point
Exit CriteriaAll detected defects resolvedOutstanding, non-blocking issues tracked
Techniques & MethodsGeneric checklists & standardsInspectors prepared in advanceFocus on problems, not on resolutionPeers only“Mandatory” data collection
Roles: Moderator, reader, recorder, inspector, author
6/12/2007 2007_06_12_Rqmts.ppt 26
Inspection Logistics
Identify moderator (for TSPi, use process manager)Inspection briefing (identify inspection roles, set date/time for inspection)Review product
– Individual reviews– Record time spent reviewing– Identify defects, but do not log on LOGD form
(defects recorded during inspection on INS & LOGD forms)
– Typically want 3-5 days for an adequate review period
Inspection meeting– Obtain & record preparation data– Step through product one line or section at a time– Raise defects or questions– Defects recorded by moderator on INS form– Defects recorded by producer on LOGD form (no need to use Change Requests)– Peripheral issues & action items should be recorded in ITL log*
6/12/2007 2007_06_12_Rqmts.ppt 27
Inspection Logistics (continued)
Estimate remaining defectsTBD (but, for each defect, record all members who identified it)
Conclude meeting– Agree on verification method for defects
– Agree on disposition (e.g. approved, approved with modification, re-inspect)
Rework product & verify fixes (e.g. moderator)
Obtain signatures of all inspectors on baseline sheet(file as quality record)
6/12/2007 2007_06_12_Rqmts.ppt 28
Measurement Data & Metrics
Base Metrics# & Type of Defects found (major, minor)For each defect, who found# of pages inspected, preparation time (per inspector), inspection time
MeasuresPreparation rate = # pages / average preparation timeInspection rate = # pages / inspection timeInspection defect rate = # major defects / inspection timeDefect density = # estimated defects / # of pagesInspection yield = # defects / # estimated defects (individual & team)
SRS Phase Defect Containment (%) =100% * # Defects removed @ step / ( Incoming defects + Injected defects)