trading places: measurement and analysis in the eyes of ... · software engineering institute why...
TRANSCRIPT
Sponsored by the U.S. Department of Defense© 2003 by Carnegie Mellon University
Version 1.0 page 1
Pittsburgh, PA 15213-3890
Carnegie MellonSoftware Engineering Institute
Trading Places:Measurement and Analysis in theEyes of the Acquirer and the SupplierWolf GoethertJeannine SiviyRobert FergusonSoftware Engineering Measurement & Analysis Initiative
Report Documentation Page Form ApprovedOMB No. 0704-0188
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering andmaintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, ArlingtonVA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if itdoes not display a currently valid OMB control number.
1. REPORT DATE MAR 2004 2. REPORT TYPE
3. DATES COVERED 00-00-2004 to 00-00-2004
4. TITLE AND SUBTITLE Trading Places: Measurement and Analysis in the Eyes of the Acquirerand the Supplier
5a. CONTRACT NUMBER
5b. GRANT NUMBER
5c. PROGRAM ELEMENT NUMBER
6. AUTHOR(S) 5d. PROJECT NUMBER
5e. TASK NUMBER
5f. WORK UNIT NUMBER
7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Carnegie Mellon University,Software Engineering Institute,Pittsburgh,PA,15213
8. PERFORMING ORGANIZATIONREPORT NUMBER
9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)
11. SPONSOR/MONITOR’S REPORT NUMBER(S)
12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited
13. SUPPLEMENTARY NOTES
14. ABSTRACT
15. SUBJECT TERMS
16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as
Report (SAR)
18. NUMBEROF PAGES
109
19a. NAME OFRESPONSIBLE PERSON
a. REPORT unclassified
b. ABSTRACT unclassified
c. THIS PAGE unclassified
Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18
© 2003 by Carnegie Mellon University Version 1.0 page 2
Carnegie MellonSoftware Engineering Institute
Trademarks and Service Marks® Capability Maturity Model, Capability Maturity Modeling, Carnegie Mellon, CERT, CERT Coordination Center, CMM,and CMMI are registered in the U.S. Patent and TrademarkOffice by Carnegie Mellon University.
SM Architecture Tradeoff Analysis Method; ATAM;CMM Integration; CURE; IDEAL; Interim Profile; OCTAVE; Operationally Critical Threat, Asset, and Vulnerability Evaluation; Personal Software Process; PSP; SCAMPI; SCAMPI Lead Assessor; SCAMPI Lead Appraiser; SCE; SEI; SEPG; Team Software Process; and TSP are servicemarks of Carnegie Mellon University.
© 2003 by Carnegie Mellon University Version 1.0 page 3
Carnegie MellonSoftware Engineering Institute
Objectives
Establish a view of the acquirer and supplier/contractorroles and responsibilities.
Show how measurement and analysis skills for internaldevelopment can be recast for acquisition and contractingenvironments.
Address two prevalent questions in the acquisitioncommunity:• How can measurement be used to improve
requirements-related processes?• How can we conduct causal analysis when we no
longer control the collection processes and/or data?
© 2003 by Carnegie Mellon University Version 1.0 page 4
Carnegie MellonSoftware Engineering Institute
OutlineContext• state of the community• changing perspectives
Background• roles & responsibilities• maturity models• measurement & analysis methods
Scenario• goal-setting and success, progress, analysis indicators• inspecting the quality of deliverables: requirements• monitoring and oversight: progress analysis• measurement in the contract• communicating with integrated measures
Summary
© 2003 by Carnegie Mellon University Version 1.0 page 5
Carnegie MellonSoftware Engineering Institute
Terms and Usage
We use the terms “acquisition” and “contracting”interchangeably throughout this tutorial.
In addition, the terms “contractor” and “supplier” are usedinterchangeably. The term “developer,” in the context ofthis tutorial, is used to describe a contractor.
© 2003 by Carnegie Mellon University Version 1.0 page 6
Carnegie MellonSoftware Engineering Institute
Trends in Outsourcing 1From Gartner Group (2002)• one out of every 10 jobs with U.S.-based information
technology vendors and service providers will be exported• more than 80 percent of corporate boards of directors will
have considered offshore outsourcing• 40 percent of corporations will have finished an outsourcing
pilot program or be actively involved in outsourcingtechnology services
From Forrester Research• offshore outsourcing will account for 28% of IT budgets in
Europe and the U.S. by 2004• offshore IT workers will go from 360,000 (in 2002) to more
than 1 million in 2005
[www.rosourcing.com], [robb 02], [diana 03]
© 2003 by Carnegie Mellon University Version 1.0 page 7
Carnegie MellonSoftware Engineering Institute
Trends in Outsourcing 2From Michael F. Corbett & Associates:• Offshore outsourcing is just one small part of a (US)$5 trillion
global outsourcing market.• This market is growing by more than 15 percent per year, and
the offshore component is certainly among the fastest growing• For U.S. IT professionals, this probably means that their
future success will come from moving up the IT value chain
From Ovum research• The outlook for the future is more offshore outsourcing, but
not at the levels predicted by other analysts in this area
[www.rosourcing.com], [robb 02], [diana 03]
© 2003 by Carnegie Mellon University Version 1.0 page 8
Carnegie MellonSoftware Engineering Institute
Why Do Organizations Outsource?
Top 10 Reasons from The Outsourcing Institute:• Reduce and control operating costs• Improve company focus• Gain access to world-class capabilities• Free internal resources for other purposes• Resources are not available internally• Accelerate reengineering benefits• Function difficult to manage/out of control• Make capital funds available• Share risks• Cash infusion
[www.rosourcing.com]
© 2003 by Carnegie Mellon University Version 1.0 page 9
Carnegie MellonSoftware Engineering Institute
The Supplier Landscape 1Contractor dimensions:• geography• style• maturity• processes
Examples include the following:• domestic development groups• offshore development groups• dedicated offshore development centers• off the shelf, COTS products• systems integrators• open source• rational• PSP/TSP
© 2003 by Carnegie Mellon University Version 1.0 page 10
Carnegie MellonSoftware Engineering Institute
The Supplier Landscape 2From Forester Research• 88% of the firms looking overseas for services claim to
get better value for their money off shore.• 71% said offshore workers did better quality work.
[www.rosourcing.com]
© 2003 by Carnegie Mellon University Version 1.0 page 11
Carnegie MellonSoftware Engineering Institute
Contracting Challenges 1From Software Magazine in 2001:• 23% of software projects are cancelled• Cost growth averages 45%• Schedule growth averages 67%• Average final product will include only 67% of its requirements• Only 28% of projects finish on schedule and within budget
Cited by a sampling of Army Acquisition Managers• The majority of problems and risks affecting acquisition
problems resides “somewhat” with the following:- factors outside the control of acquirers and developers- acquisition program policies and processes- contracting processes- the contractor’s development process
[ASSIP 03], [SWM 01]
© 2003 by Carnegie Mellon University Version 1.0 page 12
Carnegie MellonSoftware Engineering Institute
Contracting Challenges 2Cited by a sampling of Army Acquisition Managers• The top problem areas include
- requirements management (selected by 63%)- project management (22%)- contractor processes (22%)- unstable funding (21%)
From a recent presentation on component technology• contractor qualifications (Mitigation: CMMI)• requirements definition (Mitigation: close partnerships)• engineering acceptance (Mitigation: process analysis)
[ASSIP 03], [Scherlis 03]
© 2003 by Carnegie Mellon University Version 1.0 page 13
Carnegie MellonSoftware Engineering Institute
Measurement Challenges
From interviews of several acquisition managementpersonnel:
• “Measurement” is not a troublesome issue in itself;however, getting consistent, meaningful data andunderstanding how to use data is a high priority andconcern.
• There is a tremendous need for progress measuresthat can be used for timely warning of major programdisasters.
[C-M-H 03]
© 2003 by Carnegie Mellon University Version 1.0 page 14
Carnegie MellonSoftware Engineering Institute
Measures in Practice 1In a recent survey, a sampling of Army Acquisition Managersaffirmed the following• 83% based planning estimates on historical data• 79% defined quantitative objectives for acquired products and
services• 81% used metrics as an input to decision making• 75% measured and controlled project cost and schedule• 50% recorded data in organizational measurement repository• 78% had sufficient insight into the contractor’s software
engineering effort to ensure project is managed and controlledand complies with contract requirements
• 78% appraised the quality of the contractor’s process,performance, products, and services throughout the contractto identify risks and take appropriate action
[ASSIP 03]
© 2003 by Carnegie Mellon University Version 1.0 page 15
Carnegie MellonSoftware Engineering Institute
Measures in Practice 2The surveyed Army Acquisition Managers use thesemeasures to track project status:
9186 82
63
51
0102030405060708090
100
Schedule Cost DevelopmentProgress
Manpower RequirementsStability
Per
cen
t o
f R
esp
on
den
ts
[ASSIP 03]
© 2003 by Carnegie Mellon University Version 1.0 page 16
Carnegie MellonSoftware Engineering Institute
What Does This Mean?Issues in contracting are complex and multidimensional.
• Requirements management is a problem area thatfrequently is not well measured.
• Project monitoring and oversight is fairly well measured,but the related analysis may not be mastered.
• Organizations may often measure what they know howto measure, but not necessarily measure all that isneeded to be successful.
How does this compare to your experience?
© 2003 by Carnegie Mellon University Version 1.0 page 17
Carnegie MellonSoftware Engineering Institute
OutlineContext• state of the community• changing perspectives
Background• roles & responsibilities• maturity models• measurement & analysis methods
Scenario• goal-setting and success, progress, analysis indicators• inspecting the quality of deliverables: requirements• monitoring and oversight: progress analysis• measurement in the contract• communicating with integrated measures
Summary
© 2003 by Carnegie Mellon University Version 1.0 page 18
Carnegie MellonSoftware Engineering Institute
Responsibility and AuthorityMeasuring project and product success is the same whether theproject is internal or contracted:• on schedule• at cost• with required functionality• without defects
The acquiring program manager’s “circle of influence” and “circle ofcontrol” is different than the development project manager’s.• development project manager addresses the daily details of
project execution• acquisition program manager defines and executes a new set of
processes• acquisition program manager should leverage development
knowledge to manage the contract methodically, rationally, andknowledgeably
© 2003 by Carnegie Mellon University Version 1.0 page 19
Carnegie MellonSoftware Engineering Institute
Roles and Information Exchange
Status Information
Develop,Customize,Integrate• systems• software• COTS
Supplier /Developer
Directions, CorrectionsDirections, Corrections
Deliverables
Interim Documents, Tangibles
• MonitorProjectProgress
Acquirer
Pre-award activities
Post-award activities
• RFP prep.• Contract Award
Contractual Handshake
Functional & Quality
Requirements
Functional & Quality
Requirements
SubContractors
• EvaluateDeliver-ables
© 2003 by Carnegie Mellon University Version 1.0 page 20
Carnegie MellonSoftware Engineering Institute
Acquisition Measurement Themes
Project Management• project execution• contract relationship
Product Life Cycle & Performance• product planning• product development• deployment• maintenance
Process & Organizational Infrastructure• process definition and execution• relationship management
© 2003 by Carnegie Mellon University Version 1.0 page 21
Carnegie MellonSoftware Engineering Institute
Measuring Project, Product, Process
Processes
Products•Supplier Produced
•Acquisition Organization Produced
ContractualHandshake
Exchange ofindicators /informationfor tracking,monitoring,direction, etc.
Acquirer
Pre-awardactivities
Post-awardactivities
Supplier
Develop,Customize,Integrate• systems• software• COTS
Project•Schedule
•Cost
•Requirements satisfaction
(status, projection, trend)
(status, projection, trend) •Quality (amount of rework)
•Quality (amount of rework)
Relationship•Roles (changes)
•Invoicing (payment)
© 2003 by Carnegie Mellon University Version 1.0 page 22
Carnegie MellonSoftware Engineering Institute
Responsibilities Prior to ContractAwardScope definition
Vendor selection• technical capabilities
- proposed scope• process capabilities
- predictable, productive performance- ability to deal with change
• financial capabilities
Contract negotiation• quality management metrics• change management• managing & monitoring the relationship
© 2003 by Carnegie Mellon University Version 1.0 page 23
Carnegie MellonSoftware Engineering Institute
Responsibilities After ContractAward
DeliverablesDocuments- SRD- SDP- Measurement
Plan- SDD
Status Reports- Schedule- Cost- Testing
Final Product
Evaluate Quality ofDeliverables
Monitor and Oversight - Schedule & Progress - Resources & Costs - Developer’s Processes
ACQUIRER
•
•
Acquirer Responsibilities(Post-Contract Award)
Contractor
Developthe
System
© 2003 by Carnegie Mellon University Version 1.0 page 24
Carnegie MellonSoftware Engineering Institute
Monitor & Oversight
Acquirer'sEvaluationCriteria
Acquirer’sAnalysis& ReviewProcess
Status Information
- schedule progress- budget status- test results- process results, such as inspections- process compliance
Measurable Results (Examples)
• contractor effort actual vs. plan• contractor schedule actual vs. plan• defects reported
• description, severity, class, type• size, complexity of the work product
Indicators
© 2003 by Carnegie Mellon University Version 1.0 page 25
Carnegie MellonSoftware Engineering Institute
Evaluate Quality of Deliverables
Acquirer’sEvaluationcriteria
Acquirer’sInspectionor ReviewProcess
Documentsto review- SRD- SDP- Meas Plan- SDD
Measurable Results (Examples)
• defects discovered• description, severity, class, type
• size of the work product
• effort invested in the inspectionprocess
• time spent during the inspectionactivities
Indicators
FinalDeliverables
Process
Products
© 2003 by Carnegie Mellon University Version 1.0 page 26
Carnegie MellonSoftware Engineering Institute
Success Factors
To make this work you need:• technical capabilities
- integration, validation, deployment• process capabilities
- project management, QA, change control• domain knowledge
- product uses, stakeholders, quality goals• relationship management
- contracting, change management, roles, payment,relationship reviews….
And measurement to see that these things are workingwell.
© 2003 by Carnegie Mellon University Version 1.0 page 27
Carnegie MellonSoftware Engineering Institute
OutlineContext• state of the community• changing perspectives
Background• roles & responsibilities• maturity models• measurement & analysis methods
Scenario• goal-setting and success, progress, analysis indicators• inspecting the quality of deliverables: requirements• monitoring and oversight: progress analysis• measurement in the contract• communicating with integrated measures
Summary
© 2003 by Carnegie Mellon University Version 1.0 page 28
Carnegie MellonSoftware Engineering Institute
Adapting CMMI for AcquisitionIn addition to establishing these Process Areas (PAs)• Supplier Agreement Management• Integrated Supplier Management
You may also need to use these PAs for your acquisitionprocesses and extend them to include your supplier:• Requirements Management, Development• Integrated Teaming• Decision Analysis and Resolution• Organizational Environment for Integration• Organizational Process Performance• Quantitative Project Management• Causal Analysis and Resolution• Risk Management• Project Monitoring and Control• Verification & Validation• Configuration Management• Measurement and Analysis
© 2003 by Carnegie Mellon University Version 1.0 page 29
Carnegie MellonSoftware Engineering Institute
SA-CMM Key Process Areas
1Initial
Competent people and heroics
Acquisition Innovation ManagementContinuous Process Improvement
5Optimizing
4 Quantitative
3Defined
2 Repeatable
Continuous process improvement
Quantitativemanagement
Processstandardization
Quantitative Acquisition ManagementQuantitative Process Management
Training Program ManagementAcquisition Risk ManagementContract Performance ManagementProject Performance ManagementUser requirementsProcess Definition and Maintenance
Transition to SupportEvaluation Contract Tracking and OversightProject ManagementRequirements Development and Mgt.SolicitationSoftware Acquisition Planning
Level Focus Key Process Areas
Basicprojectmanagement
HigherQuality
Productivity
Lower Risk
Higher Risk
Rework
© 2003 by Carnegie Mellon University Version 1.0 page 30
Carnegie MellonSoftware Engineering Institute
Relation to CMMI PAs
Project PlanningProject Monitoring And ControlIntegrated Supplier ManagementRisk ManagementRequirements DevelopmentRequirements ManagementVerificationValidationConfiguration ManagementDecision Analysis and ResolutionOrganizational Training
Software Acquisition PlanningProject ManagementSolicitationContract Tracking and OversightRequirements Development andManagement
CMMI Process Area SA-CMM KPA
[Ferguson 03]
© 2003 by Carnegie Mellon University Version 1.0 page 31
Carnegie MellonSoftware Engineering Institute
Maturity Matching Considerations
ManagementCapability
Level
Acq
uir
er
Supplier (developer)
Disaster
Matched TeamMismatch
Mismatch
• mature buyermust mentor lowmaturitydeveloper
• outcome notpredictable
•match of skills, maturity•team risk approach•execution to plan•measurable performance•quantitative management
highest probability ofsuccess
• constant crises• no req’s mgt.• no risk mgt.• no discipline• no process. . . • no product
• “Customer isalways right”hurts.
• Customerencourages“short cuts.”
capability/maturity
cap
abili
ty/m
atu
rity
[Barbour 03]
© 2003 by Carnegie Mellon University Version 1.0 page 32
Carnegie MellonSoftware Engineering Institute
Focusing InKey points:• trends in contracting• common problems and issues
faced when contracting• common view of the roles and
responsibilities of an acquirer• role of reference models
What’s in sight:• measurement and analysis
techniques
In the distance:• an illustration of these techniques
at work
© 2003 by Carnegie Mellon University Version 1.0 page 33
Carnegie MellonSoftware Engineering Institute
OutlineContext• state of the community• changing perspectives
Background• roles & responsibilities• maturity models• measurement & analysis methods
Scenario• goal-setting and success, progress, analysis indicators• inspecting the quality of deliverables: requirements• monitoring and oversight: progress analysis• measurement in the contract• communicating with integrated measures
Summary
© 2003 by Carnegie Mellon University Version 1.0 page 34
Carnegie MellonSoftware Engineering Institute
Benefits of Using Measures
Measurement by itself does not control or improve; itgives insight for objectively planning, managing, andcommunicating.
• historical data help us predict and plan• actual versus plan data help us determine progress
and support decision making• analyzing trends helps us identify and focus on
problem areas• project data provide a basis for objective
communication
© 2003 by Carnegie Mellon University Version 1.0 page 35
Carnegie MellonSoftware Engineering Institute
Measurement in CMMI Process AreasProject Management• Project Planning, Project Monitoring and Control, SoftwareAcquisition Management
• Integrated Project Management, Risk Management, IntegratedSupplier Management
• Quantitative Project Management
Process Management• Organization Process Focus, Organization Process Definition• Organization Process Performance• Organization Innovation and Deployment
Engineering -- All
Support• Measurement and Analysis, Process and Product QualityAssurance
• Decision Analysis and Resolution• Causal Analysis and Resolution
[DZ –P 03]
© 2003 by Carnegie Mellon University Version 1.0 page 36
Carnegie MellonSoftware Engineering Institute
Measurement in CMMI Generic Practices
“Monitor and control the process against the plan and takeappropriate corrective action.” (GP2.8)
“Collect work products, measures, measurement results,and improvement information derived from planning andperforming the process to support the future use andimprovement of the organization’s processes and processassets.” (GP3.2)
Two uses of measurement:• project management• process improvement
As the organization matures, the sophistication and uses ofmeasurement increase.
[DZ 02]
© 2003 by Carnegie Mellon University Version 1.0 page 37
Carnegie MellonSoftware Engineering Institute
Measurement in SA-CMM
Maturity Levels 2-5• status of
- processes- products
Maturity Levels 4-5• effectiveness of
- processes- products
© 2003 by Carnegie Mellon University Version 1.0 page 38
Carnegie MellonSoftware Engineering Institute
Acquisition Enterprise Measurement
Execution of a contracted project also involves• legal processes• financial processes
While this tutorial does not explore these aspects ofcontracting, each aspect is measurable and can bequantitatively managed.
© 2003 by Carnegie Mellon University Version 1.0 page 39
Carnegie MellonSoftware Engineering Institute
Sources for Measures
Goals Questions MeasuresIndicators
Goal-Driven (Software) Measurement (GDM)
USER DEFINES INDICATORS & MEASURES
Based On:• what’s needed to manage the User’s goals• decisions and decision criteria related to managingthe user’s goals
(GQIM)
Practical Software & Systems MeasurementCommon
IssueArea
MeasurementCategory
Measures
PREDEFINEDPREDEFINED PREDEFINED
© 2003 by Carnegie Mellon University Version 1.0 page 40
Carnegie MellonSoftware Engineering Institute
Goal-Driven Measurement (GDM)When using goal-driven measurement, the primaryquestion is NOT:
“What metrics should I use?”
rather, it is:
“What do I want to know or learn?” “What decision do I want to make?”
Goal-driven measurement is NOT based on apredefined set of metrics.
[GQIM 96]
© 2003 by Carnegie Mellon University Version 1.0 page 41
Carnegie MellonSoftware Engineering Institute
Goal-DrivenSoftwareMeasurement(GDM)
SLOC Staff-hours Trouble Reports Milestone dates
Indicator TemplateObjectiveQuestion
InputsAlgorithmAssumptions Action
Plans
Analysis & Diagnosis
InfrastructureAssessment
Indicators
Business => Sub-Goals => Measurement
Goals
Questions
GOAL(s)
Questions
Indicators
Measures
HandbookMetric
s
_____
_____
_____
definition checklist
_____
_____
SeniorManagement
Team
What do I want to know or learn?
[GQIM 96]
© 2003 by Carnegie Mellon University Version 1.0 page 42
Carnegie MellonSoftware Engineering Institute
Practical Software & SystemsMeasurement (PSM)This measurement process is funded by the DoD and isfreely available at http://www.psmsc.com.
PSM process identifies project-specific issues:• issues grouped into common software issue areas• measurement categories correspond to issue areas• each measurement category has a candidate set of
proven measures
Measures are selected based on availability,environment, and other factors.
[PSM 00]
© 2003 by Carnegie Mellon University Version 1.0 page 43
Carnegie MellonSoftware Engineering Institute
PSM Common Software Issues –Measurement CategoriesSchedule and Progress- Milestones Performance- Work Unit Progress- Incremental Capability
Product Size and Stability- Product Size and Stability- Functional Size and Stability
Product Quality- Functional Correctness- Supportability - Maintainability- Efficiency- Portability- Usability- Dependability - Reliability
Process Performance- Process Compliance- Process Efficiency- Process Effectiveness
Resources and Cost- Personnel- Financial Performance- Environment Availability
Technical Effectiveness- Technology Suitability- Impact- Technology Volatility
Customer Satisfaction- Customer Feedback- Customer Support
[PSM 00]
© 2003 by Carnegie Mellon University Version 1.0 page 44
Carnegie MellonSoftware Engineering Institute
X-reference
Interpretation
Evolution
Assumptions
Probing Questions
Algorithm
Analysis
Feedback Guidelines
Objective
QuestionsVisual Display
Input(s)Data Elements
Responsibility for Reporting
Form(s)
Definitions
Data CollectionHowWhen/How OftenBy Whom
Data Reporting
By/To Whom
Indicator Name/Title
How Often
Date
Modified Indicator Template
80
204060
100
Perspective
Additional Modifications by clients• streamlined data collection &
reporting sections using“swimlane” diagrams
• Addition of “corrective actionguidelines”
• Subprocess selection (forCMMI)
Communicate Results
Analyze Data
SpecifyAnalysis
Procedures
CollectData
SpecifyData
CollectionProcedures
EstablishMeasurement
Objectives
SpecifyMeasures
[GQIM], [DZ 02]
© 2003 by Carnegie Mellon University Version 1.0 page 45
Carnegie MellonSoftware Engineering Institute
Roll-up For Higher Management
Task 1Task 2Task 3
Task n
Tasks to Accomplish goal
••
Task 1Task 2Task 3
Task n
Tasks to Accomplish goal
••••
Success CriteriaGoal
Strategy to accomplish the goal
Progress Indicators
Success Indicators
Analysis Indicators
80
204060
100
Tasks
Tes
t C
ases
Co
mp
lete
Functions
1 2 3 4 1 2 3 4
%
Reporting Periods
Reporting Periods
Planned
Actual
Reporting Periods
Planned
Actual
Reporting Periods
Planned
Actual
Reporting Periods
Planned
Actual80
204060
10080
204060
10080
204060
100
For Project Manager
Indicator Classifications
How well are plans proceeding?
Have the goals beenachieved? What is theimpact of the tactics?
What are results ofspecific tasks?
Roll-up For Higher Management
Reporting Periods
Planned
Actual
Reporting Periods
Planned
Actual80
204060
10080
204060
10080
204060
100
GQ(I)M
GQ(I)M
PSM
PSM, GQ(I)M
GQ(I)M
© 2003 by Carnegie Mellon University Version 1.0 page 46
Carnegie MellonSoftware Engineering Institute
Data Analysis Dynamics
Getting Started• Identify the goals• Black box process view• Is the data right?• Do I have the right data?
Decision point:• If the data is not perfect, do I
move forward or obtain betterdata?
Initial Evaluation• What should the data look like?• What does the data look like?• Can I characterize the process,
product, problem?
Decision point:• Can I address my goals
right now?• Or is additional analysis
necessary? at the same ordeeper level of detail?
• Can I move forward?
Moving Forward• Further evaluation• Decompose data, process
Decision point:• Do I take action?• What action do I take?
Repeat until root cause found, attarget with desired variation
[DAD 03]
© 2003 by Carnegie Mellon University Version 1.0 page 47
Carnegie MellonSoftware Engineering Institute
Performance Analysis Model
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
Schedule andProgress
ProductQuality
CustomerSatisfaction
[PSM 00]
© 2003 by Carnegie Mellon University Version 1.0 page 48
Carnegie MellonSoftware Engineering Institute
Performance Analysis Checklist 1Single indicator issues:• Do actual trends correspond to planned trends, such as
progress, growth, and expenditures? How big is thevariance?
• Does the variance appear to be gradually growing eachmonth?
• Are actual values exceeding planned limits, such asopen defects, changes, and resource utilization?
• Are outliers or other anomalies affecting the results?
[PSM 00]
© 2003 by Carnegie Mellon University Version 1.0 page 49
Carnegie MellonSoftware Engineering Institute
Performance Analysis Checklist 2Integrated indicator issues:• Is the source of the problem evident?
- Change in functionality, unplanned rework, etc.• Are growing problems in one area a leading indicator of
other problems later in the project?- Requirements creep impact on schedule
• Do multiple indicators lead to similar conclusions?- Lack of progress correlates with low staffing
• Does other project information contradict performanceresults?- Milestones being met but open defect counts are
increasing
[PSM 00]
© 2003 by Carnegie Mellon University Version 1.0 page 50
Carnegie MellonSoftware Engineering Institute
Focusing InEarlier:• trends, roles, models
Key Points:• measurement in maturity
models• three indicator types: success,
progress, analysis• comparing PSM and GQIM• Performance Analysis Model
What’s in sight:• an illustration of these
methods at work
© 2003 by Carnegie Mellon University Version 1.0 page 51
Carnegie MellonSoftware Engineering Institute
OutlineContext• state of the community• changing perspectives
Background• roles & responsibilities• maturity models• measurement & analysis methods
Illustration• goal-setting and success, progress, analysis indicators• inspecting the quality of deliverables: requirements• monitoring and oversight: progress analysis• measurement in the contract• communicating with integrated measures
Summary
© 2003 by Carnegie Mellon University Version 1.0 page 52
Carnegie MellonSoftware Engineering Institute
Composite Illustration*This illustration is based on an organization that is• maintaining an existing product, a blend of COTS, and
internally developed code• pursuing the acquisition of a replacement product
Their acquisition includes two contracts:• requirements development• product design, code, and test
This illustration will focus on• evaluating requirements document quality (contract 1)• analyzing project execution data (contract 2)
It will briefly highlight other aspects of acquisition measurement.
*This illustration is a composite of two projects. Aspects from other projects have been interwovenfor demonstration purposes.
© 2003 by Carnegie Mellon University Version 1.0 page 53
Carnegie MellonSoftware Engineering Institute
Illustration: Goal Structure
Stabilize Current Systems Owner:
Improve Product DeliveryOwner:
Provide “whole product” support Improve product field
performance
Meet Customers’ NeedsOwner:
Engineer the Future Systems Owner:
Deliver Future SystemsOwner:
Establish Acquisition Processes
Owner:
Stabilize Software Engineering Processes
Owner:
Develop a quality team (rightpeople, right time, right job)
Owner:
Internal Proj Mgr, Engineers
Internal Development
Program Mgr(s),Contractor(s)
Contracted Projects
SEPG, Org. Mgrs
© 2003 by Carnegie Mellon University Version 1.0 page 54
Carnegie MellonSoftware Engineering Institute
Roles and Information Exchange
Status Information
Develop,Customize,Integrate• systems• software• COTS
Supplier /Developer
Directions, CorrectionsDirections, Corrections
Deliverables
Interim Documents, Tangibles
Acquirer
Pre-award activities
Post-award activities
• RFP prep.• Contract Award
Contractual Handshake
Functional & Quality
Requirements
Functional & Quality
Requirements
SubContractors
SEPG / SAPG SEPG
• MonitorProjectProgress
• EvaluateDeliver-ables
© 2003 by Carnegie Mellon University Version 1.0 page 55
Carnegie MellonSoftware Engineering Institute
AnalysisIndicatorsWhat areresults ofspecific tasks?
SuccessIndicatorsHave the goalsbeen achieved?What is the impactof the tactics?
Tasks to Accomplish goal
Success Criteria
Goal: Establish Acquisition
Processes
Strategy to accomplish goal
ProgressIndicatorsHow well areplansproceeding?
Success Indicatorsprocess owners, training,CM, and documentation(future: procedural adherence)
Status of Software Engineering Processes
55.0
0%
62.0
0%
58.0
0%
70.0
0%
40.0
0%
45.0
0%
40.0
0% 55.0
0%
10.0
0%
25.0
0%
22.0
0%
25.0
0%
54 54
60
62
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
Jan-
03
Feb-0
3
Mar-0
3
Apr-03
Per
cen
tag
e C
om
ple
ted
50
52
54
56
58
60
62
64
# o
f Pro
cess
es
Ow ner Identif ied
Documented
Under SCM
Total # Processes/Activities
Progress Indicatorsstart, finish dateswith progress noted(move toward EV)
TODAY = 29 JULYPlan Start
Plan Finish
Actual Start
Actual Finish
Days Late*
Identify owner of inspection process 1-Jan 12-Jan 1-Jan 12-Jan 0Enhance inspection process per ISO12207 -- Design it 13-Jan 28-Feb 13-Jan 28-Feb 0 -- Document it 1-Mar 31-Mar 1-Mar 31-Mar 0Review and update of new inspection 1-Apr 30-Apr 1-Apr 30-Apr 0
Establish configuration mgmt, change mgmt procedures for new inspection process 1-May 15-May 1-May 15-May 0Train personnel on new inspection process 16-May 30-Jun 16-May 10-Jul 10Establish process for routine monitoring of procedural adherence 16-May 30-Jun 31-May 15Create data storage mechanisms to hold success measures 16-May 30-Jun 30-Jun 45Create data storage mechanisms to hold inspection data 16-May 30-Jun 30-Jun 45
For this example:Days late = actual finish - plan finish if task completedDays late = actual start - plan start if task in progress
• Reference models: CMMI, SA CMM,IEEE/ISO 12207
• Leverage CMMI capabilities built inengineering: MA, REQM, RD, CAR
• Aim for CMMI capability in selected PAs:SAM, DAR, RSK, PP/PMC, CM, PPQA
• Reference all SA-CMM Level 2 kPAs,noting overlaps with CMMI
• Implement requirements managementprocess
• Tailor existing project monitoring processesfor acquisition managers
• …..
Middle Mgmt Dashboard• selected SPI plan EV data
Sr. Mgmt dashboard• quality trends• selected project EV data
Middle Mgmt dashboard• system documentation and
testing
Sr. Mgmt scorecard ;Middle Mgmt dashboard
Analysis IndicatorsReqts completeness –original, at inspection,approved (for contract 1)
Requirements Completenessby function, as process proceeds
0102030405060
Data -
orig
Data -
insp
1
Data -
appr
oved
Docum
ent -
orig
Docum
ent -
insp
1
Docum
ent -
app
rove
d
Recor
ds -
orig
Recor
ds -
insp1
Recor
ds -
appr
oved
Repor
ts - o
rig
Repor
ts - in
sp1
Repor
ts - a
ppro
ved
User I
nt - o
rig
User I
nt -
insp1
User I
nt -
appr
oved
Complete To be determined
© 2003 by Carnegie Mellon University Version 1.0 page 56
Carnegie MellonSoftware Engineering Institute
OutlineContext• state of the community• changing perspectives
Background• roles & responsibilities• maturity models• measurement & analysis methods
Scenario• goal-setting and success, progress, analysis indicators• inspecting the quality of deliverables: requirements• monitoring and oversight: progress analysis• measurement in the contract• communicating with integrated measures
Summary
© 2003 by Carnegie Mellon University Version 1.0 page 57
Carnegie MellonSoftware Engineering Institute
Evaluate Quality of Deliverables
Acquirer’sEvaluationcriteria
Acquirer’sInspectionor ReviewProcess
Documentsto review- SRD- SDP- Meas Plan- SDD
Measurable Results (Examples)
• defects discovered• description, severity, class, type
• size of the work product
• effort invested in the inspectionprocess
• time spent during the inspectionactivities
Indicators
FinalDeliverables
Process
Products
© 2003 by Carnegie Mellon University Version 1.0 page 58
Carnegie MellonSoftware Engineering Institute
Requirements Development &Management (SA-CMM RDM)Purpose:To establish a common understanding of the softwarerequirements by the acquisition project team, the end user,and the contractor.• includes both technical and non-technical requirements• involves development of the requirements and
management of any changes• starts with description of an operational need and ends
with transfer of responsibility to the maintainer
© 2003 by Carnegie Mellon University Version 1.0 page 59
Carnegie MellonSoftware Engineering Institute
RDM - Measurement Opportunities
RDM
Development ofRequirements
Management ofRequirements
•Reporting •Periods
•Planned
•ActualTotalEffort
Process Measures
•Module
•Tro
ub
le R
epo
rts
Product Measures
Weeks
Nu
mb
er
Process Measures
•Module
•Tro
ub
le R
epo
rts
Product Measures
Contract 2in thisillustration
Contract 1in thisillustration
© 2003 by Carnegie Mellon University Version 1.0 page 60
Carnegie MellonSoftware Engineering Institute
Illustration: Reqts Process Flow End User Acquirer Contractor 1 Contractor 2
Develop User Scenarios, Product Specifications
Develop Requirements
Manage Requirements Manage Requirements
Inspect Requirements
Refine Requirements
Approve
Baseline Requirements
N
Y
© 2003 by Carnegie Mellon University Version 1.0 page 61
Carnegie MellonSoftware Engineering Institute
Requirements Process MeasuresProcess Measures• effort expended• funds expended• progress toward completion• completion of milestones• number of change requests processed (post-
development)
For the contractor, these are measures of developmentprocess.
For the acquirer, these are measures of the inspectionprocess.
© 2003 by Carnegie Mellon University Version 1.0 page 62
Carnegie MellonSoftware Engineering Institute
Requirements DocumentMeasures and Evaluation Criteria“Inch” or “thickness” Criterion• Document is at least three inches thick
“Drop it” or “Thud” Criterion• Related to inch criterion• Specific level of sound before it is accepted
Format• Pretty pictures• In color
Not!
© 2003 by Carnegie Mellon University Version 1.0 page 63
Carnegie MellonSoftware Engineering Institute
Requirements DocumentEffective Evaluation CriteriaExamples of measurements for evaluation criteria• completeness:
- “TBD” requirements;- product performance measures included
• consistency:- no conflicts across document sections
• clarity:- growth in issues,- presence of ambiguous language or words with many
meanings.• conformity:
- meets stated criteria, constraints• correctness:
- all data fields in valid ranges
Contract should contain evaluation criteria.
© 2003 by Carnegie Mellon University Version 1.0 page 64
Carnegie MellonSoftware Engineering Institute
Illustration: Requirements Indicator
Requirements Completenessby function, as process proceeds
0102030405060
Data
- orig
Data
- ins
p1
Data
- app
rove
d
Docum
ent -
orig
Docum
ent -
insp
1
Docum
ent -
app
rove
d
Recor
ds -
orig
Recor
ds -
insp1
Recor
ds -
appr
oved
Repor
ts - o
rig
Repor
ts - i
nsp1
Repor
ts - a
ppro
ved
User I
nt -
orig
User I
nt -
insp1
User I
nt -
appr
oved
Complete To be determined
© 2003 by Carnegie Mellon University Version 1.0 page 65
Carnegie MellonSoftware Engineering Institute
Practical Issues
The organization or program/project office may haveseveral barriers to effective document inspection, such as• insufficient quantity/availability of personnel• insufficient technical or domain knowledge• schedule constraints
Example:• If you have a 300 page requirements document and
typically inspect at a rate of 2 hrs/page, are thereresources available to invest 600 hours to inspect thatdocument?
© 2003 by Carnegie Mellon University Version 1.0 page 66
Carnegie MellonSoftware Engineering Institute
Advancing the State ofRequirements Product Measures
Req.Docs
Evaluationcriteria
Inspectionor ReviewProcess
Req.Docs
Tools
Manual Automated
Reduce cycle time and effortwhile producing better resultsthan possible with tediousmanual review
Lengthy, labor intensive process
Examples of Tools:• Quality Analyzer for Requirements Specifications (QuARS)
- Lexical, syntactic and semantic analyses of requirements• Automated Requirements Measurement (ARM)
© 2003 by Carnegie Mellon University Version 1.0 page 67
Carnegie MellonSoftware Engineering Institute
Quality Analyzer forRequirements Specification
How does it work?• natural language analysis of requirements text• lexical: vague, weak, optional, subjective, other terms• syntactic: multiple, implicit, under specified statements• semantic:
- allows screening for consistency, completeness, etc.- arbitrary combinations of domains, components,
functionality, product quality attributes, and so on
© 2003 by Carnegie Mellon University Version 1.0 page 68
Carnegie MellonSoftware Engineering Institute
Automated RequirementsMeasurement (ARM)Checks for desirable requirements characteristics such as:• complete: precisely define all real world situations• consistent: no conflict between individual requirements• correct• modifiable• ranked• traceable• unambiguous: can only be interpreted one way• understandable: meaning of each of its statements is
easily grasped by all of its readers• verifiable• validatable: by individuals and organizations having
vested interest• testable
© 2003 by Carnegie Mellon University Version 1.0 page 69
Carnegie MellonSoftware Engineering Institute
Focusing InEarlier:• trends, roles, models• measurement methods
Key Points:• quality of deliverables• effective evaluation criteria• measuring requirements
development (contract 1)• tools for analyzing requirements
What’s in sight:•monitoring and oversight: evaluating a schedule slip(contract 2)
•What would YOU include in the contract?
© 2003 by Carnegie Mellon University Version 1.0 page 70
Carnegie MellonSoftware Engineering Institute
OutlineContext• state of the community• changing perspectives
Background• roles & responsibilities• maturity models• measurement & analysis methods
Scenario• goal-setting and success, progress, analysis indicators• inspecting the quality of deliverables: requirements• monitoring and oversight: progress analysis• measurement in the contract• communicating with integrated measures
Summary
© 2003 by Carnegie Mellon University Version 1.0 page 71
Carnegie MellonSoftware Engineering Institute
Monitoring & Oversight
Contract #2 has been awarded.• supplier is developing the product in two builds
The contractor has just notified you that the project hasboth cost and schedule slippage.
What do you do?
© 2003 by Carnegie Mellon University Version 1.0 page 72
Carnegie MellonSoftware Engineering Institute
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
Schedule andProgress
ProductQuality
CustomerSatisfaction
Use model to guide analysis.• Step 1: Confirm Problem (Cost & Schedule Slippage)
Performance Analysis Model
© 2003 by Carnegie Mellon University Version 1.0 page 73
Carnegie MellonSoftware Engineering Institute
Schedule & Progress IndicatorsCost
EV
PV
AC
J F M A M J J A S O N D J F M A M J J A S O N D
2002 2003
PV: Planned ValueAC: Actual CostEV: Earned Value
PV: Planned ValueAC: Actual CostEV: Earned Value
} Cost Var. = EV - AC
Sched Var. = EV - PV
}
EV
PV
AC
Cost
EV
PV
AC
J F M A M J J A S O N D J F M A M J J A S O N D
2002 2003
PV: Planned ValueAC: Actual CostEV: Earned Value
PV: Planned ValueAC: Actual CostEV: Earned Value
} Cost Var. = EV - AC
Sched Var. = EV - PV
}
EV
PV
AC
Est. Requirements
Architecture
Build 1DesignAssembly & TestIntegration & Ver.
Build 2DesignAssembly & TestIntegration & Ver.
Product Integration
Formal Qual. Test
J F M A M J J A S O N D J F M A M J J A S O N D2002 2003
Life cycle Activities
Tool tips:
The top two charts weremade in Excel andmanually manipulated.
The Gantt chart can begenerated using anyscheduling software.
Components Completing A&T
0
10
20
30
40
50
60
S O N D J F M A M J J A
Planned Actual
As of Jul 03
Build 1 Assembly & Test Build 2 Assembly & Test
2002 2003
A&T = Assembly and Test
© 2003 by Carnegie Mellon University Version 1.0 page 74
Carnegie MellonSoftware Engineering Institute
What We Learned
From Schedule and Progress indicators• cost and schedule slippage -- EV chart• activities taking longer than planned -- Gantt chart• assembly and test behind schedule -- components
completion chart
What does this mean?• confirms we have a problem
© 2003 by Carnegie Mellon University Version 1.0 page 75
Carnegie MellonSoftware Engineering Institute
Resources and Cost Indicators
Analysis/Probing Questions• Is the staff allocation
contributing to theproblem (too many, toofew, wrong time frame)?
• What is rate of staffturnover?
• How does actual staffcompare to planned staffallocation?
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
ProductQuality
CustomerSatisfaction
Schedule andProgress
© 2003 by Carnegie Mellon University Version 1.0 page 76
Carnegie MellonSoftware Engineering Institute
Resources and Cost Indicators
0
5
10
15
20
25
30
35
40
45
Jan
Mar May Ju
l
Sep
Nov Jan
Mar
May Ju
l
Sep
Nov
Nu
mb
er o
f S
taff
Planned
Actual
2002 2003
Turn Over
Key Pers.
Others.
Tool tip: This chart was made in Excel and manually manipulated.
Plan 2 3 3 3 9 9 11 12 22 23 23 22 22 22 22 15Actual 3 3 4 5 5 5 6 9 9 11 11 15 18 19 19 20 20 30 30
Prg
Plan 1 1 1 3 3 2 2 5 5 7 8 8 8 8 5Actual 3 3 3 3 3 3 3 3 5 5 5 5 5 5 5 5 5 5 5Tester
© 2003 by Carnegie Mellon University Version 1.0 page 77
Carnegie MellonSoftware Engineering Institute
What We Learned
From Resources and Cost Indicators• staffing did not follow planned level
- too many at beginning of project- testers and programmers used to fill in for analysts
and designers => high re-training costs- high turnover rate => training & getting up-to-speed
costs
What does this mean?• cost overrun due partly to staffing problems
© 2003 by Carnegie Mellon University Version 1.0 page 78
Carnegie MellonSoftware Engineering Institute
Growth and Stability Indicators
Analysis/Probing Questions• Are the requirements stable?• What is the code growth?• Is functionality being
transferred from build 1 tobuild 2? If so, how does thiseffect the delivery date?
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
ProductQuality
CustomerSatisfaction
Schedule andProgress
© 2003 by Carnegie Mellon University Version 1.0 page 79
Carnegie MellonSoftware Engineering Institute
2002 2003Feb Mar Apr May Jun Jul Aug Sep Dec Mar Jul
Req Changes 10 11 6 14 10 8 4 14 4 1 5Complexity Low Low Low Low Low Low Low Low Low Low LowResources (staff-days)
4 5 3 2 4 2 3 3 2 1 2
0
20
40
60
80
100
120
140
AddedReq.
Total Req.
Nu
mb
er o
f R
equ
irem
ents
J F M A M J J A S O N D J F M A M J2002 2003
J
Build 1
Build 2
As of Jul 03
Requirement Changes Information
Tool tip: This chartcan be generatedin Excel followedby manual editingusing the drawingtoolbar
© 2003 by Carnegie Mellon University Version 1.0 page 80
Carnegie MellonSoftware Engineering Institute
Growth and Stability IndicatorsSize Growth02
Jul
Jan
02
Mar
02
May
02
Sep
02
Nov
02
Jan-
03
Mar
03
May
03
Jul0
3
Sep
03
Lin
es o
f C
od
e
0
100
200
300
400
500
600
Plan
Actuals
Build 1
Build 2
In 1
00K
As of Jul 03
Requirements per Build
Contractor’s Explanation:• Functions deferred to later buildbecause of unanticipatedcomplexity
Tool tip: This chart was made in Excel andmanually manipulated.
Nu
mb
er o
f R
eq.
As of Jul 03
0
10
20
30
40
50
60
70
80
90
Build1
Build2
Original PlanReplan 1ActualNot Done
© 2003 by Carnegie Mellon University Version 1.0 page 81
Carnegie MellonSoftware Engineering Institute
What We LearnedFrom Growth and Stability Indicators• requirement changes are of low complexity but will
have some ripple effect• code production below planned value• functionality being deferred from build 1 to build 2
attributed by contractor to unanticipated complexity
What does this mean?• expect further cost and schedule growth due to low
code production and increased number of functions tobe implemented in Build 2
• expect an impact on completion date due to functionsdeferred to Build 2
• expect the possibility of a “Build 3” proposal
© 2003 by Carnegie Mellon University Version 1.0 page 82
Carnegie MellonSoftware Engineering Institute
Product Quality Indicators
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
ProductQuality
CustomerSatisfaction
Schedule andProgress
Analysis/Probing Questions• Are the defined processes
being followed?• What is the rate of closure for
trouble reports?• What type of trouble reports
are being detected? In whatphase?
© 2003 by Carnegie Mellon University Version 1.0 page 83
Carnegie MellonSoftware Engineering Institute
Product Quality Indicators
STR Status: Open vs. Closed
0
100
200
300
400
500
600
700
800
900
J F M A M J J A S O N D J F M A M J J A S O N D
Open
Closed
Delta
As of Jul 03
20032002
Build 1
Build 2
Tool tip: This chart was made in Excel and manually manipulated.
0
50
100
150
200
Open (200)
Closed (400)
Priority1
Priority2
Priority3
Priority4
Priority5
Nu
mb
er o
f S
TR
s
Open and Closed STRs
Showstopper Nit
Tallies on Jul 03
Cumulative
© 2003 by Carnegie Mellon University Version 1.0 page 84
Carnegie MellonSoftware Engineering Institute
Classifying Trouble Report Defects
0 5 10 15 20 25
DocumentationSyntax
Build, package
Assignment
Interface
Checking
Data
Function
System
Environment
Percent of Defects
Def
ect
Typ
e
Types that codeinspections wouldhave been expectedto catch
© 2003 by Carnegie Mellon University Version 1.0 page 85
Carnegie MellonSoftware Engineering Institute
What We Learned
From Product Quality Indicators• STRs being opened faster than they’re being closed• Code inspections should have found defect types
What does this mean?• Code inspection process allowed large number of
defects to slip through.
© 2003 by Carnegie Mellon University Version 1.0 page 86
Carnegie MellonSoftware Engineering Institute
Development PerformanceIndicators
TechnicalAdequacy
DevelopmentPerformance
Growth andStability
Resourcesand Cost
ProductQuality
CustomerSatisfaction
Schedule andProgress
Analysis/Probing Questions• Are the defined processes
being followed?• Are any defined processes
being skipped?
© 2003 by Carnegie Mellon University Version 1.0 page 87
Carnegie MellonSoftware Engineering Institute
Development PerformanceIndicators
Process Compliance
50556065707580859095
100
Jan-
02F
eb-0
2M
ar-0
2A
pr-0
2M
ay-0
2Ju
n-02
Jul-0
2A
ug-0
2S
ep-0
2O
ct-0
2N
ov-0
2D
ec-0
2Ja
n-03
Feb
-03
Mar
-03
Apr
-03
May
-03
Jun-
03Ju
l-03
Aug
-03
Sep
-03
Oct
-03
Nov
-03
Dec
-03
Per
cen
tag
e
Build 1
Build 2
As of Jul 03
0 25 50 75 100
Documentation
SQACSCI Int. & Testing
Code & Unit Testing
Config. Mgt
Req. Analysis
Rework
Process
% Completion
Detailed Design
As of Jul 03Process Compliance
Funct. Int. & Testing
Inspections
Prelim. Design
Tool tip: This chart was made in Excel and manually manipulated.
© 2003 by Carnegie Mellon University Version 1.0 page 88
Carnegie MellonSoftware Engineering Institute
What We Learned
From Development Performance Indicators• adherence to defined process decreased over time• stopped doing inspections
What does this mean?• defects usually detected during code inspections
allowed to slip through• impact on cost and schedule due to rework
© 2003 by Carnegie Mellon University Version 1.0 page 89
Carnegie MellonSoftware Engineering Institute
Reasons for SlippageStaffing problems:• too many at beginning of project• below planned level during most of development
- noting that productivity increased dramatically• high turnover rate
Process compliance:• stopped doing inspections• allowed errors to leak to later phases
Requirements changes after Build 2 code and unit test
Conclusion:• expect further cost and schedule growth due to low code
production and increased number of functions to beimplemented in Build 2
© 2003 by Carnegie Mellon University Version 1.0 page 90
Carnegie MellonSoftware Engineering Institute
Possible ActionsDeveloper Actions• replan based on current performance• get staffing under control
- verify the skills balance of resources- do not decrease staffing to conform to “planned”
staffing, particularly if that would decrease thenumber of programmers
• restart inspections- code- test cases
Acquirer Decision Options• use contract labor (additional costs)• deliver smaller size - less functionality• accept schedule slip
© 2003 by Carnegie Mellon University Version 1.0 page 91
Carnegie MellonSoftware Engineering Institute
Focusing InEarlier:• trends, roles, models• measurement methods• evaluating deliverables
Key Points:• use the Performance Analysis
Model as a causal analysisnavigation aid
• always use multiple indicators• couple data analysis with
knowledge of your and yourcontractor’s processes
What’s in sight:•What would YOU include in the contract?•How to communicate using your measures
© 2003 by Carnegie Mellon University Version 1.0 page 92
Carnegie MellonSoftware Engineering Institute
OutlineContext• state of the community• changing perspectives
Background• roles & responsibilities• maturity models• measurement & analysis methods
Scenario• goal-setting and success, progress, analysis indicators• inspecting the quality of deliverables: requirements• monitoring and oversight: progress analysis• measurement in the contract• communicating with integrated measures
Summary
© 2003 by Carnegie Mellon University Version 1.0 page 93
Carnegie MellonSoftware Engineering Institute
Writing Your Contract
Performance-based contracting• contractors are paid based on how they meet
predefined metrics
General tips:• Consider project, product, process measures• Specify frequency of reporting• Specify target performance where known
- the “SMART” approach applies: specific, measurable,attainable, realistic, timely
© 2003 by Carnegie Mellon University Version 1.0 page 94
Carnegie MellonSoftware Engineering Institute
Discussion: Write Your Contracts!
For the two-contract illustration just reviewed, whatmeasures would YOU request in the contracts?
Which measures do youthink would be readilyavailable (or not)?
© 2003 by Carnegie Mellon University Version 1.0 page 95
Carnegie MellonSoftware Engineering Institute
OutlineContext• state of the community• changing perspectives
Background• roles & responsibilities• maturity models• measurement & analysis methods
Scenario• goal-setting and success, progress, analysis indicators• inspecting the quality of deliverables: requirements• monitoring and oversight: progress analysis• measurement in the contract• communicating with integrated measures
Summary
© 2003 by Carnegie Mellon University Version 1.0 page 96
Carnegie MellonSoftware Engineering Institute
AnalysisIndicatorsWhat areresults ofspecific tasks?
SuccessIndicatorsHave the goalsbeen achieved?What is the impactof the tactics?
Tasks to Accomplish goal
Success Criteria
Goal: Establish Acquisition
Processes
Strategy to accomplish goal
ProgressIndicatorsHow well areplansproceeding?
Success Indicatorsprocess owners, training,CM, and documentation(future: procedural adherence)
Status of Software Engineering Processes
55.0
0%
62.0
0%
58.0
0%
70.0
0%
40.0
0%
45.0
0%
40.0
0% 55.0
0%
10.0
0%
25.0
0%
22.0
0%
25.0
0%
54 54
60
62
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
Jan-
03
Feb-0
3
Mar-0
3
Apr-03
Per
cen
tag
e C
om
ple
ted
50
52
54
56
58
60
62
64
# o
f Pro
cess
es
Ow ner Identif ied
Documented
Under SCM
Total # Processes/Activities
Progress Indicatorsstart, finish dateswith progress noted(move toward EV)
TODAY = 29 JULYPlan Start
Plan Finish
Actual Start
Actual Finish
Days Late*
Identify owner of inspection process 1-Jan 12-Jan 1-Jan 12-Jan 0Enhance inspection process per ISO12207 -- Design it 13-Jan 28-Feb 13-Jan 28-Feb 0 -- Document it 1-Mar 31-Mar 1-Mar 31-Mar 0Review and update of new inspection 1-Apr 30-Apr 1-Apr 30-Apr 0
Establish configuration mgmt, change mgmt procedures for new inspection process 1-May 15-May 1-May 15-May 0Train personnel on new inspection process 16-May 30-Jun 16-May 10-Jul 10Establish process for routine monitoring of procedural adherence 16-May 30-Jun 31-May 15Create data storage mechanisms to hold success measures 16-May 30-Jun 30-Jun 45Create data storage mechanisms to hold inspection data 16-May 30-Jun 30-Jun 45
For this example:Days late = actual finish - plan finish if task completedDays late = actual start - plan start if task in progress
• Reference models: CMMI, SA CMM,IEEE/ISO 12207
• Leverage CMMI capabilities built inengineering: MA, REQM, RD, CAR
• Aim for CMMI capability in selected PAs:SAM, DAR, RSK, PP/PMC, CM, PPQA
• Reference all SA-CMM Level 2 kPAs,noting overlaps with CMMI
• Implement requirements managementprocess
• Tailor existing project monitoring processesfor acquisition managers
• …..
Middle Mgmt Dashboard• selected SPI plan EV data
Sr. Mgmt dashboard• quality trends• selected project EV data
Middle Mgmt dashboard• system documentation and
testing
Sr. Mgmt scorecard ;Middle Mgmt dashboard
Analysis IndicatorsReqts completeness –original, at inspection,approved (for contract 1)
Requirements Completenessby function, as process proceeds
0102030405060
Data -
orig
Data -
insp
1
Data -
appr
oved
Docum
ent -
orig
Docum
ent -
insp
1
Docum
ent -
app
rove
d
Recor
ds -
orig
Recor
ds -
insp1
Recor
ds -
appr
oved
Repor
ts - o
rig
Repor
ts - in
sp1
Repor
ts - a
ppro
ved
User I
nt - o
rig
User I
nt -
insp1
User I
nt -
appr
oved
Complete To be determined
© 2003 by Carnegie Mellon University Version 1.0 page 97
Carnegie MellonSoftware Engineering Institute
Illustration: Goal Structure
Stabilize Current Systems Owner:
Improve Product DeliveryOwner:
Provide “whole product” support Improve product field
performance
Meet Customers’ NeedsOwner:
Engineer the Future Systems Owner:
Deliver Future SystemsOwner:
Establish Acquisition Processes
Owner:
Stabilize Software Engineering Processes
Owner:
Develop a quality team (rightpeople, right time, right job)
Owner:
© 2003 by Carnegie Mellon University Version 1.0 page 98
Carnegie MellonSoftware Engineering Institute
Illustration: Success IndicatorsEstablish Acquisition ProcessesTwo key success indicators (excerpted from indicator templates)• status of ownership, training, documentation, configuration
management of processes (evolve into procedural adherence)• status of training, using ISO12207 to group processes
After processes established, monitor sustainment or adherence• use appraisal and/or audit results
Status of Software Acquisition Processes
55.0
0%
62.0
0%
58.0
0%
70.0
0%
40.0
0%
45.0
0%
40.0
0% 55.0
0%
10.0
0%
25.0
0%
22.0
0%
25.0
0%
54 54
60
62
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
Jan-
03
Feb-0
3
Mar-0
3
Apr-0
3
Per
cen
tag
e C
om
ple
ted
50
52
54
56
58
60
62
64
# o
f P
roce
sses
Ow ner Identif ied
Documented
Under CM
Total # Processes/Activities
Process Activity Training(Personnel Performing the Processes)
2 2
0
32
1 10
2 2
2
4
4
3
2
1 1 30 0
01234567
Opera
tions
Mainte
nanc
e
Docum
entat
ion
Config
urat
ion M
gmt
Quality
Ass
uran
ceVer
ificat
ionVali
datio
nJo
int R
eview
Audit
Proble
m Res
olutio
n
Process Area
To
tal #
of
Per
son
nel
To Be Trained
Trained
© 2003 by Carnegie Mellon University Version 1.0 page 99
Carnegie MellonSoftware Engineering Institute
Illustration: Senior ManagementReportingRequired contractor metrics reported by all programs• size growth• workforce size and qualifications• selected earned value (EV)• quality trends• requirements fulfillment
Required acquirer metrics reported by all programs
© 2003 by Carnegie Mellon University Version 1.0 page 100
Carnegie MellonSoftware Engineering Institute
OutlineContext• State of the community• Changing Perspectives
Background• Roles & Responsibilities• Maturity Models• Measurement & Analysis Methods
Scenario• goal-setting and success, progress, analysis indicators• inspecting the quality of deliverables: requirements• monitoring and oversight: progress analysis• measurement in the contract• communicating with integrated measures
Summary
© 2003 by Carnegie Mellon University Version 1.0 page 101
Carnegie MellonSoftware Engineering Institute
Roles and Information Exchange
Status Information
Develop,Customize,Integrate• systems• software• COTS
Supplier /Developer
Directions, CorrectionsDirections, Corrections
Deliverables
Interim Documents, Tangibles
Acquirer
Pre-award activities
Post-award activities
• RFP prep.• Contract Award
Contractual Handshake
Functional & Quality
Requirements
Functional & Quality
Requirements
SubContractors
• MonitorProjectProgress
• EvaluateDeliver-ables
© 2003 by Carnegie Mellon University Version 1.0 page 102
Carnegie MellonSoftware Engineering Institute
Measuring Project, Product, Process
Processes
Products•Supplier Produced
•Acquisition Organization Produced
ContractualHandshake
Exchange ofindicators /informationfor tracking,monitoring,direction, etc.
Acquirer
Pre-awardactivities
Post-awardactivities
Supplier
Develop,Customize,Integrate• systems• software• COTS
Project•Schedule
•Cost
•Requirements satisfaction
(status, projection, trend)
(status, projection, trend) •Quality (amount of rework)
•Quality (amount of rework)
Relationship•Roles-Changes
•Invoicing-payment
© 2003 by Carnegie Mellon University Version 1.0 page 103
Carnegie MellonSoftware Engineering Institute
Summary – Focus PointsKey acquisition responsibilities (after contract award):• monitoring and oversight• inspecting, reviewing, and understanding documents
and other work products
Post-contract award success depends on pre-contractaward activities• building measurement expectations into contracts• establishing good partnerships and working
relationships with contractors
Measure products, processes, projects, relationships• requirements development, management, products
should not be exempt! They are measurable.
© 2003 by Carnegie Mellon University Version 1.0 page 104
Carnegie MellonSoftware Engineering Institute
Contact InformationWolf GoethertSoftware Engineering InstituteMeasurement & Analysis InitiativeEmail: [email protected]
Jeannine SiviySoftware Engineering InstituteMeasurement & Analysis InitiativeEmail: [email protected]
Robert FergusonSoftware Engineering InstituteMeasurement & Analysis InitiativeEmail: [email protected]
© 2003 by Carnegie Mellon University Version 1.0 page 105
Carnegie MellonSoftware Engineering Institute
ReferencesNote: URLs valid as of tutorial delivery date.
[ASSIP 03] Information from a 2003 Survey of Army Acquisition Program Managers
[Barbour 03] Taken from a set of workshop slides.
[C-M-H 03] Carleton, Anita, Robert Mishler and Watts Humphrey, The integrated SoftwareAcquisition Measurement (ISAM) Project, Interim Status Report
[DAD 03] Siviy, Jeannine and William Florac, Data Analysis Dynamics, Half Day TutorialDelivered at SEPG 2003, Boston, MA
[Diana 03] Diana, Alison, Outsourcing by the Numbers,http://www.ecommercetimes.com/perl/story/32114.html
[DZ 02] Zubrow, David, Putting ‘M’ in the Model: Measurement and Capability Maturity Model®
Integration (CMMISM), ICSQ, 29 October 2003, Ottawa, Canada
[DZ – P 03] Adapted from [DZ 02] by Mike Phillips for a client workshop
[Ferguson 03] Ferguson, Jack, Use of CMMI in an Acquisition Context, CMMI Users Group 2003,Denver, CO
[GQIM 96] Goal-Driven Software Measurement--A Guidebookhttp://www.sei.cmu.edu/publications/documents/96.reports/96.hb.002.html
© 2003 by Carnegie Mellon University Version 1.0 page 106
Carnegie MellonSoftware Engineering Institute
ReferencesNote: URLs valid as of tutorial delivery date.
[Robb 02] Robb, Drew, 5 Top Trends in Offshore Outsourcing,http://itmanagement.earthweb.com/erp/article.php/1558431
[PSM 00] Practical Software and Systems Measurement A Foundation for Objective ProjectManagement, Guidebook, version 4.0b, Practical Software and SystemsMeasurement Support Center, U.S. TRACOM-ARDEC, AMSTA-AR-QA-A, PicatinnyArsenal, NJ, Website: www.psmsc.com, October 2000
[SWM 01] Software Magazine, Feb/March, 2001
© 2003 by Carnegie Mellon University Version 1.0 page 107
Carnegie MellonSoftware Engineering Institute
Reading & Resources 1
Note: URLs valid as of tutorial delivery date.
Practical Software and Systems Measurement (PSM)• reference for the Performance Analysis Model• reference lists of measures to consider• http://www.psmsc.com
Goal Driven Measurement (GDM) andGoal-Question-Indicator-Metric (GQIM)• front end for selecting most relevant PSM measures• used for developing context-specific indicators, particularly
“success indicators”• “Goal-Driven Software Measurement--A Guidebook”
http://www.sei.cmu.edu/publications/documents/96.reports/96.hb.002.html
© 2003 by Carnegie Mellon University Version 1.0 page 108
Carnegie MellonSoftware Engineering Institute
Reading & Resources 2
Note: URLs valid as of tutorial delivery date.
Defense Acquisition University (DAU) Deskbook• http://deskbook.dau.mil/jsp/default.jsp• provides information about regulatory references, mandatory
and discretionary references by service branch, and severalknowledge repositories
Guidelines for Successful Acquisition and Management ofSoftware-Intensive Systems,http://www.stsc.hill.af.mil/resources/tech_docs/index.html
Acquisition Centers of Excellence• Air Force, for instance ESC Hanscom
- http://esc.hanscom.af.mil/ESC-BP/• Navy
- http://www.ace.navy.mil/public/html/
© 2003 by Carnegie Mellon University Version 1.0 page 109
Carnegie MellonSoftware Engineering Institute
Reading & Resources 3
Note: URLs valid as of tutorial delivery date.
Project Management Body of Knowledge (PMBOK®)• proven, traditional project management practices and
innovated, advanced practices with more limited use• Project Management Institute Guide to the PMBOK contains
the generally accepted subset of knowledge and practicesthat are applicable to most projects most of the time- http://www.pmi.org/info/PP_StandardsExcerpts.asp- http://www.pmi.org/info/PP_PMBOK2000Excerpts.asp