defining acquisition measures - sei digital library by the u.s. department of defense © 2002 by...
TRANSCRIPT
Sponsored by the U.S. Department of Defense© 2002 by Carnegie Mellon University
Version 1.0 page 1
Pittsburgh, PA 15213-3890
Defining Acquisition MeasuresThe Integrated Software AcquisitionMeasurement Project (ISAM)Software Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213-3890
Dr. John Mishler, NAVAIR Resident Affiliate, SEI, 2002-2003
Mr. Frank Sisti, Senior Member of the Technical Staff, SEI
© 2002 by Carnegie Mellon University Version 1.0 page 2
Software Engineering Institute
DoD R&D Laboratory FFRDC
Sponsored by OUSD (AT&L)
Created in 1984
Under contract toCarnegie Mellon University
Offices in Arlington VA andPittsburgh PA
Mission: Improve thepractice of softwareengineering
© 2002 by Carnegie Mellon University Version 1.0 page 3
JAC Challenge Problem for the SEIThe SEI Joint Advisory Committee (JAC)• Is a tri-service oversight board to guide the SEI.• Establishes SEI goals and direction.
One key challenge the JAC gave to the SEI is to defineacquisition measurements to• measure and manage software-intensive systems• promptly, accurately, and precisely describe project status
and trends• support DoD program managers
© 2002 by Carnegie Mellon University Version 1.0 page 4
SEI’s Acquisition Support ProgramThe SEI has established the Acquisition Support Program (ASP)to address system acquisition issues.
The Integrated Software Acquisition Measurement (ISAM)project is SEI’s first step in addressing the acquisitionmeasurement challenge.
The full Team Acquisition Process (TAP) effort will be a follow-on ASP project to address broader acquisition managementneeds.
© 2002 by Carnegie Mellon University Version 1.0 page 5
Acquisition Measurement Objectives
The ISAM project aims to develop integrated measures that• apply to all development and acquisition levels• provide broad life-cycle coverage• promptly and precisely portray program status• accurately predict future program performance• minimally intrude on the development work• support cyclic development• facilitate process improvement• are a natural consequence of quality work
The goal is to build a measurement culture at all levels ofdevelopment and acquisition organizations.
© 2002 by Carnegie Mellon University Version 1.0 page 6
Metrics Program RequirementsTo obtain useful measures, work must be precisely planned –without precise plans, work cannot be precisely tracked.
The development process must also be defined – undefinedprocesses cannot be measured.
Process and product quality must be measured and managed –poor quality work makes projects late and unpredictable.
A useful metrics program must have people who consistentlygather accurate data and know how to use these data.
© 2002 by Carnegie Mellon University Version 1.0 page 7
The SEI Team Software ProcessSoftware-intensive programs will not improve until the behaviorof the software professionals changes.
The SEI has developed and is now transitioning the TeamSoftware Process (TSP) into general practice.
With the TSP, precise measures are a basic and normal part ofengineering practice.
The TSP provides the management and engineering trainingneeded for rapid deployment and effective use of measures.
TSP projects predictably deliver the safe, secure, and high-quality software-intensive systems needed for modern warfare.
SM
SM Team Software Process and TSP are service marks of Carnegie Mellon University.
© 2002 by Carnegie Mellon University Version 1.0 page 8
The TSP Is Widely Used
Some of the organizations that are introducing and usingthe TSP are
ABBAISBechtelBoeingComnetDFASEDSEricssonHoneywellIomegaKaiser
LittonMicrosoftNASA LangleySAICSDRCTeradyneUSAF: Hill AFBUSN: NAVAIRUSN: NAVOCEANOUSN: NUWCXerox
© 2002 by Carnegie Mellon University Version 1.0 page 9
TSP Measurements
With the TSP, developers measure all of their work.• time spent by phase• size of products produced• defects found by phase and product element
From these data, all required engineering project managementmeasures can be derived.
When using the TSP, development teams know precisely wheretheir projects stand.
TSP teams regularly report on plan versus actual quality andschedule status, estimated project completion, and status ofsignificant risks.
© 2002 by Carnegie Mellon University Version 1.0 page 10
Predictable SchedulesSchedule Deviation Individual Value Control Chart -
Commercial Systems
-150-100
-500
50100150
200250
300350
01/8801/89
01/9001/91
01/9201/93
01/9401/95
01/9601/97
01/98
Date of Project Start
% D
evia
tio
n
Individual Data Points Mean Upper Natural Process Limit
Lower Natural Process Limit One Standard Deviation
CMM introduced
TSP introduced
[Source: AIS]
© 2002 by Carnegie Mellon University Version 1.0 page 11
Cost and Schedule PerformanceSound budgets and commitments require accurate size andresource estimates.
Data from 24 teams in 4 organizations show that TSPteams make accurate estimates.
1616N =
With TSPWithout TSP
Dev
iatio
n
125
100
75
50
25
0
-25
-50
-75
1515N =
With TSPWithout TSP
Dev
iatio
n
175
150
125
100
75
50
25
0
-25
-50
-75
Effort Schedule
© 2002 by Carnegie Mellon University Version 1.0 page 12
TSP Quality Management
With the TSP, teams consistently improve product quality.
By doing quality work, TSP teams• accelerate development schedules• reduce program costs• sharply cut testing time• greatly reduce maintenance effort
© 2002 by Carnegie Mellon University Version 1.0 page 13
TSP Quality Benefits - Boeing
System Test time
Release # 6 Release # 7 Release # 8 Release # 9PSP/TSP trained
(Boeing Pilot #1)
2.36X moreSloc count
32 days 41 days28 days
4 days
94% less time
© 2002 by Carnegie Mellon University Version 1.0 page 14
Higher Product Quality
By planning, measuring, and tracking quality, TSP projectshave fewer defects and shorter testing times.
1210N =
With TSPWithout TSP
def/k
loc
9
8
7
6
5
4
3
2
1
0129N =
With TSPWithout TSP
def/K
LOC
2.0
1.5
1.0
.5
0.0
System test defect density Acceptance test defect density
© 2002 by Carnegie Mellon University Version 1.0 page 15
Reduced Cycle Time
1212N =
With TSPWithout TSP
days
/klo
c
8
7
6
5
4
3
2
1
0
With the TSP, teams find andfix defects early in thedevelopment process.
This sharply reduces test time.
With shorter testing, cycle timeis cut.
Savings
Reqts Design Implement Test
Typical team
TSP teamReqts Design Implement Test
Test Time per KLOC
© 2002 by Carnegie Mellon University Version 1.0 page 16
Reduced Development Costs
100 engineers40 hours x 50 weeks
200,000 hours
40% test time = 80,000 hours of test
TSP = 50% Productivity Improvement
Without TSP With TSP
120,000 for development2 LOC/hour = 240 KLOC
10% test time = 20,000 hours of test
180,000 for development2 LOC/hour = 360 KLOC
© 2002 by Carnegie Mellon University Version 1.0 page 17
The ISAM Measurement Focus
Product Focus •Customer Satisfaction (CUPRIMDSPS)•Operational Capability•Initial Operational Capability (IOC) Support•Integrated Logistics Support (ILS) Plan•System Integration (System of Systems)
Project Focus – ISAM •Schedule (projection, status, trend)•Cost (projection, status, trend)•Defects (projection, status, trend)•Project Cycle Time•Total Ownership Cost (TOC)•Cost of Quality•Requirements Satisfaction
Product
Process Focus•Cycle time trends•Support Capability•Total Ownership Cost trends•Requirements Management
Other Focus •Legal•Financial•Contracts•Customer•Sponsor
Process
© 2002 by Carnegie Mellon University Version 1.0 page 18
Effective Program MeasurementTo be useful for management, measures must be• precisely defined, accurate, and traceable• timely and predictive• used by engineering• a minimum complete and consistent set
The TSP accomplishes this by providing comprehensivemeasurements that are integral to the engineering work.
With TSP, measurement is natural. The TSP• precisely and promptly measures the engineering work• does not impose added measurement costs• provides a defined and non-proprietary measurement
system
© 2002 by Carnegie Mellon University Version 1.0 page 19
Schedule PredictabilityThe schedule predictabilitymeasures are• earned value• completion projections• completion projection
trends
Earned Value Schedule Tracking
0
20
40
60
80
100
120
1 4 7 10 13 16 19 22 25 28
Project Weeks
Cu
mu
lati
ve E
arn
ed V
alu
e
Plan
Projected
Actual
Schedule Projections - Weeks Early/Late
-25
-20
-15
-10
-5
0
5
10
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29
Project Weeks
Wee
ks E
arly
(+)
or
Lat
e (-
)
© 2002 by Carnegie Mellon University Version 1.0 page 20
Effort PredictabilityActual versus planned taskhours
Billed hours versus taskhours
Relationship of task hoursand EV status
Effort estimate accuracy forcompleted tasks
Planned and Actual Task Hours
050
100150200250300350400
1 3 5 7 9 11 13 15 17 19 21 23
Project Weeks
Cu
mu
lati
ve T
ask
Ho
urs
Plan
Actual
Percent Task to Billed Hours
05
1015202530354045
1 3 5 7 9 11 13 15 17 19 21 23
Project Weeks
Per
cen
tag
ePlan
Actual
© 2002 by Carnegie Mellon University Version 1.0 page 21
QualityDefects per KLOC,planned and actual
Percent defect free (PDF)
Quality profiles
Defect Removal Profile
01020304050
Rqts I
npse
ction
HLD In
spec
tion
DLD R
eview
DLD In
spec
tion
Code
Review
Compil
e
Code I
nspe
ction
Unit T
est
Int. T
est
Syst. T
est
Acc. T
est
Project Phase
Def
ects
/KL
OC
Plan
Actual
Component E Quality Profile
Design/Code time
Code-review time
Compile D/KLOCUnit-test D/KLOC
Design-review time
© 2002 by Carnegie Mellon University Version 1.0 page 22
Cycle Time ReductionTime in days fromproject initiation toinitial operationalcapability
Time to firstproductionarticle delivery
Percent of cycle timespent in test phases(after unit testing)
Cycle Time vs. % Time in Testfor an 8-month development job
0
5
10
15
20
25
5 10 15 20 25 30 35 40 45 50 55 60 65
Percent of Project Schedule in Testing
Del
iver
Cyc
le in
Mo
nth
s
Typical TSPProjects
TypicalProjects
Large SystemProjects
© 2002 by Carnegie Mellon University Version 1.0 page 23
Other MeasuresTotal Ownership Cost• Cost of program development (program initiation to initial
operational capability)• Cost of program maintenance (cost to maintain product after
initial operational capability
Cost of Quality• Percent of total development time spent in appraisal
(walkthroughs, reviews, and inspections)• Percent of total development time spent in rework (compile
and test)
Requirements Satisfaction• Number of acceptance test defects in user acceptance or
operational suitability tests• Acceptance test defects per KLOC
© 2002 by Carnegie Mellon University Version 1.0 page 24
Required TrainingTo adopt these measures, acquisition groups will require a fewdays of ISAM training.
The development groups must use the TSP.
An extensive program is available for transitioning the TSP intodevelopment and maintenance organizations.• Executive kickoff and planning seminar – 1 1/2 days• Management training – 2 days• Engineer training: 2 week PSP course• Internal transition agent training
- PSP Instructor – 5 days- TSP Launch Coach – 5 days
Training costs are recovered with the first 1,000 LOC developed.
© 2002 by Carnegie Mellon University Version 1.0 page 25
Next StepsComplete program management interviews.
Refine proposed measures.
Establish metrics-based management methods.
Define metrics prototype testing effort.
Conduct prototype tests• Program managers use ISAM measurements.• Projects use the TSP.
Produce final report and transition plan.
© 2002 by Carnegie Mellon University Version 1.0 page 26
ConclusionsImproved measurements are needed to manage software-intensive systems throughout their life cycle.
The TSP provides the foundation for precise and timely programmeasurements.
ISAM provides the measurement tools for effective andresponsive program management.
With your help and support, this project will guide future programmanagers in meeting our military’s needs for reliable, safe, andsecure software-intensive systems.
© 2002 by Carnegie Mellon University Version 1.0 page 27
For Further Information
Please contact:
Anita D. CarltonSenior Member of the Technical StaffSoftware Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213-3890
Tel. (412) 268-7718