introduction wilson rosa, afcaa [email protected] csse annual research review march 8,...

14
Introduction Wilson Rosa, AFCAA [email protected] CSSE Annual Research Review March 8, 2010

Post on 21-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Introduction

Wilson Rosa, AFCAA

[email protected]

CSSE Annual Research Review

March 8, 2010

2

Agenda

• Overview

3

Project Background

• Goal is to improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing.

• Project led by the Air Force Cost Analysis Agency (AFCAA) working with service cost agencies, and assisted by University of Southern California and Naval Postgraduate School

• We will publish the AFCAA Software Cost Estimation Metrics Manual to help analysts and decision makers develop accurate, easy and quick software cost estimates for avionics, space, ground, and shipboard platforms.

44

Stakeholder Communities

• Research is collaborative across heterogeneous stakeholder communities who have helped us in refining our data definition framework, domain taxonomy and providing us project data.– Government agencies

– Tool Vendors

– Industry

– Academia

SLIM-Estimate™TruePlanning® by PRICE Systems

5

Research Objectives

• Establish a robust and cost effective software metrics collection process and knowledge base that supports the data needs of the United States Department of Defense (DoD)

• Enhance the utility of the collected data to program oversight and management

• Support academic and commercial research into improved cost estimation of future DoD software-intensive systems

6

Software Cost Model Calibration

Most program offices and support contractors rely heavily on software cost models

May have not been calibrated with most recent DoD data

Calibration with recent data (2002-Present) will help increase program office estimating accuracy

7

AFCAA Software Cost Estimation Metrics Manual Table of Contents

Chapter 1: Software Estimation PrinciplesChapter 2: Product SizingChapter 3: Product GrowthChapter 4: Effective SLOCChapter 5: Historical Productivity Chapter 6: Model CalibrationChapter 7: Calibrated SLIM-ESTIMATEChapter 8: Cost Risk and Uncertainty MetricsChapter 9: Data Normalization Chapter 10: Software Resource Data ReportChapter 11: Software MaintenanceChapter 12: Lessons Learned

8

Manual Special Features

• Augment NCCA/AFCAA Software Cost Handbook:– Default Equivalent Size Inputs (DM, CM, IM, SU, AA, UNFM)

– Productivity Benchmarks by Operating Environment, Application Domain, and Software Size

– Empirical Code, Effort, and Schedule Growth Measures derived from SRDRs

– Empirically Based Cost Risk and Uncertainty Analysis Metrics

– Calibrated SLIM-Estimate™ using most recent SRDR data

– Mapping between COCOMO, SEER, True S cost drivers

– Empirical Dataset for COCOMO, True S, and SEER Calibration

– Software Maintenance Parameters

9

Manual Special Features (Cont.)

• Guidelines for reconciling inconsistent data

• Standard Definitions (Application Domain, SLOC, etc.)

• Address issues related to incremental development (overlaps, early-increment breakage, integration complexity growth, deleted software, relations to maintenance) and version management (a form of product line development and evolution).

• Impact of Next Generation Paradigms – Model Driven Architecture, Net-Centricity, Systems of Systems, etc.

10

Agenda

• Overview (Barry Boehm)

• Data Analysis

• Software Sizing

• Recent Workshop Results

• Conclusions

11

DoD Empirical Data

• Data quality and standardization issues– No reporting of Equivalent Size Inputs – CM, DM, IM, SU, AA, UNFM, Type

– No common SLOC reporting – logical, physical, etc.

– No standard definitions – Application Domain, Build, Increment, Spiral,…

– No common effort reporting – analysis, design, code, test, CM, QA,…

– No common code counting tool

– Product size only reported in lines of code

– No reporting of quality measures – defect density, defect containment, etc.

• Limited empirical research within DoD on other contributors to productivity besides effort and size:

– Operating Environment, Application Domain, and Product Complexity

– Personnel Capability

– Required Reliability

– Quality – Defect Density, Defect Containment

– Integrating code from previous deliveries – Builds, Spirals, Increments, etc.

– Converting to Equivalent SLOC

• Categories like Modified, Reused, Adopted, Managed, and Used add no value unless they translate into single or unique narrow ranges of DM, CM, and IM parameter values. We have seen no empirical evidence that they do…

12

SRDR Data Source

1.

1. System/Element Name (version/release): 2. Report As Of:

3. Authorizing Vehicle (MOU, contract/amendment, etc.): 4. Reporting Event: Contract/Release End

Submission # ________

(Supersedes # _______, if applicable)

Description of Actual Development Organization

5. Development Organization: 8. Lead Evaluator:

7. Certification Date: 9. Affiliation:

10. Precedents (list up to five similar systems by the same organization or team):

Comments on Part 1 responses:

2. Product and Development Description Percent of Product Size

Upgrade or New?

1. Primary Application Type: 2. % 3. 4.

17. Primary Language Used: 18. %

21. List COTS/GOTS Applications Used:

22. Peak staff (maximum team size in FTE) that worked on and charged to this project: __________

23. Percent of personnel that was: Highly experienced in domain: ___% Nominally experienced: ___% Entry level, no experience: ___%

Comments on Part 2 responses:

3.

2. Number of External Interface Requirements (i.e., not under project control)

4. Amount of New Code developed and delivered (Size in __________ )

5. Amount of Modified Code developed and delivered (Size in __________ )

6. Amount of Unmodified, Reused Code developed and delivered (Size in __________ )

Comments on Part 3 responses:

DD Form 2630-3 Page 1 of 2

Software Resources Data Report: Final Developer Report - Sample

Page 1: Report Context, Project Description and Size

Report Context

6. Certified CMM Level (or equivalent):

Product Size ReportingProvide Actuals at

Final Delivery

Actual Development Process

Code Size Measures for items 4 through 6. For each, indicate S for physical SLOC (carriage returns); Snc for noncomment SLOC only; LS for logical statements; or provide abbreviation _________ and explain in associated Data Dictionary.

1. Number of Software Requirements, not including External Interface Requirements (unless noted in associated Data Dictionary)

3. Amount of Requirements Volatility encountered during development (1=Very Low .. 5=Very High)

13

Data Collection and Analysis• Approach

– Be sensitive to the application domain– Embrace the full life cycle and Incremental Commitment Model

• Be able to collect data by phase, project and/or build or increment

• Items to collect– SLOC reporting – logical, physical, NCSS, etc.– Requirements Volatility and Reuse

• Modified or Adopted using DM, CM, IM; SU, UNFM as appropriate

– Definitions for Application Types, Development Phase, Lifecycle Model,…

– Effort reporting – phase and activity– Quality measures – defects, MTBF, etc.

14

Data Normalization Strategy

• Interview program offices and developers to obtain additional information not captured in SRDRs…

– Modification Type – auto generated, re-hosted, translated, modified

– Source – in-house, third party, Prior Build, Prior Spiral, etc.

– Degree-of-Modification – %DM, %CM, %IM; SU, UNFM as appropriate

– Requirements Volatility -- % of ESLOC reworked or deleted due to requirements volatility

– Method – Model Driven Architecture, Object-Oriented, Traditional

– Cost Model Parameters – True S, SEER, COCOMO, SLIM