the analytic sciences corporation · the analytic sciences corporation tr-9033-2 ... collect-ion of...

132
THE ANALYTIC SCIENCES CORPORATION TR-9033-2 TASK ORDER 33 SOFTWARE MEASUREMENT PROCESS Evaluation of Software Measurement Processes and Software Measurement Support Technical Report D T IC S3 May 1989 ELECTE JUN2O0 19891 Prepared Under: S Contract No SD1084-88-C-0018 Prepared For: STRATEGIC DEFENSE INITIATIVE ORGANIZATION Office of the Secretary of Defense Washington, D.C. 20301-7100 Approved lot pubiUc releasl | Disul~buuon unlirznd I! Prepared By: SPARTA, Inc. I Teledyne Brown Engineering The Analytic Sciences Corporation I THE ANALYTIC SCIENCES CORPORATION 1700 North Moore Street Suite 1800 Arlington, VA 22209 89 6 16 :3 9 c9. o-- To ow

Upload: others

Post on 22-Aug-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

TR-9033-2

TASK ORDER 33

SOFTWARE MEASUREMENT PROCESS

Evaluation of Software Measurement Processesand

Software Measurement SupportTechnical Report D T IC

S3 May 1989 ELECTE

JUN2O0 19891

Prepared Under: SContract No SD1084-88-C-0018

Prepared For:

STRATEGIC DEFENSE INITIATIVE ORGANIZATIONOffice of the Secretary of DefenseWashington, D.C. 20301-7100

Approved lot pubiUc releasl |Disul~buuon unlirzndI!

Prepared By:

SPARTA, Inc.I Teledyne Brown EngineeringThe Analytic Sciences CorporationI

THE ANALYTIC SCIENCES CORPORATION1700 North Moore Street

Suite 1800Arlington, VA 22209

89 6 16 :3 9 c9. o-- To ow

Page 2: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

UNCLASSIFIEDSECURITY CL.ASSIFICATION OF THIS PAGE '

Form Approved

REPORT DOCUMENTATION PAGE OMBo. 070-18

Ia. REPORT SECURITY CLASSIFiCATION 1b. RESTRICTIVE MARKINGSUNCLASSIFIED

2a. SECURITY CLASSIFICATION AUTHORITY 3. DISTRIBUTION /AVAILABILITY OF REPORTApproved for Public Release

2b. DECLASSIFICATION/ DOWNGRADING SCHEDULE Distribution Unlimited

4, PERFORMING ORGANIZATION REPORT NUMBER(S) S. MONITORING ORGANIZATION REPORT NUMBER(S)TR-9033-2

6.1 NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATIONThe Analytic Sciences (If applicable)Corporation (Prime Contractor) Strategic Defense Initiative Organization

DR $ 4ty,'ta rad Z Coe)7b. ADDRESS City State, and ZIP Code). ,Soore adeet Room LE149

Suite 1800 The Pentagon

Arlington, VA 22209 Washington, D.C. 20301-7100

8a. NAME OF FUNDING/ SPONSORING 8b. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBERORGANIZATION (if applicable)

StrategiczDefense Initiativeurganization

Bc. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERSRoom 1E149 PROGRAM PROJECT TASK IWORK UNITThe Pentagon ELEMENT NO. NO. NO ACCESSION NO.

Washington, D.C. 20301-7100 1A

11. TITlE (include S urity Clamfication)Evaluation of Software Measurement Processes and Software Measurement Support (U)

12. PERSONAL AUTHOR(S)

13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF REPORT (YearMt,ay) IS PAGE COUNTFROM 5Dec88 TO 3May89 1989, May, 3 125

16. SUPPLEMENTARY NOTATION TASC Report No. TR-9033-2

Prepared by SPARTA, Inc, Teledyne Brown Engineering, and The Analytic Sciences Corporation

17. COSATI CODES 18. SUBJECT TERMS (Conthnu. on revern if necesay and intiy by bock number)

FIELD GROUP SUB-GROUP SDS SOFTWARE, EVALUATION, MEASUREMENT PROCESS METRICS,MEASUREMENT PLAN

A 1 BSTRACT (Continue on reverse if noce end de bloc mr,elhis report presents an evalua on o Current soZtware measurement proc es and softwaremeasurement support tools. The applicability and requirements for the use these tools toSDS softward development were identified. The evaluation was performed in fo owing steps:(1) collect-ion of extensive metrics data from industry, Government and academic urces, (2)evaluaton and analysis of the collected information for specific relevancy within soft-ware domain applications, (3) review of tools databases and product information literature to

identify potential metrics tools applicable to SDS Software support, and (4) identification of

deficiencies or domain limitations of existing methods and models.

The results of our review of available and ongoing metrics program reveal that a metrics

methodology is needed for the effective use of metrics and each major system, development/acquisition must develop and tailor its own metrics program. There is a need to emphasizepredictive measurands in the earlier phases of the life cycle. ( )

20 DISTRIBUTION I AVAILABIUTY OF ABSTRACT 21. ABSTRACT SECURsTY CLASSIFICATIf)N

~UNCLASSFI D -. 0 DArrIE!, ~ ~. ~~~, . -OT1C USERS UNCLASSIFIF" ..

22.. A RESPgNSIBLEIoIV OUE 22b. TELEPHONE (iciude Area Code) 22r. OFFICE SYMBOL( It- - 202) 693-1600 SDIO/SIPI

DD Form 1473, JUN 84 Previo ons are obtoete. SECURITY CLASSIFICATION OF THIS PAGE

UNCLASSIFIED

Page 3: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

1 TABLE OF CONTENTS

3 Page

LIST OF TABLES iv

E LIST OF FIGURES v

LIST OF ACRONYMS vi

EXECUTIVE SUMMARY ES-1

I 0.0 INTRODUCTION 1

1. REVIEW OF EXISTING METRICS 1-11.1 Literature Survey 1-1

I 1.1.1 Framework Discussions 1-11.1.2 Management and Quality Indicators and Factors 1-31.1.3 Management Methodology 1-31.1.4 Reliability 1-51.1.5 Aggregate Indicators vs. Simple Measurements 1-61.1.6 Validation and Refinement Artic 1-61.1.7 Summarizing the Validation Literature 1-8

1.2 Databases 1-91.3 Cross Referencing Models By Attributes And Parameters 1-10

1.3.1 Software Quality Attribute Definitions 1-111.4 Screening Inappropriate Software Metrics 1-161.5 Mapping Software Metrics To Types And Processes 1-16

1.5.1 Ranking Attributes by Software/Process Types 1-161.5.2 Subfunction Metrics Applicabilities 1-161.5.3 Development Process Applicability 1-16

1.6 Life Cycle 1-201.6.1 Life Cycle Models 1-201.6.2 Life Cycle Experience Classification 1-261.6.3 Summary 1-28

I 2. ANALYTICAL EVALUATION OF SOFTWARE METRICS 2-12.1 Feature Analysis 2-1

I 3. EXPERIENCE ASSESSMENT 3-13.1 The Naval Underwater Systems Command (NUSC) 3-13.2 Naval Surface Warfare Center - Dahlgren 3-23 3.3 IBM at NASA-Johnson Space Right Center 3-2

4. CAPABILITIES, LIMITATION AND DEFICIENCIES OF MODELS 4-14.1 General Limitations 4-14.2 Reliability Models 4-14.3 Complexity/Maintainability 4-2

I- * • .. . ill I I ImI I si"

Page 4: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

STABLE OF CONTENTS (Continued)

I Page

4.4 Efficiency Models 4-34.5 Efficiency/Productivity Models 4-34.6 Integrity/Security Models 4-34.7 Portability, Usability, Reusability 4-34.8 Composite Metrics 4-4

5. AVAILABLE SOFTWARE METRIC SUPPORT TOOLS/ENVIRONMENTS 5-15.1 Existing Tools and Environments 5-15.2 Analytic Evaluation of Tools/Environments 5-2

5.2.1 Software Management and Reporting Tool (SMART) 5-35.2.2 Statistical Modeling and Estimation of Reliability

Functions for Software (SMERFS) 5-55.2.3 McCabe Complexity Tools 5-55.2.4 Software Engineering Cost Model (SECOMO) 5-65.2.5 Revised Enhanced Version in COCOMO "REVIC" 5-7

5.3 Experience Assessments 5-75.4 Requisite Environment/Toolset Features 5-8

References REF-1

U Appendix A Proposed IEEE Metrics Terminology A-1

Appendix B Metrics Ranking Tables By Subfunction B-1

Appendix C Broad Selection of Software/Environment Support Tools C-1

Appendix D Classification Tools Matrix D-iAcces ionr ForNT! S CRAMI

DTIC TABI Unarinoijrc;-d u

By .. ... ..... ..

6 OtI I I I ibI'I i I

Av i;;ibi:ty (Cod0s

TA~ SPARTAIllI

I Cowie= TO 100

Page 5: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

3 LIST OF TABLES

Table PageNo. No.

1.3-1 Attributes for Time Between Failure Models 1-12

1 1.3-2 Attributes for Complexity Models 1-12

I 1.3-3 Attributes for Failure Count Models 1-13

1.3-4 Attributes for Fault Seeding Models 1-13

5 1.3-5 Attributes for Input Domain Based Models 1-14

1.3-6 Attributes for Productivity Models 1-14

1 1.3-7 Parameters Associated with Metric Classes 1-15

1.5-1 Software Functions with Ranked Attributes 1-17

1.5-2 Deriving Metrics Rankings from Subfunction Attributes 1-18

I 1.5-3 Metrics Ranking Derivation Rules 1-19

1.6-1 Comparing MIL-STD-2167A & RADC-TR-87-171 1-22

1.6-2 Comparing MEL-STD-2167A & Waterfall Phases 1-22

1.6-3 Comparing RADC-TR-87-171 & Waterfall Phases 1-23

1.6-4 RADC Life Cycle 1-24

i 1.6-5 Software Reliability Measurement Model 1-25

1.6-6 Software Metrics Experience Classification 1-27

I5-1 Overview of "SMART" Software Data 5-3

5-2 List of Sample Graphics for "SMART" 5-4

II

II

Page 6: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATION1

3 LIST OF FIGURES

I Figure PageNo. No.

I 1-1 Refined Software Quality Metrics Framework 1-2

1-2 Software Management/Quality Indicators and Factors 1-4

I,IU

II

I

I 4', , ,A, a AMVA f stIv

Page 7: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONU

3 LIST OF ACRONYMS

I AMMCOM U.S. Army Armaments Command

CSC Computer System Component

CSCI Computer System Configuration Item

IDA Institute for Defense Analyses

IEEE Institute for Electrical and Electronics Engineering

I ISEC U.S. Army Information Systems Engineering Command

N-SITE Near-Term System Integration, Test and Evaluation

RADC Rome Air Development Center

I SDC Strategic Defense Command

SDIO Strategic Defense Initiative Organization

I SDLC Software Development Life Cycle

SDS Strategic Defense System

SIE Satka (Surveillance, Acquisition, Tracking and KillAssessment) Integrated Experiment

I SOA State of the Art

SOW Statement of Work

SPO System Program Office

E TASQ Technology for the Assessment of Software Quality

USNPGS U.S. Naval Post-Graduate School

I V&V Verification and Validation iI

I

Page 8: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATIONU3 EXECUTIVE SUMMARY

I OBJECTIVES

The work presented in this report was performed under Subtask 2 and 3 of Task Order 33 ofthe SDIO Systems SETA contract. The purpose of these subtasks was to evaluate current softwaremeasurement processes and to evaluate current software measurement support tools. TeledyneBrown Engineering was the technical lead on the subtasks with support from Sparta and TASC.

The subtasks consisted of the following activities:

o collection of extensive metrics data from industry, Government and academic5 sources;

o evaluation and analysis of the collected information for specific relevancy within theSDIO software domain applications;

o review of tools databases and product information literature to identify potentialmetrics tools applicable to SDI software support;

o identification of deficiencies or domain limitation of existing methods and models.

I CONCLUSIONS

There are several major conclusions from this subtask. The results of our review of availableand ongoing metrics programs reveal that a metrics methodology is needed for the effective use ofmetrics and each major system, development/acquisition must develop and tailor its own metricsprogram. SDI is no exception. There is a need to emphasize predictive measurands in the earlierphases of the life cycle (i.e., requirements and design). In the near term, available metrics toolpackages can be halpful, but greater use of formal notation to support requirements and designsynthesis is required if metrics information rigor and application is to occur in a much moreE disciplined and predictive manner.

IOPEN ISSUES

The selection of application data used with metrics models must be carefully selected, scaledand scoped. Formal metrics models for use in the early life cycle phases must be developed. Inturn, a life cycle model, iterative in nature, must be identified that can successfully be used for SDIdevelopment. Without a proper life cycle and metric requirements model, the state-of-the-art inpredictive metrics will continue to lag behind post-facto metrics for some time to come. Predictivemetrics are essential if higher productivity yields and better quality reusable software is to become areality.

U i PA rA

ES- I %aUSa"DIA IWM OI

Page 9: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES COR:ORATION

00. INTRODUCTION

This report provides an evaluation of current software measurement processes and toolscurrently used in the Government, industry and academia.

Section 1 lists, summarizes and analyzes the metrics models from available databases andsoftware documents. Section 2 presents an assessment of how each applicable software metric canbe applied within each SDS software subfunction. For each candidate soaware metric identified, aI validation analysis summary will be presented. Section 3 presents the results of analyzing the fieldexperience with two initiatives highlighted. Section 4 identifies capabilities and limitations incurrent methods. Section 5 presents a literature and database survey identifying existing metricstools and environments.

The collection of metrics data proved to be both skewed and elusive. Skewed in the sensethat much of the existing metrics information that is found to exist and is formalized via models andI mathematical relationships is encountered late in the life cycle (i.e., occurring in the implementationphase and beyond). Elusive due to the fact that early life cycle (predictive) metrics are virtuallynon-existent, and when identified, have no formal foundations (mathematical or relational) tosupport them for the most part. Furthermore, consensus is still involing on the relative value ofspecific metrics (e.g., complexity) amoung metrics "experts". Essentially each program mustestablish and implement its own metrics and quality program to derive maximum benefit from it.

A section on life cycle models (1.6) has been incorporated into this report. This section isintended t3 support the need for a new life cycle model consistent with the MIL-STD-2167Arequirements. It is also intended to provide further insight on the need for metrics emphasis andtheir relationships in the initial (early) life cycle phases, and the development of more formalpredictive metrics. Such emphasis and identification appears to provide a key component toachieving higher productivity and quality thresholds.

Some of the data collected and conclusions arrived at, while disappointing due to the state-of-the-metric-art, are not surprising and confirm expectations. However, the life cycle section,together with the evidence identified in section 2.1, giving support to a multi-attribute, compositeIIor vector metric representation presents a potential breakthrough in the SOA. The latter metricsform is the subject of the 1990 International Conference on Metrics being held in the United States3 for the first time in several years.

With respect to a tools/environment database, the Army's TASQ database contains a wealthof information as evidenced by Appendix C and D, and has served as an extremely valuable sourceof information, representing the latest and most extensive survey in the marketplace.

This document is driven by the software measurement requirements established in the TaskOrder 33, TR-9033-1, and is consistent in the use of the software domain and functionidentifications of that document. The document also extends and complements that information ofTR-9033-1.3

I fr-,,TAISC .8 "1 A

siInot

Page 10: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

I 1. REVIEW OF EXISTING METRICS

SThe overriding theme for this survey and evaluation of applicable software metrics is topoint up practical tools for acquisition and developmental program managers to make appropriateestimations and useful assessments on real-world problems. As Dr. J. Short of the NavalS Underwater Systems Command indicated, "we get results from our metrics or we change them."

1. 1 LITERATURE SURVEY

Priortizing the metrics literature reading of a project manager or V&V planner is a first stepin managing a practical metrics program. The key concepts of quality and productivity factors andcriteria within the project life cycle and framework provide the fundamentals for solid planning.

1.1.1 Framework Discussions

An understanding of the structure of the software metrics is facilitated by the definition ofthe software quality metrics framework. Though introduced by [RADC8502], the IEEE draftStandard for Software Quality Metrics has refined the framework.

The quality metrics framework provides an open-ended, hierarchy for organizing theconceptual elements of the quality metrics domain. These elements are quality attributes, criteriaand measurements suitable for organizing quality control rooms and management tracking. TheIEEE refinements include the concepts of "direct metrics" (for quality factors like reliability) andsubfactors" (e.g., for correctness). (Refer to Figure 1-1)

I [EEE (draft) P1061]Standard for a Software Quality Metrics MethodologySection 3 presents management-oriented objectives.Section 4 presents the refined Software Quality Metrics

Framework.

A good, comprehensive graduate text on software quality metrics was written by S.D.Conte, H.E. Dunsmore and V.Y. Shen in a collaborative effort between Purdue University and theMicroelectronics and Computer Technology Corporation (MCC). The authors present manymeasurement and analysis examples.

I lCONT861Software Engineering Metrics and Models, Benjamin Cummings, 1986.

I Also, more particular to SDIO and the SDS, the (draft) IDA paper P-2132 (draft) [IDA88121present a terser introduction that would be suitable for an SDS management pamphlet.

[IDA P-21321SDS Software Testing and Evaluation: A Review of the State-of-the-Art in Software

Testing and Evaluation, Chapter 6. "Software Measurement Technology" present a current review3 of the state of the art of software quality metrics methodology.

I,TAISRJA

1-1"

IA

Page 11: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

I

I REFINED SOFTWARE QUALITY METRICS FRAMEWORK

* SOFTWARE QUALITYOF SYSTEM X

I_IIFACTOR FACTOR jFACTOR

3 Direct Metric Direct metric Direct Metric

I SUB-FACTOR SUB-FACTOR SUB-FACTOR

IEEE P1061 (DRAFT)

I FIGURE 1-1

II TA. A

1-2 1 Wr, ,3 T I

Page 12: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

3 Chapter 7, "Software Reliability Assessment Technology" presents a concise, yet relevantdiscussion of the potential shortfall in current reliability metrics for the SDS software metricsrequirements.

I [RADC-TR-85-37,3 vols]Specification of Software Quality Attributes. Vol. 1 introduces the framework and

concepts of quaJ;y specification and evaluation. Volume 2 is a detailed guidebook forspecification. Votume 3 is the evaluation guide with sample checklists and worksheets.

[RADC-TR-87-171 v]Methodology for Software Reliability Prediction and Assessment, 3.1 "Software Quality

Measurement Framework" includes the original framework discussion with definitions of allreliability concepts.

U 1. 1.2 Management and Quality Indicators and Factors

The Air Force pamphlets AFSCP-800-14 and AFSCP-800-43 present a solid introductionto management (performance) and quality indicators. The pamphlets correlate the management andquality indicators, and then each quality indicator to its applicable software quality factors (asintroduced by [RADC8502]. One disturbing statement "...there are no widespread toolsavailable..." depends on the writers concept of "widespread", and is misleading -- tools areavailable.

The quality indicators pamphlets [-800-141 buries an important concept for managementamong two graphs and a chart:

4-13.... the degree of management insight can be significantly increased by using thesoftware quality indicators. This combination of management and quality indicators should enablethe contractor and the SPO to manage software development activities actively instead of reacting tosoftware crises as they arise...

I Note: The Army has replicated the Air Force pamphlets with few variations.

I[AFSCP-800-141 Software Quality Indicators[AMCP-70-141

[AFSCP-800-43J Software Management Indicators

[AMCP-70-13]

Figure 1-2 presents the correlations of factors and indicators as per the pamphlets.

I 1.1.3 Management Methodology

As their draft evolves, Norman Schneidewind (USNPGS) and others on the IEEE QualityMetrics Standard Committee have emphasized the development of an extensible metricsmanagement method for each software acquisition or evaluating organization. The approach takenby Dr. John Short and his metrics staff at NUSC is similar, emphasizing clear, well-understoodmetrics objectives, planning at the single project level with SOWs tailored to meet life cycleassessments points. The Air Force methodology guide emphasizes iterative planning, predictingand assessment. [IEEE(d)P1061], [RADC878A/B]

1 -3 1osU1-3 I UTA TL-

•~~~~CWM TO ,, I IIII Iim

Page 13: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONU

ISoftware Management and Quality Indicators [AFSCP 800-141

Quality Management IndicatorsIndicators CRU SDM RD&S SPDT C/SD SDT

I Completeness X X X X X XDesign Structure X X X XDefect Density X X X XFault Density X X X XTest Coverage X X X X

m Test Sufficiency X X X X XDocumentation X X X X

CRU = Computer resource utilizationSDM - Software development manpowerRD&S = Requirements definition & stabilitySPDT. Software progress - development & testC/SD - Cost/Schedule DeviationsSDT - Software development tools

U Software Quality Indicators and Factors [AFSCP 800-14]

I Quality Software Quality FactorsIndicators Corr Effc Flex Intg Into Main Port Reli Reus Test Usab

I Completeness X X X XDesign Structure X X X XDefect Density X X X X XU Fault Density X X X X XTest Coverage X X X XTest Sufficiency X X X X

m Documentation X X X X X

Corr Correctness Main - MaintainabilityEffc = Efficiency Port a PortabilityFlex = Flexibility Reli - ReliabilityIntg = Integrity Reus - ReusabilityInto - Interoperability Test - Testability

Usab - Usability

Figure 1-2. Software Management/Quality Indicators & Factors

.. .. i sI

1-4 1WWOmA"D UCA 1UD"

Page 14: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

I (Further discussion of Life Cycle inportance is in Paragraph 1.6. The NUSC environment

is discussed in Paragraph 3.1.)

1. 1.4 Reliability

As Musa, lannino and Okumoto introduce, "Reliability is probably the most important ofthe characteristics"...of software quality. They define reliability traditionally "the probability thatthe software will work without failure for a period of time to meet customer requirements." Thissubsumes many properties of software quality (correctness, friendliness, safety,...) but excludesmodifiability, readibility (maintainability), which are less quantifiable.

[MUSA87]Software Reliability - Measurement, Prediction, Application, McGraw Hill, 1987.

This comprehensive text provides practical measurement application guidance problems andsolutions for managers and practicing software engineers, as well as the theoretical discussionsneeded for a college course. It uses the traditional hardware reliability approach taking advantageof the body of systems and control reliability knowledge. There is a reliance on probabilisticprojections using random/stochastic techniques.

A key point about this approach: It is based upon unpredictability of programmer errorsand unpredictability of execution conditions complicated by machine states. This does not accountfor the "known error" category (non-fatal, moderately reducing faults). Random implies

"unpredictable" vs. "uniform". The use of techniques based upon this randomness assumptionshould not preclude other techniques which would presume some predictability about the failures,faults or defects.

Some further points made by John Musa in his articles should also be noted:

a. Reliability based on failure statistics is user oriented, more significant and suitablefor setting (user-oriented) objectives (for prudent business services and typicalinformation systems).

I b. Defect/correction based quality measures are only significant in terms of contractorperformance toward target goals.

Other key works on software reliability are from the Air Force RADC and the IEEE. TheRADC work provides detail profiling of reliability and testing measurements. The more recentwork of the IEEE Software Reliability Measurement Working Group is providing a refineddictionary and guidebook for reliability, including detail directory lookup for individual softwaremetric parameters, a cohesive notation, instructions and evaluation references.

TugI

TAISCIt AI1 -5AI "a LiN . Wrn

....... ~~~~~~~OWM TO=="'=mlnllmmln IM [

Page 15: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

II THE ANALYTIC SCIENCES CORPORATION

i JRADC-TR-87-1711Methodology (Vol. 1) and Guidebook (Vol. 2) for Software Reliability Prediction and

Estimation

I IEEE 982.1](Draft) Standard Dictionary of Measures to Produce Reliable Software

[IEEE 982.2](Draft) Guide for the Use of the Standard Dictionary of Measures to Produce Reliable

Software

As previously mentioned, the IEEE Quality Metrics Standard Committee is providingstandard methodology with appendices of measurements, experience reports and validation guides[IEEE P1061].

1.1.5 Aggregate Indicators vs. Simple Measurements

I The Air Force pamphlets, [CARD8707] and [GOIC89RC] indicate that primitive measuresare to be aggregated with weighting factors based upon the relative value of the functions ormodules effected when building the quality indicators. In the Waterfall SDLC these weights can bederived by requirements prioritization at the top level, and then functional factoring as therequirements are delineated. However, in a prototyping development, functional and performancegoals may be used to establish relative system functional values.

Apart from the goal-oriented weights, the characteristics of the software architecture affectthe relationship between software elements. John Musa has recently alerted us that the criticality offunctions within the evolving system software architecture is a paramount consideration whenconsidering software reliability, especially in regard to high-use servo functions or boundarycontrol elements (e.g., Three Mile Island fault).

1.1.6 Validation and Refinement Articles

The technical literature is primarily concerned with the detail validations of particularreliability and maintainability assessment measures and more recently has been focusing upon thevalidity of predictions. There seems to be agreement among metrics experts that users shouldcarefully plan and build understanding of the unique characteristics of their software andenvironment. Most of the software measures including balances or weights for developmentexperience and people factors have shown to be more significant than unadjusted measures of thesoftware alone,.opeiy pouciiyeit

3 We observed that the complexity and productivity models and tools do exist, but theirinterpretations and validations vary. Not much validation has been done on maintainabilityalthough tools do exist. Other issues such as software integrity, portability, usability, andreusability lack tools as well as validations. There are numerous validations and reliability models.

Some of the more significant results reached are listed below:

[TAKA8901] The results of this article indicate that models based on factors such as frequency ofprogram specification change, programmer's skills, and volume of program design document aremore reliable than conventional error prediction methods based only on program size.

1-6 we in om nn coeI

Page 16: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATIONI[CARD8812] The results of this article indicate that complexity indicators based on structural(intermodule) complexity and local (intramodule) complexity seemed to be a useful tool forevaluating design quality before committing the design to code.

I [LEW88J11 The results of this article indicate that complexity measures are useful in quantifyingthe design and provide a guide to designing reliable software.

[BOEH8810] Barry Boehm and Philip Papaccio present some cost and quality observationsspecifically applicable to planned rapid deployment prototyping and other hybrid life cycle notions:low quality - low reliability software costs much more to maintain; high quality software tends toreduce its costs. High quality and low costs are promoted by personnel incentives, workenvironment and tool enhancements, and by software reuse with low rework. The authors cite useof their database of projects to validate the COCOMO model(s).

[WEIS88IO] Weiss and Weyuker established an extended domain model of software reliability andused the lannino, Musa, Okumuto rating criteria [IANN84]. The significance is that the notion oftolerable discrepancies of user service are introduced into a refined definition of softwarereliability, and the domain of program performance is enforced. This work was analytical andpreliminary and needs followup, but promises applicability to large systems like the SDS, wherethe criticality of failures, the severity of defects, and the tolerance levels should be factored intoreliability measures. The second author, Elaine J. Weyuker has independent software reliabilitywork. Both were at NYU.

[DAVI8809] This short article analyzed the applicability of complexity measures (McCabe,Halstead) and lines of code in contrast to a group of new measures which analyze computerprograms in terms of cognitive "chunks" as proposed by Lamergan, Mayer, Schneiderman andSolway (in various papers) "the software psychologists". The validating scope was limited to a setof small Fortran programs. Some significant observations are:

a. Halstead's length oriented "E" measure is more robust than anticipated.

I b. Good debug time predictors also predicted the error counts well.

c. Constructive time for new programs depends more on size than complexity.

d. Maintenance time is strongly related to program complexity and lesser to overallsize.

e. Data structure complexity is more significant to maintenance effort.

3 f. Control flow complexity is more significant to constructive effort.

g. Reinforces the view of Conte, Dunsmore and Shen that early SDLC phase metadatais fairly accurate at defining the actual data complexity developed. [CONTE84OI]

[RAMA8808] This was a successful combination of Halstead's Software Science measures withMcCabe's Cyclomatic Complexity Measure, using a set of weighting factors for the length andvolume primitives. However, the combined results have no greater validity than the simplecombination of separate measures.

TARAT3 1-7 IMIn WT AiWNW=ll/'l Tot sm

Page 17: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATION

I [CARD8712] This article demonstrated that the basic relation of software science lacks empirical,as well as theoretic support. Future models of program construction process should be based moreclosely on observations of what programmers actually do.

[JEFF8707] This article found that the complex relationship between effort (productivity) andelapsed time variation cannot be measured by any of the current time-sensitive cost models.

[PERK8703] This article investigated a Navy-supplied Ada software and showed that anautomatable, hierarchical, Ada-specific software metrics framework is an effective aid forE improving the quality of Ada software.

[ABDE8609] This article examined 10 metrics: Jelinski and Moranda, Bayesian Jelinski-Moranda, Littlewood, Bayesian Littlewood, Littlewood and Verrall, Keiller and Littlewood,Weibull Order Statistics, Duane, Goel-Okumoto Model, and Littlewood NHPP model. The goalof the paper is to present an approach in deciding which is the most appropriate model to use. Theresults showed that there is no "best buy" among the 10 metrics. This is because predictivemeasures perform with varying degrees of accuracy on different software being studied.Therefore, the users need to be able to select and be sure that a chosen prediction metric isperforming well for the type of prediction required.

I [ALBR8411] This article demonstrates the equivalence between Albrecht's methodology toestimate the amount of the "function" or "function points" the software is to perform, andHalstead's "software science" model of "SLOC" measure. It also demonstrates the high degree ofcorrelation between "function points" and the "SLOC" (source lines of code) of the program, andbetween "function points" and the work-effort required to develop the code.

I [BASI8311A] This paper attempts to validate the Halstead's Software Sciences metrics, McCabe'sCyclomatic Complexity and other standard programs measures. The results of this article indicatethat the Software Science metrics, Cyclomatic Complexity and other predictive metrics do notsatisfactorily measure the errors occurred during development, and neither of these metrics reallymeasure or predict effort or quality.

[HENR81] The article showed that the measurement of software quality for large-scale systemsusing information flow to represent the system interconnectivity is an important and viabletechnique.

S t I. 1.7 Summarizing the Validation Literature

[LEW8811] and [CARD8812] agreed that complexity measures are useful in measuringI and quantifying the design effort and provide a guide to designing reliable and high-qualitysoftware. In contrast, [BASI8311A] showed that the software science, cyclomatic complexitymetrics and other predictive metrics do not satisfactorily measure the errors incurred during thedevelopment phase, and neither of these metrics really measures or predicts design effort orsoftware quality. Ramamurthy and Melton IRAMA8808] examined software science andcyclomatic complexity and observed that combining these measures was as effective as making theseparate assessments. They proposed a family of weighted measures which simultaneously detectthe software characteristics that are detected by the software science measures and the cyclomaticnumber. [GIBS8903] studied the effect of system structure/complexity on system maintainability;specifically, what effect do structural differences have on maintenance performance, and are

i I *' SPARTAI~~~T S AMGIl I I l I

Page 18: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATIONIstructural differences measurable? The results indicate that the structural differences do impactperformance and the metrics being validated (i.e., Halstead's E, McCabe's v(G), Woodward's K,Gaffney's Jumps, Chen's MIN and Benyon-Tinker's C2) can be used as project management

i tools.

From the article's results, some conclusions can be made:

a. There is an evolving consensus among complexity metrics validators that theapplication data used with the complexity models must be carefully selected and theconclusions must be scoped and scaled.

I b. The validity metrics models for estimation and prediction requires continualupdating.

c. There is a need to emphasized predictive measurands in the earlier phases of the lifecycle (i.e., requirements and design).

d. The validation process must continue so that metrics can be effectively used tocharacterize and evaluate software products and processes.

e. A metrics methodology is needed for the effective use of metrics and each majorsystem, development/acquisition must develop and tailor its own metrics program.

i 1.2 DATABASES

Technical interchanges and queries have been performed with the following informationsources:

a. The TASQ Repository maintained for the U.S. Army AMCCOM ProductAssurance Directorate in New Jersey. This database contains information onseveral thousand tools, characteristics and companies. The information is availablevia a series of in depth classification matrices and vendor descriptors. Appendix Ccontains selected portions of the TASQ database with potential direct or indirectsupport potential for the SDS. A complete TASQ expansion is separately bound insingle copy as Appendix D (too physically large for reproduction).

b. The Air Force Systems Command Rome Air Development Center at Griffiss Air

Force Base, New York. A number of technical reports from this source arereferenced in the document list. The Data Analysis Center for Software (DACS)contains an extensive database and was extremely productive in obtaininginformation. (See references section)

c. The Institute of Electrical & Electronics Engineering (IEEE) contains an extensiveset of standards, guidelines and draft standards. In addition, other technical IEEEpublications are identified that served as an extensive source of information (e.g.,Transactions on Software, Computer Magazine, Software Magazine, SpectrumMagazine, and others). The Association of Computing Machinery (ACM) providedadditional metrics sources and related information; the "Communications of theACM" publication served as one of the source documents. (see references section)

I 1.9iIUUIflIIWIWII I I

Page 19: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

II THE ANALYTIC SCIENCES CORPORATION

d. Portland State University's Metrics database and Dr. Wayne Harrison providedmuch assistance in acquiring information. (see references section)

e. George Mason University's Metrics repository and Dr. Ambrose Goicoecheaprovided direct assistance in acquiring metrics information. Professor Goicoecheawas also instrumental in acquiring European surveys on the state-of-the-practice inmetrics. (see references section)

f. The NASA Goddard Space Flight Center Software Engineering Laboratory and theUniversity of Maryland have an extensive amount of metrics information. Meetingshave been held with NASA's Frank McGarry, and University of Maryland (Dr. M.

Zelkowitz). The Research Institute in Computer and Information Sciences(RICIS), University of Houston (Dr. C. McKay) was also contacted. RICIS iscurrently engaged in Mission and Safety Citical software initiatives with NASA'sJohnson Space Center, Houston on the Space Station Software SupportEnvironment (SSE). Software environment concerns for the SSE are similar tothose that are expected for the SDI's (see references on Basili (Univ. of Md.),

g. The Naval Surface Weapons Center (NSWC) at Dahlgren, Virginia provided detailsabout the Statistical Modeling and Estimation of Reliability Functions for Software(SMERFS) and summary data on Navy software support tools.

h. The Naval Underwater Systems Command (NUSC) provided descriptive dataabout their management methods for successful software metrics application.

I i. The Headquarters Command at Kirtland AFB described their enhanced productivitymodel "REVIC" (outlined in Section 5).

I 1.3 CROSS REFERENCING MODELS BY ATTRIBUTES AND PARAMETERS

The following tables (1.3-1 through 1.3-6) summarize the metric models identified in theliterature. Each table groups a set of models by its type of measurement employed. Currently thegroups fall into six measurement categories. The categories are: 1) Time Between Failures, 2)I Complexity, 3) Failure Count, 4) Fault Seeding, 5) Input Domain Based, and 6) Productivity.Each table indicates what specific software attributes are addressed by each model in the table.This identification establishes the framework for the mapping of software metrics to softwaretypes, processes, and domains or SDS subfunctions. Five of the six tables use the same attributeset; while the table on productivity uses an entirely different set. The models dealing withproductivity, while concerned with performance and design attributes, are focused on projectmanagement issues such as cost, schedule, and the overall development process. For this effort,the software metric models addressing the nine attributes identified in will be covered. In additionto identifying metric models, this section also contains a table (1.3-7) listing the most commonlyused parameters in the construction of software metrics.

I The majority of tables use nine software attributes. These are the attributes identified inSubtask 1, TR-9033-1. Thirteen attributes (factors) were originally classified by Rome AirDevelopment Center (RADC); however, based on familiarity with SDI component software

Tat

1-10 I IE

Page 20: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

requirements several attributes were combined to form the remaining set of eight. In addition, a

ninth attribute was proposed in TR-9033-l. This attribute "throughput" addresses the userconcerns of computational and communication throughputs. The following alignment shows thecurrent nine attributes and the previously defined attributes that have been combined.

CURRENT ATITRIBUTE OLD AlTRI1BUTE

1. Reliability Reliability2. Survivability Survivability3. Integrity Integrity4. Efficiency Efficiency5. Maintainability Maintainability

CorrectnessVerifiabilityFlexibilityExpandability

6. Usability Usability7. Portability Portability8. Reuse Reusability

Reuse9. Throughput None

1.3.1 Software Quality Attribute Definitions

The following definitions are included here for completeness.

Reliability. The probability that software will not cause the failure of a system for a specifiedtime under specified conditions.

Survivability. The built-in capability of the software to perform its required function when aportion of the system is inoperative.

Integrity. The degree to which the software controls unauthorized access to, or modificationof, system software and data.

Efficiency. The degree to which the software performs its intended functions with minimumi consumption of computer time and storage resources.

Throughput. The degree to which the software demonstrates its capability to process dataidentified as computational and/or communications related.

Maintainability. The effort required to locate and correct an error in the software.

* Usability. The effort required to learn the human interface with the software, to prepareinput, and to interpret output of the software.

Portability. The effort required to transfer the software from one hardware or softwareenvironment to another.

8.. - -n

Page 21: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATIONII

Table 1.3-1 Attributes for Time Between Failure Models

ATTRIBUTESMODEL Performance Design

Reli Surv Intg Effc Thru Usab Main Port Reus

Time Between Failure--------------------------------- ------ ---------- ---

Jelinski/Moranda X XSchick/Wolverton (Linear) X XSchick/Wolverton (Parabolic) X XMoranda (Geometric De-eut) XMoranda (Hybrid Geomet Poiss) XGoel/Okumoto XLittlewood/Verrall XLloyd/Lipow X

U Table 1.3-2 Attributes for Complexity Models

ATTRIBUTES3 MODEL Performance DesignReli Surv Intg Effc Thru Usab Main Port Reus

Complexity3-------------------- ----.--- -------------- ------- ------- -------------- -------Halstead X XMcCabe X XWoodward (Knot Counts) X XChen (Nested Decision Stints) X XGaffney X XBenyon-Tinker X XGilb's (Binary Decision) X XChapin's 0 X XSegment-Global Usage Pair X XMyees (McCabe extension) X XHansen's (McCabe/Halstead) X XOviedo's (Data/Ctri Flows) X X

Effo - Efficiency Reus = ReusabilityIntg - Integrity Surv - SurvivibilityMain , Maintainability Thru . ThroughputPort Portability Usab - UsabilityReli - Reliability

TA i nIR1-12 '

Page 22: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONm

ITable 1.3-3 Attributes for Failure Count ModelsATTRIBTES

MODEL Performance DesignReli Surv Ing Effc Thru Usab Main Port Reus

Failure Count---------------------------------------

Goel/Okumoto XSchneidewind xGoel XMusa XShooman xMoranda X %Jelinski/Moranda X %Moranda X %Goel/Okumoto (Gen Poisson) X %Brooks/Motley X %IBM Poisson X %

% Each model contains a different representation for a hazard3 function that may be adapable to support survivability

Table 1.3-4 Attributes for Fault Seeding Models

3 ATTRIBUTESMODEL Performance Design

Reli Surv Intg Effc Thru Usab Main Port Reus

Fault Seeding-- --- ------- ------------------------- 'I------I Mills X

IEffc - Efficiency Reus -Reusability

Intg - Integrity Surv - SurvivibilityMain = Maintainability Thru - ThroughputPort - Portability Usab UsabilityReli - Reliability

13 -1

Page 23: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

3 Table 1.3-5 Attributes for Input Domain Based Models

ATRB UT ESMODEL Performance I Design

Hell SUNv Inig Efic Thru Usab Main Port Reus

Input Domain Based--------------------- ---- --- --- ------------- --- ------------ ----Nelson XHo XRamnamoorthy/Bastani X

Efic = Efficiency Reus -Reusability

lntg - Integrity SUN - Survivibility3Main Maintainability Thru -Throughput

Port -Portability Usab -Usability5 Reli - Reliability

I Table 1.3-6 Attributes for Productivity Models

MODEL ATTRIBUTES

Productivity Size Cost Comp Devp

ISoftware Size XPersonnel XVolatility X .xN ~ ~Resource Utilization xComplexity x xSchedule Progress X X X XDesign Progress X£CSU Development Progress X X XTesting Progress x xIncremental Release Content X X X X

U Size - Size of the ProgramCost = Total CosytComp .Completion Date

Devp - Effect of the Development Process

IR£1-14 coar o I=o

Page 24: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I THE ANALYTIC SCIENCES CORPORATION

Table 1.3-7 Parameters Associated with Metric Classes

METRIC CLASSEST C F F I

PARAMETERS B P C S DF X B.....--------------------- . ..---- ..----- ..--------------------... ---- ------- --.-- o - o ---- .-- ----. -- -- - - -

Observed time between failures XCalculated hazard function XSubjective Random Variable XCum No. of Observed failures X XNo. of faults detected during a time period X XNo. of faults seeded into program XFailure rate X XExpected No. of faults to be detected X XAmount of execution time XNo. of control transfers X XNo. of instructions in the program X XDebugging time since time of integration XNo. of faults corrected XNo. of entries/exits per module XSoftware science measures XDesign structure XData flow complexity XRequirements traceability XSoftware documentation XNo. of operands & operators X"Binary" decisions in program logic XNo. of external interfaces XSegment use of global variables XNo. of intersecting control statements XNo. of compound predicates XInputs by subdomains X

Metrics Classes:I TBF = Time Between Failures

CPX - ComplexityFC - Failure CountFS - Fault SeedingIIDB - Input Domain Based

Ii r~1-1

I

Page 25: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

£ Reusability. The degree to which the software can be used in multiple applications.

1.4 SCREENING INAPPROPRIATE SOFTWARE METRICS

A review of the metrics contained in paragraph 1.3 reveals that no single metric is capable ofsupporting decisions across an entire software domain as defied in Subtask 1, TR-9033-1. Eachmetric supports measurements in one or two software attributes; e.g., Reliability, Maintainability,etc. The implication here, is that perhaps several metrics will need to be combined to form acomposite model capable of supporting decisions on the 36 SDS subfunctions or the definedsoftware domains. The analysis concludes that no available metric should be dropped fromconsideration.

3 1.5 MAPPING SOFTWARE METRICS TO TYPES AND PROCESSES

Each of the software metrics classes presented is mapped into its applicable major functionand subfunction, using the attribute rankings developed in TR-9033-1.

1.5.1 Ranking Attributes by Software/Process Types

First, the characteristics of each subfunction were studied to determine the relative weights ofthe attributes. This analysis was performed in Subtask I and is explained in SDIO Task 33, TR-9033-1 SDS Software Measurement ReQuirements, Para. 2.3 "Characteristics of Software." Thefollowing characteristics were postulated: a) Criticality, b) Embedded vs. general purpose, c)Space/Ground based, d) Life cycle, e) Algorithmic Content, f) Size, g) Risk, and H) Intended use.

Then, in TR-9033-1, Para. 2.4, the characteristics of each subfunction were used to5determine the attribute rankings for each SDS Subfunction.

Table 1.5-1 summarizes the subfunction attribute rankings, concentrating upon the "high"

3 and "medium" rankings; the remaining subfunction-attribute correlations were all "low."

1.5.2 Subfunction Metrics Applicabilities

I3 The applicability of the metrics classes and models to each of the quality attributes waspresented in Para. 1.3 and Tables 1.3-1 through 1.3-6. These applicabilities are used as a basis toderive the subfunction metrics applicabilities.

Table 1.5-2 presents the extrapolations to metrics classes from the attribute.and applicability

rankings, and Table 1.5-3 presents the derivation rules used.

1 1.5.3 Development Process Applicability

Prior to the analytical evaluation of the metrics in regard to the subfunction requirements,(presented in Para. 2) we need to consider how the development process can affect metricsapplicabiities. The possibility of employing off-the-shelf and preused software componenetspresents a different software metrics environment than the single-vendor prototype concept, andthe traditional requirements-based, multi-phased development.

DI1-16 N F \, J ii %

I

Page 26: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATIONI3 Table 1.5-1. Software Functions with Ranked Attributes (from TR-9033-1)

Major Function Subfunction High/Med Applicable AttributesNo. Title No. Title Reli Surv Intg Thru Usab Main Port Reus Efic

1 Detect1 Plumes H H H H M2 ColdbodiesH H H H3RF H H H H M M M

4 Resolve Obj. H H H M MS Discr H H H M M6 Assess Kills H H H M M M

3 Track7 Correlate H H H H M8 Initiate H H H H M9Estimate H H H H M M

10 Predict I&l H H H M M4 Communicate

11 Interplatform H H H H M H12 Ground-Space H H H M M M M13 Ground M M M M M H M H

S Assess 14 Threat H H H15 SDS H H H

6 Wpn Contrl16 SBI Asgn&Ctrl H H H H M17 GBI Asgn&Ctrl H H H M H M18 SBI Guide&Ctrl H H H H H19 GBI Guide&Ctrl H H H M H M

7 Platfm Mgt20 Cmd Env Ctrl H M H M M H M MI21 Ctrl Onbd Env H H H H M22 Cmd Attit&Pos H M H M M H M M23 Ctrl Attit&Pos H H H H M24 Sense Status H H H M M25 Assess Status H M H M M H M M26 Cmd Reconfig H M H M M H M M27 Reconfigure H H H H M

8 Spprt Dev 28Tools M M M M M M M9 Simulate

29 HWIL M H H M M M M30 Demonstration M H M M M M M M10 Spprt Acqu31 Developer Test M M M M H H H H M32 Developer Env H M H M H M M33 Factory Test M M M H H H H34 Acceptance Test M M M M M M

11 SpprtMgt35 MIS DB Maint M M H M H H H H M36 Track Mgt Info M H H H H H M

ATTRIBUTES: Reli = Reliability Main - MaintainabilitySurv - Survivability Usab - UsabilityIntg= Integrity Reus - ReusabilityEffc - Efficiency Port = Portability Thru - Throughput

TASO FA

1-17 %=GAn£q

Page 27: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

S Table 1.5-2 Deriving Metrics Rankings from Subfunction AttributesFUNCTIONS I ATTRIBUTES I METRICS CLASSESNo. Subfunction IReli Survlntg Thru UsalMair Port Reu: Effc 4TBFCpx FO FS 1DBDetect i I

I Plumes IH H H H M H H H H H2 Coldbodies IHH HH H HH HH H3 RF IH HHH M M M H HH HHU ~~Identify HHMMH H H4 Resolve Obj. I H H H M M IH H H H H

6 Assess Kills IH H H M M M M M H H H H HTrackII

7 Correlate IH H H H MIH H HH H8lInitiate IH H H H MIH H HH H9 9Estimate ~ H H H H M MIH H HH H

10OPredictI&l IH H H M MIH H H H HCommunicate I

11linterplatform ~H H H H M H H H H HH12 Ground-Space IH H H M M M MIH H H H H13 Ground iM M M M H M H MM MM M

Assess14 lThreat jH H H IH H H H H15 SDS jH H H H H H H H

Wpn Control I17 SBI Asgn&CtrI IH H H H M H H HH H18 SBI Guide&CtrI IH H H H H H HHH7Glg&Cr )HHHM HMH H H H19 GBI Guide&Ctrl H H H M H M H H HH H

Plaffm Mgt I20 OCmd Env Ctrl IHM HM M HM MIHH H HH21lCtrl Onbd Env IH H H H MIH H H H H22 Cmd Attft&Pos IH M H M M H M M IH H H H M23 Ctrl AtIt&Pos IH H H H M IH H H H H24 Sense Status IH H H M M IH H H H M25 Assess Status IH M H M M H M M IH H H H M26 Cmd Reconfig IHM H MM HM MH HH H M27 Reconfigure IH H H H MIH H H H H

Spprt Dev I28 Tools M MM M M MM MIMMM M

Simulate29 HWIL M H H MMM M H MHM M30ODemonsration IM H M MM MM M HM HM M

31lDeveloper Test M M MM H H H H MIM H MM M32 Developer Env IH M H M H M M H H H H H33 Factory Test IMM M H HH H MH M MM34 Acceptance Test IM M M M M M IM M M M M

35M15 8 at M M H M H H H H MIM H MM M36 Track Mgt Info jM H H H H H M M H M M M

METRIC CLASSES: "TBF" -- Time Between FailuresOFCO -- Failure Count Models "Cpx" -- ComplIexity'DB" - Input Domain Based FS* - Fault Seeding

TASS.1-18 oam TO m

Page 28: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATIONII

Table 1.5-3: Metrics Ranking Derivation Rules

ATTRIBUTE RANKING METRIC CLASS/MODEL RANKING

<RELIABILITY "RELI"> m=> <TIME BETWEEN FAILURES "TBF">&<COMPLEXITY "CPX'>

&<FAILURE COUNT "FC">&<FAULT SEEDING "FS">3 &<INPUT DOMAIN BASED "IDB">

<SURVIVABILITY "SURV"> -=> <TBF> & <FC>

HIGH "H' w H

3 MEDIUM "M" m> M

H&M H

H &H H

3 M&M M

'II

I

TASC rPAmeTA

£ 1-19 . Ngm,,s 19

Page 29: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

3 Two classes of metrics assume access to source code: Complexity Models (Halstead,McCabe, Woodward, etc.) and the Mills Fault-Seeding Model, which also requires the alteration ofthe source or the software adaptation data (parameters, etc.). The failure-sensitive (TBF & FC)and input domain (IDB) and many productivity metrics models are independent of the source code,and do appear applicable to hybrid development processes.

Other developmental process implications are discussed next in regard to the softwaredevelopment life cycle (SDLC).

1.6 LIFE CYCLE

1 1.6.1 Life Cycle Models

This section on life cycles has a fourfold purpose:

a. To establish a reference life cycle to support the categorization of metrics, and theparticular phase with which they are associated or used;

b. To establish a foundation and reference point from which subsequent and differentlife cycle forms can be derived or related;

c. To establish a reference model that can be used in the identification of new metricforms and types, and the phase in which they will be employed;

I d. To provide the basis for establishing new metrics methodologies and usageemphasis.

3 To support the categorization of metrics, a reference or standard life cycle requiredidentification together with its identified phases. The software life cycle used is that defined in[MIL-STD-2167A]. A number of other life cycle variations were found (RADC878A, IEEE Std.P982.2/D6, [BOEH81]) and examined. Life cycles were found to vary in the number anddefinition of phases. However, these differences for the most part were found to beorganizational, with the content essentially the same as Tables 1.6-1,2&3 illustrate. Life cyclevariations were minor, and all of these models can be mapped into each other with relative ease, aswell as the waterfall model of TR-9033-1.

Newer life cycle models (i.e., Spiral, technology, iterative) that more appropriately supportcomplex development activities such as prototyping and reusability paradigms were not consideredfor evaluation. The newer models have not been used to the extent that the classical waterfall

model has, and thus not much experience and data are available. However, their iterative nature,("build a little, test a little, correct/adjust, repeat the process again"), coupled with the formality ofsome of the prototypes used in this iterative process, is more appropriately suited to modern daydevelopments. Large and complex system developments that must endure changing requirements,as well as undergo evolutionary or incremental changes must be supported by iterative processesthat provide the ability to revisit and reexamine design incursions and baseline changes. It is,therefore, recommended that a more indepth examination of iterative life cycle models be made in

the context of this report.

TA IAM1£1-20 M ?GA=W.? L0MCOWN]M TO W ,

Page 30: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATION

3I Items a. and d. are considered within the scope of this task, items b. and c. are outside thescope of this effort. The basis for the scope statements is the pragmatic task emphasis ofidentifying existing and available software and system measurement, and variably reliableS predictors of high utilitarian value to managers and technical personnel.

A few observations about the referenced software development life cycles contained inNTables 1.6-1,2,3:

- User and developer bidirectional communications and information flow are not wellsupported (e.g., human engineering and software engineering). This is indicativeof the lack of feedback paths within the life cycles required to resolve deficiencies,issues and problems as they arise;

- Events and activities are sequential in nature, with very little support for iterativeprocesses. The 2167A life cycles support prototyping and design reusability.Iterative processes are required to tune or refine prototypes as design information3 evolves and is elaborated upon;

- Resulting products from such are limited and constrained. The varying degrees ofprototype and reuse formalisms will require tailoring of existing specifications, aswell as require new ones. Additionally, associated design reviews will also requirecustomizing (e.g., reuse preliminary design review, prototype specification designreview).

More sophisticated configuration, delivery and requirements packaging, such as incrementaldevelopment, incremental deployment, evolutionary development, transactional threading, andtechnology insertion have given rise to newer, less well defined models. Models to supportvarious advanced life cycle requirements have had slow acceptance (formal prototype, executableprototype, spiral, automation-based model).

The applicability of each metric to the life cycle phases must be clearly identified. This isessential in planning for effective visability and predictability in advance of development. A metricused to assess code structure is one that is invoked late in the development life cycle where muchtime and resources have already been consumed (post-facto). A metric or predictor used in theearly or initial life cycle phases (i.e., requirements/design synthesis) can provide anticipatory orpredictive insight before long term resources are committed, where it is much more cost-effective[BOEH8 11 to make design tradeoffs and correct problems.

ISPARAJSRC

1 -2 1 W W M, M Mr A I

Page 31: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

TABLE 1.6-1

I COMPARING MIL-STD-2167A & RADC-TR-87-171

I M[L-STD-2167A RADC-TR-87-171

No. Phase No. Phase

1 S/W Reqts. Analysis 1 S/W Reqts. Analysis2 Prelinanary Design 2 Preliminary & Detailed Design3 Detailed Design4 Coding & Unit Test 3 Coding & Unit Test5 CSC Integration & Test 4 CSC Integration & Test6 CSCI Testing 5 CSCI Testing7 System Integ. & Testing 6 System Integ. & Testing8 Oper. Test & Evaluation 7 Oper. Test & Evaluation9 Deployment & Distribution 8 Production & Deployment

I TABLE 1.6-2

3 COMPARING MIL-STD-2167A & [BOEH811 WATERFALL PHASES

MIL-STD-2167A [BOEH81] WATERFALL LIFE CYCLE

No. Phase No. heb

1 S1W Reqts. Analysis 1 Software Reqts./Validation2 Preliminary Design 2 Product Design/Verification3 Detailed Design 3 Detailed Design/Verification4 Coding & Unit Test 4 Code/Unit Test5 CSC Integration & Test 5 Integ./Product Verification6 CSCI Testing7 System nteg. & Testing 6 Implementation System Test8 Oper. Test & Evaluation 7 Op's & Maint./Revalidation9 Deployment & Distribution

I2

3 1-22 ,m & 'IUS'

Page 32: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

TABLE 1.6-3

I ICOMPARING RADC-TR-87-171 & fBOEH811 WATERFALL PHASES

RADC-87 LIFE CYCLE [BOEH81] WATERFALL LIFE CYCLE

I No. Phase No. Phase

1 S/W Reqts. Analysis I Software Reqts./Validation2 Preliminary and Detailed 2 Product Design/Verification

Design 3 Detailed Design/Verification3 Coding & Unit Test 4 Code/Unit Test4 CSC Integration & Test 5 Integ./Product Verification5 CSCI Testing6 System Integ. & Testing 6 Implementation System Test7 Oper. Test & Evaluation 7 Op's & Maint./Revalidation8 Production & Deployment

As indicated in Subtask 1, TR-9033-I, approximately 70% of all "software errors" actuallyoccur early in the development cycle. This gives great importance to identifying and implementingthose metrics which can be applied in the early design phases. An examination of a well definedcharacteristic life cycle model (e.g., RADC's) as presented in Table 1.6-4, with its associatedI reliability measure model (Table 1.6-5) reveals the following:

- The identification of a prototyping phase and an extensble reusability dependentlife cycle (i.e., reusable architecture, design, specifications and code) requires theidentification of new metrics or redefinition of others. Thus, the entry for reusemetrics (Table 1.6-4) would also appear in the preliminary and detailed designphase; while a new one would be entered in this same phase for prototyping.

I - Secondly, the associated reliability measurement model of Table 1.6-5 wouldrequire similar adjustments in its metrics formula representation.

3 - Thirdly, a shift in emphasis would occur, from estimation to predictive metrics, ifemphasis were placed on prototyping a system in order to obtain early visibilityinto the design and ferret out initial design issues.

The impact on the reliability model predictive and estimation metrics (Table 1.6-5)relationships (see reference for formuli details) can be altered significantly if new terms, such asI NR and NP are added to this established model.

Furthermore, in both the RADC and IEEE models, predictive metrics (category 1) used in theearly life cycle phases are the ones where the least amount of information is available and whereformal relationships are least understood.

"it

1 -2,

...... .. - - .==.=,lINfill lWA M

Page 33: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

" _ x ×x x x ×

*X XXX XXI> x xxx.., iI X XXX X

I X .XXX,.X

I U) 'Ba,:

cS

i -)

.. .. . .... I . I l l I I I I I I

-I0:xxx

-I - - - - - - - - - - - - -II ii) a xxIu oI

----! ---!!!-- ----------- &

F cCc x x1-24

Page 34: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

5 TABLE 1.6-5

SOFTWARE RELIABILITY MEASUREMENT MODEL5 RADC-TR-87-171 PREDICTIVE AND ESTIMATION METRICS

PART 1: PREDICTIVE METRICS

5 Application Type ADevelopment Environment DSoftware Characteristics S

Requirements and Design Representation SIAnomaly Management SATraceability STQuality Review Results SQ

Software Implementation S2Language Type SLProgram Size SSModularity SMExtent of Reuse SUComplexity SX

Standards Review Results SR

Rp = A x D x S whereS =SI1xS2Sl =SAxSTxSQS2 = SL x SS x SM x SU x SX x SR

PART 2: ESTIMATION METRICS

Failure Rate During Testing FTest Environment T

Test Effort TETest Methodology IMTest Coverage TC

Operating Environment EWorkload EWInput Variability EV

RE = F x T, during testing where T = TE x TM x TC andRE = F x E, during OT&E where5 E-EWxEV

IITom

• " ' ' ' , I I I I I

Page 35: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATION

3 Focusing on MIL-STD-2167A thus establishes the need for the definition of a moreextensive and formal life cycle that can address the activities and processes for prototyping andreusability among others.

1.6.2 Life Cycle Experience Classification

Table 1.6-6 represents an experience classification matrix of measures and the life cycleU phase in which their use is appropriate. The table divides the measures into three categories:

Category I - identifies measures that have been formalized (e.g., by a mathematicalfoundation)and have limited or insufficient operational validation

3 Category 2 - identifies measures that have been formalized and have a moderateexperience validation

Category 3 - identifies measures that have been formalized and have an extensiveI experience basis.

The table serves to identify measures (category 1) that may have a higher utilitarian valuethan the currently utilized metrics of category 3 [IEEE P9821 that are well understood and industryrecognized (e.g., McCabe, Halstead). It is interesting to note that category 3 measures fall into thelater phases of the life cycle further supporting the claim that they are post-facto measures (i.e.,occur from the coding phase to the operations and maintenance phase) for the most part wheresignificant resources have been consumed and committed. While category 3 measures representapproximately 20% of the total and are represented by the majority of tools that exist in themarketplace, category I measures account for approximately 50% and are not supported by manytools and environments. The remainder fall into category 2 (approximately 30%).

A contributing factor to the unavailability of metrics information rigor in the early phases ofthe life cycle is the fact that use of formal notation to support requirements and design synthesis isnot available or utilized within the industry for the most part. The sparse use of formal notationwithin the life cycle reveals a serious technology and communications gap between the systems andsoftware engineering communities. Use of a formal syntax (e.g., BNF notation) or system designlanguage, with the same level of formalism as those that exist in the implementation phase of thelife cycle (i.e., programming language) has the benefit of being machine processable and

I analyzable.

The coexistence of both a formal system design and a programming language provides thebasis for applying complexity, efficiency, reliability, etc., measures in a more consistent andpredictive manner. The existence of a formal system design language allows an early assessmentof design integrity and completeness that can in turn be used to predict code and structuralcomplexity, and ease of integration. Formal notation would provide metrics measurements and

I models with a more extensive and effective range of application.

I(i3 1-26 ( " 'I*"M

,, , , , I I I II I I I ITag

Page 36: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

3 Part 1: LIGHT EXPERIENCE METRICS BY LIFE CYCLE PHASECategory 1 Metric Conc Rqmts Des Implem Test Intg/Tes O&MCum. Failure Profile x x x x x x xCum. Failure Profile x x x x x x xFunctional/ Modular Test Coverage x x xDefect Indices x x x x x x xErrors Distributions x x x x x xS/W Maturity Index x x x x xNumber Entry/Exits Per Module x x xGraph-Theoretic Cmplexity for Arch. x x xDesign Structure x xSoftware Purity Level x x xRequirements Complicance x xData or Into. Row Complexity x x xResidual Faults Count x x xTesting Sufficiency x xRequired Software Reliability x x x x x x xSoftware Release Readiness x xTest Accuracy xIndep. Process Reliability x xCombined HW/SW Operational Reli. x x x

Part 2: MODERATE EXPERIENCE METRICS BY LIFE CYCLE PHASECategory 2 Metric Conc Rqrnts Des Implem Test Intg/Tes O&MFaultDensity x x x x x x xCause & Effect Graphing x x x x xManhours per Major Defect Detected x x x x x xNumber of Conflicting Rqmts x xMinimal Unit Test Case Determination x x xRun Reliability x x xEstimated Number of Faults Remaining x x xTest Coverage x x x x xReliability Growth Function x x xSoftware Documentation & Source Listing x x x xCompleteness x x xSystem Performance Reliability x x x x x x

Part 3: GREATER EXPERIENCE METRICS BY LIFE CYCLE PHASEIategory 3 Metric ronc Rqmts Des Implem I est Int I es O&MDefect uensny x x x X x x xRequirements Traceability x x xSoftware Science Measures x xCyclomatic Complexity x x xMean Time To Discover The Next K Faults x x xFailure Analysis Using Elapsed Time x x xMean Time To Failure x x xFailure Rate x X _

5 Table 1.6-6: SOFTWARE METRICS EXPERIENCE CLASSIFICATION

Idl dl

r SPAMNA

1-27 iW 'W= A I

.... ~~~~ ,W,,mlIlvl?0iW0IIB

Page 37: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

S1.6.3 Summary

Paragraphs 1.6.1 and 1.6.2, together with the other sections of this report, are intended toI provide the basis for the following:

a. Much more metric information and formalism is concentrated on the later phases ofthe life cycle (i.e., from coding/implementation to the operations and maintenancephases). The reference section provides additional strong support for the statement.

b. The cost to correct errors or deficiencies in the later phases increases by at least oneto two orders of magnitude. This is attributable to the consumption of resourcesand labor intensive activities that have occurred by the time that implementation orcoding is reached [BOEH8 1].

I c. The coding and related phases (e.g., CSC phase) of the life cycle are the leastsignificant cost drivers. It is recognized, thus specific references are not required,that maintenance is the costliest phase, as a result of redesign, requirementschanges, poor design visibility and specifications.

d. Current development efforts that have focused on the early phases of the life cycle,predictive metrics and the early detection of problems have had very goodproductivity (cost and schedule) and have produced quality software with fewercatastrophic errors. IBM's Space Shuttle Program, NUSC's Submarine CombatSystem, Unisys Trident Submarine Navigation Program, and Teledyne Brown'sSDI SIE and N-SITE programs are examples of such. Though these efforts arefew in number, a consistent theme is emerging.

1 Indications are that focusing on predictive/early life cycle use of metrics and focusing on the earlylife cycle phases themselves, will provide a better quality and cost-effective design. Yet predictivemetrics and the measures of category I (Table 1.6-6) are the ones with the least experience factors

* and understanding.

TASC &dm IAIr

I 1-28 *ON Im,. ow.,

I ..iI.,

IIll~l lII l ll

Page 38: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

I 2. ANALYTIC EVALUATION OF SOFTWARE METRICS

I 2.1 FEATURE ANALYSIS

As outlined in Subtask 1, TR-9033-1, several software domains can be constructed usingcombinations of characteristics and factors applied to 36 SDS subfunctions. An assessment ofhow each applicable software metric can be applied within each SDS subfunction will bepresented. Each table in Appendix B identifies the SDS subfunction and the nine softwareattributes along with their relative rankings as presented in Subtask 1, TR-9033-1, for thatsubfunction. Each relative ranking is assigned a score. The scores are arbitrarily assigned a valuefrom 1, for low, to 5, for high.

A vector is created, for each identified metric, using the scores assigned from the relativerankings given in TR-9033-1. For example, the first subfunction identified is "Detect Plumes".Associated with that subfunction are the following attributes with their relative rankings. Besideeach relative ranking is its associated score.

Quality Attributes Relative Rankings Score

Reliability High 5Survivability High 5Integrity High 5Efficiency Moderate 3Throughput High 5Usability Low IMaintainability Low 1Portability Low IReuse Low 2

U Taking the McCabe metric from Table 1.3-2, note that this metric is used in measuring reliabilityand maintainability. For Detecting Plumes, the need for reliability as defined in TR-9033-1 ishigh, while the need for maintainability is low. Since these are the only two attributes measured bythis metric the following vector is created:

Detect PlumesTR-9033-1 Rankings

H H H H L L L L M

M m Sum _g Thru Vab Main Port Ru E&ffc

I McCabe 5 0 0 0 0 1 0 0 0

Recognize that each subfunction has a different set of attribute rankings and therefore creates adifferent vector set.

3"2-1 %aW% U'r* IIW

Page 39: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

Using the McCabe metric for another SDS subfunction, e.g., "Command EnvironmentControl", the vector is different and it appears more relevant to this subfunction than DetectingPlumes.

Command Environment Control

U TR-9033-1 Rankings

3 H M H M M H M L M

Reli Surv t Thru Usab Main Port Reus Effc

I McCabe 5 0 0 0 0 5 0 0 0

The McCabe metric, though still deficient in six of the eight attributes, matches therequirements of the Command Environment Control subfunction better since it would be able tomeasure 2 of the 3 attributes designated as a high need.

The reader should note the large number of zeroes included in the vectors shown inAppendix B. This indicates that the established metrics do not support many of the SDS requiredattributes. It appears that much work still needs to be accomplished in creating metrics that address

those attributes. One method of attacking this deficiency would be to use an emerging class ofmetrics designated as multi-attribute metrics. Some pioneering work is currently on-going atGeorge Mason University and other academic institutions around the country. In fact, the NationalScience Foundation sponsors a Special Interest Group on Multi-Criteria Decision Making (MCDM)3 which meets periodically to discuss the state-of-the-art in software measurement.

An alternative approach would be to investigate the feasibility of creating metrics for thoseattributes not currently addressed. After those metrics are created, a composite (or collective) set ofmetrics could be applied to each SDS subfunction. The greatest payback would be achieved byapplying those composite or multi-attribute metrics that address the attributes with high or moderaterelative ranking. If funds still remained in the quality assurance purse after covering higher level5 attributes, then the attributes designated as low ranking could be addressed.

Caution must be exercised in interpreting the values (scores) in the created vectors. No oneshould make a judgement that one metric is "better" than another because a vector value is larger,e.g., 5 vs. 1. Remember the scores are highly dependent on the relative ranking derived in TR-9033-1 for each SDS subfunction. Even the same metric can have several different vectors,depending again on the SDS subfunction being evaluated. The assessment that can be made for avector with higher scores is that the application of that metric to its specific SDS subfunction willprovide information about an attribute assessed, in TR-9033-1, to be more critical. This reportdoes not claim to rank the metrics. It attempts to show where specific metrics can best be applied5 effectively across all 36 SDS subfunctions.

The well-defined metrics which were analyzed in this section are generally applied late in theU software development life cycle (SDLC). They are normally classified as estimation type metricsrather than predictive. The current trend is to develop and apply predictive metrics early in theSDLC to aid not only program managers, but also development personnel in identifying

I TAS i il IPI TA32-2 TO OfIP"DAM

Page 40: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

unfavorable trends early in development where corrective action can be taken with minimum costand schedule impacts. As previously presented in Table 1.6-4, the more research and developmentdone to inspire predictive metrics for the preliminary and detailed design phase of the life cycle, the3 better the control will be on future developmental efforts.

U3UUU

IU

I

II

I TASCI"IGRC,,I SA

2-.3r4PMAMW tIoimTOGO

Page 41: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I THE ANALYTIC SCIENCES CORPORATION

I 3. EXPERIENCE ASSESSMENT

3 A number of major programs were identified as potential sources that would provide fieldexperience and relevant insight into software metrics. Several programs were eliminated initiallydue to their states of maturity or information availability. NASA's Space Station initiatives are intheir initial phases, thus a shift to other programs with a greater experiential base was merited(e.g., NASA/IBM Shuttle program). However, that is not to say that the Space Station programdoes not have a quality or metrics program. Some of the RICIS literature and effort clearly focuses3 on mission and safety critical software and related issues for the Space Station.

The Joint Tactical Fusion program has a viable metrics initiative. Though informationrequested is forthcoming, the classified nature of the program precludes the screening of relevantmetrics information in time for this report. However, review of models used and metrics initiativesare consistent with and support conclusions reached in this report.

Formally instituted metrics programs on major systems acquisition are not readilyU identifiable, with the exception of those noted in the following sections. For the most part, metricsefforts are contained within larger quality assurance programs or as smaller independent researchinitiatives within DoD. However, where metrics programs have been highlighted as part of majorprogram acquisitions, much success has followed those developments/acquisitions.

i 3.1 THE NAVAL UNDERWATER SYSTEMS COMMAND (NUSC)

The NUSC facility at Newport, Rhode Island, has been using a suite of software metrics tomanage submarine combat systems acquisitions for the last four years. Dr. John Short, Dept.Manager, was instrumental in setting the management guidance for a cohesive method of contractoroversight using software metrics throughout the life cycles of each of the emerging developments.

The management concept at work is risk management: to manage project and resource risksby assessments, predictions, validations and risk mitigations. The process starts with RFPdevelopment and proceeds apace of the life cycles. Key players within NUSC and the contractorsare given training and briefings by the NUSC software metrics staff.

The NUSC database and experience is focused on the Navy Submarine Combat SystemApplied Software Metrics Program. NUSC has approximately 10 million lines of source code(MSLOC) in process with projects varying between 0.5 and 4.0 MSLOC.

The NUSC software metrics program applications have initially centered on CMS-2 languageU applications, with two recent exceptions, the CCS MK 2 and AN/BSY-2 programs using Ada andMIL-STD-2167. Key tools used in the NUSC program are LOTUS 1-2-3 as the spreadsheet forinformation collection of suchitems as faults, problem trouble reports (PTR), and PTR testing, supported by an ORACLEdatabase; at least 4 development models (SLIM, COCOMO, SECOMO, SASET); 2 reliabilitymodels (MUSA, DUANE); a complexity analyzer and code analyzer (AdaMat); a variety of PC-based management tools, and other non- metrics tools. In summary, the Navy's program isfocused on applied, tailorable and practical applications oriented metrics. Although theimplementation phase is monitored and evaluated, initial life cycle phases are strongly emphasized.The collection and analysis of problem trouble reports, together with statistical evaluation of them(rate of occurrence, rate of resolution, resolution to occurrence correlation) plays a critical role in

TAS MjalT

85A3-1 '"" SUl A

. . . . . ' • , i I I I

Page 42: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

3 predicting software quality for NUSC. The effort has potentially significant relevance to the SDIinitiative and parts and processes of their program may be exported/reused directly to becomecomerstones of SDI metrics/quality programs.

The submarine systems environment has several similar characteristics to SDI componentsystems, including austere survivability and reliability requirements. In certain cases, theenvironment and operating conditions may be even more severe since maintenance corrections andfault repairs during operational conditions at sea are not allowed. In many instances, reliability andMTBF's are pushed to the limit. Large megabit transmission links are non- existant to effectdownloading of new or enhanced software versions as may be found in SDI space-basedI subsystems. The information exchange with NUSC is in its initial phase, and further technicalinterchanges are envisioned.

I 3.2 NAVAL SURFACE WARFARE CENTER - DAHLGREN

Dr. William Farr at NSWL Strategic Systems Dept. has been active on the IEEE SoftwareReliability Measurement Working Group Steering Committee and a contributor to the IEEE draftstandard P982.1 committee. Dr. Farr has developed a package of 8 metrics models dubbed"SMERFS" (Statistical Modelling and Estimating of Reliability Functions for Software) and lastreported over 50 different users. One of which, NASA-Johnson Space Flight Center, will bediscussed next. (A module package with usage documents have been received--refer to Section5.5.2).

I 3.3 IBM AT NASA-JOHNSON SPACE FLIGHT CENTER

David Hamilton of IBM at NASA in Houston has been a SMERFS user and reports highlysatisfactory results using the SMERFS-based Schneidewind Model. He has had one years' usewith the SMERFS package. It should be noted that SMERFS is a post-facto metrics application.

The IBM Space Shuttle Programs' "On Board Primary Software Reliability Prediction" effortsource information was examined for insight into metrics and models used. Paraphrasing theirapproach - "The emphasis of this program centers on providing good software via the removal offailure mode causes based on the early identification of design flaws at the 'front-end' of theprocess (i.e., requirements & design) rather than 'defense (e.g., redundancy)' against them." Akey IBM approach to detecting software errors is to use Statistical Modelling of detection historydata in a configuration management database. The methodology is to apply the SMERFS, a multi-model interactive computer program, to appropriate discrepancy data in the database. Modelvalidation centers on predicting reliability. Much of the data used for analysis is derived frompreviously developed software that is repeatedly analyzed, catalogued and statistically analyzed.The SMERFS models are then used to find corresponding "form and fit representations". Themodel that best fit their data is the Schneidewind Model which uses exponentially decreasing errordetection rates and a poisson process failure detection.

3 The validity of this approach, apart from a front-end emphasis, depends upon:

a. the application of software engineering methodologies;

3 b. well-defined processes;

3-2 T MUOWAUI r .sM T

Page 43: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

3 c. performing labor intensive and extensive multi-pathactivities (e.g., code to function/vice-versacorrelation, backward chaining analysis)

d. "(a) software error history sufficiently analyzed to allow application of (the)reliability model".

U Note: In the domain of SDI software, c. and d. are not practical due to the amount and complexityof the software, and the lack of historical data available relative to it. Many of the SDI softwareapplications are being developed for the first time.

I,I

UII•

II

I

IASRT

TAMP ~tUSA3-3 W~ #MMAM A%

I _.

Page 44: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATIONII 4. CAPABILITIES, LIMITATION AND DEFICIENCIES OF MODELS

The prioritization of SDS software metrics requirements appeared to be high for Reliability,Survivability, Integrity, and Efficiency software attributes. There was moderate or mediumpriority for Maintainability, Portability, Usability, and Reusability, in that order. The assessmentof the software metrics maturity presents a different ordering: Efficiency, Reliability,Maintainability begin most matured, with considerable less for Survivability, and negligible, ifany, support for Portability, Usability and Reusability attributes. The rest of this section willsummarize the capabilities of the field of software metrics models to support each of these areas.

4.1 GENERAL LIMITATIONS

[CONT86] makes some general caveats after introducing the notion of a (composite) "Qualityof Software" metric (1.2.2). The book gives the following warnings:

3 a. Like items must be measured and compared together "apples with apples".

b. When metrics are moved from one environment to another, they should be re-calibrated.

c. Metrics are to support not replace management savvy.

3 d. Software metrics are poor personnel evaluators.

e. Any metric model output is not better than its input.

3 f. The metrics program cost should be less than its benefits.

4.2 RELIABILITY MODELS

Both the Time Between Failures (TBF) and the Failure Count models have had considerableevaluation and study. Recently, John Musa stated that failure count and time was the correct basisfor reliability assessment and prediction as opposed to fault or defect counting and projectionIMUSA8903J. However, as stated in [IDA P-2132, 7.31 "Current software reliability technologysuffers from some fundamental problems and limitations". This paper states that the convenientadoption of hardware reliability measures obscures the fundamental difference between hardwareand software reliability: that though hardware is subject to physical aging, software is not. Freshanalysis into the nature of software reliability and trustworthiness by Parnas suggest that acombinatorial/probalistic indicator may be needed: the probability that no critical fault remains.

[IDA P-2132] also raises doubts that studies based upon non-critical fault rates in softwaretargets of average reliability are deterministic toward predictions of criticality in highly reliablesoftware. The IDA paper goes on to quote Goel at the IDA Testing and Evaluation Workshop thatavailable approaches are too simplistic, and they cast the very pessimistic statement "Because of thefundamental limitations of current software reliability assessment methodologies ... further workon enhancing current methodolgies (sic) is unlikely to yield satisfactory results." (Assuming thatthe Pamas criteria is used -- determination of the probability of a failure-causing fault.)

I I~PAWrr

3 4-1 M 1WWA7 I, A %AMl cotwam TO ._m

Page 45: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

3 This pessimistic cast of the IDA paper was presaged back in 1986 by Abdel-Ghaly, Chanand Littlewood, in their review of ten reliability models. The authors pointed out that predictivequality is the only consideration for a user of software reliability models and a good "model" is notsufficient to produce good predictions. They found that the inference rules and the predictiveprocedure were significant to the relative success of the various models JABDE8609].

The authors present an approach for validating candidate models for a specific application.The results showed that there was no single "best buy" among them. The ten models were:Jelinski - Moranda, Bayesian Jelinski-Moranda, Littlewood, Bayesian Littlewood, Littlewood-Verral, Keiller-Littlewood, Weibull Order Statistics, Duane, Goel-Okumoto Model, and LittlewoodNHPP.

4.3 COMPLEXITY/MAINTAINABILITY

More recent analysis of maintainability predictors using measures as program specificationchange rates, programmer's skill levels, and volume of design documentation, has considerableeffect on error rates. Both [TAKA8901J and [GIBS8903] showed that structural differences doimpact programmer's performance. Specifically, system improvements result in considerableimprovements in programmer's performance. Gibson and Senn's study investigated this byasking three experienced professional programmers to perform three maintenance tasks on threefunctionally equivalent, but different in complexity, versions of a COBOL system.

Six metrics were used in this study: the Halstead's E, McCabe's cyclomatic complexity,Woodward's K, Gaffney's Jumps, Chen's MIN (Maximum Intersects Number), and Benyon-Tinker's Cx. The metrics reflected both the improvements in the system (in terms of ease inmaintenance as system complexity is decreased) as well as in programmer maintenanceperformance. The Halstead's E, McCabe's Cyclomatic Complexity, Woodward's K, andGaffney's Jumps reflect a decrease in complexity with improvement in system structure (controlflow complexity). The Chen's MIN and Benyon-Tinker's Cx on the other hand reflect the

offsetting increase in complexity due to IF nesting and number of modules.

This study was consistent with earlier work by Evangelist and work by Basili, Selby andPhilips, investigating the validity of complexity measures as independent predictors of effort and orpsychological complexity [EVAN84, EVAN83, BASI83]. These investigators concluded that themetrics by themselves were no better than lines-of-code, but when the personnel being studied washeld constant (single programmer studied across multiple projects), there appeared to be some3 relative correlation with actual work and times to complete.

Since the metrics have shown to be meaningful in studies where the work object is changed,but the programme-" performing the work is fixed, further work studying programmer variables isneeded. A detailed problem solving and work trait profiling system may be possible to account forsignificant programmer differences, and be a basis to increase the validity of complexity metrics forI maintainability predictions.

II TAMdF;

3 4-2 %6r1?5= W1AULD

..... ~~~~~~00n -- TOmmmM mmmmm m m mmm

Page 46: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATIONI

4.4 EFFICIENCY MODELS

The use of modeling to predict sequential software efficiency is quite mature. Typically asimulation language is used to model the intended software achitecture. As development proceeds,more data is fed into the simulator, and the accuracy of the predictions improves. General purposesimulators like Simscript and GPSS have been available for many years, as have texts and articles[ADK17704], [SCHN7704].

However, as pointed out by the IDA team in their review of testing technology available forconcurrent and real-time programs, the methodology is relatively new. One recent technology isthe TAGS Simulation Compiler offered by Teledyne Brown. Another is Auto-G from Advanced

Systems Architectures of England. A third is Statemate by "iLOGI X", an Israeli vendor.

I However promising these packages are, they require further investigation and validation.

I4.5 EFFICIENCY/PRODUCTIVITY MODELS

Halstead's original "Software Science E" was a model to forecast programming effort basedupon complexity and size work parameters. The COCOMO and SECOMO models are based onlines of code and other intermediate cost drivers. These models have been in use on manyprograms and should be considered mature [BOEH81], however, David Card and others havefound that the factors of change in the software maintenance environment may not be so

I predictable.

The maintenance cost model was studied in [CARD8807J. The results showed that thestandard maintenance cost models based on lines of code measures proved inadequate in estimatingmaintenance cost. This is due to the different productivity rates of maintenance and theaccumulation effect of program changes.

U4.6 INTEGRITY/SECURITY MODELS

The existence of an integrity model is not obvious, but both the Air Force and Navy haveU developed systems security risk assessment methods [NEUG85, pg. 4-74, 75], which may beused to assess system integrity factors in terms of annual levels of loss. Both methods are fairlyinaccurate, using "fuzzy-metrics" (approximated orders of magnitude). This concept of fuzzymetrics may have some application to the assessment of integrity as may other elements of thesecurity models: vulnerability inventories, threat catalogues, service and data asset valuation, andtime-based exposure projections. Fairly recent research into fuzzy systems [NEGO81] and

modeling [NEG087] theory has developed the mathematical foundations needed to refine suchapplied risk models, but much more study is needed.

4.7 PORTABILITY, USABILITY, REUSABILITY

U Models to support assessments of portability, usability and reusability are not available. Thisrealization was also derived by the SPARTA team [SPART8903]. These attributes were ranked asmedium in the aggregate rankings, so some assessment of their relative value for research anddevelopment is warranted.

4.83IASC " at

Page 47: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATION

U 4.8 COMPOSITE METRICS

Though both [CONT86] and [RADC878A] present a grand accumulation formula forassessing the many quality factors and arriving at a signle coefficient, there is considerableskepticism that such a calculation would obscure the meaningfulness to seasoned managers.Another method is outlined in [GOIC81] and [SING87 -- Multicriteria Modeling and/or FuzzyI Multicriteria Modeling.

[SING87, pg. 18271 presents a good introductory example. For a software quality metricsapplication refer to the discussions about the Naval Underwater Systems Command.

IIIII

I AIIII

I TA-SC S0"All4-4 / uolme-=ai TO

Page 48: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATION

E5. AVAILABLE SOFTWARE METRICS SUPPORT

This section presents the Subtask 3 survey, features assessment and recommendation ofavailable software metrics tools and environments.

I 5.1 EXISTING TOOLS AND ENVIRONMENTS

An extensive list of tools and environments have been extracted from TBE's TASQ databasefor examination. A more extensive list of tools and environments reviewed and others that are3 under review is contained in Appendix C.

From this task's point of view, two categories of tools have been established: tools directlysupporting a metric (e.g., McCabe, Halstead, Complexity) and ancillary support tools (i.e., toolsthat can support or assist in the collection or analysis of a metric). Ancillary support tools consistof the following categories:

Program Management & Support ToolsComputer-Aided Software Engineering Tools

- Linkers, Loaders, Debuggers- Test GeneratorsS- Spreadsheets- Databases- Environments

Thus, while spreadsheets such as LOTUS 1-2-3 can be extremely useful in collecting andclassifying metrics and measurements, spreadsheets are not considered metrics tools.

II

I

I

I 5-1 A C SAT

Page 49: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATIONU

I 5.2 ANALYTIC EVALUATION OF TOOLS/ENVIRONMENTS

A number of metrics tools have been identified:

Tool Source

SMART USA-CECOM/rBESMERFS NSWC-Dahlgren/Wm.

FarrMcCabe Metric IntermetricsHalstead Metric IntermetricsComplexity Measures Tool EVBComplexity Metric Computer Systems

DesignAnalyze Autometric, Inc.Complexity Measure PD SIMTEL 20ADADL Software Systems

Design, Inc.Byron IntermetricsAnalyze dofAdamat Dynamics Research

Corp.Complexity Analysis Tool McCabe AssociatesAda Complexity Analysis Tool

(ACAT), (McCabe) Teledyne BrownEngineering

AdaPIC Arcadia ConsortiumLogiscope VerilogCOCOMO TRW-Redondo Beach3 SECOMO RADC-DACS

With the emphasis for the Strategic Defense Initiatives to capitalize on modem softwareengineering methods and approaches, and the Ada Technologies initiatives, only a few of thesetools will be discussed further.

TA C5-2 I. ~ r.,,

III *. .

I

Page 50: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATION

1 5.2.1 Software Management and Reporting Tool (SMART)

The SMART package is targeted for deployment at the U.S. Army Communications -Electronics Command (CECOM) July-August 1989. The first program it will be applied on is theAdvanced Field Artillery Tactical Data System (AFATDS).

UThe software metrics goal is to implement the Software Management and Quality Indicatorsas per AFSCP-800-43 and AFSCP-800-14 in an efficient, extensible architecture. The softwaremetrics indicators and data management will be based on IBM PC-compatible machines using the

I popular "dBaseY' formats with the "Clipper" data base language system.

Table 5-1 shows some of the data tracked by SMART.

TABLE 5-1OVERVIEW OF SMART SOFTWARE DATA

TYPE INDICATOR

DeviationsWaiversSoftware Trouble ReportsTest Incident FormsEngineering Change ProposalsSoftware Improvement ReportsSoftware Discrepancy ReportsComputer Resource UtilizationSoftware Development ManpowerRequirements Definition and StabilitySoftware Progress - Development and TestCost/Schedule DeviationsSoftware Development ToolsCompletenessDesign StructureDefect DensityFault DensityTest CoverageSoftware MaturityDocumentation

III

T~4~S.O SPATA~

5-3weNm" rUI ~l

Page 51: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATIONI

I TABLE 5-2LIST OF SAMPLE GRAPHICS FOR SMART

COMPUTER RESOURCE UTILIZATIONS (3)Primary MemorySecondary MemoryCPU Utilization/P Utilization

I SOFTWARE DEVELOPMENT MANPOWER (WBS)

REQUIREMENTS DEFINITION AND STABILITY (2 LEVELS)System; CSCI

DEVELOPMENT AND TESTScheduled/Actual; CSCI & CSC

COST/SCHEDULE DEVIATIONS

SOFTWARE DEVELOPMENT TOOLS

COMPLETENESSRequirements %/monthImplementation %/month

DESIGN STRUCTUREModularityComplexity/Dependency

I DEFECT DENSITIESRequirementsDesignCode

FAULT (FAILURE) DENSITY

I TEST COVERAGE

SOFTWARE MATURITY

DOCUMENT TROUBLES (BY MODULE)II

5-4 I . __ ,,

3 Toe

Page 52: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

II THE ANALYTIC SCIENCES CORPORATION

5.2.2 Statistical Modeling and Estimation of ReliabilityFunctions for Software (SMERFS)

SMERFS was developed several years ago as an aid in the evaluation of software reliability.In its original design it was targeted for mainframe and mini-computer environments. Since then ithas also been adapted to operate on micro-computers, specifically IBM-PC/XT compatibles.

The current version of SMERFS has incorporated eight software reliability models. The

models include the following: (1) Musa's Execution Model, (2) Goel-Okumoto Non-Homogeneous Poisson Model, (3) Adapted Goel-Okumoto Non-Homogeneous Poisson Model,(4) Moranda's Geometric Model, (5) Schafer's Generalized Poisson Model, (6) Schneidewind's

Model, (7) Littlewood-Verrall Bayesian Model, and (8) the Brooks-Motley Model.

SMERFS contains a driver which is claimed to make it machine independent. The driver is asubset of the American Standards Institute (ANSI) specifications for the FORTRAN 77 compiler.Several user selectable options are available within the driver and allow the system to be configuredto produce: better predictions; output plots and catalogued output files. Currently SMERFS isoperational on three main computer groups at the Naval Surface Weapons Center (NSWC),Dahlgren, VA. The three computer groups include the CDC CYBER 170/875, the Vaxcluster11/785, and a large number of IBM-compatible PCs. Dr. William H. Farr, of NSWC, and Mr.Oliver D. Smith, of EG&G Washington Analytical Services Center, Inc. both claim thattransferring SMERFS to other computers should be very easily accomplished. [FARR8812]

Besides containing operating instructions within its interactive mode, two additional pieces ofdocumentation are available for use with SMERFS. The two supplemental reports are: (1)SMERFS Library Access Guide (NSWC-TR-84-371, Rev. 1), and (2) SMERFS User's Guide(NSWC-TR-84-373, Rev. 1). These two publications allow a potential user to preview thesystem. Examples are provided throughout the User's Guide, allowing a potential user to acquirean overview of the SMERFS processing. In addition, the guide also shows actual software3 reliability analyses performed on the CDC CYBER 170/875.

The SMERFS systems show tremendous potential for use in the SDI environment with a

minimum of modification.

5.2.3 McCabe Complexity Tools

Tools based on McCabe Complexity Analysis are of limited use in the Ada domain,unlessSDI applications use older implementation languages (e.g., COBOL, CMS-2, FORTRAN).Technical Report MC87- McCabe II-0003, Extending McCabe's Cyclomatic Complexity Metric forAnalysis of Ada Software, [TBE8703] U.S. Army AMCCOM; Dover, N.J. under contractDAAA21-85-D-0010 identifies extensions that are required of McCabe's Theorems if Ada, or othermodem language is the implementation language. McCabe complexity applications are limited topre-Ada higher order languages that do not contain modem software engineering abstractionprocess or language constructs such as packages, generics or tasking mechanisms. Thus, pre-Adatools that fall into this category (which is most of them) will require updating.

The above referenced report has resulted in technical report MC87-McCabe 11-0005,Modified A-Level Software Design Specification for the Ada Complexity Analysis Tool whichI Automates the Extended McCabe's C clomatic Complexity Metric, [TBE8704] U.S. Army

TAISPATA

BOA

5-5 I A %I IinuinmETo

Page 53: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

THE ANALYTIC SCIENCES CORPORATIONIAMCCOM; Dover, N.J., contract DAAA2I-85-D- 0010, establishing the requirements for an Ada

Complexity Analysis Tool (ACAT) to support Ada implementations.

i A number of Ada based or derived program design language (PDL) analysis tools haveemerged over the past several years. Two of those listed, Byron and ADADL, are mature enoughto be employed to perform code analysis completeness and consistency checking, crossreferencing, structure checking, code verification completeness and correctness, and interfaceanalysis.

ADADL has both a McCabe and an ADADL complexity metric for both pseudocode and theAda code for each program unit. Both of these complexity metrics have not been evaluated fordegree of appropriateness.

IThese are estimation metrics tool used on pseudocode or Ada (late life cycle phaseapplication). It should also be noted that a standard DoD Ada PDL does not and may never exist,thus PDL tool invocation is at the implementers or applications level discretion. Automatic codegeneration or a shift in design emphasis to an SADMT [IDA8804A&B] or like representation mayobviate the need for such in the future.

1 5.2.4 Software Engineering Cost Model (SECOMO)

SECOMO is an interactive software cost estimation tool, based on the COCOMO cost model,for calculating the total technical and support manpower requirements of a Life Cycle SoftwareEngineering (LCSE) Center. SECOMO is maintained and distributed by the Rome AirDevelopment Centers Data and Analysis Center for Software (DACS) at a $50 charge.

SECOMO includes a 'Care" cost limit for the fire-up phase of an LCSE Center.

Developed and maintained by the IT Research Institute and RADC, it is kept current with theTRW/Barry Boehm COCOMO users days recommendations.

Significant enhancements are:

i An Ada parameters set* Pull down menus and user efficiencies- Site-timing parameters

Anny Materiel Command sites which are user include:

* CECOM* AVSCOM* AMCCOM* MICOM

Navy sites include: NSWC-Dahlgren, NUSC - New London, NCSC - Panama City, FL; NTS -I Orlando.

II TAISC AR

5-6 MAOorIT

Page 54: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I THE ANALYTIC SCIENCES CORPORATION

5.2.5 Revised Enhanced Version in COCOMO "REVIC"

REVIC is another COCOMO enhancement maintained by an Air Force Center. The

headquarters command at Kirtland AFB, New Mexico has developed an extensive set of projectperformance data which they use to tune the REVIC model. Updates and enhancements to baselineCOCOMO and to SECOMO are brought into REVIC. (No Charge) HQCMP/EPR, Kirtland AFB,NM 87117-5000.

The REVIC User Group "RUG" is chaired by Dr. George Hozier at the University of NewMexico.

The active users include many Air Force sites and contractors:

Air Force CommandsESD, RADC, ACS-Pentagon

I ContractorsThe Aerospace CorporationBoeingHughesGE

TRWLockheedTextron

i 5.3 EXPERIENCE ASSESSMENTS

Adamat, both versions of Analyze, and Logiscope are mature metrics tools for extensivecomplexity analysis. Logiscope, at this time, does not support Ada source code, the others do.Adamat appears to have (based on literature review) the most extensive set of metrics criterion. Apartial list is identified:

Anomaly Management: Default initialization, basic loops containing a conditionalexit or return, strong type checking and constraint checking, raising user definedexceptions, range checking, etc.

Independence: Isolation of input/output routines, isolation of tasking statementsand declarations, independence from system- dependent modules, access types,package system, use of length clauses, etc.

Modularity: Use of private and limited private types, single type declarations in3 package specifications, etc.

Self-Descriptions: Use of comments, use of identifier names, etc.

3 Simplicity: Boolean expressions, labels, decision points, branches, branchconstructs, nesting levels, use of literals, procedure calls, etc.

5-7 Gmff 1 IAW

i -= E * P CililAlM ToIwe

Page 55: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITHE ANALYTIC SCIENCES CORPORATION

I System Clarity: Parenthesized expressions, no default mode parameters, accesstypes declared private, loops-modules-blocks names, etc.

The TASQ and ISEC [TBE86091 tools/environment databases are extensive and robust(containing several thousand tools). Initial metrics queries have resulted in a sparse toolidentification. A formal and more extensive categorization will be completed by mid-April, andwill be categorized per the matrix contained in Appendix D for completeness. All tools identified inAppendix C will also be matrix categorized as such. A metrics tools expansion list much beyondthat identified in this section is not expected. Recent dialogue with Ada environment developersand tool vendors at such exhibits/expositions as TRIADA, 7th Annual National Ada Conference,CASE EXPO, NSIA, etc., reveal that metrics tools are sparse and not the focus of theirdevelopment efforts. This is understandable in light of the state and maturity of Ada technology.The latter may account for the individual DoD service initiatives aimed at software quality programsand tool initiatives such as those identified in this report and focusing on metrics. The SDI will inall probability require its own tailored metrics and quality control program similar to those ofNUSC, NASA and AMMCOM. Much of the present developer/vendor effort is directed atenvironments, compiler maturation and efficiency, and related support tools (e.g., linkers, loaders,editors).

Many of the Ada developers at this time are looking for third party vendors to provide themwith complementary/synergistic metrics tools (Rational and Alsys are examples of such).

The emphasis on predictive metrics and early life cycle phase emphasis has identified majormetrics tool deficiencies in these areas - virtually non-existent. Well defined metrics, measurandrelationships and formalisms in the early phases supported by tools have yet to appear to providethe productivity impact they are expected to have. The AdaPIC tool set (a futures project) ongoingwithin the Arcadia consortium, holds some promise, but these new tools have yet to be developed[WOLF8903]. Similarly, TBE's ACAT is under development.

Unfortunately many of the AdaPIC tools will not address the early life cycle phases, but areaimed at complementing the emerging development environment. Specifically, more systemsengineering metrics tools are required to support system synthesis and couple with the earlysoftware engineering processes.

I 5.4 REQUISITE ENVIRONMENT/TOOLSET FEATURES

As stated in Section 5.3, the proposed TASQ format (identified in the matrix of Appendix D)will be utilized to provide the next elaboration update of tools contained in Appendix C.

£

ToI

5-8 -GWOT T UT

Page 56: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Ii/III!

II RFRNE

III'II

Page 57: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

REF. I

ALPHABETICAL DOCUMENT INDEXI[ABDE8609] Abdel-Ghaly, Chan & Littlewood; Evaluation of Competing Software

Reliability Predictions; IEEE Transactions, 9/86.

[ADK177041 Adkins & Pooch; Computer Simulation: A Tutorial; Computer, 4/77.

[ALBR841 11 Albrecht, A.J.; AD/M Productivity Measurement and Estimate Validation;3 IBM CIS&A Guideline 313; 11/1/84.

[ALBR8311] Albrecht & Gaffney; Software Function, Source Lines of Code andDevelopment Effort Prediction: A Software Science Validation; IEEETransactions, 11/83.

[ARMY84081 U.S. Army/Computer Systems Command; Software Quality EngineeringI Handbook; Subcontract No. 7202, 8/84.

[ARNO86] Arnold, Robert S.; Tutorial: Software Restructuring; IEEE Computer3 Society Press, 1986. Includes [YAU801 1], [HARR8209]

[BALZ83 11] Balzer, Cheatham & Green; Software Technology in the 1990's: Using aNew Paradigm; Computer, 11/83.

[BAS1831 IA] Basili & Hutchens; An Empirical Study of a Syntactic Complexity Family;IEEE Transactions, 11/83.

I [BAS187091 Basili & Rombach; Implementing Quantitative SQA: A Practical Model;IEEE Transactions, 9/87.

I [BAS187031 Basili & Rombach; TAME: Tailoring an Ada Measurement Environment;Proceedings from 5th National Joint Ada Conference, 3/87.

I [BAS18311B] Basili, Selby, Phillips; Metric Analysis & Data Validation Across FortranProjects; IEEE Transactions; 11/83.

I [BEHR83 11] Behrens, Charles; Measuring the Productivity of Computer SystemsDevelopment Activities with Function Points; IEEE Transactions, 11/83.

[BEIZ84] Beizer, Boris; Software System Testing and Quality Assurance; VanNostrand Reinhold Co., 1984.

[BELZ8603] Belz, Frank C.; Applying the Spiral Model: Observations on DevelopingSystem Software in Ada; Proceedings from 4th National Ada Conference,3/86.

IREF-2

Page 58: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I

I [BEND8703] Bendifallah & Scacchi; Understanding Software Maintenance Work; IEEETransactions, 3/87.

[BOEH81] Boehm, Barry W.; Software Engineering Economics; Prentice-Hall, NewJersey, 1981.

[BOEH88 101 Boehm & Papaccio; Understanding and Controlling Software Costs; IEEETransactions, 10/88.

[BOWE86101 Bowen, John B.; Application of a Multi-Model Approach to EstimatingResidual Software Faults and Time Between Failures; Quality & ReliabilityEngineering International, Vol. 3; 10/22/86.

[BRIT8811] Britcher, R.N.; Using Inspections to Investigate Program Correctness;Computer, 11/88.

[BROW84071 Browne, J.C.; Understanding Execution Behavior of Software Systems;Computer, 7/84.

[CARD8508] Card, Page & McGarry; Criteria for Software Modularization; ProceedingsIEEE Conference on Software Engineering; 8/85. (see SEI section)

[CARD8602] Card, Church & Agresti; An Empirical Study of Software Design Practices;

I IEEE Transactions; 2/86. (see SEI section)

[CARD8707] Card, McGarry & Page; Evaluating Software Engineering Technologies;IEEE Transactions, 7/87.

[CARD8709] Card, Cotnoir, Goorevich; Managing Software Maintenance Cost &Quality; Proceedings IEEE Conference on Software Maintenance, 9/87. (see3 SEI section)

[CARD8812] Card & Agresti; Measuring Software Design Complexity; Journal ofI Systems and Software, 12/88. (see SEI section)

[CARD87] Card & Agresti; Resolving the Software Science Anomaly; Journal ofSystems and Software, 1987. (see SEI section)

[CARD8807] Card, David; The Role of Measurement in Software Engineering;Proceedings 2nd IEE/BCS Software Engineering Conf., 7/88. (see SEI3 section)

[CARR8603] Carrio, Miguel A.; The Technology Life Cycle and Ada; Proceedings from£ 4th National Ada Conference, 3/86.

[CHAL87] Chalam, V.V.; Adaptive Control Systems - Techniques & Applications;Marcel Dekker, Inc., 1987.

[CHEU8706] Cheung & Li; An Empirical Study of Software Metrics; IEEE Transactions,1 6/87.

3 REF-3

Page 59: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I

E [CHIE86071 Chien, Y-T; Expert Systems in the SDI Environment; Computer, 7/86.

[CHUB89R] Chubb, Bruce; Lessons Learned in the Setup and Management ofProductivity Improvement Methods.

[CINL75] Cinlar, Erhan; Introduction to Stochastic Processes; Prentice-Hall, Inc.1975.

[COHE85] Cohen, Paul R.; Heuristic Reasoning about Uncertainty: An ArtificialIntelligence Approach; Pitman Publishing, Inc., 1985.

[CONT86] Conte, Dunsmore, Shen; Software Engineering Metrics & Models;Bei ,jamin/Cummings Publishing Co.; 1986.

[DAVI88101 Davis, Bersoff, Comer; A Strategy for Comparing Alternative Software3, Development Life Cycle Models; IEEE Transactions, 10/88.

[DAV18809] Davis & LeBlanc; A Study of the Applicability of Complexity Measures;IEEE Transactions, 9/88.

[DEL088071 DeLoach, Scott A.; An Interface-Based Ada Programming SupportEnvironment; ACM Press-Ada Letters; July/Aug. 1988.

I [DEMU8807] Demurjian & Hsiao; Towards a Better Understanding of Data ModelsThrough the Multilingual Database System; IEEE Transactions, 7/88.

i [DOD8807A] Department of Defense; Report of the Defense Science Board Task Force onMilitary Software-9/87; ACM Press-Ada Letters; July/Aug. 1988.

[DOD8807B] Department of Defense; Ada Board Response to Report of Defense ScienceBoard Task Force on Military Software-2/88; ACM Press-Ada Letters;July/Aug. 1988.

I [DORN75] Domy, C. N lson; A Vector Space Approach to Models & Optimization;John Wiley & Sons; 1975.

3 [DUNS8805] Dunsmore, H.E.; Evidence Supports some Truisms, Belies Others; QualityTime-IEEE Software, 5/88.

[ELLI88] Ellis & Wygant; A System-Oriented Methodology to Support SoftwareTestability; ITEA Journal, 1988.

[EVAN84] Evangelist, Michael; An Analysis of Control Flow Complexity; COMPSACProceedings, 1984.

[EVAN83] Evangelist, Michael; Software Complexity Metric Sensitivity to ProgramStructuring Rules; Journal of Systems & Software; 1983.

I3 REF-4i II

Page 60: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

III [FARR881 Farr, Smith & Schimmelpfenneg; A PC Tool for Software Reliability

Measurement; Proceedings from Institute of Environment Sciences; 1988.

[FARR8812A] Farr & Smith; Statistical Modeling and Estimation of Reliability FunctionsI for Software (SMERFS) User's Guide; NSWC-TR-84-373, 12/88.

[FARR8812B] Fan & Smith; Statistical Modeling and Estimation of Relability FunctionsII for Software (SMERFS) Library Access Guide; NSWC-TR-84-371, 12/88.

[FARR8309I Farr, William H.; A Survey of Software Reliability Modeling and3 Estimation; NSWC-TR-82-171; 9/83.

[FREE87] Freeman, Peter; Tutorial: Software Reusability; IEEE Computer SocietyI Press, 1987.

[GADO881 Gados, Ronald; Guidelines for Certification of Strategic Defense SystemSimulation Models; ITEA Journal, 1988.

[GELP8806] Gelperin & Hetzel; The Growth of Software Testing; Communications ofthe ACM, 6/88.

[GIBS8903] Gibson & Senn; System Structure and Software Maintenance Performance;Communications of the ACM, 3/89.

L [GOIC89RA] Goicoechea, Burden, Sanchez; Evaluation & Ranking of Alternate Systemswith Ariadne: An Application to the Economy Car Selection Problem. (seeGMU Info. section)

[GOIC8 11 Goicoechea, Hansen & Henrickson; A Hierarchical-Multilevel Approach tothe Development & Ranking of Systems in Tanning Industry; Computing &IIndustrial Engineering, 1981. (see GMU Info. section)

[GOIC89RB] Goicoechea, Ambrose; Metrics and Trade-Off Analysis in Software Design,Production, and Evaluation.

[GOIC89RC] Goicoechea, Ambrose; Software Metrics: Their Measurement and Use inModels. (see GMU Info. section)

[HARR89RA] Harrison, Warren; Applying McCabe's Complexity Measure to Multiple-Exit Programs. (see PSU Info. section)

I [HARR8209] Harrison, Magel, Kluczny, DeKock; Applying Software ComplexityMetrics to Program Maintenance; Computer Magazine, 9/82. (see PSUInfo. section)

[HARR86] Harrison & Cook; Are Deeply Nested Conditionals Less Readable?; Journalof Systems and Software, 1986. (see PSU Info. section)

[HARR89RBI Harrison & Magel; A Complexity Measure Based on Nesting Level. (see

PSU Info. section)

Ii REF-5

Page 61: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I

I[HARR881 Harrison, Warren; MAE: A Syntactic Metric Analysis Environment; Journal

of Systems and Software; 1988. (see PSU Info. section)

I [HARR871 Harrison & Cook; A Micro/Macro Measure of Software Complexity;Journal of Systems and Software; 1987. (see PSU Info. section)

I [HARR8602] Harrison & Cook; A Note on the Berry-Meekings Style Metric;Communications of the ACM, 2/86. (see PSU Info. section)

[HARR89RC] Harrison, Warren; Software Science and Weyuker's Fifth Property. (seePSU Info. section)

[HARR8804] Harrison, Warren; Using Software Metrics to Allocate Testing Resources;Journal of Management Information Systems, Spring '88. (see PSU Info.section)

I [HENR8109] Henry & Kafura; Software Structure Metrics Based on Information Flow;IEEE Transactions, 9/81.

[HUMP8703] Humphrey, Watts S.; Software Process Management; 9th InternationalConference on Software Engineering, 3/87.

L [IDA8804Aj IDA-Deliverable to SDIO; Strategic Defense Initiative Architecture DataflowModeling Technique, Version 1.5; Report # P-2035, 4/88.

[IDA8804B] IDA-Deliverable to SDIO; A Simple Example of an SADMT ArchitectureI1 Specification, Version 1.5; Report # P-2036, 4/88.

[IDA8812] IDA-Deliverable to SDIO; SDS Software Testing & Evaluation: A Reviewof the State-of-the-Art in Software Testing and Evaluation; Report # P-2132, 12/88.

[JEFF8707] Jeffery, D. Ross; Time-Sensitive Cost Models in the Commercial MISEnvironment; IEEE Transactions, 7/87.

[KAFU8703] Kafura & Reddy; The Use of Software Complexity Metrics in SoftwareMaintenance; IEEE Transactions, 3/87.

[KARI8812] Karimi & Kuo; User Interface Design from a Real Time Perspective;Communications of the ACM, 12/88.

[KAVI8710] Kavi, Buckles and Bhat; Isomorphisms Between Petri Nets and DataflowGraphs; IEEE Transactions, 10/87.

[LEAC8703] Leach, Ronald J.; Ada Software Metrics and Their Limitations; Proceedingsfrom 5th National Joint Ada Conference, 3/87.

[LEE87101 Lee, Tony; An Information-Theoretic Analysis of Relational Databases--PartI: Data Dependencies and Information Metric; IEEE Transactions, 10/87.

REF-6

Page 62: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I [LEW88 11] Lew, Dillon & Forward; Software Complexity and its Impact on SoftwareReliability; IEEE Transactions, 11/88.

I [LICH8601] Lichtman, Zavdi; Generation and Consistency Checking of Design andProgram Structures; IEEE Transactions, 1/86.

[MART8407] Martin & Brice; Effect of Hardware-Software Interaction on Performance;Computer, 7/84.

[MCCA7612] McCabe, Thomas; A Complexity Measure; IEEE Transactions, 12/76.

[MCCL7805] McClure, Carma; A Model for Program Complexity Analysis; Proceedingsfrom 3rd International Conference on S/W Engineering; 5178.

[MCKA89R] McKay, Charles; A Proposed Framework for the Tools & Rules to Support

the Life Cycle of the Space Station.

[MEND8205] Mendis, Kenneth S.; Quantifying Software Quality; Computers, 5/82.

I [MILL7612] Mills, Harlan D.; Software Development; IEEE Transactions, 12/76.

[MUSA8903] Musa, John D.; Faults, Failures, and a Metrics Revolution; IEEE Software,3/89.

[MUSA87] Musa, lannino, Okumoto; Software Reliability: Measurement, Prediction,Application; McGraw-Hill, New York; 1987.

I [MUSA8902] Musa, John D.; Tools for Measuring Software Reliability; IEEE Spectrum,2/89.

I [MYER861 1] Myers, W.; Can Software for the Strategic Defense Initiative Ever be Error-Free?; Computer, 11/86.

I [NELS87] Nelson & Carroll; Tutorial: Fault-Tolerant Computing; IEEE ComputerSociety Press; 1987.

[NEUG8510] Neugent, Gilligan, Hoffman & Ruthberg; Technology Assessment:Methods for Measuring the Level of Computer Security; NSB SpecialPublication 500-133, 10/85.

I [NEU'M8609] Neumann, Peter Gabriel; On Hierarchical Design of Computer Systems for

Critical Applications; IEEE Transactions, 9/86.

i [NYBE89] Nyberg, Karl; Ada: Sources & Resources 1989 Edition.

[OSTR88061 Ostrand & Balcer; The Category-Partition Method for Specifying &3 Generating Functional Tests; Communications of the ACM, 6/88.

Ii REF-7

Page 63: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

3 [PERK8703] Perkins & Gorzela; Experience Using an Automated Metrics Framework toImprove the Quality of Ada Software; Proceedings from 5th National JointAda Conference, 3/87.

[RAMA8808] Ramamurthy & Melton; A Synthesis of Software Science Measures and theCyclomatic Number; IEEE Transactions, 8/88.

I [RICH8808] Rich & Waters; Automatic Programming Myths and Prospects; Computer,

8/88.

[ROLA8606] Roland, Jon; Software Metrics; Computer Language, 6/86.

[ROMB8703] Rombach, H. Dieter; A Controlled Experiment on the Impact of Software3 Structure on Maintainability; IEEE Transactions, 3/87.

[RADC8205] Rome Air Development Center; Software Testing Measures; RADC-TR-82-1 135, 5/82.

[RADC8308] Rome Air Development Center; A Guidebook for Software ReliabilityAssessment; RADC-TR-83-176, 8/83.

[RADC8403] Rome Air Development Center; Software Test Handbook - Software TestGuidebook; RADC-TR-84-53, Vol. II; 3/84.

i [RADC8502] Rome Air Development Center; Specification of Software Quality Attributes;RADC-TR-85-37, Vol. I, 11, I1; 2/85.

U [RADC878A] Rome Air Development Center; Methodology for Software ReliabilityPrediction and Assessment; RADC-TR-87-171, Vol. I; 8/87.

I [RADC878B] Rome Air Development Center; Software Reliability Prediction andEstimation Guidebook; RADC-TR-87-171, Vol. H; 8/87.

[SCHN8904] Schneidewind, Norman; Distributed System Software Design Paradigmwith Application to Computer Networks; IEEE Transactions, 4/89.

[SCHN87031 Schneidewind, Norman; The State of Software Maintenance; IEEEITransactions, 3/87.

[SCHN77041 Schneidewind, Norman; The Use of Simulation in the Evaluation ofSoftware; Computer, 4/77.

[SCHU88051 Schultz, Herman P.; Software Management Metrics; ESD-TR-88-001,3 5/88.

[SELB8703] Selby, Richard W.; Incorporating Metrics into a Software Environment;3 Proceedings from 5th National Joint Ada Conference, 3/87.

[SHAT88081 Shatz, Sol; Towards Complexity Metrics for Ada Tasking; IEEETransactions, 8/88.

3 REF-8

Page 64: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

II

I [SHUM88] Shumskas, Anthony; Why Higher Reliability Software Should Result fromReduced Test and Increased Evaluation; 1TEA Journal, 1988.

3 [SIEG88071 Siegrist, Kyle; Reliability of Systems with Markov Transfer of Control;IEEE Transactions, 7/88.

[SING87] Singh, Madan1 ; Systems & Control Encyclopedia - Theory, Technology,Applications; Vol. 3, 6, 8; Pergamon Press; 1987.

[SLON87031 Slonaker, Smith, Prizant & Giles; Development of Multi-Tasking Softwarein Ada - a Case Study; Proceedings from 5th National Joint AdaConference, 3/87.

I [SPAR8808] SPARTA, TBE & TASC Deliverable to SDIO; Task Order 5 BM/C3Architectures - (Tools) Subtask I Architectural RepresentationMethodology; Contract No. SD1O84-88-C-0018, 8/15/88.

i 1[SRIV88011 Srivastava & Farr; The Use of Software Reliability Models in the Analysisof Operational Software System; Proceedings from the Institute ofEnvironmental Sciences, 1/13/88.

[STAN85] Stankovic, John A.; Reliable Distributed System Software; IEEE ComputerSociety Press; 1985.

[STEL8807] Stelovsky & Sugaya; A System for Specification and Rapid Prototyping ofApplication Commanc Languages; IEEE Transactions, 7/88.

[TAKA8901] Takahashi & Kamayachi; An Empirical Study of a Model for Program ErrorPrediction; IEEE Transactions, 1/89.

I [TBE86091 Teledyne Brown Engineering Deliverable to ISEC; Generic ToolsRequirements for ISEC Ada Transition; MJ86-85-C0244, Contract No.F49642-85-C0244, 9/30/86.

[TBE87031 Teledyne Brown Engineering Deliverable to AMCCOM; ExtendingMcCabe's Cyclomatic Complexity Metric for Analysis of Ada Software;3 MC87-McCabe 11-0003, Contract No. DAAA21-85-D-0010, 3/87.

[TBE8704] Teledyne Brown Engineering Deliverable to AMCCOM; Modified A-LevelSoftware Design Specification for the Ada Complexity Analysis Tool WhichAutomates the Extended McCabe's Cyclomatic Metric; MC87-McCabe 11-0005; Contract # DAAA21-85-D-0010, 4/87.ATaes he Extene ceris Cylm cMtric C MCOb

[TBE8710] Teledyne Brown Engineering Deliverable to CECOM; SoftwareMethodology Catalog; MC87-COMM/ADP-0036, Contract No. DAAB07-86-D-R001, 10/87.

I d . REF-9

Page 65: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

III [TBE8808A] Teledyne Brown Engineering Deliverable to SDIO; BM/C3 Architecture

Tools, Tool Set Recommendation - TAGS and SADMT and Software TestPlan for the SADMT-Based IORL Simulation Compiler; Contract No.3 SD1084-88-C-0018, 8/19/88.

[TBE8808B] Teledyne Brown Engineering Deliverable to SDIO; Technical Report for theDevelopment of a IORL Software Translator to SADMT; Contract No.SDIO-84-88-C-00 18, 8/26/88.

[TBE881 1] Teledyne Brown Engineering Deliverable to CECOM; ImplementationGuidelines for Software Management & Quality Indicators; Contract No.DAAB07-86-D-ROO 1, Draft-9/88, Revised- 11/88.

[TENN8809] Tenny, Ted; Program Readability: Procedures Versus Comments; IEEETransactions, 9/88.

[TITA8809] Titan Systems Deliverable to SDIO; Preliminary Software Test andEvaluation Assessment and Recommendation; 9/30/88.

[TOHM89031 Tohma, Tokunaga, Nagase & Murata; Structural Approach to the Estimationof the Number of Residual Software Faults Based on the Hyper-GeometricDistribution; IEEE Transactions, 3/89.

3 (TRAC88] Tracz, Will; Tutorial: Software Reuse: Emerging Technology; IEEEComputer Society Press; 1988.

[WEID86021 Weiderman, Nelson; Evaluation of Ada Environments; Report # SEI-86-MR-2, 2/5/86.

[WEIS8810] Weiss & Weyuker; An Extended Domain-Based Model of SoftwareI Reliability; IEEE Transactions, 10/88.

[WEYU8809] Weyuker, Elaine; Evaluating Software Complexity Measures; IEEE3 Transactions, 9/88.

[WEYU8806] Weyuker, Elaine; The Evaluation of Program-Based Software Test DataAdequacy Criteria; Communications of the ACM, 6/88.

[WOLF8903] Wolf, Clarke & Wileden; The AdaPIC Tool Set: Supporting InterfaceControl and Analysis Throughout the Software Development Process; IEEETransactions, 3/89.

[WOOD79] Woodward, Hennell & Hedley; A Measure of Control Flow Complexity in3Program Text; IEEE Transactions, 1979.

[WU87031 Wu, Basili & Reed; A Structure Coverage Tool for Ada Software Systems;Proceedings from 5th National Joint Ada Conference, 3/87.

[YAU8808] Yau, Nichol, Tsai, Liu; An Integrated Life-Cycle Model for SoftwareMaintenance; IEEE Transactions, 8/88.

3 REF-1O

Page 66: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

II

i [YAU801 1] Yau & Collofello; Some Stability Measures for Software Maintenance; IEEETransactions, 11/80.

I [YAU86061 Yau & Tsai; A Survey of Software Design Techniques; IEEE Transactions,6/86.

5[YU88091 Yu, Shen, Dunsmore; An Analysis of Several Software Defect Models;IEEE Transactions, 9/88.

R IiIIIIII

Ii I REF-"l

Page 67: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I

REF.2 CATAGORY INDICES

II METRICS PROGRAMS

1. NUSC - Navy Submarine Combat System Applied Software Metrics Program - 3/15/892. IBM - Space Shuttle Programs - 7/29/883 USAF R&M 2000 Program - 1/1/894. NSWC - Statistical Modeling & Estimation of Reliability Functions for Software

(SMERFS) - Dahlgren, VA, 12/88

I SDS POLICIES AND PROCEDURES

1. Strategic Defense System Software Policy. Draft 7/19/88

TECHNOLOGY FOR ASSESSMENT OF SOFTWARE QUALITY (TASQ)

1. Database Reports2. AMC S/W Task Force Brief - 12/14/883. POC Cross Reference Draft Report to AMCCOM; Contract No. DAAA21-88-D-0012,3 Technical Report MC89-TASQ-0003, 1/27/89

SOFTWARE ENGINEERING INSTITUTE (SEI) (received from)

1. The Role of Measurement in Software Engineering; David Card; Proceedings 2nd IEE/BCSSoftware Engineering Conf., 7/88 [CARD88071

2. Resolving the Software Science Anomaly; Card & Agresti; Journal of Systems &Software, 1987 [CARD87]

3. Measuring Software Design Complexity; Card & Agresti; Journal of Systems & Software,1988 [CARD88]

4. Managing Software Maintenance Cost & Quality; Card, Cotnoir, Goorevich; ProceedingsIEEE Conference on Software Maintenance, 9/87 [CARD8709]

5. An Empirical Study of Software Design Practices; Card, Church, Agresti; IEEETransactions; 2/86 [CARD8602]

6. Criteria for Software Modularization; Card, Page, McGarry; Proceedings IEEE Conf. onSoftware Engineering; 8/85 [CARD8508]

I INSTITUTE FOR DEFENSE ANALYSES (IDA)

I. SDS Software Testing & Evaluation: A Review of the State-of-the-Art in Software Testingand Evaluation; IDA Deliverable to SDIO, Report # P-2132, 12/88 [IDA88121

2. Strategic Defense Initiative Architecture Dataflow Modeling Technique; IDA Deliverable toSDIO, Report P-2035, 4/88 [IDA8804A]

3. A Simple Example of an SADMT Architecture Specification; IDA Deliverable to SDIO,5 Report P-2036, 4/88 [IDA8804B]

II REF- 12

Page 68: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I MIL-STDS - IEEE/ANSI/ACM/IDA/Directives/Guidelines/etc.

1. AFSCP-800-14 S/W Quality Indicators 1/20/87I 2. AFSCP-800-43 S/W Management Indicators 1/31/86

3. IEEE Std. for Software Productivity Metrics 9/1/884. IEEE P1061 Std. for Software Quality Metrics, Draft 2/23/87 & 11/17/885. IEEE P982.1/D3.0 Std, A Standard Dictionary of Measures to Produce Reliable Software,

Draft 5/886. IEEE P982.2/D6, Draft Guide for the Use of Standard Dictionary of Measures to Produce

Reliable Software, 5/887. IEEE P1044/D3 Draft Standard of a Standard Classification for Software Errors, Faults &

Failures, 12/878. Orlando HI, Panel V, Management Indicators and Quality Metrics, 1/30/87I 9. AR 1000-1 Department of the Army, Basic Policies for Systems Acquisition, 5/1/8110. Draft DoDD 5000.29, Management of Computer Resources in Major Defense Systems,

1/15/8611. DoDD 3405.1, Computer Programming Language Policy, 4/2/8712. DoDD 3405.2, Use of Ada in Weapon Systems, 3/30/8713. Draft DoDD 3405.xx, Computer Software Policy, 10/29/85

14. MIL-STD-2167A, Defense System Software Development, 2/29/88

GEORGE MASON UNIVERSITY

1. Tutorial paper: A User-Oriented Listing of Multiple Criteria Decision Methods; Despontin,Moscarola, Spronk; Belgian Journal of Statistics

2. Metrics & Trade-Off Analysis in.Software Design, Production, & Evaluation; AmbroseGoicoechea, GMU [GOIC89RB]

3. A Hierarchical-Multilevel Approach to the Development & Ranking of Systems in TanningIndustry; Goicoechea, Hansen & Henrickson, Computing & Industrial Engineering, 1981[GOIC81]

4. Evaluation & Ranking of Alternate Systems with Ariadne: An Application to the EconomyCar Selection Problem; Goicoechea, Burden, Sanchez; GMU [GOIC89RA]

5. Software Metrics: Their Measurement and Use in Models; Ambrose Goicoechea, GMU[GOIC89RC]

PORTLAND STATE UNIVERSITY

I . Tool Info. (PC-Metric - Developer's Toolbox)2. Software Science and Weyuker's Fifth Property, Warren Harrison, PSU [HARR89RC]3. A Note on the Berry-Meekings Style Metric; Warren Harrison & Curtis Cook;

Communications of the ACM, 2/86 [HARR8602]4. MAE: A Syntactic Metric Analysis Environment; Warren Harrison, Journal of Systems

and Software, 1988 [HARR88]5. Applying McCabe's Complexity Measure to Multiple-Exit Programs; Harrison, PSU

[HARR89RA]6. Using Software Metrics to Allocate Testing Resources; Harrison; Journal of Management

Information Systems, Spring 88 [HARR8804]7. Applying Software Complexity Metrics to Program Maintenance; Harrison, Magel,

Kluczny, DeKock; Computer Magazine, 9/82 [HARR8209]

I REF-13

Page 69: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

II

8. A Complexity Measure Based on Nesting Level; Harrison & Magel; Univ. of Mo.[HARR89RB]

9. Are Deeply Nested Conditionals Less Readable?; Harrison & Cook; Journal of Systems &Software, 1986 [HARR86]

10. A Micro/Macro Measure of Software Complexity; Harrison & Cook; Journal of Systems &Software, 1987 [HARR87]

I TOOLS, COURSES AND NEWSLETTERS

1. Automated Systems Department (TBE-HSV) S/W Tools2. TBE Accomplishments & Initiatives in Productivity, Training & Automated IV&V Tools -

April 19883. IV&V Methodology & Tools Update - Computer Applications Directorate - April 19884. Timing & Sizing Simulation Evaluation - DP-Sim & Network 11.5 - Terry Moms - April

- 19885. TBE 1988 BAMD V&V Tools Matrix

I 6. Logiscope Descriptions7. McCabe & Associate Tools8. AT&T Tools for Measuring Software Reliability9. Dynamics Research Corporation -- AdaMat & Ada Quality Quarterly Newsletter10. EVB, Software Engineering courses for 198911. Ada Information Clearinghouse newlerters

K CONFERENCE INFORMATION

1. Proceedings of the Joint Ada Conference - Fifth National Conference on Ada Technologyand Washington Ada Symposium; March 16-19, 1987

2. Agenda, NSIA Software Quality and Productivity; March 10-12, 19873. Agenda, 9th Annual Conference on Multiple-Criteria Decision Making; August 1990

Agenda; Ada Project Management Course by Don Firesmith; Spring 1989.

R 1UI

III3 REF-14

Page 70: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

IIUIIIII

APPENDIX A

I PROPOSED IEEE METRICS TERMINOLOGY

IIIIUUIaI3 A-i

Page 71: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

PROPOSED IEEE STANDARD QUALITY AND RELIABILITYMETRICS TERMINOLOGY

Term!Phrase Definition

Concept Phase The period of time in the software life cycle duringwhich system concepts and objectives needed by theuser are identified and documented. Precedes the3 Requirements Phase.

CriticalRange Metric values used to classify software into3categories of acceptable, marginal and unacceptable.

Critical Value Metric value of a validated metric which is used to3identify software which has unacceptable quality.

Defect A product anomaly. Examples include such thingsas: 1) omissions and imperfections found duringearly life cycle phases and 2) faults contained insoftware sufficiently mature for test or operation.See also "fault".

I Direct Metric A metric that represents and defines a softwarequality factor, and which is valid by definition (e.g.,mean time to software failure for the factorI!reliability).

Error Human action that results in software containing afault. Examples include omission ormisinterpretation of user requirements in a softwarespecification, incorrect translation or omission of arequirement in the design specification (ANSI/IEEEStd 729-1983).

Factor Sample A set of factor values which is drawn from themetrics data base and used in metrics validation.

Factor Value A value (see "metric value") of the direct metric that3represents a factor.

Failure 1) The termination of the ability of a functional unitto perform its required function (ISO; ANSI/IEEEStd 729-1983). 2) An event in which a system orsystem component does not perform a requiredfunction within specified limits. A failure may beproduced when a fault is encountered (ANSI/IEEEStd 729-1983).

13 A-2

Page 72: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Fault 1) An accidental condition that causes a functionalunit to fail to perform its required function (ISO,ANSI/IEEE Std 729-1983). 2) A manifestation ofan error in software. A fault, if encountered, maycause a failure. Synonymous with bug (ANSI/EEEStd 729-1983).

3 Measure 1) A quantitative assessment of the degree to whicha software product or process possesses a givenattribute (IEEE P982.1/D3.0). 2) To ascertain orappraise by comparing to a standard; to apply ametric (IEEE P1061).

Measurement 1) The act or process of measuring. 2) A figure,extent, or amount obtained by measuring.

Metrics Framework A tool used for organizing, selecting,communicating and evaluating the required qualityattributes for a software system; a hierarchicalbreakdown of factors, subfactors and metrics for a5 software system.

Metrics Sample A set of metrics values which is drawn from the3 metrics data base and used in metrics validation.

Metric Validation The act or process of ensuring that a metric correctlypredicts or accesses a quality factor.

Metric Value An element from the range of a metric; a metricoutput.

Predictive Assessment The process of using a predictor metric(s) to predictthe value of another metric.

5 Predictive Metric A metric which is used to predict the values ofanother metric.

Primitive Data relating to the development or use of software

that is used in developing measures or quantitativedescriptions of software. Primitives are directlymeasurable or countable, or may be given a constantvalue or condition for a specific measure. Examplesinclude error, failure, fault, time, time interval, date,number of noncommentary source code statements,edges, nodes.

Process Step Any task performed in the development,implementation or maintenance of software (e.g.,identify the software components of a system as partof the design).

A* A-3

Page 73: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I

N Process Metric Metrics used to measure characteristics of themethods, techniques, and tools employed inacquiring, developing, verifying, operating andchanging the software system.

Product Metric Metrics used to measure the characteristics of thedocumentation and code.

Quali, Attribute A characteristic of software; a generic term applyingto factors, sub-factors, or metric values.

Quality Factor An attribute of software that contributes to its3 quality.

Quality Requirement A requirement that a software attribute be present insoftware to satisfy a contract, standard,specification, or other formally imposed.

Quality Sub-factor A decomposition of a quality factor or quality sub-factor document.

Sample Software Software selected from a current or completedproject from which data can be obtained for use inpreliminary testing of data collection and metriccomputation procedures.

3 Software Component General term used to refer to an element of asoftware system, such as module, unit, data ordocument.

Software Quality Metric A function whose inputs are software data andwhose output is a single (numerical) value that canbe interpreted as the degree to which softwarepossesses a given attribute that affects its quality.

Software Reliability The probability that software will not cause thefailure of a system for a specified time underspecified conditions. The probability is a functionof the inputs to and use of the system as well as afunction of the existence of faults in the software.The inputs to the system determined whetherexisting faults, if any, are encountered (ANSI/IEEE3 Std 729-1983).

AI3 N

Page 74: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I

Software Reliability Management The process of optimizing the reliability of softwarethrough a program which emphasizes software errorprevention, fault detection and removal, and the useof measurements to maximize reliability in light ofproject constraints such as resources (cost),schedule, and performance.

3 Validated Metric A metric whose values have been statisticallyassociated with corresponding quality factor values.

II

II

II

IIII

i A-5

Page 75: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I

IIIII

II

IIIII

i< B-i through B-36 >

Page 76: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-i Subfunction Ranked Attributes & Metric Models

Subfunction Title: Detect Plumes

3 SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effci H H H H L L L L M

Time Between Failure 5 5 5 5 1 1 1 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0

Gaffney 5 0 0 0 0 1 0 0 0Benyc -Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

3 Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0. 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding----------------------------------------- -------

Mills 5 0 0 0 0 0 0 0 0

3 Input Domain BasedNelson 5 0 0 0 0 0 0 0 0o 5 0 0 0 0 0 0 0 0

5 Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

U

Page 77: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-2 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Detect Cold Bodies

U SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

- H H H H L L L L HTime Between Failure 5 5 5 5 1 1 1 1 5

3 Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

I Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Mu sa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

3 Input Domain Based--------------------------------------------------------------------------------Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 78: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-3 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: RF Detect

METRIC CLASSES & MODELS SUBFUNCTION ATTRIBUTES, RANKINGS & SCORES

Reli Surv Intg Thru Usab Main Port Reus Effc

Time Between Failure 5 5 5 5 1 3 3 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0

Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 3 0 0 0McCabe 5 0 0 0 0 3 0 0 0Woodward (Knot Counts) 5 0 0 0 0 3 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 3 0 0 0Gaffney 5 0 0 0 0 3 0 0 0Benyon-Tinker 5 0 0 0 0 3 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 3 0 0 0Chapin's Q 5 0 0 0 0 3 0 0 0Segment-Global Usage Pair 5 0 0 0 0 3 0 0 0Myer's (McCabe extension) 5 0 0 0 0 3 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 3 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 3 0 0 0

Failure Count

3 Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0. 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

U Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 79: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-4 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Resolve Objects

3 SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

I H H H M L L L L MTime Between Failure 5 5 5 3 1 1 1 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0

Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0

Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's 0 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 a 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0

Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 03 Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

Page 80: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I

Table B-5 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Discriminate

I SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

I H H H M L L L L MTime Between Failure 5 5 5 3 1 1 1 1 3

3 Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's 0 5 0 0 0 0 1 0 0 0Segment-Global Usaae Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0.0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0

IJelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

I Input Domain Based---------------------------------------------------------------------------Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 05 Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I0

Page 81: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I

Table B-6 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Assess Kills

SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus EffcI H H H M M M M L M

Time Between Failure 5 5 5 3 3 3 3 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0

Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0

Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Hal: tead 5 0 0 0 0 3 0 0 0McCabe 5 0 0 0 0 3 0 0 0Woodward (Knot Counts) 5 0 0 0 0 3 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 3 0 0 0Gaffney 5 0 0 0 0 3 0 0 0Benyon-Tinker 5 0 0 0 0 3 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 3 0 0 0Chapin's Q 5 0 0 0 0 3 0 0 0Segment-Global Usage Pair 5 0 0 0 0 3 0 0 0Myer's (McCabe extension) 5 0 0 0 0 3 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 3 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 3 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0

Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 .0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

-Nelson 5-0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0

5 Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 82: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITable B-7 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: Correlate

SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

3 H H H H L L L L MTime Between Failure 5 5 5 5 1 1 1 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0

Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0

Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0lChen (Nested Decisior Stmts) 5 0 0 0 0 1 0 0 0

Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's tMcCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead)5 0 0 0 1 0 0 0

Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

3 Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0. 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

I Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

Nelson 5 0 0 0 0 0 0 0 0"o 5 0 0 0 0 0 0 0 03 Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 83: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-8 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: Initiate Track

m SUBFUNCTION ATTRIBUTES, RANKINGS SCORES

I METRIC CLASSES & MODELSReli Surv Intg Thru Usab Main Port Reus Effc-H H H H L L L L M

Time Between Failure 5 5 5 5 1 1 1 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0

Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0

Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 C 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0I Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

U Mills 5 0 0 0 0 1 0 0 0

Input Domain Based

Nelson 5 C 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0I Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 84: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-9 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Estimate State

METRIC CLASSES & MODELS SUBFUNCTION ATTRIBUTES, RANKINGS & SCORES

.....- Reli Surv Intg Thru Usab Main Port Reus Effc

Time Between Failure 5 5 5 5 1 1 3 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/okumoto 5 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0

Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker5 0 0 0 0 1 0 0 0

Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Mu sa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Mc'randa 5 5 0 0 0 0 0 0 0ISchick/Wolverton Poso)5 5 0 0 0 0 0 0 0

Brooks/Motley 5 5 0 0 0 0 0 0 0IIBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

mills 5 0 0 0 0 0 0 0 0

Input Domain Based

INelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 85: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I II

Table B-10 Subfunction Ranked Attributes & Metric Models

3 Subfunction Title: Predict Intercept and Impact Points

I CSUBFUNCTION ATTRIBUTES, RANKINGS £ SCORES

--M-TRIC-CLASSES---MODELSReli Surv Intg Thru Usab Main Port Reus Effc

IH H H , L L L L M

Time Between Failure 5 5 5 3 1 1 1 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0-

I Complexity

Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 03 Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

I Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0. 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0

nMoranda 5 5 0 0 0 0 0 0 0

Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

I Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

I Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 86: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-Il Subfunction Ranked Attributes I Metric Models

Subfunction Title: Interplatform Data Communications

3 SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

3 H H H H L L L M HTime Between Failure 5 5 5 5 1 1 1 3 5

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0

Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

£ Mills 5 0 0 0 Z 0 0 0 0

Input Domain Based

I Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

!

Page 87: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-12 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Ground - Space Communications

I CSUBFUNCTION ATTRIBUTES, RANKINGS & SCORESU METRIC CLASSES & MODELSReli Surv Intg Thru Lsab Main Port Reus Effc-------------------------- ---- ---- ---- ---- ---- ---- ---- ----I H H M L M M L M

Time Between Failure 5 5 5 3 1 3 3 1 3----------------------------------------- ---- ---- ---- ---- ---- ---- ---- ---- -iJelinski/Moranda 5 5 0 0 0 0 0 0 0

Schick/Wolverton (Linear) 5 5 0 0 u 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0

I Complexity

Halstead 5 0 0 0 0 3 0 0 0McCabe 5 0 0 0 0 3 0 0 0Woodward (Knot Counts) 5 0 0 0 0 3 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 3 0 0 0Gaffney 5 0 0 0 0 3 0 0 0Benyon-Tinker 5 0 0 0 0 3 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 3 0 0 0Chapin's Q 5 0 0 0 0 3 0 0 0Segment-Global Usage Pair 5 0 0 0 0 3 0 0 0Myer's (McCabe extension) 5 0 0 0 0 3 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 3 0 0 05 Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 3 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0

Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0IMoranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0

(GIBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

i Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

I Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0i Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 88: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

3Table B-13 Subfunction Ranked Attributes & Metric ModelsSubfunction Title: Ground Communications

3 SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus EffcI M M M M M H M H

Time Between Failure 3 3 3 3 3 5 3 5 1

iJelinski/Mo;anda 3 3 0 0 0 0 0 0 0

Schick/Wolverton (Linear) 3 3 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 3 3 0 0 0 0 0 0 0Moranda (Geometric De-eut) 3 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 3 0 0 0 0 0 0 0 0Goel/Okumoto 3 0 0 0 0 0 0 0 0Littlewood/Verrall3 0 0 0 0 0 0 0 0

Lloyd/Lipow 3 0 0 0 0 0 0 0 0

Complexity

Halstead 3 0 0 0 0 5 0 0 0McCabe 3 0 0 0 0 5 0 0 0Woodward (Knot Counts) 3 0 0 0 0 5 0 0 0Chen (Nested Decision Stmts) 3 0 0 0 0 5 0 0 0Gaffney 3 0 0 0 0 5 0 0 0Benyon-Tinker 3 0 0 0 0 5 0 0 0Gilb's (Binary Decision) 3 0 0 0 0 5 0 0 0Chapin's Q 3 0 0 0 0 5 0 0 0Segment-Global Usage Pair 3 0 0 0 0 5 0 0 0Myer's (McCabe extension) 3 0 0 0 0 5 0 0 0Hansen's (McCabe/Halstead) 3 0 0 0 0 5 0 0 0Oviedo's (Data/Ctrl Flows) 3 0 0 0 0 5 0 0 0

Failure Count

m Geol/Okumoto 3 0 0 0 0 0 0 0 0Schneidewind 3 0 0 0 0 0 0 0 0Geol 3 0 0 0 0 0 0 0 0Musa 3 0 0 0 0 0 0 0 0Shooman 3 0 0 0 0 0 0 0 0Moranda 3 3 0 0 0 0 0 0 0Jelinski/Moranda 3 3 0 0 0 0 0 0 0Moranda 3 3 0 0 0 0 0 0 0Schick/Wolverton 3 3 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 3 3 0 0 0 0 0 0 0Brooks/Motley 3 3 0 0 0 0 0 0 0IBM Poisson 3 3 0 0 0 0 0 0 0

Fault Seeding

?-ills 3 0 0 0 0 0 0 0 0

I Input Domain Based

--------------------------------------------------------------------------------Nelson 3 0 0 0 0 0 0 0 0Ho 3 0 0 0 0 0 0 0 03 Ramamoorthy/Bastani 3 0 0 0 0 0 0 0 0

l

Page 89: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITable B-16 Subfunction Ranked Attributes & Metric Models

3 Subfunction Title: Assign and Control SBI Weapons

I CSUBFUNCTION ATTRIBUTES, RANKINGS & SCORES--METRIC-CLASSES---MODE-SReli Surv Intg Thru Usab Main Port Reus Effc

iH H H H L L L L 14

Time Between Failure 5 5 5 5 1 1 1 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0

I Complexity

Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0

I Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0U Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

I Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0S Moranda 5 5 0 0 0 0 0 0 0

Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

i Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

I Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0I

I

Page 90: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-17 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: Assign and Control GBI Weapons

M SUBFUNCTION ATTRIBUTES, RANKINGS & SCORES

I METRIC CLASSES & MODELSReli Surv Intg Thru Usab Main Port Reus Effc

Time Between Failure 5 5 5 3 1 5 1 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 5 0 0 0McCabe 5 0 0 0 0 5 0 0 0Woodward (Knot Counts) 5 0 0 0 0 5 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 5 0 0 0Gaffney 5 0 0 0 0 5 0 0 0Benyon-Tinker 5 0 0 0 0 5 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 5 0 0 0Chapin's Q 5 0 0 0 0 5 0 0 0Segment-Global Usage Pair 5 0 0 0 0 5 0 0 0Myer's (McCabe extension) 5 0 0 0 0 5 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 5 0 0 03 Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 5 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0

Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0EnMoranda 5 5 0 0 0 0 0 0 0

Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

Input Domain Based-Nelson 5 0 0 0 0 0 0 0 0

Ho 5 0 0 0 0 0 0 0 0

Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 91: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

3 Table B-18 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Guide and Control SBI Weapons

5 SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

l H H H H L L L L HTime Between Failure 5 5 5 5 1 1 1 1 5

Jelinski/Moranda 5 5 0 0 0 0 0 0 0

Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's 0 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0E Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

I Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0I Musa 5 0. 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0I Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0I Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

5Input Domain BasedNelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 92: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-19 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Guide and Control GBI Weapons

3 SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

I H H H M L H L L MTime Between Failure 5 5 5 3 1 5 1 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0

5 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 5 0 0 0McCabe 5 0 0 0 0 5 0 0 0Woodward (Knot Counts) 5 0 0 0 0 5 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 5 0 0 0Gaffney 5 0 0 0 0 5 0 0 0Benyon-Tinker 5 0 0 0 0 5 0 0 0

Gilb's (Binary Decision) 5 0 0 0 0 5 0 0 0Chapin's Q 5 0 0 0 0 5 0 0 0Segment-Global Usage Pair 5 0 0 0 0 5 0 0 0Myer's (McCabe extension) 5 0 0 0 0 5 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 5 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 5 0 0 0

Failure Count

3 Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0

Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

----------------------------Ho 5 0 0 0 0 0 0 0 0

3 Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 93: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-20 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: Command Environment Control

METRIC CLASSES & MODELS SUEFUNCTION ATTRIBUTES, RANKINGS & SCORES

Reli Surv Intg Thru Usab Main Port Reus EfcI H M H M - H - L -

Time Between Failure 5 3 5 3 3 5 3 1 3

Jelinski/Moranda 5 3 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 3 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 3 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 5 0 0 0IMcCabe 5 0 0 0 0 5 0 0 0

Woodward (Knot Counts) 5 0 0 0 0 5 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 5 0 0 0Gaffney 5 0 0 0 0 5 0 0 0Benyon-Tinker 5 0 0 0 0 5 0 0 0

Gilb's (Binary Decision) 5 0 0 0 0 5 0 0 0Chapin's Q 5 0 0 0 0 5 0 0 0Segment-Global Usage Pair 5 0 0 0 0 5 0 0 0Myer's (McCabe extension) 5 0 0 0 0 5 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 5 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 5 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0

Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 3 0 0 0 0 0 0• Jelinski/Moranda 5 3 0 0 0 0 0 0Moranda 5 3 0 0 0 0 0 0 0Schick/Wolverton 5 3 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 3 0 0 0 0 0 0 0I Brooks/Motley 5 3 0 0 0 0 0 0 0IBM Poisson 5 3 0 0 0 0 0 0 0

Fault Seeding

I Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0I Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 94: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

IIII

Table B-21 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: Control Onboard Environment

I CSUBFUNCTION ATTRIBUTES, RANKINGS & SCORES

N METRIC CLASSES & MODELSReli Surv Intg Thru Usab Main Port Reus Effc

I H H H H L L L L M

Time Between Failure 5 5 5 5 1 1 1 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's 0 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 03 Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 CGeol 5 0 0 0 0 0 0 0 0Musa 5 0. 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0

Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

3 Mills 5 0 0 0 0 0 0 0 0

Input Domain Based----------- ------ -----

Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 95: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-22 Subfunction Ranked Attributes I Metric Models

I Subfunction Title: Command Attitude and Position Control

I CSUBFUNCTION ATTRIBUTES, RANKINGS & SCORES

- -METRIC CLASSES & MODELSReli Surv Intg Thru Usab Main Port Reus Effci H M H -- - H -- L --

Time Between Failure 5 3 5 3 3 5 3 1 3

Jelinski/Moranda 5 3 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 3 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 3 0 0 0 0 0 0 0

Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 5 0 0 0McCabe 5 0 0 0 0 5 0 0 0Woodward (Knot Counts) 5 0 0 0 0 5 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 5 0 0 0Gaffney 5 0 0 0 0 5 0 0 0Benyon-Tinker 5 0 0 0 0 5 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 5 0 0 0Chapin's Q 5 0 0 0 0 5 0 0 0Segment-Global Usage Pair 5 0 0 0 0 5 0 0 0Myer's (McCabe extension) 5 0 0 0 0 5 0 0 0

Hansen's (McCabe/Halstead) 5 0 0 0 0 5 0 0 03 'viedo's (Data/Ctrl Flows) 5 0 0 0 0 5 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0I -chneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 3 0 0 0 0 0 0 0Jelinski/Moranda 5 3 0 0 0 0 0 0 0Moranda 5 3 0 0 0 0 0 0 0Schick/Wolverton 5 3 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 3 0 0 0 0 0 0 0Brooks/Motley 5 3 0 0 0 0 0 0 0IBM Poisson 5 3 0 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

I Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0i Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

U

Page 96: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-23 Subfunction Ranked Attributes & Metric Models

ISubfunction Title: Control Onboard Attitude and Position

I CSUBFUNCTION ATTRIBUTES, RANKINGS & SCORES

I METRIC CLASSES & MODELSReli Surv Intg Thru Usab Main Port Reus Effc

IH H H H L L L L M

Time Between Failure 5 5 5 5 1 1 1 2 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 03 Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0I Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0. 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

Input Domain Based-Nelson 5 0 0 0 0 0 0 0 0

Ho 5 0 0 0 0 0 0 0 0

Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 97: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

3Table B-24 Subfunction Ranked Attributes & Metric ModelsSubfunction Title: Sense Onboard Status

USUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

-- H H H M L L L L MTime Between Failure 5 5 5 3 1 1 1 1 3

U Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

U Halstead 5 0 0 0 0 1 0 0 0McCabe 5 0 0 0 0 1 0 0 0Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0

Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

~Fault Seeding

---------------------------------------------------------------------------------Mills 5 0 0 0 0 0 0 0 0

3Input Domain BasedNelson 5 0 0 0 0 0 0 0 0H 5 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 98: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITable B-25 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: Assess Status

I CSUBFUNCTION ATTRIBUTES, RANKINGS & SCORES

I-METRIC CLASSES & MODELSReli Surv Intg Thru Usab Main Port Reus EffcIH M H M M H M L M

Time Between Failure 5 3 5 3 3 5 3 1 3

Jelinski/Moranda 5 3 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 3 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 3 0 0 0 0 0 0 0

Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 5 0 0 0McCabe 5 0 0 0 0 5 0 0 0Woodward (Knot Counts) 5 0 0 0 0 5 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 5 0 0 0Gaffney 5 0 0 0 0 5 0 0 0Benyon-Tinker 5 0 0 0 0 5 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 5 0 0 0Chapin's 0 5 0 0 0 0 5 0 0 0Segment-Global Usage Pair 5 0 0 0 0 5 0 0 0Myer's (McCabe extension) 5 0 0 0 0 5 0 0 0

Hansen's (McCabe/Halstead) 5 0 0 0 0 5 0 0 03 Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 5 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 3 0 0 0 0 0 0 0Jelinski/Moranda 5 3 0 0 0 0 0 0 0

nMoranda 5 3 0 0 0 0 0 0 0

Schick/Wolverton 5 3 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 3 0 0 0 0 0 0 0Brooks/Motley 5 3 0 0 0 0 0 0 0IBM Poisson 5 3 0 0 0 0 0 0 0

Fault Seeding

I Mills 5 0 0 0 0 0 0 0 0

Input Domain BasedI Nelson 5 0 0 0 0 0 0 0 0

Ho 5 0 0 0 0 0 0 0 0

I Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 99: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-26 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Command Reconfiguration

3 SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus EffcI H M H M M H M L M

Time Between Failure 5 3 5 3 3 5 3 1 3

Jelinski/Moranda 5 3 0 0 0 0 0 0 0

Schick/Wolverton (Linear) 5 3 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 3 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 5 0 0 0McCabe 5 0 0 0 0 5 0 0 0Woodward (Knot Counts) 5 0 0 0 0 5 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 5 0 0 0Gaffney 5 0 0 0 0 5 0 0 0Benyon-Tinker 5 0 0 0 0 5 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 5 0 0 0Chapin's Q 5 0 0 0 0 5 0 0 0Segment-Global Usage Pair 5 0 0 0 0 5 0 0 0Myer's (McCabe extension) 5 0 0 0 0 5 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 5 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 5 0 0 0

Failure Count

3 Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 3 0 0 0 0 0 0 0Jelinski/Moranda 5 3 0 0 0 0 0 0 0Moranda 5 3 0 0 0 0 0 0 0Schick/Wolverton 5 3 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 3 0 0 0 0 0 0 0Brooks/Motley 5 3 0 0 0 0 0 0 0IBM Poisson 5 3 0 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

--Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 03 Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 100: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITable B-27 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: Reconfiguration

3 SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

I H H H H L L L L MTime Between Failure 5 5 5 5 1 1 1 1 3

Jelinski/Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 03 Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 1 0 0 0

Woodward (Knot Counts) 5 0 0 0 0 1 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 1 0 0 0Gaffney 5 0 0 0 0 1 0 0 0Benyon-Tinker 5 0 0 0 0 1 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 1 0 0 0Chapin's Q 5 0 0 0 0 1 0 0 0Segment-Global Usage Pair 5 0 0 0 0 1 0 0 0Myer's (McCabe extension) 5 0 0 0 0 1 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 1 0 0 0Oviedo's (Data/Ctrl Flows) 5 0 0 0 0 1 0 0 0

Failure Count

3 Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Jelinski/Moranda 5 5 0 0 0 0 0 0 0Moranda 5 5 0 0 0 0 0 0 0Schick/Wolverton 5 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 5 0 0 0 0 0 0 0Brooks/Motley 5 5 0 0 0 0 0 0 0IBM Poisson 5 5 0 0 0 0 0 0 0

Fault Seeding

I Mills 5 0 0 0 0 0 0 0 0

Input Domain Based

Nelson 5 0 0 0 0 0 0 0 0Ho 5 0 0 0 0 0 0 0 0

I Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

3

Page 101: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITable B-28 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: Development Tools

1 SUBFUNCTION ATTRIBUTES, RANKINGS £ SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

I M M M L M M M M MTime Between Failure 3 3 3 1 3 3 3 3 3

Jelinski/Moranda 3 3 0 0 0 0 0 0 0Schick/Wolverton (Linear) 3 3 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 3 3 0 0 0 0 0 0 0Moranda (Geometric De-eut) 3 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 3 0 0 0 0 0 0 0 0Goel/Okumoto 3 0 0 0 0 0 0 0 0Littlewood/Verrall 3 0 0 0 0 0 0 0 03 Lloyd/Lipow 3 0 0 0 0 0 0 0 0

Complexity

Halstead 3 0 0 0 0 3 0 0 0McCabe 3 0 0 0 0 3 0 0 0Woodward (Knot Counts) 3 0 0 0 0 3 0 0 0Chen (Nested Decision Stmts) 3 0 0 0 0 3 0 0 0Gaffney 3 0 0 0 0 3 0 0 0Benyon-Tinker 3 0 0 0 0 3 0 0 0Gilb's (Binary Decision) 3 0 0 0 0 3 0 0 0Chapin's Q 3 0 0 0 0 3 0 0 0Segment-Global Usage Pair 3 0 0 0 0 3 0 0 0Myer's (McCabe extension) 3 0 0 0 0 3 0 0 0Hansen's (McCabe/Halstead) 3 0 0 0 0 3 0 0 0Oviedo's (Data/Ctrl Flows) 3 0 0 0 0 3 0 0 0

Failure Count

3 Geol/Okumoto 3 0 0 0 0 0 0 0 0Schneidewind 3 0 0 0 0 0 0 0 0Geol 3 0 0 0 0 0 0 0 0Musa 3 0 0 0 0 0 0 0 0Shooman 3 0 0 0 0 0 0 0 0Moranda 3 3 0 0 0 0 0 0 0Jelinski/Moranda 3 3 0 0 0 0 0 0 0Moranda 3 3 0 0 0 0 0 0 0Schick/Wolverton 3 3 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 3 3 0 0 0 0 0 0 0Brooks/Motley 3 3 0 0 0 0 0 0 0IBM Poisson 3 3 0 0 0 0 0 0 0

Fault Seeding

I Mills 3 0 0 0 0 0 0 0 0

Input Domain Based

Nelson 3 0 0 0 0 0 0 0 0Ho 3 0 0 0 0 0 0 0 03 Ramamoorthy/Bastani 3 0 0 0 0 0 0 0 0

I

Page 102: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

II

Table B-29 Subfunction Ranked Attributes & Metric Models

ISubfunction Title: Hardware-in-the-loop (HWIL) Simulation

SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

M H H M M M L L MTime Between Failure 3 5 5 3 3 3 1 1 3

Jelinski/Moranda 3 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 3 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 3 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 3 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 3 0 0 0 0 0 0 0 0Goel/Okumoto 3 0 0 0 0 0 0 0 0Littlewood/Verrall 3 0 0 0 0 0 0 0 CLloyd/Lipow 3 0 0 0 0 0 0 0 0

Complexity

Halstead 3 0 0 0 0 3 0 0 0

Woodward (Knot Counts) 3 0 0 0 0 3 0 0 0Chen (Nested Decision Stmts) 3 0 0 0 0 3 0 0 0Gaffney 3 0 0 0 0 3 0 0 0Benyon-Tinker 3 0 0 0 0 3 0 0 0Gilb's (Binary Decision) 3 0 0 0 0 3 0 0 0Chapin's Q 3 0 0 0 0 3 0 0 0Segment-Global Usage Pair 3 0 0 0 0 3 0 0 0Myer's (McCabe extension) 3 0 0 0 0 3 0 0 0Hansen's (McCabe/Halstead) 3 0 0 0 0 3 0 0 0Oviedo's (Data/Ctrl Flows) 3 0 0 0 0 3 0 0 0

Failure Count

Geol/Okumoto 3 0 0 0 0 0 0 0 0Schneidewind 3 0 0 0 0 0 0 0 0Geol 3 0 0 0 0 0 0 0 0Musa 3 0 0 0 0 0 0 0 0Shooman 3 0 0 0 0 0 0 0 0Moranda 3 5 0 0 0 0 0 0 0Jelinski/Moranda 3 5 0 0 0 0 0 0 0Moranda 3 5 0 0 0 0 0 0 0Schick/Wolverton 3 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 3 5 0 0 0 0 0 0 0Brooks/Motley 3 5 0 0 0 0 0 0 0IBM Poisson 3 5 0 0 0 0 0 0 0

Fault Seeding

E Mills 3 0 0 0 0 0 0 0 0

Input Domain Based

Nelson 3 0 0 0 0 0 0 0 0Ho 3 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 3 0 0 0 0 0 0 0 0

I

Page 103: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I3 Table B-30 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Demonstration Simulation

SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

I M H M M M M M L MTime Between Failure 3 5 3 3 3 3 3 1 3

3 Jelinski/Moranda 3 5 0 0 0 0 0 0 0Schick/Wolverton (Linear) 3 5 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 3 5 0 0 0 0 0 0 0Moranda (Geometric De-eut) 3 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 3 0 0 0 0 0 0 0 0Goel/Okumoto 3 0 0 0 0 0 0 0 0Littlewood/Verrall 3 0 0 0 0 0 0 0 0Lloyd/Lipow 3 0 0 0 0 0 0 0 0

Complexity

Halstead 3 0 0 0 0 3 0 0 0McCabe 3 0 0 0 0 3 0 0 0Woodward (Knot Counts) 3 0 0 0 0 3 0 0 0Chen (Nested Decision Stmts) 3 0 0 0 0 3 0 0 0Gaffney 3 0 0 0 0 3 0 0 0Benyon-Tinker 3 0 0 0 0 3 0 0 0Gilb's (Binary Decision) 3 0 0 0 0 3 0 0 0Chapin's Q 3 0 0 0 0 3 0 0 0Segment-Global Usage Pair 3 0 0 0 0 3 0 0 0Myer's (McCabe extension) 3 0 0 0 0 3 0 0 0Hansen's (McCabe/Halstead) 3 0 0 0 0 3 0 0 0Oviedo's (Data/Ctrl Flows) 3 0 0 0 0 3 0 0 0

Failure Count

I Geol/Okumoto 3 0 0 0 0 0 0 0 0Schneidewind 3 0 0 0 0 0 0 0 0Geol 3 0 0 0 0 0 0 0 0Musa 3 0. 0 0 0 0 0 0 0Shooman 3 0 0 0 0 0 0 0 0Moranda 3 5 0 0 0 0 0 0 0I Jelinski/Moranda 3 5 0 0 0 0 0 0 0Moranda 3 5 0 0 0 0 0 0 0Schick/Wolverton 3 5 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 3 5 0 0 0 0 0 0 0Brooks/Motley 3 5 0 0 0 0 0 0 0IBM Poisson 3 5 0 0 0 0 0 0 0

Fault Seeding

Mills 3 0 0 0 0 0 0 0 0

3 Input Domain Based---------------------------------------------------------------------------

Nelson 3 0 0 0 0 0 0 0 0

Ho 3 0 0 0 0 0 0 0 03 Ramamoorthy/Bastani 3 0 0 0 0 0 0 0 0

I

Page 104: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

mTable B-32 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Provide Development Environment

I CSUBFUNCTION ATTRIBUTES, RANKINGS & SCORES

METRIC CLASSES & MODELSReli Surv Intg Thru Usab Main Port Reus Effc

I H M H M H M L L M

Time Between Failure 5 3 5 3 5 3 1 1 3

Jelinski/Moranda 5 3 0 0 0 0 0 0 0Schick/Wolverton (Linear) 5 3 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 5 3 0 0 0 0 0 0 0Moranda (Geometric De-eut) 5 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 5 0 0 0 0 0 0 0 0Goel/Okumoto 5 0 0 0 0 0 0 0 0Littlewood/Verrall 5 0 0 0 0 0 0 0 0Lloyd/Lipow 5 0 0 0 0 0 0 0 0

Complexity

Halstead 5 0 0 0 0 3 0 0 0McCabe 5 0 0 0 0 3 0 0 0Woodward (Knot Counts) 5 0 0 0 0 3 0 0 0Chen (Nested Decision Stmts) 5 0 0 0 0 3 0 0 0Gaffney 5 0 0 0 0 3 0 0 0Benyon-Tinker 5 0 0 0 0 3 0 0 0Gilb's (Binary Decision) 5 0 0 0 0 3 0 0 0Chapin's 0 5 0 0 0 0 3 0 0 0Segment-Global Usage Pair 5 0 0 0 0 3 0 0 0Myer's (McCabe extension) 5 0 0 0 0 3 0 0 0Hansen's (McCabe/Halstead) 5 0 0 0 0 3 0 0 0oviedo's (Data/Ctrl Flows) 5 0 0 0 0 3 0 0 0

Failure Count

Geol/Okumoto 5 0 0 0 0 0 0 0 0Schneidewind 5 0 0 0 0 0 0 0 0

Geol 5 0 0 0 0 0 0 0 0Musa 5 0 0 0 0 0 0 0 0Shooman 5 0 0 0 0 0 0 0 0Moranda 5 3 0 0 0 0 0 0 0Jelinski/Moranda 5 3 0 0 0 0 0 0 0Moranda 5 3 0 0 0 0 0 0 0Schick/Wolverton 5 3 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 5 3 0 0 0 0 0 0 0Brooks/Motley 5 3 0 0 0 0 0 0 0IBM Poisson 5 3 0 0 0 0 0 0 0

Fault Seeding

Mills 5 0 0 0 0 0 0 0 0

Input Domain BasedNelson 5 0 0 0 0 0 0 0 0

Ho 5 0 0 0 0 0 0 0 0

Ramamoorthy/Bastani 5 0 0 0 0 0 0 0 0

I

Page 105: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITable B-33 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: Support Factory Test

SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus EffcI M M M L H H H H L

Time Between Failure 3 3 3 1 5 5 5 5 1

Jelinski/Moranda 3 3 0 0 0 0 0 0 0Schick/Wolverton (Linear) 3 3 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 3 3 0 0 0 0 0 0 0Moranda (Geometric De-eut) 3 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 3 0 0 0 0 0 0 0 0Goel/Okumoto 3 0 0 0 0 0 0 0 0Littlewood/Verrall 3 0 0 0 0 0 0 0 03 Lloyd/Lipow 3 0 0 0 0 0 0 0 0

Complexity

Halstead 3 0 0 0 0 5 0 0 0McCabe 3 0 0 0 0 5 0 0 0

Woodward (Knot Counts) 3 0 0 0 0 5 0 0 0Chen (Nested Decision Stmts) 3 0 0 0 0 5 0 0 0Gaffney 3 0 0 0 0 5 0 0 0Benyon-Tinker 3 0 0 0 0 5 0 0 0

Gilb's (Binary Decision) 3 0 0 0 0 5 0 0 0Chapin's Q 3 0 0 0 0 5 0 0 0Segment-Global Usage Pair 3 0 0 0 0 5 0 0 0Myer's (McCabe extension) 3 0 0 0 0 5 0 0 0Hansen's (McCabe/Halstead) 3 0 0 0 0 5 0 0 0Oviedo's '(Dta/Ctrl Flows) 3 0 0 0 0 5 0 0 0

Failure Count

3 Geol/Okumoto 3 0 0 0 0 0 0 0 0Schneidewind 3 0 0 0 0 0 0 0 0Geol 3 0 0 0 0 0 0 0 0Musa 3 0 0 0 0 0 0 0 0Shooman 3 0 0 0 0 0 0 0 0Moranda 3 3 0 0 0 0 0 0 0Jelinski/Moranda 3 3 0 0 0 0 0 0 0Moranda 3 3 0 0 0 0 0 0 0Schick/Wolverton 3 3 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 3 3 0 0 0 0 0 0 0Brooks/Motley 3 3 0 0 0 0 0 0 0IBM Poisson 3 3 0 0 0 0 0 0 0

Fault Seeding

Mills 3 0 0 0 0 0 0 0 0

Input Domain Based

Nelson 3 0 0 0 0 0 0 0 0Ho 3 0 0 0 0 0 0 0 03 Ramamoorthy/Bastani 3 0 0 0 0 0 0 0

I

Page 106: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-34 Subfunction Ranked Attributes & Metric Models

Subfunction Title: Support Acceptance Test

I CSUBFUNCTION ATTRIBUTES, RANKINGS & SCORES

--METRIC-CLASSES---MODELSReli Surv Intg Thru Usab Main Port Reus Effc

M M M M L M L L M

Time Between Failure 3 3 3 3 1 3 1 1 3

Jelinski/Moranda 3 3 0 0 0 0 0 0 0Schick/Wolverton (Linear) 3 3 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 3 3 0 0 0 0 0 0 0Moranda (Geometric De-eut) 3 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 3 0 0 0 0 0 0 0 0Goel/Okumoto 3 0 0 0 0 0 0 0 0Littlewood/Verrall 3 0 0 0 0 0 0 0 0Lloyd/Lipow 3 0 0 0 0 0 0 0 0

IComplexityHalstead 3 0 0 0 0 3 0 0 0McCabe 3 0 0 0 0 3 0 0 0Woodward (Knot Counts) 3 0 0 0 0 3 0 0 0Chen (Nested Decision Stmts) 3 0 0 0 0 3 0 0 0Gaffney 3 0 0 0 0 3 0 0 0Benyon-Tinker 3 0 0 0 0 3 0 0 0Gilb's (Binary Decision) 3 0 0 0 0 3 0 0 0Chapin's Q 3 0 0 0 0 3 0 0 0Segment-Global Usage Pair 3 0 0 0 0 3 0 0 0Myer's (McCabe extension) 3 0 0 0 0 3 0 0 0Hansen's (McCabe/Halstead) 3 0 0 0 0 3 0 0 0Oviedo's (Data/Ctrl Flows) 3 0 0 0 0 3 0 0 0

Failure CountGeol/Okumnoto 3 0 0 0 0 0 0 0 0

Schneidewind 3 0 0 0 0 0 0 0 0Geol 3 0 0 0 0 0 0 0 0Musa 3 0 0 0 0 0 0 0 0Shooman 3 0 0 0 0 0 0 0 0Moranda 3 3 0 0 0 0 0 0 0Jelinski/Moranda 3 3 0 0 0 0 0 0 0moranda 3 3 0 0 0 0 0 0 0

Schick/Wolverton 3 3 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 3 3 0 0 0 0 0 0 0Brooks/Motley 3 3 0 0 0 0 0 0 0IBM Poisson 3 3 0 0 0 0 0 0 0

Fault Seeding

Mills 3 0 0 0 0 0 0 0 0

Input Domain Based

Nelson 3 0 0 0 0 0 0 0 0Ho 3 0 0 0 0 0 0 0 0Ramamoorthy/Bastani 3 0 0 0 0 0 0 0 0

U

Page 107: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

Table B-?j Subfunction Ranked Attributes & Metric Models

Subfunction Title: Maintain and Control Management InformationDatabase

SUBFUNCTION ATTRIBUTES, RANKINGS & SCORESMETRIC CLASSES & MODELS

Reli Surv Intg Thru Usab Main Port Reus Effc

I M M H M H H H H MTime Between Failure 3 3 5 3 5 5 5 5 3

Jelinski/Moranda 3 3 0 0 0 0 0 0 0Ichick/Wol-erton (Linear) 3 3 0 0 0 0 0 0 0Schick/Wolverton (Linear) 3 3 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 3 3 0 0 0 0 0 0 0Moranda (Geometric De-eut) 3 0 0 0 0 0 0 b 0Moranda (Hybrid Geomet Poiss) 3 0 0 0 0 0 0 0 0Goel/Okumoto 3 0 0 0 0 0 0 0 0Littlewood/Verrall 3 0 0 0 0 0 0 0 0Lloyd/Lipow 3 0 0 0 0 0 0 0 0

Complexity

Halstead 3 0 0 0 0 5 0 0 0Mc -- ---abe-- -- -3-- -0-- -0-- -- 0---0---5--- 0- - 0- -- 0-Woodward (Knot Counts) 3 0 0 0 0 5 0 0 0Chen (Nested Decision Stmts) 3 0 0 0 0 5 0 0 0Gaffney 3 0 0 0 0 5 0 0 0Benyon-Tinker 3 0 0 0 0 5 0 0 0Gilb's (Binary Decision) 3 0 0 0 0 5 0 0 0Chapin's 0 3 0 0 0 0 5 0 0 0Segment-Global Usage Pair 3 0 0 0 0 5 0 0 0Myer's (McCabe extension) 3 0 0 0 0 5 0 0 0Hansen's (McCabe/Halstead) 3 0 0 0 0 5 0 0 0Oviedo's (Data/Ctrl Flows) 3 0 0 0 0 5 0 0 0

Failure Count

Geol/Okumoto 3 0 0 0 0 0 0 0 0Schneidewind 3 0 0 0 0 0 0 0 0Geol 3 0 0 0 0 0 0 0 0I Musa 3 0. 0 0 0 0 0 0 0Shooman 3 0 0 0 0 0 0 0 0Moranda 3 3 0 0 0 0 0 0 0Jelinski/Moranda 3 3 0 0 0 0 0 0 0Moranda 3 3 0 0 0 0 0 0 0Schick/Wolverton 3 3 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 3 3 0 0 0 0 0 0 0I Brooks/Motley 3 3 0 0 0 0 0 0 0IBM Poisson 3 3 0 0 0 0 0 0 0

Fault Seeding

I Mills 3 0 0 0 0 0 0 0 0

Input Domain Based

Nelson 3 0 0 0 0 0 0 0 0Ho 3 0 0 0 0 0 0 0 0I Ramamoorthy/Bastani 3 0 0 0 0 0 0 0 0

I

Page 108: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

ITable B-36 Subfunction Ranked Attributes & Metric Models

I Subfunction Title: Management Information Tracking

I CSUBFUNCTION ATTRIBUTES, RANKINGS & SCORES

- -METRIC CLASSES & MODELS-- Reli Surv Intg Thru Usab Main Port Reus Effc

-- L H L - H H H -

Time Between Failure 3 1 5 1 5 5 5 5 3

Jelinski/Moranda 3 1 0 0 0 0 0 0 0Schick/Wolverton (Linear) 3 1 0 0 0 0 0 0 0Schick/Wolverton (Parabolic) 3 1 0 0 0 0 0 0 0

Moranda (Georretric De-eut) 3 0 0 0 0 0 0 0 0Moranda (Hybrid Geomet Poiss) 3 0 0 0 0 0 0 0 0Goel/Okumoto 3 0 0 0 0 0 0 0 0Littlewood/Verrall 3 0 0 0 0 0 0 0 0

Lloyd/Lipow 3 0 0 0 0 0 0 0 0

Complexity

Halstead 3 0 0 0 0 5 0 0 0McCabe 3 0 0 0 0 5 0 0 0Woodward (Knot Counts) 3 0 0 0 0 5 0 0 0Chen (Nested Decision Stmts) 3 0 0 0 0 5 0 0 0Gaffney 3 0 0 0 0 5 0 0 0Benyon-Tinker 3 0 0 0 0 5 0 0 0Gilb's (Binary Decision) 3 0 0 0 0 5 0 0 0Chapin's 0 3 0 0 0 0 5 0 0 0Segment-Global Usage Pair 3 0 0 0 0 5 0 0 0Myer's (McCabe extension) 3 0 0 0 0 5 0 0 0

Hansen's (McCabe/Halstead) 3 0 0 0 0 5 0 0 0Oviedo's (Data/Ctrl Flows) 3 0 0 0 0 5 0 0 0

Failure Count

Geol/Okumoto 3 0 0 0 0 0 0 0 0Schneidewind 3 0 0 0 0 0 0 0 0Geol 3 0 0 0 0 0 0 0 0Musa 3 0 0 0 0 0 0 0 0Shooman 3 0 0 0 0 0 0 0 0Moranda 3 1 0 0 0 0 0 0 0Jelinski/Moranda 3 1 0 0 0 0 0 0 0Moranda 3 1 0 0 0 0 0 0 0Schick/Wolverton 3 1 0 0 0 0 0 0 0Goel/Okumoto (Gen Poisson) 3 1 0 0 0 0 0 0 0Brooks/Motley 3 1 0 0 0 0 0 0 0IBM Poisson 3 1 0 0 0 0 0 0 0

Fault Seeding

Mills 3 0 0 0 0 0 0 0 0

Input Domain Based------------ -----------

Nelson 3 0 0 0 0 0 0 0 0Ho 3 0 0 0 0 0 0 0 03 Ramamoorthy/Bastani 3 0 0 0 0 0 0 0 0

I

Page 109: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

C- I thog C-20

IIIIII APPENDIX C

i BROAD SELECTIO)N O)F S()FT\ ARE

i EN\ iR()NMENT SLPPORT TO)OLS

III,IIIIIII

I C-1 through C-20

Page 110: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

I LEGEND

<COLUMN HEADING> DESCRIPTION

"Field Value"

S< APPL-N > Major CategoryI. "SWAT" o Software Analysis Tool2. "PSM" o Project Setup/Monitor

S3. "OLEA" o On Line Expert Assistance

I < TYPE > Sub Category

(Meaning Full FunctionalDescriptors are Used)I <TOOL-NAM> Tool Name

<COMPANY> (Self Explanatory)

< CITY > ,,

i <STATE>

IIIIiIiI

Page 111: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

RD CL M a. CL CL IL a.0IL a. a.a0

0

'm

0

0 l 0- m ccw~~

> 0 w3 zl (D0

3:~~ 00 . La L=zV-J QO~ C

>I LI- ~~~ ~ ~ ~ t >~II CL--IICC~fCCC) 000

uiLu0 :IwZruec o W*) C WL -Cwwwww wOOWWwwwwwwaWWWWWWWWW

U) CU U)C <DUo2<'m

Page 112: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0CIzc oo ao co ao a o.a. w~o a. o

U 0U

WIU= C *CICL LU C'( C ICIo cc o, CC CI)Uj i s -, M ,

> :c=5 c Lx = w c >.- > )I > >>

E~ CO0CLU 3 L z U LI) 0J ~ O 0 C /

In) CL0.)u C

~oIC~ w~Z Z Z~ Z z ~ zCZ WW I ,0OC6~00 -00po O ocz

.0 CL >-WI C O

IU) C a: - CRZ0a< -j .x U)_ aI c 0u

a. 2 I-2C L0 _ C ! LM>

cc 0- L. a. <. >. a.WW

0 J> w a .

a.~~m ICCCIIICCCIIICCICCCCCCIIICCCIIICIIIIm 9uI iV ) O4

Page 113: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

IQCIMCL C

0

zIMV . V) V) U)0 V)0.0.) ) O

3U 0-0lL

0 C 0 VCC

<0w = OR<5J.<MMD-J

ci ZU 0i 0z ma. 0O 0 0 l 0 -Jomm

am 0 c~czmz o~C <w ww <co l c

z ,>00950 0000w

2 wc : 0- < 0. 00.0x

0.O 000 -M WN C M C CM M(M

M 0.

~~L CCa0 1 g 0o

<- .00.n W 00NN -Crdc - 0 008 ZZZ~

8U W W W W WW-k LU U UW LU M x LU CL

*~~~L 3UiE 00000000zO t- M M =~w M M w M M~

Uz < CCCC!R0. M~~wwwwwwwwwww0IJ- J_ J- J- J-J- J- J- J- J-

C ooooui ~ ~ ~ ~ ~ ~ ooc~~~~~~~~~

000000000I

Page 114: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0

-i0C.<

C

zWO Cf C/) CO V) V) CIO r CO V ) U l ) V

Ixb > > > > >>>>>> >>>>>>

LU e-%LU UJ u - c 5L

cn ~~ww~>wS <C

I C l ) C ) 2 L ) Z W U Z c 2 Z Z Z

m IX

0 e. 0 E: 0 ~0 .J

o ot-o m C. C o C 3 O O U >W LU < W

Z M L OOOI M M c m C

0Iiiawo0 o/000

00 C) 00C L 0 CL CL)0 000 000

000

z mOOCO o LOOOCCCCCOOOOCCCC cOCOOCO oCOOO

Page 115: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

IIz

U) 0co UC) c col Cl) l) C0 COC) Cl) U )c)CI) CO)

x>b > > > > > >>>>>>>

LU 0

< 0zz0 ) C L0

E JCCM mz~,WO2o o o < <

Cr 0 .0.0. C

I 0000 W10.-C) w ~w< 000U)Z

oS<tMw 05 C win co 0L ccc o jc

m .00 Cl) J co I) C0-0U00~

00 C.) a<-0<a- <ocr- o0c <zCM.m oouol-m x"

0- w .1 W J.1O b

11 w i c Jbeww ww ww a ww wwIMww w

I CD COCZ I -CCOD0. m ,=

m c 0 J I LMcc<CO

Page 116: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0

0a.

x

co2

UZZ0223:<0czoooooz0oo0 -

LI, < 00

P z U, c uI- L c w >- ZZ - <aZ 0-1 Z -J V ZZ ZC 0- U-- < < 0

E~ 0 <OOCO 0~ZOCOOO~ 00> -C COz) > LLLLw oo

0.< Z<Z Z Z 2CCCO0

E ~~~~ CO) ( x:m zo)ww0 Zo 8~o8a 0

I 0 -0w &-- 00 0 c o0 2z w 0 33_ Cw~O(LCRoCo0.

002< wwzU 0 CO CO < 2 zL rr

C. L) (n- T _ C O)2 CO :1 0 ) C LC

2 0 -1 C -CMCOCOCL C. Lo LCOCzCOyCO0z00000<00h0+OO

z r j: U U c .=8w 000 d F 0 0O oVw < >> C) -000 000000000000zi L-Ju 0 L

>. W:5 uiCI 0. c (FICE M>P -P 6 2 MM 66 WM COOOOOOOOCCCCCCCCOOOOOOOOCCCCCCCCLIJL

Page 117: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0-OC<<<

< <0M.

4:a:

LU 0 0:I _ wwamzz a c

~Cl 0.u~. <C~ :0 O n a

i. wo0 0.<ZS2 CC C mmCD:W<2. _cxm< _

z =m > CDU->

E CD 0 U. L

00 000 cc0z :-z z >i-OO4C4CW WWWWWCl ZI aZC a < V5LLZWWWWWIJUU. 0 2 CrZ ZC:

co;20 u LUI ~ ~ ~ -M Q CDCCDDCDDCCDCCDDCDDCCDCCDDCDDCCDCCDDCDDCcDc

IJU j i-

Page 118: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0CLU<

x--

00L)

Im -- j - 0) U)J o D L)1 cx

4:l5 5ZZ 5- E M m < 2 > > Z

0 0

3~ a <a V)w5 ~co 0 0 c o-

m ~ 0 Z Z04 -a.0-0l i....jZCn ) u)j Zq4

z V)x OCJ.)0 o? i~< 0 00 w0 r 0 n ZL) Cl) 0c l

3j _j 0 j>oM_ C) M- J0 O 0 U -

w <:0 I.C- w0

1I ~00 cow F- zL

V) Z0.o Wt; 0 0< 0 x

CDl4:4:0 0~ w WWwwww wwww

10 4:0.MU6 nm wmm<<I iLJL i UWL L

Page 119: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

U0

C. 0a0 0 . a. CL

00CL 44

z

> > >> >>>>Z>>>>>> >>>>

E/ - < W.J 4~ 4 4 W 44 "W

Z> 4J COzu 0 0

0 >-0 woo>. < 0Xc m0

.95x w 0 w 00. <C. J -- c :w 1 u U) j

<0 <.CI O0 <0 i < < m >.J z~

< - I- L

8 < 80xUTw w 0 u 0-IZ C )

1Z ,2z 02 =1 w

1 0 80, 8tz o4 0 mR iC< 00 C J CC COCOLgoU-O 2 = 0 U-0 0 [Llw =c).Xi

Page 120: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0

Iz1 . . . . . . .M0

C V) (n mom CC'CC o C Cl COIxI C

x >C < 4- < <-

e) UJ'C L>w Z,< r'ZZ 500 Lli Lr0 'C C J 1 z .- in z aICh QD -~~ (0c Ln z -W0O-jj 0

Z C~CC 'C'C'C m0w wW .C0 0 eM 'C z z 2 LL, 1-- m w 2O~tZEna (- c=CCC J- J z zzz j w<0<z-

< cc, W0 0.L

I- C.) COI-C

cc 000 CC 9-L

>Z-9--0 .. 8 8 >2 0 g c

0U0 0< i-20.00 0z~m-i o . 0.xm lm0. 'C'CC'C'00C'C0Q CL2 mICLM n

Im

Page 121: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

< 0ot7

00.

4(J 0 ( .

;S- o 0 a _ jsC 0 P d

2>m ~ w- m - 0C l

E E I-- - 2 0 2 1- 0 co )0 W~ cod L i L )<mw 0

0.U) )U E -)00

~W>W0 80So0CLi- i )U)> U) cc cr cn) o

3D =~cc c M w J< 0c

t-D 0 0 0 .- wU. 0 c c c d c U) 0

IL CLL 0 -w0L

00g0 < P0 U)UUUU))<3<:PPPc 0000J

0L gg . §

I JIMU ..MU -w m m.

Page 122: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

CL a. . a.0.. 0. 0. C. C0.0.CC

0

Io V) cn CflCI C/) V)C V) V)C COV ClO Cnx

> > > > > > > >- >-

ooZooemco M Ox Z 00~0a

< u) u<i a) 0 z IO w < 0 i 0 LL 0 cc 0 c

Z cccMw9 *a <Uj D Erz zC, C w <<ZLW - O6ir w cr 0

*j: z D D _5;W z :cz a WZ_

t:~~~~C =0 u5 0 2 7o w ; RwG ) << 0

E nc oc U02 :CD 0.co co owCO -j Co I-CoO

COOOOOOOCCC)CCCOOOOOC

0 .3~~~_ 0.CCCCOOOC

C~IJ~I C OCOOC CO U) mmz 0. z.000 0..0 0...00 0.0.zuit iLUI- oo O t<O Oe z < 2-

Page 123: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

C,

m M. CL 0. 0.0 0.0 CL0 C .L0 C0ILI .0 C . IL00

00L

zCC C/) UU)C0 ) )

>>> > > >> >

0 m cc 0c/ccc 00Z)cc 0 M C iY2 a OcL c w3 z wC

co <w ~ z z zzzz CC zc

4) 6<<<<Pu c< iu iu

<UL

o z --

I--0. *eC06 w6 w _

*a -l 0.9 <zo :L V

IU LeWW 001, oU)00.0.0a..0 0.0.0. ZZ ZZ 2 ru

> <000000 00-O0OCLOOOOO00

a. M 4U2 - I MO0. ~lC/,COCC/ClC,/ OCCCC)CW)I /CC C,(v~C~~~~~~~ 0..........0000o0000 co 0....o00

Page 124: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0CL

0.< coI U' U' ' U U''tU

aIb > > >

w' Q lI-I L< Wo j<'!

CL Wm wzw C.3MF'LL >. ZL WU' 7:CCB >- W-cz. U''U -Ji >- UCl

I 8U 0 W 0666 <zaE i-L

U)< C C W W W W WIW W W Wm Lm2

a:' U'U'U' 'CLLU~'U coUI o v WCL0.0co0.0.0 0.000...00 0 o....00

z&.0~ 0..0zIr z>c. cc l

Page 125: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0

00.

<( <zIOU C/b C/) 0a)c Cf Cl n V)U

< Zcc 220o000002 000o000o< >om

8=

E0) 00 U 0. 3

< z o ~zbw-"Nz. Z00~ ZZZZ. <

LU CO Z LJ

2 2> Cf0JJ 0 c cz>>>> LJCU)0 <z z8 C LJW <<< -j -J wo x>- _0.o4) M w0 9< u << << 5 =0 u uiw cc0 Pzzi

c 0. ccc0.0. 0D <0.0

I

Page 126: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0

Iz

< >

Iu LU OzC Z~ 0 0 a U)OC08) ) Ju

m) z0 5 :3:0 ,- ; 0 w 0:30 _ 00

-J Lm( z

ccZ ':Zc 0 0 u - UZO.cc g3-4 - C0Lu 2 cl.ZZ

w I-I-OZ 2 w<

>- wow - J - J - J-0Jo->LU"-z WWWWW Wto W U z zZZZ ZZ ZZ ZZ ZZ Zuilz 0E0mm uzLI.0 00 C3 J L o m w m o o m

I Uc iwu

Page 127: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0

0C.

IC0) l l OO CO b O Cf O C)c

> >> > > >>> > > >> >

cc u 0 wL; I- -jL I - o o U 0JoZ

Is Li > -. z -j

ad<ww 0 0 c 9o -0

-w Zw . M 2IdO0 o U 0 0 1 1w.- z z e

-E 8- M 0~ 1=c)m < f

IaLI c<Icc :

Page 128: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

3 0

z

> > > > >>>

0I ~-fl'

z~0 0.L 0a.0 3: > LL 0 LwwcuJ

C) w- 0 C CoL < Wu>2a a oc0w) L L . U. enJ Im O<<

< w < 1. 5I ~~ ~ ~ C CD Cfl......00000I)<04Ic< 5 0< < c

< m < zzz3: >. zzz zzzzz zCC)OC)f 2 )ICC w 0m <gem c OCO uzi U U I oU c r0 aC-J 0 )L cL cu D0.c~z~ zm o~ <WW WLJ W W 0W ~W WIL*U U UU --M O u -C LC

0. cL -> mZ L iLI~-S<9 0!;2 <ZP

Page 129: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0

0.

I ZOLM 02 2OZL2

L, Wf 0 L

3EU 0 LOEL '--00 i -U Cl -0C,

U) zIri -Joo z Oz 0

LI U 0 z z Q X0Z ZGa C.) LU < Ww.o...J!. _0 ~ ~ ~ ~ ~ CC-0 § LZ L O < -o Z <M a mD5<0 I LC)-~-5j 53w z

0.0

IUL

Page 130: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

0

0

I zx> > >> >> >

z-- 000 -0 00

II 0, w0 0

D0-J

C,, U, z z Z 0 z

E CLI wUI . CL ED ,

2i o. zz - -jUU Z uiLI 0

om~

z a

0.IUz

Page 131: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

D-I

IAP IX aD Iii

iLSSFCTONTOSMTI

i

I

I

III

Page 132: THE ANALYTIC SCIENCES CORPORATION · THE ANALYTIC SCIENCES CORPORATION TR-9033-2 ... collect-ion of extensive metrics data from industry, Government and academic urces, (2) evaluaton

IIIII -- 30-

IIIIIIIIIIIIII --30--