acquisition and contracting risk management guide for dod acquisition government report jun

188
RISK MANAGEMENT GUIDE FOR DOD ACQUISITION Fifth Edition Department of Defense Defense Acquisition University June 2002

Upload: others

Post on 11-Feb-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

i

RISKMANAGEMENT

GUIDE FORDOD ACQUISITION

Fifth Edition

Department of DefenseDefense Acquisition University

June 2002

ii

PUBLISHED BY THEDEFENSE ACQUISITION UNIVERSITY PRESS

FORT BELVOIR, VIRGINIA 22060-5565

For sale by the U.S. Government Printing OfficeSuperintendent of Documents, Mail Stop: SSOP, Washington, DC 20402-9328

Please e-mail comments or recommended changes to:

[email protected]

iii

OFFICE OF THE UNDER SECRETARY OF DEFENSE3000 DEFENSE PENTAGON

WASHINGTON, DC 20301-3000

RISKMANAGEMENT

GUIDE

Acquisition reform has changed the way the Department of Defense (DoD) designs, develops, manu-factures, and supports systems. Our technical, business, and management approach for acquiring andoperating systems has, and continues to, evolve. For example, we no longer can rely on militaryspecifications and standards to define and control how our developers design, build, and supportour new systems. Today we use commercial hardware and software, promote open systemsarchitecture, and encourage streamlining processes, just to name a few of the initiatives that affectthe way we do business. At the same time, the Office of the Secretary of Defense (OSD) hasreduced the level of oversight and review of programs and manufacturers’ plants.

While the new acquisition model gives government program managers and their contractors broadercontrol and more options than they have enjoyed in the past, it also exposes them to new risks. OSDrecognizes that risk is inherent in any acquisition program and considers it essential that programmanagers take appropriate steps to manage and control risks.

This document is a product of a joint effort by the Under Secretary of Defense (Acquisition,Technology and Logistics (USD (AT&L)) staff and the Defense Acquisition University. It isbased on the material developed by the DoD Risk Management Working Group. Material in thisGuide is also reflected in the Risk Management Focus Area of the Program Management Communityof Practice (PMCOP) (http://www.pmcop.dau.mil), and in the Defense Acquisition Deskbook, whichcan be accessed via the Acquisition Support Center Website (http://center.dau.mil).

Frank J. Anderson, Jr.PresidentDefense Acquisition University

iv

PREFACE

In 1996, the USD (AT&L) established a Risk Management Working Group composed of members ofthe Office of the Secretary of Defense (OSD) staff, representatives of the Military Services, andmembers of other DoD agencies involved in systems acquisition. This group reviewed pertinent DoDdirectives (DoDD) and regulations, examined how the Services managed risk, studied various ex-amples of risk management by industry, and looked at DoD training and education activity in riskmanagement. Other sources of information were the Software Engineering Institute Risk Initiative,the Open Systems Initiative, and the safety and cost estimating communities. The findings and resultsof the Working Group investigation were presented to the USD (AT&L) and are summarized below:

Following that guidance, Working Group members wrote the risk management portions of the DefenseAcquisition Deskbook. The Defense Acquisition Deskbook is accessible from the DAU AcquisitionSupport Center (http://center.dau.mil).

Industries

• Focus of efforts is to get a product to market at a competitive price.

• Industry has have either a structured or informal Risk Management process.

• Evolutionary approaches help avoid or minimize risk.

• Most approaches employ risk avoidance, early planning, continuous assessment, and problem-solving techniques.

• Structured approaches, when they exist, are similar to DoD’s approach to Risk Management.

The Working Group concluded that industry has no magic formula for Risk Management.

The Military Services

• The Services differ in their approaches to Risk Management.

• Each approach has its strengths but no one approach is comprehensive.

• Consolidation of the strengths of each approach could foster better Risk Management in DoD.

The Working Group recommended that the Defense Acquisition Deskbook contain a set of guidelinesfor sound risk management practices, and further, that it contain a set of risk management definitionsthat are comprehensive and useful by all the Components.

DoD Policy*

• The risk management policy contained in DoDD 5000.1 is not comprehensive.

The Working Group recommended that DoDD 5000.1 be amended to include a more comprehensiveset of risk management policies that focuses on:

• The relationship between the Cost As an Independent Variable (CAIV) concept and RiskManagement.

• Requirement that risk management be prospective (forward looking).

• Establishment of risk management as a primary management technique to be used by ProgramManagers (PMs).

*Note: The DoD 5000 policy documents referred to in the 1996 Report have since been superseded by a new set of DoD 5000policy documents issued in 2000–2002 time frame.

v

The Risk Management part of the Defense Acquisition Deskbook forms the basis for this Guide. Thegoal of the Risk Management Guide is to provide acquisition professionals and program managementoffices with a practical reference for dealing with system acquisition risks. It has also been designedto be used as an aid in DAU course offerings.

This Guide reflects the efforts of many people. Mr. Mark Schaeffer, former Deputy Director, SystemsEngineering, who chaired the Risk Management Working Group, and Mr. Mike Zsak and Mr. TomParry, formerly from the AT&L Systems Engineering Support Office, were the original drivingforce behind the risk management initiative. Mr. Paul McMahon and Mr. Bill Bahnmaier from theDAU/DSMC faculty and Mr. Greg Caruth, Ms. Debbie Gonzalez, Ms. Frances Battle from the DAUPress; Ms. Patricia Bartlett from Bartlett Communications, and Mr. Norman Bull guided the compo-sition of the Guide. Assistance was also provided by Mr. Jeff Turner of the DAU Publications Distri-bution Center. Special recognition goes to the Institute for Defense Analyses team composed of Mr.Louis Simpleman, Mr. Ken Evans, Mr. Jim Lloyd, Mr. Gerald Pike, and Mr. Richard Roemer, whocompiled the data and wrote major portions of the text. Also special thanks to Ms. Margaret Adcockof the Navy Acquisition Reform Office for her detailed comments and support.

Charles B. Cochrane William W. BahnmaierDirector EditorDAU Center for Program Management DAU Center for Program Management

DoD Procedures

• Risk Management procedures in DoD 5000.2-R are inadequate to fully implement the riskmanagement policy contained in DoDD 5000.1.

Procedures are lacking regarding:

– Scope of Risk Management

– Purpose of Risk Management

– Role of Milestone Decision Authorities

– Risk Management’s support of CAIV

– Risk assessment during early acquisition phases.

• Some key procedures may have been lost in transition from DoD 5000.2M to DoD 5000.2-R.

The Working Group recommended that procedures in DoD 5000.2-R be expanded, using the DefenseAcquisition Deskbook as the expansion means, in order to provide comprehensive guidance for theimplementation of risk management policy.

DoD Risk Management Training

• Risk management training for the DoD Acquisition Corps needs to be updated and expanded, andIntegrated Product Team (IPT) and Overarching IPT (OIPT) personnel need to be educated on thenew and expanding role of risk management in DoD systems acquisition.

• Risk Management knowledge level needs improvement.

• Education is a key to obtaining the support of OIPTs and PMs.

The Working Group recommended that the Defense Acquisition University (DAU) include training forRisk Management in all functional courses and develop a dedicated risk management course foracquisition corps personnel.

vi

CONTENTS

Chapter 1 INTRODUCTION ............................................................................................... 1

1.1 Purpose and Scope ......................................................................................................... 1

1.2 Organization of the Guide .............................................................................................. 1

1.3 Approach to Risk Management ..................................................................................... 2

1.4 DoD Risk Management Policies and Procedures ........................................................... 2

Chapter 2 RISK AND RISK MANAGEMENT ................................................................... 5

2.1 Introduction .................................................................................................................... 5

2.2 Overview ........................................................................................................................ 5

2.3 Risk Management Structure and Definitions .................................................................. 7

2.4 Risk Discussion .............................................................................................................. 82.4.1 Characteristics of Acquisition Risk .................................................................... 82.4.2 Program Products, Processes, Risk Areas, and Risk Events ............................... 9

2.5 Risk Planning ............................................................................................................... 112.5.1 Purpose of Risk Plans ..................................................................................... 112.5.2 Risk Planning Process ...................................................................................... 12

2.6 Risk Assessment ........................................................................................................... 132.6.1 Purpose of Risk Assessments ............................................................................ 132.6.2 Risk Assessment Process .................................................................................. 132.6.3 Timing of Risk Assessments ............................................................................. 152.6.4 Conducting Risk Assessments .......................................................................... 15

2.7 Risk Handling ............................................................................................................... 202.7.1 Purpose of Risk Handling ................................................................................. 202.7.2 Risk-Handling Process ..................................................................................... 21

2.8 Risk Monitoring ............................................................................................................ 23

2.9 Risk Documentation ..................................................................................................... 24

Chapter 3 RISK MANAGEMENT AND THE DOD ACQUISITION PROCESS .......... 27

3.1 Introduction .................................................................................................................. 27

3.2 Overview ...................................................................................................................... 27

3.3 DoD Acquisition Process .............................................................................................. 27

3.4 Characteristics of the Acquisition Process ..................................................................... 283.4.1 Integrated Product and Process Development (IPPD) ...................................... 283.4.2 Continuous Risk Management .......................................................................... 283.4.3 Program Stability .............................................................................................. 293.4.4 Reduction of Life-Cycle Costs ......................................................................... 29

vii

3.4.5 Event-Oriented Management ............................................................................ 293.4.6 Modeling and Simulation .................................................................................. 29

3.5 Risk Management Activities during Acquisition Phases ............................................... 293.5.1 Concept and Technology Development (CTD) Phase ...................................... 303.5.2 Subsequent Phases ............................................................................................ 31

3.6 Risk Management and Milestone Decisions ................................................................. 31

3.7 Risk Management and the Acquisition Strategy ........................................................... 31

3.8 Risk Management and CAIV ....................................................................................... 32

Chapter 4 RISK MANAGEMENT AND PROGRAM MANAGEMENT ....................... 35

4.1 Introduction .................................................................................................................. 35

4.2 Overview ...................................................................................................................... 35

4.3 Program Manager and Risk Management ..................................................................... 354.3.1 Risk Management Is a Program Management Tool .......................................... 364.3.2 Risk Management Is a Formal Process ............................................................. 364.3.3 Risk Management Is Forward-Looking ............................................................ 364.3.4 Risk Management Is Integral to Integrated Product

and Process Development (IPPD) ................................................................. 37

4.4 Risk Management Organization in the PMO ................................................................ 374.4.1 Risk Management Organizational Structure ..................................................... 374.4.2 Risk Management Responsibilities ................................................................... 39

4.5 Contractor Risk Management ....................................................................................... 404.5.1 Contractor View of Risk ................................................................................... 404.5.2 Government/Contractor Relationship ............................................................... 40

4.6 Risk Management and the Contractual Process ............................................................ 424.6.1 Risk Management: Pre-Contract Award ........................................................... 424.6.2 Early Industry Involvement: Industrial Capabilities Review ............................. 424.6.3 Developing the Request for Proposal ............................................................... 434.6.4 The Offeror’s Proposal ..................................................................................... 464.6.5 Basis for Selection ............................................................................................ 464.6.6 Source Selection ............................................................................................... 47

4.7 Risk Management: Post-Contract Award ...................................................................... 47

4.8 Risk Management Reporting and Information System ................................................. 48

4.9 Risk Management Training ........................................................................................... 49

Chapter 5 RISK MANAGEMENT TECHNIQUES ........................................................... 53

5.1 Introduction .................................................................................................................. 53

5.2 Overview ...................................................................................................................... 53

5.3 Risk Planning Techniques ............................................................................................ 535.3.1 Description ....................................................................................................... 535.3.2 Procedures ........................................................................................................ 55

viii

5.4 Risk Assessment Techniques ........................................................................................ 555.4.1 Product (WBS) Risk Assessment ...................................................................... 555.4.2 Process (DoD 4245.7-M) Risk Assessment ...................................................... 585.4.3 Program Documentation Evaluation Risk Identification ................................... 605.4.4 Threat and Requirements Risk Assessment ...................................................... 615.4.5 Cost Risk Assessment ....................................................................................... 635.4.6 Quantified Schedule Risk Assessment .............................................................. 645.4.7 Expert Interviews .............................................................................................. 665.4.8 Analogy Comparison/Lessons-Learned Studies ............................................... 67

5.5 Risk Prioritization ......................................................................................................... 685.5.1 Description ....................................................................................................... 685.5.2 Procedures ........................................................................................................ 69

5.6 Risk-Handling Techniques ........................................................................................... 715.6.1 General ............................................................................................................. 715.6.2 Risk Control ..................................................................................................... 725.6.3 Risk Avoidance ................................................................................................. 775.6.4 Risk Assumption ............................................................................................... 785.6.5 Risk Transfer .................................................................................................... 78

5.7 Risk Monitoring ............................................................................................................ 795.7.1 General ............................................................................................................. 795.7.2 Earned Value Management ............................................................................... 795.7.3 Technical Performance Measurement ............................................................... 805.7.4 Integrated Planning and Scheduling ................................................................. 805.7.5 Watch List ......................................................................................................... 815.7.6 Reports ............................................................................................................. 825.7.7 Management Indicator System.......................................................................... 84

5.8 Risk Management Information Systems and Documentation ....................................... 865.8.1 Description ....................................................................................................... 865.8.2 Risk Management Reports ................................................................................ 86

5.9 Software Risk Management Methodologies ................................................................. 885.9.1 Software Risk Evaluation (SRE) ...................................................................... 885.9.2 Boehm’s Software Risk Management Method ................................................. 885.9.3 Best Practices Initiative Risk Management Method .......................................... 91

APPENDIX A –DOD RISK MANAGEMENT POLICIES AND PROCEDURES ................................... A-1

DoD Directive 5000.1. The Defense Acquisition System, (w/chg 1 dtd 4 Jan 2001) ........ A-1

DoD Instruction 5000.2. Operation of the Defense Acquisition System, 5 April 2002 ...... A-2

ix

DoD Regulation 5000.2-R. Mandatory Procedures for Major Defense Acquisition Programs (MDAPs) and Major Automated Information System (MAIS) Acquisition Programs, 5 April 2002 ................................ A-3

DoD Directive 5000.4. OSD Cost Analysis Improvement Group (CAIG), November 24, 1992 .................................................................................................... A-12

DoD 5000.4-M. Cost Analysis Guidance and Procedures, December 1992 .................... A-12

APPENDIX B –GENERIC RISK MANAGEMENT PLAN ......................................................................... B-1

Sample Risk Management Plan ......................................................................................... B-1

Preface ............................................................................................................................... B-1

Sample Format for Risk Management Plan....................................................................... B-2

Sample Risk Management Plan for the XYZ Program (ACAT I, II) ................................ B-4

1.0 Introduction ........................................................................................................ B-4

1.1 Purpose ............................................................................................................... B-4

1.2 Program Summary .............................................................................................. B-41.2.1 System Description ................................................................................. B-51.2.2 Acquisition Strategy ............................................................................... B-51.2.3 Program Management Approach ............................................................ B-5

1.3 Definitions .......................................................................................................... B-51.3.1 Risk ......................................................................................................... B-51.3.2 Risk Event ............................................................................................... B-51.3.3 Technical Risk ........................................................................................ B-61.3.4 Cost Risk ................................................................................................ B-61.3.5 Schedule Risk ......................................................................................... B-61.3.6 Risk Ratings............................................................................................ B-61.3.7 Independent Risk Assessor ..................................................................... B-61.3.8 Templates and Best Practices ................................................................. B-61.3.9 Metrics .................................................................................................... B-71.3.10 Critical Program Attributes ..................................................................... B-7

2.0 Risk Management ............................................................................................... B-7

2.1 General Approach and Status ............................................................................. B-7

2.2 Risk Management Strategy ................................................................................. B-7

2.3 Organization ....................................................................................................... B-82.3.1 Risk Management Coordinator ............................................................... B-82.3.2 Program Level Integrated Product Team (PLIPT) .................................. B-92.3.3 PIPTs ...................................................................................................... B-9

x

2.3.4 XYZ Independent Risk Assessors ......................................................... B-102.3.5 Other Risk Assessment Responsibilities .............................................. B-102.3.6 User Participation ................................................................................. B-102.3.7 Risk Training ........................................................................................ B-10

3.0 Risk Management Process and Procedures ...................................................... B-10

3.1 Overview........................................................................................................... B-10

3.2 Risk Planning.................................................................................................... B-103.2.1 Process .................................................................................................. B-103.2.2 Procedures ............................................................................................ B-11

3.3 Risk Assessment ............................................................................................... B-123.3.1 Process .................................................................................................. B-123.3.2 Procedures ............................................................................................ B-13

3.4 Risk Handling ................................................................................................... B-183.4.1 Process .................................................................................................. B-183.4.2 Procedures ............................................................................................ B-18

3.5 Risk Monitoring................................................................................................ B-193.5.1 Process .................................................................................................. B-193.5.2 Procedures ............................................................................................ B-19

4.0 Risk Management Information System and Documentation ........................... B-20

4.1 Risk Management Information System (RMIS) .............................................. B-20

4.2 Risk Documentation ......................................................................................... B-204.2.1 Risk Assessment Documentation ......................................................... B-204.2.2 Risk Handling Documentation ............................................................. B-204.2.3 Risk Monitoring Documentation .......................................................... B-20

4.3 Reports .............................................................................................................. B-204.3.1 Standard Reports .................................................................................. B-214.3.2 Ad Hoc Reports .................................................................................... B-21

Annex A to XYZ Risk Management Plan – Critical Program Attributes .......................................................................... B-22

Annex B to XYZ Risk Management Plan – Program Risk Reduction Schedule .............................................................. B-23

Annex C to XYZ Risk Management Plan – Program Metric Examples ........................................................................... B-24

Annex D to XYZ Risk Management Plan – Management Information System and Documentation ............................... B-26

1.0 Description ...................................................................................... B-26

2.0 Risk Management Reports – XYZ Program ................................... B-262.1 Risk Information Form............................................................ B-272.2 Risk Assessment Report .......................................................... B-27

xi

2.3 Risk Handling Documentation ................................................ B-272.4 Risk Monitoring Documentation ............................................ B-27

3.0 Database Management System (DBMS) ........................................ B-27

Sample Risk Management Plan for the ABC Program (ACAT III, IV) .......................... B-32

1.0 Introduction ...................................................................................................... B-321.1 Purpose .................................................................................................... B-32

2.0 Program Summary ............................................................................................ B-322.1 Description .............................................................................................. B-322.2 Acquisition Strategy ................................................................................ B-332.3 Program Management Approach ............................................................. B-33

3.0 Risk-Related Definitions .................................................................................. B-333.1 Technical Risk ......................................................................................... B-333.2 Cost Risk ................................................................................................. B-333.3 Risk Ratings............................................................................................. B-33

4.0 Risk Management Status and Strategy ............................................................. B-344.1 Risk Management Status ......................................................................... B-344.2 Risk Management Strategy ...................................................................... B-34

5.0 Organization ..................................................................................................... B-345.1 Program Office ........................................................................................ B-34

6.0 Risk Management Structures and Procedures .................................................. B-366.1 Risk Planning........................................................................................... B-366.2 Risk Assessment ...................................................................................... B-36

6.2.1 Risk Identification ....................................................................... B-376.2.2 Risk Analysis ............................................................................... B-386.2.3 Risk Rating .................................................................................. B-406.2.4 Risk Prioritization ........................................................................ B-40

6.3 Risk Handling .......................................................................................... B-426.4 Risk Monitoring....................................................................................... B-426.5 Risk Management Information System (RMIS),

Documentation, and Reports .............................................................. B-43

Annex A to ABC Risk Management Plan – Critical Program Attributes ......... B-44

Annex B to ABC Risk Management Plan – Program Risk Reduction Schedule .............................................................. B-45

1.0 Description ...................................................................................... B-45

2.0 Risk Management Forms and Reports ............................................ B-452.1 Risk Information Form............................................................ B-452.2 Risk Monitoring Documentation ............................................ B-452.3 PIPT Risk Summary Report .................................................... B-45

xii

APPENDIX C –GLOSSARY ........................................................................................................................... C-1

APPENDIX D –QUANTIFYING EXPERT JUDGMENT .......................................................................... D-1

APPENDIX E –BIBLIOGRAPHY .................................................................................................................. E-1

FIGURES

2-1. Risk Management Structure ......................................................................................... 7

2-2. Critical Process Areas and Templates ........................................................................ 11

2-3. A Risk Management Plan Outline/Format ................................................................. 12

2-4. Risk Assessment ........................................................................................................ 14

2-5. Example of a WBS Dependent Evaluation Structure ................................................ 16

2-6. Overall Risk Rating (Example) .................................................................................. 20

4-1. Decentralized Risk Management Organization .......................................................... 38

5-1. Risk Planning Technique Input and Output ............................................................... 54

5-2. Sample Format for Risk Management Plan ............................................................... 56

5-3. Product (WBS) Risk Assessment Techniques Input and Output ................................ 57

5-4. Process (DoD 4245.7-M) Risk Assessment Technique Input and Output ................. 59

5-5. Plan Evaluation Technique Input and Output ............................................................ 60

5-6. Concept and Technology Development (CTD) Phase Correlation of Selected Documents (Example) ..................................................... 61

5-7. Threat and Requirement Risk Assessment Technique Input and Output ................... 62

5-8. Cost Risk Assessment Top-Level Diagram ................................................................ 64

5-9. Schedule Risk Assessment Technique Input and Output ........................................... 65

5-10. Expert Interview Technique Input and Output ........................................................... 67

5-11. Analogy Comparison/Lessons-Learned Studies Top-Level Diagram ........................ 68

5-12. Risk Prioritization Technique Input and Output ......................................................... 69

5-13. Risk Aggregation Techniques Input and Output ........................................................ 70

5-14. List of Aggregated Risks ........................................................................................... 70

5-15. Example Showing Detailed List of Top-Level Risk Information ............................... 82

5-16. Example of More Complex Combination of Risk Level and Scheduled Tasks ......... 83

5-17. Conceptual Risk Management and Reporting System ............................................... 87B-1. Risk Management and the Acquisition Process ....................................................... B-8

B-2. XYZ Risk Management Organization ..................................................................... B-9

xiii

B-3. Risk Management Structure ................................................................................... B-11

B-4. Risk Assessment Process ....................................................................................... B-17

B-5. XYZ Program Risk Reduction Schedule (Example) ............................................. B-23

B-6. Conceptual Risk Management and Reporting System ........................................... B-26

B-7. Risk Information Form .......................................................................................... B-29

B-8. Risk Tracking Report Example .............................................................................. B-30

B-9. ABC Risk Management Organization ................................................................... B-35

B-10. Risk Assessment Process ....................................................................................... B-41

B-11. Risk Tracking Report Example .............................................................................. B-47

TABLES

2-1. Risk Assessment Approaches .................................................................................... 18

2-2. Probability/Likelihood Criteria (Example) ................................................................. 18

2-3. Consequences/Impacts Criteria (Example) ................................................................ 18

2-4. Overall Risk Rating Criteria (Example) ..................................................................... 19

2-5. Risk Ratings (Example) ............................................................................................. 19

4-1. Notional Description of Risk Management Responsibilities ...................................... 41

4-2. Significant Risks by Critical Risk Areas .................................................................... 44

4-3. Risk Management Reference Documents .................................................................. 50

5-1. Critical Risk Areas and Example Elements ................................................................ 58

5-2. Examples of Demonstration Events ........................................................................... 76

5-3. Watch List Example ................................................................................................... 81

5-4. Examples of Product-Related Metrics ........................................................................ 84

5-5. Examples of Process Metrics ..................................................................................... 85

5-6. Examples of Cost and Schedule Metrics .................................................................... 86

5-7. Database Management System Elements ................................................................... 89

5-8. Software Risk Management Steps ............................................................................. 90

5-9. Top 10 Software Risks .............................................................................................. 90

5-10. Best Practices Initiative Risk Management Method ................................................... 91

5-11. Software Risk Grouping ............................................................................................ 92B-1. Critical Program Attributes .................................................................................... B-22

B-2. Examples of Product-Related Metrics .................................................................... B-24

B-3. Examples of Process Metrics ................................................................................. B-24

B-4. Examples of Cost and Schedule Metrics ................................................................ B-25

B-5. DBMS Elements .................................................................................................... B-28

xiv

B-6. Watch List Example ............................................................................................... B-31

B-7. Probability/Likelihood Levels ................................................................................ B-39

B-8. Risk Consequence ................................................................................................. B-41

B-9. Critical Program Attributes .................................................................................... B-44

B-10. DBMS Elements .................................................................................................... B-46

B-11. Sample Watch List ................................................................................................. B-48

B-12. Example PIPT Risk Summary Report ................................................................... B-48

B-13. Examples of Process Metrics ................................................................................. B-49

B-14. Examples of Cost and Schedule Metrics ................................................................ B-49

1

11INTRODUCTION

instruction and as a reference book for prac-tical applications. Most of the material in thisGuide is derived from the Defense Acquisi-tion Deskbook. Readers should refer toParagraph 2.5.2 of the Defense AcquisitionDeskbook for any new risk management in-formation that is disseminated between pub-lishing of updated Guide editions.

1.2 ORGANIZATION OF THE GUIDE

The Risk Management Guide discusses risk andrisk management, defines terms, and introducesbasic risk management concepts (Chapter 2).

Chapter 3 examines risk management conceptsrelative to the DoD acquisition process. Itillustrates how risk management is an integralpart of program management, describes inter-action with other acquisition processes, andidentifies and discusses the various types ofacquisition risks.

Chapter 4 discusses the implementation of arisk management program from the perspec-tive of a PMO. This chapter focuses on practi-cal application issues such as risk managementprogram design options, PMO risk managementorganizations, and criteria for a risk managementinformation system (MIS).

Chapter 5, the final chapter, describes a num-ber of techniques that address the aspects(phases) of risk management, i.e., planning,assessment, handling, and monitoring.

Risk has always been a concern in the acquisi-tion of Department of Defense (DoD) systems.The acquisition process itself is designed, to alarge degree, to allow risks to be controlled fromconception to delivery of a system. Unfortu-nately, in the past, some Program Managers(PMs) and decision makers have viewed riskas something to be avoided. Any program thathad risk was subject to intense review and over-sight. This attitude has changed. DoD manag-ers recognize that risk is inherent in any pro-gram and that it is necessary to analyze futureprogram events to identify potential risks andtake measures to handle them.

Risk management is concerned with the out-come of future events, whose exact outcome isunknown, and with how to deal with these un-certainties, i.e., a range of possible outcomes.In general, outcomes are categorized as favor-able or unfavorable, and risk management isthe art and science of planning, assessing, andhandling future events to ensure favorable out-comes. The alternative to risk management iscrisis management, a resource-intensive processthat is normally constrained by a restricted setof available options.

1.1 PURPOSE AND SCOPE

This Risk Management Guide is designed toprovide acquisition professionals and programmanagement offices (PMOs) with a referencebook for dealing with system acquisition risks.It is intended to be useful as an aid in classroom

2

This Guide is a source of background informa-tion and provides a starting point for a risk man-agement program. None of the material is man-datory. PMs should tailor the approaches andtechniques to fit their programs.

The Risk Management Guide also containsappendices that are intended to serve as refer-ence material and examples, and providebackup detail for some of the concepts pre-sented in the main portion of the Guide.

1.3 APPROACH TO RISKMANAGEMENT

Based on the DoD model contained in the De-fense Acquisition Deskbook (described in Chap-ter 2), this Guide emphasizes a risk managementapproach that is disciplined, forward looking, andcontinuous.

In 1986, the Government Accounting Office(GAO), as part of an evaluation of DoD poli-cies and procedures for technical risk assess-ments, developed a set of criteria as an approachto good risk assessments. These criteria, withslight modification, apply to all aspects of riskmanagement and are encompassed in theGuide’s approach. They are:

(1) Planned Procedures. Risk managementis planned and systematic.

(2) Prospective Assessment. Potential futureproblems are considered, not just currentproblems.

(3) Attention to Technical Risk. There isexplicit attention to technical risk.

(4) Documentation. All aspects of the riskmanagement program are recorded anddata maintained.

(5) Continual Process. Risk assessments aremade throughout the acquisition process;handling activities are continually evaluatedand changed if necessary; and critical riskareas are always monitored.

While these criteria are not solely sufficient todetermine the “health” of a program, they areimportant indicators of how well a risk man-agement process is being implemented. A pro-active risk management process is a good starttoward a successful program.

1.4 DOD RISK MANAGEMENTPOLICIES AND PROCEDURES

DoD policies and procedures that address riskmanagement for acquisition programs are con-tained in five key DoD documents. DoD Di-rective (DoDD) 5000.1 (The Defense Acqui-sition System) contains overall acquisitionpolicy—with a strong basis in risk manage-ment. The policy on risk management is am-plified further by the information in DoD In-struction (DoDI) 5000.2 (Operation of theDefense Acquisition System) and DoD 5000.2-R (Mandatory Procedures for Major DefenseAcquisition Programs (MDAPs) and the Ma-jor Automated Information System (MAIS)Acquisition Programs). These documents in-tegrate risk management into the acquisitionprocess, describe the relationship between riskand various acquisition functions, and estab-lish some reporting requirements. DoDD5000.4 and DoD 5000.4-M address risk andcost analysis guidance as they apply to theOffice of the Secretary of Defense. AppendixA is an extract of existing risk managementpolicies and procedures from all of thesedocuments.

The DoD 5000 series contains strong statementson risk management but requires elaboration to

3

help the PM establish an effective risk manage-ment program. The information furnished in theRisk Management section of the Defense Ac-quisition Deskbook supports and expands thecontents of the DoD 5000 series.

The DoD risk management policies and proceduresprovide the basis for this Guide, which comple-ments the Defense Acquisition Deskbook by elabo-rating on risk management concepts and by pro-viding greater detail for applying techniques.

4

5

22RISK AND

RISK MANAGEMENT

insight into risk areas, thereby allowing the de-velopment of effective handling strategies. Thenet result promotes executable programs.

Effective risk management requires involve-ment of the entire program team and also re-quires help from outside experts knowledge-able in critical risk areas (e.g., threat, technol-ogy, design, manufacturing, logistics, schedule,and cost). In addition, the risk management pro-cess should cover hardware, software, the hu-man element, and integration issues. Outsideexperts may include representatives from theuser, laboratories, contract management, test,logistics, and sustainment communities, andindustry. Users, essential participants in pro-gram trade analyses, should be part of the as-sessment process so that an acceptable balanceamong cost, schedule, performance, and riskcan be reached. A close relationship betweenthe Government and industry, and later with theselected contractor(s), promotes an understand-ing of program risks and assists in developingand executing the management efforts.

Successful risk management programs gen-erally have the following characteristics:

€ Feasible, stable, and well-understood userrequirements and threat;

• A close relationship with user, industry, andother appropriate participants;

2.1 INTRODUCTION

This Chapter introduces the concepts of risk andrisk management by explaining the DoD risk-related definitions and by identifying the char-acteristics of acquisition risks. It also presentsand discusses a structured concept for riskmanagement and its five subordinate processes.

2.2 OVERVIEW

The DoD risk management concept is based onthe principles that risk management must beforward-looking, structured, informative, andcontinuous. The key to successful risk manage-ment is early planning and aggressive execu-tion. Good planning enables an organized, com-prehensive, and iterative approach for identi-fying and assessing the risk and handling op-tions necessary to refine a program acquisitionstrategy. To support these efforts, assessmentsshould be performed as early as possible in thelife cycle to ensure that critical technical, sched-ule, and cost risks are addressed with mitiga-tion actions incorporated into program planningand budget projections.

PMs should update program risk assessmentsand tailor their management strategies accord-ingly. Early information gives them data thathelps when writing a Request for Proposal andassists in Source Selection planning. As a pro-gram progresses, new information improves

6

• A planned and structured risk managementprocess, integral to the acquisition process;

• An acquisition strategy consistent with risklevel and risk-handling strategies;

• Continual reassessment of program andassociated risks;

• A defined set of success criteria for all cost,schedule, and performance elements, e.g.,Acquisition Program Baseline (APB)thresholds;

• Metrics to monitor effectiveness of risk-handling strategies;

• Effective Test and Evaluation Program; and

• Formal documentation.

PMs should follow the guidelines below toensure that a management program possessesthe above characteristics.

• Assess program risks, using a structured pro-cess, and develop strategies to manage theserisks throughout each acquisition phase.

• Identify early and intensively manage thosedesign parameters that critically affect cost,capability, or readiness.

• Use technology demonstrations/modeling/simulation and aggressive prototyping toreduce risks.

• Use test and evaluation as a means ofquantifying the results of the risk-handlingprocess.

• Include industry and user participation in riskmanagement.

• Use Developmental Test and Evaluation(DT&E) and early operational assessmentswhen appropriate.

• Establish a series of “risk assessment re-views” to evaluate the effectiveness of riskhandling against clearly defined successcriteria.

• Establish the means and format to communi-cate risk information and to train participantsin risk management.

• Prepare an assessment training package formembers of the program office and others,as needed.

• Acquire approval of accepted risks at theappropriate decision level.

In general, management of software risk is thesame as management of other types of riskand techniques that apply to hardware programsare equally applicable to software intensiveprograms. However, some characteristics ofsoftware make this type of risk managementdifferent, primarily because it is difficult to:

• Identify software risk.

• Estimate the time and resources required todevelop new software, resulting in potentialrisks in cost and schedule.

• Test software completely because of thenumber of paths that can be followed in thelogic of the software.

• Develop new programs because of the rapidchanges in information technology and anever-increasing demand for quality softwarepersonnel.

7

2.3 RISK MANAGEMENTSTRUCTURE AND DEFINITIONS

Although each risk management strategydepends upon the nature of the system beingdeveloped, research reveals that good strategiescontain the same basic processes and structureshown in Figure 2-1. This structure is some-times also referred to as the Risk ManagementProcess Model. The application of these pro-cesses vary with acquisition phases and the de-gree of system definition; all should be inte-grated into the program management function.The elements of the structure are discussed inthe following paragraphs of this Chapter; how-ever, in order to form a basis for discussion,the Defense Acquisition Deskbook definitions forthe processes and elements of risk managementinclude:

Risk is a measure of the potential inability toachieve overall program objectives within de-fined cost, schedule, and technical constraintsand has two components: (1) the probability/likelihood of failing to achieve a particular out-come, and (2) the consequences/impacts of fail-ing to achieve that outcome.

Figure 2-1. Risk Management Structure

Risk events, i.e., things that could go wrong fora program or system, are elements of an acquisi-tion program that should be assessed to deter-mine the level of risk. The events should be de-fined to a level that an individual can compre-hend the potential impact and its causes. For ex-ample, a potential risk event for a turbine enginecould be turbine blade vibration. There couldbe a series of potential risk events that should beselected, examined, and assessed by subject-matter experts.

The relationship between the two componentsof risk—probability and consequence/impact—is complex. To avoid obscuring the results ofan assessment, the risk associated with an eventshould be characterized in terms of its two com-ponents. As part of the assessment there is alsoa need for backup documentation containingthe supporting data and assessment rationale.

Risk management is the act or practice of deal-ing with risk. It includes planning for risk, as-sessing (identifying and analyzing) risk areas,developing risk-handling options, monitoringrisks to determine how risks have changed, anddocumenting the overall risk managementprogram.

Risk Management

RiskAssessment

RiskPlanning

RiskHandling

RiskMonitoring

RiskIdentification

RiskAnalysis

Risk Documentation

8

Risk planning is the process of developing anddocumenting an organized, comprehensive, andinteractive strategy and methods for identify-ing and tracking risk areas, developing risk-handling plans, performing continuous risk as-sessments to determine how risks have changed,and assigning adequate resources.

Risk assessment is the process of identifyingand analyzing program areas and critical tech-nical process risks to increase the probability/likelihood of meeting cost, schedule, and per-formance objectives. Risk identification is theprocess of examining the program areas andeach critical technical process to identify anddocument the associated risk. Risk analysis isthe process of examining each identified riskarea or process to refine the description of therisk, isolating the cause, and determining theeffects. It includes risk rating and prioritizationin which risk events are defined in terms of theirprobability of occurrence, severity of conse-quence/impact, and relationship to other riskareas or processes.

Risk handling is the process that identifies,evaluates, selects, and implements options inorder to set risk at acceptable levels given pro-gram constraints and objectives. This includesthe specifics on what should be done, when itshould be accomplished, who is responsible,and associated cost and schedule. The most ap-propriate strategy is selected from these han-dling options. For purposes of the Guide, riskhandling is an all-encompassing term whereasrisk mitigation is one subset of risk handling.

Risk monitoring is the process that systemati-cally tracks and evaluates the performance ofrisk-handling actions against establishedmetrics throughout the acquisition process anddevelops further risk-handling options, asappropriate. It feeds information back into theother risk management activities of planning,assessment, and handling as shown in Figure

2-1. This feedback mechanism was first sug-gested by Dr. Edmund Conrow in his bookEffective Risk Management: Some Keys toSuccess.

Risk documentation is recording, maintaining,and reporting assessments, handling analysisand plans, and monitoring results. It includesall plans, reports for the PM and decisionauthorities, and reporting forms that may beinternal to the PMO.

2.4 RISK DISCUSSION

Implicit in the definition of risk is the conceptthat risks are future events , i.e., potential prob-lems, and that there is uncertainty associatedwith the program if these risk events occur.Therefore, there is a need to determine, as muchas possible, the probability of a risk eventoccurring and to estimate the consequence/impact if it occurs. The combination of these twofactors determines the level of risk. For example,an event with a low probability of occurring, yetwith severe consequences/impacts, may be a can-didate for handling. Conversely, an event with ahigh probability of happening, but the conse-quences/impacts of which do not affect aprogram, may be acceptable and require nohandling.

To reduce uncertainty and apply the definitionof risk to acquisition programs, PMs must befamiliar with the types of acquisition risks, un-derstand risk terminology, and know how tomeasure risk. These topics are addressed in thenext several sections.

2.4.1 Characteristics of Acquisition Risk

Acquisition programs tend to have numerous,often interrelated, risks. They are not alwaysobvious; relationships may be obscure; and theymay exist at all program levels throughout thelife of a program. Risks are in the PMO (program

9

plans, etc.); in support provided by other Gov-ernment agencies; in threat assessment; and inprime contractor processes, engineering andmanufacturing processes, and technology. Theinterrelationship among risk events may causean increase in one because of the occurrence ofanother. For example, a slip in schedule for anearly test event may adversely impact subse-quent tests, assuming a fixed period of test timeis available.

Another important risk characteristic is the timeperiod before a risk future event occurs; becausetime is critical in determining risk-handlingoptions. If an event is imminent, the PMO mustresort to crisis management. An event that isfar enough in the future to allow managementactions may be controllable. The goal is to avoidthe need to revert to crisis management andproblem solving by managing risk up front.

An event’s probability of occurrence and con-sequences/impacts may change as the develop-ment process proceeds and information be-comes available. Therefore, throughout the de-velopment phase, PMOs should reevaluateknown risks on a periodic basis and examinethe program for new risks.

2.4.2 Program Products, Processes,Risk Areas, and Risk Events

Program risk includes all risk events and theirrelationships to each other. It is a top-level as-sessment of impact to the program when all riskevents at the lower levels of the program areconsidered. Program risk may be a roll-up ofall low-level events; however, most likely, it isa subjective evaluation of the known risks bythe PMO, based on the judgment and experi-ence of experts. Any roll-up of program risksmust be carefully done to prevent key risk issuesfrom “slipping through the cracks.” Identify-ing program risk is essential because it forcesthe PMO to consider relationships among all

risks and may identify potential areas of concernthat would have otherwise been overlooked.One of the greatest strengths of a formal, con-tinuous risk management process is the proac-tive quest to identify risk events for handling andthe reduction of uncertainty that results from han-dling actions.

A program office has continuous demands onits time and resources. It is, at best, difficult,and probably impossible, to assess everypotential area and process. To manage risk, thePMOs should focus on the critical areas thatcould affect the outcome of their programs.Work Breakdown Structure (WBS) product andprocess elements and industrial engineeringand manufacturing processes contain most ofthe significant risk events. Risk events are de-termined by examining each WBS element andprocess in terms of sources or areas of risk.Broadly speaking, these sources generally canbe grouped as cost, schedule, and performance,with the latter including technical risk.Following are some typical risk areas:

• Threat. The sensitivity of the program touncertainty in the threat description, thedegree to which the system design wouldhave to change if the threat’s parameterschange, or the vulnerability of the programto foreign intelligence collection efforts(sensitivity to threat countermeasure).

• Requirements. The sensitivity of the programto uncertainty in the system description andrequirements except for those caused bythreat uncertainty.

• Design. The ability of the system configu-ration to achieve the program’s engineeringobjectives based on the available technology,design tools, design maturity, etc.

• Test and Evaluation (T&E). The adequacyand capability of the T&E program to assess

10

attainment of significant performance speci-fications and determine whether the systemsare operationally effective and suitable.

• Modeling and Simulation (M&S). The ad-equacy and capability of M&S to support allphases of a program using verified, valid,and accredited M&S tools.

• Technology. The degree to which the tech-nology proposed for the program has beendemonstrated as capable of meeting all ofthe program’s objectives.

• Logistics. The ability of the system configu-ration to achieve the program’s logistics ob-jectives based on the system design, main-tenance concept, support system design, andavailability of support resources.

• Production. The ability of the system con-figuration to achieve the program’s produc-tion objectives based on the system design,manufacturing processes chosen, and avail-ability of manufacturing resources such asfacilities and personnel.

• Concurrency. The sensitivity of the pro-gram to uncertainty resulting from the com-bining or overlapping of life-cycle phases oractivities.

• Capability of Developer. The ability of thedeveloper to design, develop, and manufac-ture the system. The contractor should havethe experience, resources, and knowledge toproduce the system.

• Cost/Funding. The ability of the system toachieve the program’s life-cycle cost objec-tives. This includes the effects of budget andaffordability decisions and the effects ofinherent errors in the cost estimatingtechnique(s) used (given that the technicalrequirements were properly defined).

• Management. The degree in which programplans and strategies exist and are realistic andconsistent. The Government’s acquisitionteam should be qualified and sufficientlystaffed to manage the program.

• Schedule. The adequacy of the time allo-cated for performing the defined tasks, e.g.,developmental, production, etc. This factorincludes the effects of programmatic sched-ule decisions, the inherent errors in theschedule estimating technique used, andexternal physical constraints.

Critical risk processes are the developer’s en-gineering and production processes which, his-torically, have caused the most difficulty dur-ing the development and/or production phasesof acquisition programs. These processes in-clude, but are not limited to, design, test, pro-duction, facilities, logistics, and manage-ment. These processes are included in the criti-cal risk areas and are addressed separately toemphasize that they focus on processes. DoD4245.7-M, Transition from Development toProduction, describes them using templates.See Figure 2-2 for an example of the templatefor product development. The templates are theresult of a Defense Science Board task force,composed of Government and industry experts,who identified engineering processes and con-trol methods to minimize risk in both Govern-ment and industry. The task force defined thesecritical events in terms of the templates, whichare briefly discussed later. A copy of DoD4245.7-M may be obtained at the DefenseTechnical Information Center (DTIC) Website:http://www.dtic.mil/whs/directives.

Additional areas, such as manpower, environ-mental impact, systems safety and health, andsystems engineering, that are analyzed duringprogram plan development provide indicatorsfor additional risk. The PMO should considerthese areas for early assessment since failure

11

to do so could cause dire consequences/impactsin the program’s latter phases.

In addition, PMs should address the uncer-tainty associated with security—an area some-times overlooked by developers but addressedin the Acquisition System Protection (ASP)section of the Defense Acquisition Deskbookand Air Force Pamphlet ASPWG PH-1, Ac-quisition System Protection Program WorkBook, September 1994. However, in additionto the guidance given there, PMs must recog-nize that, in the past, classified programs haveexperienced difficulty in access, facilities,clearances, and visitor control. Failure to man-age these aspects of a classified program couldadversely affect cost and schedule.

Figure 2-2. Critical Process Areas and Templates

2.5 RISK PLANNING

2.5.1 Purpose of Risk Plans

Risk planning is the detailed formulation of aprogram of action for the management of risk.It is the process to:

• Develop and document an organized, com-prehensive, and interactive risk managementstrategy.

• Determine the methods to be used to executea PM’s risk management strategy.

• Plan for adequate resources.

Risk planning is iterative and includes describ-ing and scheduling the activities and process to

PRODUCT

FUNDING

DESIGN TEST PRODUCTION FACILITIES LOGISTICS MANAGEMENT

MoneyPhasing

Design Ref.Mission Profile

Trade Studies

Build-in Test

Design Studies

Parts & MaterialsSelection

Computer-AidedDesign (CAD)

Design Reviews

DesignRequirements

Design Policy

Design Analysis

SoftwareDesign

Design forTesting

Design Reviews

ConfigurationControl

Failure ReportingSystem

Integrated Test

Life

Software Test

Uniform TestReport

FieldFeedback

Design Limit

Test Analysisand Fix (TAFF)

ManufacturingScreening

ManufacturingPlan

Tool Planning

Piece PartControl

Computer-AidedMfg. (CAM)

Defect Control

Special TestEquipment (STE)

Quality Mfg.Process

SubcontractorControl

FactoryImprovements

Modernization

ProductivityCenter

Logistics SupportAnalysis

Support & TestEquipment

Spares

Manpower &Personnel

Training Materials& Equipment

ManufacturingStrategy

DataManagement

Production Breaks

PersonnelRequirements

Technical RiskAssessment

Transition Plan

12

assess (identify and analyze), handle, monitor,and document the risk associated with a pro-gram. The result is the Risk Management Plan(RMP).

2.5.2 Risk Planning Process

The PMO should periodically review the planand revise it, if necessary. Some events suchas: (1) a change in acquisition strategy, (2)preparation for a major decision point, (3) tech-nical audits and reviews, (4) an update of otherprogram plans, and (5) preparation for a Pro-gram Objective Memorandum (POM) submis-sion may drive the need to update an existingplan.

Planning begins by developing and document-ing a risk management strategy. Early effortsestablish the purpose and objective, assign re-sponsibilities for specific areas, identify addi-tional technical expertise needed, describe theassessment process and areas to consider,delineate procedures for consideration of han-dling options, define a risk rating scheme,dictate the reporting and documentation needs,and establish report requirements and moni-toring metrics. This planning should also ad-dress evaluation of the capabilities of potential

Figure 2-3. A Risk Management Plan Outline/Format

sources as well as early industry involvement andprogram.

The PM’s strategy to manage risk provides theprogram team with direction and basis for plan-ning. Initially formalized during a program’sConcept Exploration Phase and updated foreach subsequent program phase, the strategyshould be reflected in the program’s acquisi-tion strategy, which with requirement and threatdocuments, known risks, and system and pro-gram characteristics are sources of informationfor PMO use to devise a strategy and begin de-veloping a Risk Management Plan. Since theprogram’s risks are affected by the Governmentand contractor team’s ability to develop andmanufacture the system, industry can providevaluable insight into this area of consideration.

The plan is the road map that tells the Govern-ment and contractor team how to get from wherethe program is today to where the PM wants itto be in the future. The key to writing a goodplan is to provide the necessary information sothe program team knows the objectives, goals,and the PMO’s risk management process. Sinceit is a map, it may be specific in some areas,such as the assignment of responsibilities forGovernment and contractor participants and

Introduction

Program Summary

Definitions

Risk Management Strategy and Approach

Organization

Risk Management Process and Procedures

Risk Planning

Risk Assessment

Risk Handling

Risk Monitoring

Risk Management Information System, Documentation and Reports

13

definitions, and general in other areas to allowusers to choose the most efficient way to pro-ceed. For example, a description of techniquesthat suggests several methods for evaluators touse to assess risk is appropriate, since everytechnique has advantages and disadvantagesdepending on the situation.

Appendix B contains two examples of a riskplan and a summary of the format is shown inFigure 2-3.

In a decentralized PMO risk management orga-nization, the program’s risk management coor-dinator may be responsible for risk managementplanning. See Sections 4.4, Risk ManagementOrganization in the PMO, and 5.3, Risk Plan-ning Techniques.

2.6 RISK ASSESSMENT

2.6.1 Purpose of Risk Assessments

The primary objective of assessments is toidentify and analyze program risks so that themost critical among them may be controlled.Assessments are factors that managers shouldconsider in setting cost, schedule, and perfor-mance objectives because they provide anindication of the probability/likelihood ofachieving the desired outcomes.

2.6.2 Risk Assessment Process

Risk assessment is the problem definition stageof management that identifies and analyzes(quantifies) prospective program events in termsof probability and consequences/impacts. Theresults form the basis for most risk managementactions. It is probably the most difficult andtime-consuming part of the management pro-cess. There are no quick answers or shortcuts.Tools are available to assist evaluators in assess-ing risk, but none are totally suitable for any

program and may be highly misleading if theuser does not understand how to apply them orinterpret the results. Despite its complexity, riskassessment is one of the most important phasesof the risk process because the caliber and qual-ity of assessments determine the effectivenessof a management program.

The components of assessment, identificationand analysis, are performed sequentially withidentification being the first step.

Risk identification begins by compiling theprogram’s risk events. PMOs should examineand identify program events by reducing themto a level of detail that permits an evaluator tounderstand the significance of any risk and iden-tify its causes, i.e., risk drivers. This is a practi-cal way of addressing the large and diverse num-ber of potential risks that often occur in acqui-sition programs. For example, a WBS level 4or 5 element may generate several risk eventsassociated with a specification or function, e.g.,failure to meet turbine blade vibration require-ments for an engine turbine design.

Risk events are best identified by examiningeach WBS product and process element in termsof the sources or areas of risk, as previouslydescribed in Paragraph 2.4.2.

Risks are those events that evaluators (afterexamining scenarios, WBS, or processes)determine would adversely affect the program.Evaluators may initially rank events by prob-ability and consequence/impact of occurrencebefore beginning analysis to focus on thosemost critical.

Risk analysis is a technical and systematic pro-cess to examine identified risks, isolate causes,determine the relationship to other risks, andexpress the impact in terms of probability andconsequences/impacts.

14

In practice, the distinction between risk identi-fication and risk analysis is often blurred be-cause there is some risk analysis that occursduring the identification process. For example,if, in the process of interviewing an expert, arisk is identified, it is logical to pursue informa-tion on the probability of it occurring, the con-sequences/impacts, the time associated with therisk (i.e., when it might occur), and possible

Figure 2-4. Risk Assessment

ways of dealing with it. The latter actions arepart of risk analysis and risk handling, but of-ten begin during risk identification.

Prioritization is the ranking of risk events todetermine the order of importance. It serves asthe basis for risk-handling actions. Prioritizationis part of risk analysis.

Risk Planning

Handling Phase

Risk HandlingSupport Key Events

Identification ofRisk Events

List WBS product/process elements

Examine each in termsof risk sources/areas

Determine what couldgo wrong

Compile list of“Risk Events”

Planning Phase

AssessmentPhase

RISK

ASSESSMENT

Risk Identification Activity

• Identify Risk Events

• Examine Events for Consequences

• Preliminary Analysis

• Document the Results

Risk Analysis Activity

• Develop Probability Consequences Scales

• Perform Supporting Analysis

• Determine Probability and ConsequenceLevels/Ratings

• Document the Results

• Rate, Prioritize, and Aggregate Risks

• Establish Watch List

Pre-Risk Assessment Activity

• Determine Needs to Conduct Assessment

• Train the Teams

• Define Evaluation Structure

• Identify Outside Experts

15

Integrated Product Teams (IPTs) typically per-form risk assessments in a decentralized riskmanagement organization as described in Para-graph 4.4. If necessary, the team may be aug-mented by people from other program areas oroutside experts. Paragraph 5.4, Risk AssessmentTechniques, elaborates on this for each of thedescribed assessment techniques.

2.6.3 Timing of Risk Assessments

The assessment process begins during the lasthalf of the Concept and Technology Devel-opment (CTD) Phase and continues through-out the subsequent acquisition phases. ThePMO should continually reassess the programat increasing levels of detail as the programpro-gresses through the acquisition phases andmore information becomes available. Thereare, however, times when events may requirenew assessments, i.e., a major change in theacquisition strategy. Paragraph 2.5.2 lists otherevents that could cause risk assessments to beperformed.

2.6.4 Conducting Risk Assessments

There is no standard approach to assessing riskbecause methods vary according to the tech-nique employed, the phase of the program, andthe nature of the program itself; however, sometop-level actions are typically common to allmethods. They are grouped in Figure 2-4 intopre-risk assessment activities, risk identifica-tion activities, and risk analysis activities. Eachrisk category or area, e.g., cost, schedule, andperformance, includes a core set of assessmenttasks and is related to the other two categories.This relationship requires supportive analysisamong areas to ensure the integration of the as-sessment process. For example, a technical as-sessment probably should include a cost andschedule analysis in determining the technicalrisk impact. The results of the assessments, nor-mally conducted by IPTs follow:

Performance/Technical Assessment (Includestechnical areas of risk shown in Paragraph2.4.2.)

• Provides technical foundation,

• Identifies and describes program risks, i.e.,threat, technology, design, manufacturing,etc.,

• Prioritizes risks with relative or quantifiedweight for program impact,

• Analyzes risks and relates them to otherinternal and external risks,

• Quantifies associated program activities withboth time duration and resources,

• Quantifies inputs for schedule assessmentand cost estimate,

• Documents technical basis and risk definitionfor the risk assessment.

Schedule Assessment

• Evaluates baseline schedule inputs,

• Incorporates technical assessment andschedule uncertainty inputs to programschedule model,

• Evaluates impacts to program schedulebased on technical team assessment,

• Performs schedule analysis on programintegrated master schedule,

• Quantifies schedule excursions reflectingeffects of cost risks, including resourceconstraints,

• Provides Government schedule assessmentfor cost analysis and fiscal year planning,

16

• Reflects technical foundation, activity defini-tion, and inputs from technical and cost ar-eas,

• Documents schedule basis and risk impactsfor the risk assessment.

Cost Estimate and Assessment

• Builds on technical and schedule assessmentresults,

• Translates technical and schedule risks intocost,

• Derives cost estimate by integrating techni-cal risk and schedule risk impacts withresources,

• Establishes budgetary requirements consis-tent with fiscal year planning,

• Determines if the phasing of funds supportstechnical and acquisition approach,

• Provides program cost excursions from:– Near-term budget execution impacts,– External budget changes and constraints.

• Documents cost basis and risk impacts.

2.6.4.1 Pre-Risk Assessment Activities. TheRisk Management Plan may describe theactions that compose this activity. Typically, aprogram-level IPT may conduct a quick-lookassessment of the program to identify the needfor technical experts (who are not part of theteam) and to examine areas that appear mostlikely to contain risk. The program’s risk coor-dinator, or an outside expert, may train the IPTs,focusing on the program’s risk strategy, defini-tions, suggested techniques, documentation,and reporting requirements. Paragraph 4.9,Risk Management Training, provides somesuggestions for training.

2.6.4.2 Risk Identification Activity. To iden-tify risk events, IPTs should break down pro-gram elements to a level where they, or sub-ject-matter experts, can perform valid assess-ments. The information necessary to do thisvaries according to the phase of the program.During the early phases, requirement, threatdocuments, and acquisition plans may be theonly program-specific data available. Theyshould be analyzed to identify events that mayhave adverse consequences/impacts. A useful

Figure 2-5. Example of a WBS Dependent Evaluation Structure

Risk Goals/Level 1 Level 2 Level 3 Level 4 Event Objectives

Weight WeightBudget

Aircraft Wings

AircraftSystem Airframe

(too heavy)

17

initial identification exercise is to perform a mis-sion profile for the system as suggested in DoD4245.7-M, Transition from Development toProduction. Using this methodology, the devel-oper creates a functional and environmentalprofile for the system and examines the low-level requirements that the system must meetto satisfy its mission requirements. The IPTsmay then study these requirements to determinewhich are critical. For example, in an aircraftprofile, it may be apparent that high speed iscritical. If the speed requirement is close to thatachieved by existing aircraft, this may not be aconcern. However, if the speed is greater thanthat achieved by today’s aircraft, it may be acritical risk area. Since aircraft speed depends,among other things, on weight and enginethrust, it would be desirable to enlist the helpof a materials expert to address weight and anengine expert to assess engine-associated risk.

Another method of decomposition is to createa WBS as early as possible in a program. Fig-ure 2-5 is a simple example of a decompositionbased on the WBS for an aircraft. The figureshows an important requirement of the decom-position process, the establishment of goals(e.g., don’t exceed the weight budget or objec-tive). Risk events are determined by matchingeach WBS element and process to sources orareas of risk. Risk areas/sources are describedin Paragraph 2.4.2 and Table 4-2.

During decomposition, risk events are identi-fied from experience, brainstorming, lessonslearned from similar programs, and guidancecontained in the risk management plan. Astructured approach previously discussedmatches each WBS element and process interms of sources or areas of risk. The exami-nation of each element against each risk areais an exploratory exercise to identify the criti-cal risks. The investigation may show that risksare interrelated. For example, the weight ofan aircraft affects its speed, but also impacts

the payload, range, and fuel requirements.These have design and logis t icsconsequences/impacts and may even affectthe number of aircraft that must be procuredto meet objectives.

Critical risks need to be documented as speci-fied in the Risk Management Plan and may in-clude the scenario that causes the risk, plannedmanagement controls and actions, etc. It mayalso contain an initial assessment of the conse-quences/impacts to focus the risk assessmenteffort. A risk watch list should be initiated aspart of risk identification. It is refined duringhandling, and monitored/updated during themonitoring phase. Watch lists provide a conve-nient and necessary form to track and documentactivities and actions resulting from riskanalysis. Watch lists frequently evolve from theinput of each “expert” functional manager on aprogram. (See paragraph 5.7.5.)

2.6.4.3 Risk Analysis Activity. Analysis be-gins with a detailed study of the critical riskevents that have been identified. The objectiveis to gather enough information about the risksto judge the probability of occurrence and theimpact on cost, schedule, and performance ifthe risk occurs.

Impact assessments are normally subjective andbased on detailed information that may comefrom:

• Comparisons with similar systems,

• Relevant lessons-learned studies,

• Experience,

• Results from tests and prototype development,

• Data from engineering or other models,

• Specialist and expert judgments,

18

• Analysis of plans and related documents,

• Modeling and simulation,

• Sensitivity analysis of alternatives.

Depending on the particular technique and therisk being analyzed, some supporting analysis

Table 2-1. Risk Assessment Approaches

Applicable Risk Areas &Risk Assessment Technique Applicable Acquisition Phases Processes

Program Plans and critical com-Plan Evaluation/Risk Identification All phases munications with the developer

Product (WBS) Risk Assessment All phases starting with the All critical risk areas except threat,completion of the Contract WBS requirements, cost, and schedule

Process (DoD 4265.7-M) RiskAssessment All phases, but mainly late SDD All critical risk processes

Cost Risk Assessment All phases Cost critical risk areas

Schedule Risk Assessment All phases Schedule critical risk areas

Table 2-2. Probability/Likelihood Criteria (Example)

Level What is the Likelihood the RiskEvent Will Happen?

a Remote

b Unlikely

c Likely

d Highly Likely

e Near Certainty

Table 2-3. Consequences/Impacts Criteria (Example)

Performance Schedule Costa Minimal or no impact Minimal or no impact Minimal or no impact

b Acceptable with some Additional resources <5%reduction in margin required; able to meet

need dates

c Acceptable with significant Minor slip in key milestones; 5-7%reduction in margin not able to meet need date

d Acceptable; no remaining Major slip in key milestone 7-10%margin or critical path impacted

e Unacceptable Can’t achieve key team or >10%major program milestone

LevelGiven the Risk Is Realized, What Is the Magnitude of the Impact?

may be necessary, i.e., analysis of contractor pro-cesses, such as design, engineering, fault treeanalysis, engineering models, simulation, etc.Analyses provide the basis for subjectiveassessments.

A critical aspect of risk analysis is datacollection. Two primary sources of data are

19

interviews of subject-matter experts and anal-ogy comparisons with similar systems. Para-graph 5.4 contains a procedure for collect-ing both types of data for use in support ofthe techniques listed in Table 2-1. Periodi-cally, sets of risks need to be prioritized inpreparation for risk handling, and aggregatedto support program management reviews.Paragraph 5.5, Risk Prioritization, describesmethods for accomplishing this.

2.6.4.3.1 Risk Rating and Prioritization/Ranking

Risk ratings are an indication of the potentialimpact of risks on a program; they are a mea-sure of the probability/likelihood of an eventoccurring and the consequences/impacts of theevent. They are often expressed as high, mod-erate, and low. Risk rating and prioritization/ranking are considered integral parts of riskanalysis.

A group of experts, who are familiar with eachrisk source/area (e.g., design, logistics, produc-tion, etc.) and product WBS element, are bestqualified to determine risk ratings. They shouldidentify rating criteria for review by the PMO,

Table 2-4. Overall Risk Rating Criteria (Example)

Risk Rating Description

High Major disruption likely

Moderate Some disruption

Low Minimum disruption

who includes them in the Risk ManagementPlan. In most cases, the criteria will be basedon the experience of the experts, as opposed tomathematically derived, and should establishlevels of probability/likelihood and conse-quences/ impacts that will provide a range ofpossibilities large enough to distinguish differ-ences in risk ratings. At the program level, con-sequences/impacts should be expressed interms of impact on cost, schedule and perfor-mance. Tables 2-2 and 2-3 are examples ofprobability/ likelihood and consequence/impactcriteria, and Table 2-4 contains an example ofoverall risk rating criteria, which considers bothprobability/likelihood and consequences/impacts. Table 2-5 provides a sample formatfor presenting risk ratings.

Using these risk ratings, PMs can identifyevents requiring priority management (high ormoderate risk probability/likelihood or conse-quences/impacts). The document prioritizingthe risk events is called a Watch List. Risk rat-ings also help to identify the areas that shouldbe reported within and outside the PMO, e.g.,milestone decision reviews. Thus, it is impor-tant that the ratings be portrayed as accuratelyas possible.

Table 2-5. Risk Ratings (Example)

Priority Area/Source Location Risk Event Proba- Conse- RiskProcess bility quence Rating

1 Design WBS 3.1 Design not Highly Can’t achieve Highcompleted on time Likely key milestone

2

3

20

A simple method of representing the risk ratingfor risk events, i.e., a risk matrix, is shown inFigure 2-6. In this matrix, the PM has definedhigh, moderate, and low levels for the variouscombinations of probability/likelihood andconsequences/impacts.

There is a common tendency to attempt to de-velop a single number to portray the risk asso-ciated with a particular event. This approachmay be suitable if both probability/likelihood(probability) and consequences/impacts havebeen quantified using compatible cardinalscales or calibrated ordinal scales whose scalelevels have been determined using acceptedprocedures (e.g., Analytical Hierarchy Process).In such a case, mathematical manipulation ofthe values may be meaningful and provide somequantitative basis for the ranking of risks.

In most cases, however, risk scales are actuallyjust raw (uncalibrated) ordinal scales, reflect-ing only relative standing between scale levelsand not actual numerical differences. Any math-ematical operations performed on results fromuncalibrated ordinal scales, or a combinationof uncalibrated ordinal and cardinal scales, canprovide information that will at best be mis-leading, if not completely meaningless, result-ing in erroneous risk ratings. Hence, mathemati-cal operations should generally not be per-formed on scores derived from uncalibrated

Figure 2-6. Overall Risk Rating (Example)

Lik

elih

oo

d

Consequence

e L M H H H

d L M M H H

c L M M M H

b L L L M M

a L L L L M

a b c d e

ordinal scales. (Note: risk scales that are ex-pressed as decimal values (e.g., a 5 level scalewith values 0.2, 0.4, 0.6, 0.8 and 1.0) still retainthe ordinal scale limitations discussed above.)For a more detailed discussion of risk scales,see Appendix G of the reference Effective RiskManagement: Some Keys to Success.

One way to avoid this situation is to simplyshow each risk event’s probability/likelihoodand consequences/impacts separately, with noattempt to mathematically combine them. Otherfactors that may significantly contribute to therisk rating, such as time sensitivity or resourceavailability, can also be shown. The prioriti-zation or ranking—done after the rating—should also be performed using a structured riskrating approach (e.g., Figure 2-6) coupled withexpert opinion and experience. Prioritization orranking is achieved through integration of riskevents from lower to higher WBS levels. Thismeans that the effect of risk at lower WBSelements needs to be reflected cumulatively atthe top or system level.

2.7 RISK HANDLING

2.7.1 Purpose of Risk Handling

Risk handling includes specific methods andtechniques to deal with known risks and aschedule for accomplishing tasks, identifies

21

who is responsible for the risk area, and pro-vides an estimate of the cost and scheduleassociated with handling the risk, if any. Itinvolves planning and execution with theobjective of handling risks at an acceptable level.The IPTs that assess risk should begin the pro-cess to identify and evaluate handling approachesto propose to the PM, who selects the appropri-ate ones for implementation.

2.7.2 Risk-Handling Process

The risk-handling phase must be compatible withthe risk management plan and any additionalguidance the PM provides. Paragraph 5.3 de-scribes a technique that concentrates on plan-ning. A critical part of planning involves refin-ing and selecting of the most appropriate han-dling options.

The IPTs that evaluate the handling options mayuse the following criteria as a starting point forassessment:

• Can the option be feasibly implemented andstill meet the user’s needs?

• What is the expected effectiveness of thehandling option in reducing program risk toan acceptable level?

• Is the option affordable in terms of dollarsand other resources (e.g., use of criticalmaterials, test facilities, etc.)?

• Is time available to develop and implementthe option, and what effect does that haveon the overall program schedule?

• What effect does the option have on thesystem’s technical performance?

Risk-handling options can include risk control,risk avoidance, risk assumption, and risk

transfer. An acronym used to identify these op-tions is “CAAT”. Although the control risk-handling option is commonly used in defenseprograms, it should not automatically be cho-sen. All four options should be evaluated andthe best one chosen for a given risk issue.

Risk Control does not attempt to eliminate thesource of the risk but seeks to reduce or mitigatethe risks. It monitors and manages the risk in amanner that reduces the probability/likelihoodand/or consequence/impact of its occurrence orminimizes the risk’s effect on the program. Thisoption may add to the cost of a program; how-ever, the selected approach should provide anoptional risk among the candidate approachesof risk reduction, cost effectiveness, and sched-ule impact. A sampling is listed below of thetypes of risk control actions available to thePMO. Paragraph 5.6.2 discusses them in moredetail.

• Multiple Development Efforts. Createcompeting systems in parallel that meet thesame performance requirements.

• Alternative Design. Create a backup designoption that uses a lower risk approach.

• Trade Studies. Arrive at a balance of engi-neering requirements in the design of asystem.

• Early Prototyping. Build and test prototypesearly in the system development.

• Incremental Development. Design with theintent of upgrading system parts in the future.

• Technology Maturation Efforts. Normally,technology maturation is used when the de-sired technology will replace an existingtechnology which is available for use in thesystem.

22

• Robust Design. This approach, while it couldbe more costly, uses advanced design andmanufacturing techniques that promote qual-ity through design.

• Reviews, Walk-throughs, and Inspections.These three actions can be used to reduce theprobability/likelihood and potential conse-quences/impacts of risks through timely as-sessment of actual or planned events.

• Design of Experiments. This engineeringtool identifies critical design factors that aresensitive, therefore potentially high risk, toachieve a particular user requirement.

• Open Systems. Carefully selected commer-cial specifications and standards whose usecan result in lower risks.

• Use of Standard Items/Software Reuse.Use of existing and proven hardware andsoftware, where applicable, can substantiallyreduce risks.

• Two-Phase Development. Incorporation offormal risk reduction into System Develop-ment and Demonstration (SDD). The firstpart of SDD is System Integration (SI),where prototypes are developed and tested.In the second part, System Demonstration(SD), Engineering Development Models(EDMs) are developed and tested.

• Use of Mock-ups. The use of mock-ups,especially man-machine interface mock-ups,can be used to conduct early exploration ofdesign options.

• Modeling/Simulation. Modeling and simu-lation can be used to investigate various de-sign options and system requirement levels.

• Key Parameter Control Boards. The prac-tice of establishing a control board for a

parameter may be appropriate when a par-ticular feature (such as system weight) iscrucial to achieving the overall programrequirements.

• Manufacturing Screening. For programsin SDD, various manufacturing screens(including environmental stress screening(ESS)) can be incorporated into test articleproduction and low rate initial production(LRIP) to identify deficient manufactur-ing processes. ESS is a manufacturing pro-cess for stimulating parts and workman-ship defects in electronic assemblies andunits.

• Test, Analyze, and Fix (TAAF). TAAF isthe use of a period of dedicated testing toidentify and correct deficiencies in a design.

• Demonstration Events. Demonstrationevents are points in the program (normallytests) that determine if risks are beingsuccessfully abated.

• Process Proofing. Similar to Program Met-rics, but aimed at manufacturing and supportprocesses which are critical to achieving sys-tem requirements. Proofing simulates actualproduction environments and conditions toinsure repeatedly conforming hardware andsoftware.

As you can see, there are numerous means thatcan be used to actively control risks.

Risk Avoidance involves a change in the con-cept, requirements, specifications, and/or prac-tices that reduce risk to an acceptable level.Simply stated, it eliminates the sources of highor possibly medium risk and replaces themwith a lower risk solution and may be sup-ported by a cost/benefit analysis. Generally,this method may be done in parallel with theup-front requirements analysis, supported by

23

cost/requirement trade studies, which can in-clude Cost As an Independent Variable (CAIV)trades.

Risk Assumption. Risk assumption is anacknowledgment of the existence of a particu-lar risk situation and a conscious decision toaccept the associated level of risk, withoutengaging in any special efforts to control it.However, a general cost and schedule reservemay be set aside to deal with any problemsthat may occur as a result of various risk as-sumption decisions. This method recognizesthat not all identified program risks warrantspecial handling; as such, it is most suited forthose situations that have been classified aslow risk. The key to successful risk assump-tion is twofold:

• Identify the resources (time, money, people,etc.) needed to overcome a risk if it materi-alizes. This includes identifying the specificmanagement actions (such as retesting,additional time for further design activities)that may occur.

• Ensure that necessary administrative actionsare taken to identify a management reserveto accomplish those management actions.

Risk-handling options have broad cost impli-cations. The magnitude of these costs are cir-cumstance-dependent. The approval and fund-ing of handling options should be part of theprocess that establishes the program cost andperformance goals. This should normally bedone by the Program-Level Risk ManagementIPT or Risk Management Board. The selectedhandling option should be included in theprogram’s acquisition strategy.

Once the acquisition strategy includes risk-handling approaches, the PMO can derive theschedule and identify cost, schedule, andperformance, impacts to the basic program.

Risk Transfer. This action may reallocate riskduring the concept development and design pro-cesses from one part of the system to another,thereby reducing the overall system risk, or re-distributing risks between the Government andthe prime contractor or within Governmentagencies; or between members of the contrac-tor team. It is an integral part of the functionalanalysis process. Risk transfer is a form of risksharing and not risk abrogation on the part ofthe Government, and it may influence cost ob-jectives. An example is the transfer of a func-tion from hardware implementation to softwareimplementation or vice versa. The effectivenessof risk transfer depends on the use of success-ful system design techniques. Modularity andfunctional partitioning are two design tech-niques that support risk transfer. In some cases,risk transfer may concentrate risk areas in onearea of the design. This allows management tofocus attention and resources on that area.

2.8 RISK MONITORING

The monitoring process systematically tracksand evaluates the effectiveness of risk-han-dling actions against established metrics.Monitoring results may also provide a basisfor developing additional handling options andidentifying new risks. The key to the moni-toring process is to establish a cost, schedule,and performance management indicator sys-tem over the entire program that the PM usesto evaluate the status of the program. The in-dicator system should be designed to provideearly warning of potential problems to allowmanagement actions. Risk monitoring is not aproblem-solving technique, but rather, a pro-active technique to observe the results of riskhandling and identify new risks. Some moni-toring techniques can be adapted to becomepart of a risk indicator system:

• Test and Evaluation (T&E). A well-defined(T&E) program is a key element in monitoring

24

the performance of selected risk-handling op-tions and developing new risk assessments.

• Earned Value (EV). This uses standard DoDcost/schedule data to evaluate a program’scost and schedule performance in an inte-grated fashion. As such, it provides a basisto determine if risk-handling actions areachieving their forecasted results.

• Technical Performance Measurement(TPM). TPM is a product design assessmentwhich estimates, through engineering analy-sis and tests, the values of essential perfor-mance parameters of the current design aseffected by risk-handling actions.

• Program Metrics. These are used for for-mal, periodic performance assessments ofthe various development processes, evaluat-ing how well the system development pro-cess is achieving its objective. This techniquecan be used to monitor corrective actions thatemerged from an assessment of the criticalrisk processes.

• Schedule Performance Monitoring. Thisis the use of program schedule data to evalu-ate how well the program is progressing tocompletion.

Paragraph 5.7 describes several monitoringtechniques, e.g., earned value.

The indicator system and periodic reassess-ments of program risk should provide the PMOwith the means to incorporate risk managementinto the overall program management structure.

2.9 RISK DOCUMENTATION

A primary criteria for successful management isformally documenting the ongoing riskmanagement process. This is important because:

• It provides the basis for program assessmentsand updates as the program progresses.

• Formal documentation tends to ensure morecomprehensive risk assessments than if it isnot documented.

• It provides a basis for monitoring risk-handling actions and verifying the results.

• It provides program background material fornew personnel.

• It is a management tool for the execution ofthe program.

• It provides the rationale for programdecisions.

The documentation should be done by thoseresponsible for planning, collecting, andanalyzing data, i.e., IPT level in most cases.

Risk management reports vary depending onthe size, nature, and phase of the program.Examples of some risk management documentsand reports that may be useful to a PM are:

• Risk management plan,

• Risk information form,

• Risk assessment report,

• Prioritized list of risks,

• Risk handling plan,

• Aggregated risk list,

• Risk monitoring documentation:– Program metrics,– Technical reports,– Earned value reports,

25

– Watch list,– Schedule performance report,– Critical risk processes reports.

Most PMOs can devise a list of standard re-ports that will satisfy their needs most of thetime; however, since there will always be a needfor ad hoc reports, briefings, and assessments,it is advisable to store risk information in amanagement information system (MIS). Thisallows the creation of both standard and adhoc reports, as needed. Paragraphs 4.8 and 5.8discuss an MIS to support a risk managementprogram.

Acquisition reform discourages Governmentoversight; therefore, formal contractor-pro-duced risk documentation may not be availablefor most programs. However, program insightis encouraged, and PMOs can obtain infor-mation about program risk from contractorinternal documentation such as:

• Risk Management Policy and Procedures.This is a description of the contractor’s cor-porate policy for the management of risk. The

procedures describe the methods for risk iden-tification, analysis, handling, monitoring, anddocumentation. It should provide the baselineplanning document for the contractor’sapproach to risk management.

• Corporate Policy and Procedures Docu-ments. Corporations have policy and proce-dures documents that address the functionalareas that are critical to the design, engineer-ing, manufacture, test and evaluation, quality,configuration control, manufacture, etc., ofa system. These documents are based onwhat the company perceives as best prac-tices, and although they may not specificallyaddress risk, deviation from these policiesrepresents risk to a program. Internal com-pany reports that address how well programscomply with policy may be required and willprovide valuable information.

• Risk Monitoring Report. Contractorsshould have internal tracking metrics andreports for each moderate- or high-risk item.These metrics may be used to determine thestatus of risk reduction programs.

26

27

33RISK MANAGEMENT AND THE

DOD ACQUISITION PROCESS

handling, and monitoring) is particularly impor-tant during Concept and Technology Develop-ment (CTD) Phase of any program, when alter-natives are evaluated, program objectives are es-tablished, and the acquisition strategy isdeveloped. All of these activities require accep-tance of some level of risk and development ofplans to manage the risk.

As a program evolves into subsequent phases,the nature of the risk management effort willchange. New assessments will be built onprevious ones. Risk areas will become morespecific as the system is defined.

Risk management should also be an integralpart of any Source Selection process, from re-quest for proposal (RFP) preparation, throughproposal evaluation, and after contract award.Throughout the program life, IPTs will play a keyrole in risk management activities.

3.3 DOD ACQUISITION PROCESS

The phases and milestones of the acquisitionprocess provide a streamlined structure thatemphasizes risk management and affordability.The phases are a logical means of progressivelytranslating broadly-stated mission needs intowell-defined system-specific requirements, andultimately into operationally effective, suitable,and survivable systems. It is important toremember that the term “system” includes

3.1 INTRODUCTION

This Chapter discusses the relationship betweenrisk and the acquisition process, describes howrisk is considered in design of the AcquisitionPlan, and expresses the need to consider risk asearly in the program as possible. Appendix Ais a summary of the risk management require-ments that are contained in DoDD 5000.1,DoDI 5000.2, DoD 5000.2-R, DoD 5000.4, andDoD 5000.4-M.

3.2 OVERVIEW

The DoD acquisition process for the manage-ment of programs consists of a series of phasesdesigned to reduce risk, ensure affordability,and provide adequate information for decisionmaking. Acquisition officials are encouragedto tailor programs to eliminate phases or activi-ties that result in little payoff in fielding timeor cost savings. To effectively tailor a program,one needs to understand the risks present in theprogram and to develop a plan for managingthese risks. DoD policy calls for the continualassessment of program risks, beginning withthe initial phase of an acquisition program, andthe development of management approachesbefore any decision is made to enter allsubsequent phases.

The application of risk management processes(planning, assessment, identification, analysis,

28

hardware, software, and the human element.Each phase is designed, among other things, tomanage risks. Milestones are points in time thatallow decision makers to evaluate the programstatus and determine if the program should pro-ceed to the next phase. The Milestone DecisionAuthority (MDA) and PM tailor milestones andphases so that each milestone decision pointallows assessment of program status and the op-portunity to review plans for the next phase andbeyond. The MDA should explicitly addressprogram risks and the adequacy of risk man-agement planning during the milestone reviewsand establish exit criteria for progression to thenext phase.

The contract schedule normally allows time formilestone decisions before spending begins insubsequent phases and should also permitdemonstration of the exit criteria in time to sup-port the milestone review. There are exceptionsto this—driven by funding availability andoption award dates. However, the objective isto provide proper fiscal control without delay-ing the acquisition decisions or contracts whileadequately considering risk.

The acquisition strategy defines the businessand technical management approach to meetobjectives within program constraints with aprimary goal to minimize the time and cost ofsatisfying a valid need, consistent with com-mon sense and sound business practices. A PMprepares a preliminary acquisition strategy atMilestone A (that includes CTD Phase activi-ties that focus on identifying risk and handlingoptions). Later, the PM updates the strategyto support each milestone decision by describ-ing activities and events planned for the up-coming phase and relating the accomplish-ments of that phase to the program’s overall,long-term objectives. The risk associated witha program will significantly influence theacquisition strategy.

3.4 CHARACTERISTICS OFTHE ACQUISITION PROCESS

The acquisition process that has evolved canbe characterized in terms of the followingconcepts that are particularly relevant to themanagement of risk in programs.

3.4.1 Integrated Product and ProcessDevelopment (IPPD)

IPPD integrates all acquisition activities in orderto optimize system development, production,and deployment. Key to the success of the IPPDconcept are the IPTs, which are composed ofqualified and empowered representatives fromall appropriate functional disciplines who worktogether to identify and resolve issues. As such,IPTs are the foundation for organizing for riskmanagement.

3.4.2 Continuous Risk Management

PMs should focus on risk management through-out the life of the program, not just in prepara-tion for program and milestone reviews. Pro-gram risks should be continuously assessed, andthe risk-handling approaches developed, exe-cuted, and monitored throughout the acquisi-tion process. Both the Government and contrac-tors must understand risks as a programprogresses through the various phases and mile-stone decision points, and must modify the man-agement strategy and plan accordingly. Whilespecific government and contractors risk man-agement processes may likely be different, it isimportant that each party have a common andcomplete set of process steps (regardless of theirnames), and be able to exchange and clearlyunderstand the other party’s risk managementdocumentation.

29

3.4.3 Program Stability

Once a program is initiated, program stability isa top priority. Keys to creating program stabilityare realistic investment planning and affordabilityassessments. They must reflect an accurate andcomprehensive understanding of existing or ex-pected program risks. A risk management strat-egy must be developed early in the process, be-fore actually initiating the program to ensure it isa stable one, recognizing that key issues affect-ing program stability may be external.

3.4.4 Reduction of Life-Cycle Costs

DoD considers the reduction of total cost to ac-quire and operate systems while maintaining ahigh level of performance for the user to be ofhighest priority. This is reflected, in part,through the introduction of the “Cost As an In-dependent Variable” (CAIV) concept. CAIVentails setting aggressive, realistic cost objec-tives early in an acquisition program and thenmanaging all aspects of the program to achievethose objectives, while still meeting the user’sperformance and schedule needs. Inherent inthe CAIV concept is the realization that risksmust be understood, taken, and managed inorder to achieve cost, schedule, and perfor-mance objectives. An understanding of risk isessential to setting realistic cost objectives. ThePM and user representatives should identify riskand cost driving requirements during the gen-eration of the Operational Requirement Docu-ment (ORD) in order to know where tradeoffsmay be necessary.

3.4.5 Event-Oriented Management

Event-oriented management requires that de-cision makers base their decisions on signifi-cant events in the acquisition life cycle, ratherthan on arbitrary calendar dates. This manage-ment process emphasizes effective acquisitionplanning and embodies sound risk management.

Decisions to proceed with a program should bebased on demonstration of performance,through test and evaluation, and on verificationthat program risks are well-understood and arebeing managed effectively. Attainment ofagreed-upon exit criteria is an indication thatthe PMO is managing risk effectively.

3.4.6 Modeling and Simulation

Properly used, models and simulations canreduce time, resources, and acquisition riskand may increase the quality of the systemsbeing developed. Users of these models andsimulations must have a good understanding oftheir capabilities and limitations and theirapplicability to the issues being addressed.

From a risk perspective, modeling and simula-tion may be used to develop alternative con-cepts during system design; predict perfor-mance in support of trade-off studies; evaluatesystem design and support preliminary designreviews during design development; predictsystem performance and supplement live testsduring testing; examine the military value ofthe system; determine the impact of designchanges; hone requirements; and develop life-cycle support requirements and assessments.

However, a key limitation through models andsimulations is that the results are only asaccurate and certain as the quality of the under-lying relationships and input data. Blindly be-lieving and using the output from models andsimulations should never be done.

3.5 RISK MANAGEMENTACTIVITIES DURINGACQUISITION PHASES

Risk management activities should be appliedcontinuously throughout all acquisition processphases and in the technology opportunities andrequirements activities that feed into the process.

30

However, because of the difference in avail-able information, the level of application anddetail will vary for technology opportunity ac-tivities and for each phase. For technologicalopportunity activities, DoD used three mecha-nisms to transition concepts and technologyto user and acquisition customers: AdvancedTechnology Demonstrations (ATDs), Ad-vanced Concept Technology Demonstrations(ACTDs), and Experiments. When assessingthe risk of these mechanisms, descriptors calledTechnology Readiness Levels (TRLs) areused. TRLs provide consistent, uniform de-scriptions of technical maturity—across dif-ferent types of technologies. Appendix 6 ofDoD 5000.2-R (also see Appendix A, pageA-11 of this Guide) contains guidance on useof TRLs.

In the CTD Phase, management focuses on as-sessing the risks in the alternative concepts avail-able to satisfy users needs and on planning a strat-egy to address those risks. For each of the sub-sequent phases, all four risk managementactivities may be applied with increasing focuson risk handling and monitoring.

The PM identifies objectives, alternatives, andconstraints at the beginning of each phase of aprogram and then evaluates alternatives, iden-tifies sources of project risk, and selects a strat-egy for resolving the risks. The PMO updatesthe acquisition strategy, risk assessments, andother aspects of program planning, based onanalyses, for the phase of the acquisition.

Developers should become involved in the riskmanagement process at the beginning, whenusers define performance requirements, andcontinue during the acquisition process until thesystem is delivered. The early identification andassessment of critical risks allow PMs to for-mulate handling approaches and to streamlinethe program definition and the RFP aroundcritical product and process risks.

The following paragraphs address risk man-agement in the different phases in more detail.

3.5.1 Concept and TechnologyDevelopment (CTD) Phase

DoDI 5000.2 describes the CTD Phase as nor-mally consisting of studies that define andevaluate the feasibility of alternative conceptsand provide the basis for the assessment of thesealternatives in terms of their advantages, disad-vantages, and risk levels at the Milestone (MS)B decision point. In addition to providing in-put to the Analysis of Alternatives, the PM de-velops a proposed acquisition program baseline(APB) and exit criteria for the System Integra-tion (SI) part of the System Development andDemonstration (SDD) Phase.

The APB documents the most important per-formance, cost, and schedule objectives andthresholds for the selected concepts. Theparameters selected are such that a re-evalua-tion of alternative concepts is appropriate ifthresholds are not met. Exit criteria are eventsor accomplishments that allow managers to trackprogress in critical technical, cost, or schedulerisk areas. They must be demonstrated to showthat a program is on track.

In defining alternative concepts, PMs shouldpay particular attention to the threat and theuser’s requirements, which are normally statedin broad terms at this time. Risks can be intro-duced if the requirements are not stable, or ifthey are overly restrictive and contain specifictechnical solutions. Requirements can also besignificant cost and schedule risk drivers if theyrequire a level of performance that is difficultto achieve within the program budget and timeconstraints. Such drivers need to be identifiedas early in the program as possible.

The acquisition strategy should address theknown risks for each alternative concept, and

31

the plans to handle them, including specificevents intended to control the risks. Similarly,the T&E strategy should reflect how T&E, withthe use of M&S, will be used to assess risklevels and identify new or suspected risk areas.

A risk management strategy, derived in concertwith the acquisition strategy, should be devel-oped during this phase and revised and updatedcontinually throughout the program. This strat-egy should include risk management planningthat clearly defines roles, responsibilities, au-thority, and documentation for program reviews,risk assessments, and risk monitoring.

3.5.2 Subsequent Phases

During subsequent phases, concepts, techno-logical approaches, and/or design approaches(selected at the previous milestone decisions)are pursued to define the program and programrisks. Selected alternative concepts continue tobe analyzed, and the acquisition strategy, andthe various strategies and plans derived fromit, continue to be refined.

Risk management efforts in these phases focuson: understanding critical technology, manufac-turing, and support risks, along with cost, sched-ule, and performance risks; and demonstratingthat they are being controlled before moving tothe next milestone. Note that the accuracy ofcost, schedule, performance risk assessmentsshould improve with each succeeding programphase (e.g., more info, better design documen-tation, etc.). Thus, particular attention shouldbe placed on handling and monitoring activi-ties. Planning and assessment should continueas new information becomes available and newrisk events are identified.

During these phases, the risk management pro-gram should be carried out in an integrated Gov-ernment-contractor framework to the extent pos-sible, that allows the Government to manage

program risks, with the contractor responsibleto the PM for product and process risks and formaintaining design accountability. Both the Gov-ernment and contractors need to understand therisks clearly, and jointly plan management ef-forts. In any event, risk management needs tobe tailored to each program and contract type.

3.6 RISK MANAGEMENT ANDMILESTONE DECISIONS

Before a milestone review, the PM shouldupdate risk assessments, explicitly addressingthe risks in the critical areas, such as threat,requirements, technology, etc., and identifyareas of moderate or high risk.

Each critical technical assessment should besupported by subsystems’ risk assessments,which should be supported by design reviews,test results, and specific analyses.

The PM should present planned risk-handlingactions for moderate- or high-risk areas at themilestone review to determine their adequacy andto ensure the efficient allocation of resources.

3.7 RISK MANAGEMENT AND THEACQUISITION STRATEGY

In addition to providing the framework forprogram planning and execution, the acquisi-tion strategy serves several purposes that areimportant to risk management:

• Provides a master schedule for research,development, test, production, deployment,and critical events in the acquisition cycle.

• Gives a master checklist of the important is-sues and alternatives that must be addressed.

• Assists in prioritizing and integrating func-tional requirements, evaluating alternatives,and providing a coordinated approach to

32

integrate diverse functional issues, leading tothe accomplishment of program objectives.

• Documents the assumptions and guidelinesthat led to the initiation and direction of theprogram.

• Provides the basis for the development and ex-ecution of the various subordinate functionalstrategies and plans.

The strategy structure should ensure a soundprogram through the management of cost, sche-dule, and performance risk. A good acquisitionstrategy acknowledges and identifies programrisks and forms the basis for implementing aforward-looking, rather than reactive, effectiverisk management effort.

Acquisition strategy should describe how riskis to be handled and identify which risks are tobe shared with the contractor and which are tobe retained by Government. The key concepthere is that the Government shares the risk withthe contractor, but does not transfer risk to thecontractor. The PMO always has a responsibil-ity to the system user to develop a capable sys-tem and can never absolve itself of that respon-sibility. Therefore, all program risks, whetherprimarily managed by the PMO or by the con-tractor, must be assessed and managed by thePMO.

Once the program office has determined howmuch of each risk is to be shared with the con-tractor, it should assess the total risk assumedby the developing contractor (including subcon-tractors). The Government should not requirecontractors to accept financial risks that areinconsistent with their ability to handle them.Financial risks are driven, in large measure, bythe underlying technical and programmatic risksinherent in a program. The Government contract-ing officer should, therefore, select the propertype of contract based on an appropriate risk

assessment, to ensure a clear relationshipbetween the selected contract type and programrisk. An example would be the use of cost-reimbursable-type contracts for developmentprojects.

3.8 RISK MANAGEMENT AND CAIV

The intention of CAIV is to establish balancebetween cost, schedule, performance, and riskearly in the acquisition process and to manageto a cost objective. CAIV requires that PMsestablish aggressive cost objectives, definedto some degree by the maximum level ofacceptable risk. Risks in achieving both per-formance and aggressive cost goals must beclearly recognized and actively managedthrough:

(1) continuing iteration of cost/performance/schedule/risk tradeoffs,

(2) identifying key performance and manufac-turing process uncertainties, and

(3) demonstrating solutions before production.

Whereas DoD has traditionally managed per-formance risk, equal emphasis must be placedon managing cost and schedule risks. An un-derlying premise of CAIV is that if costs aretoo great, and there are ways to reduce them,then the user and developer may reduce perfor-mance requirements to meet cost objections.Cost control and effective risk managementinvolve planning and scheduling events anddemonstrations to verify solutions to cost,schedule, and performance risk issues.

User participation in the trade-off analysis is es-sential to attain a favorable balance betweencost, schedule, performance, and risk. The PMand user representatives should identify risk andcost driving requirements during the generationof the ORD to know where tradeoffs may be

33

possible. Risk assessments are critical to theCAIV process since they provide users and de-velopers with essential data to assist in the cost,schedule, performance, and risk trade decisions.

Cost for risk management is directly related tothe level of risk and affects a program in twoways. First, costs are associated with specifichandling activities, for example, a paralleldevelopment. Second, funds are needed to

cover the known risks of the selected systemapproach (i.e., funds to cover cost uncertainty).PMs must include the anticipated expense ofmanaging risk in their estimates of program costs.Decision makers must weigh these costs againstthe level of risk in reaching program funding de-cisions. CAIV requires that program funds sup-port the level of accepted program risk and thatrisk management costs are included in settingcost objectives.

34

35

To use risk management as a program manage-ment tool, the information resulting from eachof the risk processes should be documented ina usable form and available to members of theGovernment/industry program team. This in-formation will provide the basis for reportingrisk and overall program information, both in-ternally and externally. Managing collectionand dissemination of risk information can beenhanced through the use of a ManagementInformation System (MIS).

4.3 PROGRAM MANAGER AND RISKMANAGEMENT

All PMs are responsible for establishing andexecuting a risk management program that sat-isfies the policies contained in DoDD 5000.1.A PM must balance program-unique require-ments or circumstances (e.g., size of the PMOstaff) against the demands of proven risk man-agement principles and practices. This sectionaddresses these principles and practices andprovides a basis for establishing a PMO’s riskmanagement organization and related proce-dures. The following guidelines define anapproach to risk management.

44RISK MANAGEMENT

ANDPROGRAM MANAGEMENT

4.1 INTRODUCTION

Risk management as a program managementresponsibility can be a comprehensive andresponsive management tool if it is properlyorganized and monitored at the PM level. A for-malized risk management program should bewell-planned and forward-looking by identify-ing, analyzing, and resolving potential problemareas before they occur, and by incorporatingmonitoring techniques that accurately portraythe status of risks and the efforts to mitigatethem. Introduction of risk management earlyin a program emphasizes its importance and en-courages contractors and members of theGovernment team to consider risk in the dailymanagement functions.

This Chapter addresses the relationship betweenrisk management and program management andsuggests methods of introducing risk manage-ment in a program, organizing for risk, andtraining.

4.2 OVERVIEW

A PMO should organize for risk management,using existing IPTs. The PM may also want touse contractors to support management effortsor have experts not involved with the programperform independent assessments.

36

4.3.1 Risk Management Is aProgram Management Tool

Risk management should be integral to aprogram’s overall management. PMs must takean active role in the process to ensure that theirapproach leads to a balanced use of programresources, reflects their overall managementphilosophy, and includes Government and con-tractors. Past DoD practices have generallytreated risk management solely as a systemengineering function, cost-estimating techniqueor possibly as an independent function distinctfrom other program functions. Today, risk man-agement is recognized as a vital integrated pro-gram management tool that cuts across the en-tire acquisition program, addressing and in-terrelating cost, schedule, and performancerisks. The goal is to make everyone involved ina program aware that risk should be a consider-ation in the design, development, and fieldingof a system. It should not be treated as some-one else’s responsibility. Specific functionalareas—such as system engineering—could becharged with implementing risk management,as long as they take the program managementview towards it.

4.3.2 Risk Management Is aFormal Process

Formal risk management refers to a structuredprocess whereby risks are systematically iden-tified, analyzed, handled, and monitored. (Arecommended structure is described in Section2 of this Guide.) A structured risk managementprocess, which is applied early, continuously,and rigorously, provides a disciplined environ-ment for decision making and for the efficientuse of program resources. Through a disciplinedprocess PMs can uncover obscure and lower-level risks that collectively could pose a majorrisk.

The need for a formal risk management processarises from the nature of risk and the complexityof acquisition programs. The numerous risks inan acquisition program are often interrelated andobscure and change in the course of the devel-opment process. A formal approach is the onlyeffective method to sort through numerous riskevents, to identify the risks and their interrela-tionships, to pinpoint the truly critical ones, andto identify cost-effective ways to reduce thoserisks, consistent with overall program objectives.

A structured process can reduce the complex-ity of an acquisition program by defining anapproach to assess, handle, monitor, and com-municate program risk. The systematic identi-fication, analysis, and mitigation of risks alsooffers a reliable way to ensure objectivity, thatis, minimize unwarranted optimism, prejudice,ignorance, or self-interest. Further, structurereduces the impact of personnel turnover andprovides a basis for training and consistencyamong all the functional areas of a program. Astructured risk program may also promoteteamwork and understanding and improves thequality of the risk products.

4.3.3 Risk Management IsForward-Looking

Effective risk management is based on thepremise that PMs must identify potential prob-lems, referred to as risk events, long before theycan occur and develop strategies that increasethe probability/likelihood of a favorable outcometo these problems. Application of this philosophyoccurs primarily by using analytical techniquesthat give forward-looking assessments.

Typically, the early identification of potentialproblems is concerned with two types of events.The first are relevant to the current or immi-nent acquisition phase of a program (interme-diate-term), such as satisfying a technical exitcriteria in time for the next milestone review.

37

The second are concerned with the futurephase(s) of a program (long-term) such aspotential risk events related to transitioning asystem from development to production.

By analyzing critical events, certain risks canbe determined. To do this, one should considerthe range of potential outcomes and the factorsthat determine those outcomes. Through riskhandling, a PM then develops approaches thatminimize risk factors. Paragraph 5.6 of thisGuide describes some handling approaches.

Choosing the proper risk-handling optionsrequires that a balance be struck between theresources required to implement those optionsand their payoffs (both intermediate and long-term) and the resources realistically available.

4.3.4 Risk Management Is Integral toIntegrated Product and ProcessDevelopment (IPPD)

One of the tenets of IPPD is multidisciplinaryteamwork through IPTs, which are an integralpart of the defense acquisition oversight andreview process. The Integrating IPT (IIPT) is avaluable resource to assist in developing a riskmanagement plan and should be used accord-ingly. The PM should ensure that the require-ments of the Overarching IPT (OIPT) arereflected in the plan.

Working with the OIPT, the PM can establishthe type and frequency of risk managementinformation that an OIPT requires, and refinemanagement organization and procedures.This should be done during the initial OIPTmeetings. OIPTs will most likely requireinformation concerning:

• Known risks and their characteristics, e.g.,probability of occurrence and consequences/impacts,

• Planned risk-handling actions, funded andunfunded,

• Achievements in controlling risks at accept-able levels.

IIPTs and OIPTs may also require details onthe PM’s risk management program, access tothe risk management plan, and the results ofspecific risk assessments. In addition, PMs maywant to present selected information to IIPTsand OIPTs to help substantiate a position orrecommendation, e.g., help support a budgetrequest.

4.4 RISK MANAGEMENTORGANIZATION IN THE PMO

The PM, after determining a preferred manage-ment approach, must organize the programoffice and establish outside relationships inorder to manage risk. No particular organiza-tional structure is superior; however, experienceprovides some insights into the development ofeffective risk management organizations. PMsshould consider the following discussion in thecontext of their unique requirements andcircumstances and apply those that are suitableto their specific needs.

4.4.1 Risk ManagementOrganizational Structure

A major choice for each PM is whether to havea centralized or decentralized risk managementorganization. The PM may choose a central-ized organizational structure until team mem-bers become familiar with both the program andthe risk management process. In a centralizedapproach, the PM establishes a team that is re-sponsible for all aspects of risk management.The team would write a plan, conduct assess-ments, evaluate risk-handling options, andmonitor progress. Although this approach maybe necessary early in a program, it tends to

38

minimize the concept that risk management isa responsibility shared by all members of theacquisition team, whether Government or con-tractor.

The PM may also choose to decentralize. Thedegree of decentralization depends on theassignment of responsibilities. Some level ofcentralization is almost always essential for prior-itizing risk across the program. A program levelIPT (see Figure 4-1) or a Risk ManagementBoard (RMB) may be appropriate for this inte-grating function.

The decentralized risk management organizationis the most widely used approach, which is com-patible with the DoD’s IPPD policy and gener-ally results in an efficient use of personnel re-sources. In this approach, risk management isdelegated to Program IPTs (PIPTs).

The following guidelines apply to all riskmanagement organizations:

• The PM is ultimately responsible for plan-ning, allocating resources, and executing riskmanagement. This requires the PM to over-see and participate in the risk managementprocess.

• The PM must make optimal use of availableresources, i.e., personnel, organizations, andfunds. Personnel and organizational re-sources include the PMO, functional supportoffices of the host command, the primecontractor, independent risk assessors, andsupport contractors.

• Risk management is a team function. Thisstems from the pervasive nature of risk andthe impact that risk-handling plans may haveon other program plans and actions. In theaggregate, risk planning, risk assessment,

Figure 4-1. Decentralized Risk Management Organization

SupportContractor

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

SupportContractor

PrimeContractor

FunctionalSupportOffices

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

○ ○ ○○

PM

RiskManagementCoordinator

Program LevelIPT or (Risk

Management Board)

Sub-TierProgram IPTs

(PIPTs)

PMOFunctional

Offices

IndependentRisk

Assessors

○ ○ ○ ○ ○ ○ As NeededCoordination

○ ○ ○ ○ ○ ○ Support Provided byNon-PMO Organizations

39

risk handling, and risk monitoring affect allprogram activities and organizations. Anyattempt to implement an aggressive forward-looking risk management program withoutthe involvement of all PMO subordinateorganizations could result in confusion, mis-direction, and wasted resources. The onlyway to avoid this is through teamwork amongthe PMO organizations and the prime con-tractor. The management organizationalstructure can promote teamwork by requir-ing strong connectivity between that struc-ture, the various PMO organizations, and theprime contractor. The teams may use inde-pendent assessments to assist them, whenrequired.

Figure 4-1 portrays a decentralized risk manage-ment organization. This example includes theentire PMO and selected non-PMO organizations,e.g., the prime contractor, who are members ofthe IPTs. The figure shows that risk managementis an integral part of program management andnot an additional or separate function to perform.Hence, separate personnel are not designated tomanage risk, but rather all individuals are requiredto consider risk management as a routine part oftheir jobs. In the figure, the risk coordinator re-ports to the PM, but works in coordination withthe PIPT, functional offices, and the ProgramLevel IPT. As shown, this organizational struc-ture is suited to Acquisition Category (ACAT) Iprograms, but PMs can tailor it to satisfy their spe-cific requirements. The details are dependant uponthe contract, type, statement of work, and othervariables.

The organizational structure shows that the PMis ultimately responsible for risk management.There is a coordinator to assist with this respon-sibility and act as an “operations” officer. Thismay be a full-time position or an additional dutyas the PM deems appropriate. The coordinatorshould have specific training and experience inrisk management to increase the chance of

successful implementation and to avoid commonproblems. A support contractor may assist thecoordinator by performing administrative tasksassociated with that office.

The Program Level IPT, composed of individu-als from the PMO and prime contractor, ensuresthat the PM’s risk management program isimplemented and program results are synthe-sized into a form suitable for decision makingby the PM and OIPT.

The inclusion of both Sub-Tier IPTs and PMOfunctional offices simply reflects that not allprogram management functions will beassigned to Sub-Tier IPTs for execution.

Independent risk assessors are typically hiredwhen the PM has specific cost, schedule, per-formance concerns with a hardware or softwareproduct or engineering process and wants anindependent assessment from an expert in a par-ticular field. The duration of their services isnormally short, and tailored to each program.

4.4.2 Risk Management Responsibilities

This section identifies the primary responsibili-ties that could be associated with a decentral-ized risk management organization. In assign-ing the responsibilities to the various organiza-tional elements, the PM should strike a balancebetween a concentration of responsibilities atthe higher levels and pushing them too far downthe organizational structure.

The development of these responsibilities, in part,is based on the premise that risk managementactivities must be specific—and assigned to in-dividuals, not groups. The responsibilities listedbelow are assigned to the leader of each organi-zational element, recognizing that the composi-tion of each element will be program unique,i.e., number of assigned PMO personnel, primecontractor personnel, etc. The task of further

40

assigning these responsibilities, along with tai-loring them to satisfy the needs and requirementsof each program, remains for PMs and their staffsto accomplish.

Table 4-1 provides a description of the respon-sibilities associated with the decentralized riskmanagement structure, sorted by notional or-ganizational elements that may make up the riskmanagement structure.

4.5 CONTRACTOR RISKMANAGEMENT

Experience has shown that managing a program’srisks requires a close partnership between thePMO and the prime contractor(s). PMs mustdetermine the type of support they need fromtheir prime contractor, communicate theseneeds through the Request for Proposal (RFP)for each acquisition phase, and then provide forthem in the contract. Preparation of the RFPand source selection are discussed in subsequentsections.

4.5.1 Contractor View of Risk

Contractors treat risk differently from the Gov-ernment because each views risk from a differ-ent perspective. The PM, in executing his riskmanagement program, needs to understand thecontractor viewpoint.

Contractors typically divide risks into two basictypes: business risks and program risks. Busi-ness risk, in the broadest sense, involves theinherent chance of making a profit or incurringa loss on any given contract. Program risk in-volves, among other things, technical, require-ment, and design uncertainties. A contractor’sefforts to minimize business risks may conflictwith a Government PM’s efforts to lowerprogram risk.

While the government and contractors may havedifferent views on specific cost, schedule, andperformance risk levels/ratings, they generallyhave (or should have) similar views of the riskmanagement process. One exception may bethe requirements placed by corporate manage-ment—that could conflict with the Governmentview of program risk. The similarity, however,does not necessarily lead to the contractor hav-ing a competent internal risk management pro-gram. As a Project Management Institute (PMI)handbook points out, “On most (contractor) pro-jects, responsibility for Project Risk is so per-vasive that it is rarely given sufficient centralattention.” As a minimum, it is important that thePMO writes the RFP asking the contractor todescribe its risk management process, includingits approach to managing any specific areas.

4.5.2 Government/ContractorRelationship

The prime contractor’s support and assistanceis required even though the ultimate responsi-bility for risk management rests with the Gov-ernment PM. Often, the contractor is betterequipped to understand the program technicalrisks than the Government program office is.Both the Government and contractor need toshare information, understand the risks, anddevelop and execute management efforts. TheGovernment must involve the contractor earlyin program development, so that effective riskassessment and reduction can occur.

Therefore, risk management must be a key partof the contractor’s management scheme. Al-though the Government does not dictate howthe contractor should manage risk, some char-acteristics of a good Government/contractorrelationship include:

• Clear definition of risks and their assignment.

41

Table 4-1. Notional Description of Risk Management Responsibilities

Personnel Job Responsibility

• Plan, organize, direct, and control risk management.

• Comply with DoDD 5000.1, DoDI 5000.2, DoD 5000.2-R, DoDD 5000.4, and DoD5000.4-M risk management guidance.

• Ensure that funds are available to support approved risk-handling plans.

• Inform and advise MDA, Cost Analysis Improvement Group (CAIG) and OIPT onprogram risk and its mitigation.

• Develop and maintain risk management plans.

• Provide risk management training.

• Define the risk reporting scales to be used by the program.

• Develop and maintain a risk management information system.

• Prepare risk management reports.

• Monitor compliance with DoDD risk management requirements.

• Ensure that risk management functions and tasks performed by the Sub-TierIPTs and the PMO functional offices are fully integrated and in compliance withassigned tasks.

• Advise the PM and Program Level IPT on the use of risk management sources,i.e., host command functional support offices, etc.

• Evaluate risk assessments, risk-handling plans, and risk monitoring results asdirected and recommend appropriate actions.

• Advise the PM on the use of independent risk assessors.

• Ensure that the risk management program is implemented, risk reduction isaccomplished in conformance with the PM’s strategy, and the risk managementefforts of the Sub-Tier IPTs are integrated.

• Report risk events to the risk management coordinator.

• Evaluate whether Sub-Tier IPTs and PMO functional offices have identifiedcritical risks and proposed risk-handling plans.

• Ensure that cost, schedule, and performance risks are compatible.

• Ensure that cost, schedule, and performance risks are combined in a mannerconsistent with the plan.

• Assess risks, recommending appropriate risk-handling strategies for eachidentified moderate and high risk, and implementing and documenting all riskmanagement analyses and findings within the team’s product area.

• Coordinate all risk management findings and decisions with other Sub-Tier IPTs,PMO functional offices, the Program Level IPT, and the risk-managementcoordination office.

• Identify funding requirements to implement risk-handling plans.

• Identify the need for risk management training.

• Report risk events to the Program Level IPT and risk coordinator.

• Perform independent risk assessment on critical risk areas or contractorengineering processes that the PM has specified.

• Report the results of those assessments to the PM.

• Work with the risk management coordinator.

ProgramManager

RiskManagementCoordinator

Program Level IPT

(some PMOsuse a Risk

ManagementBoard (RMB)

for thisresponsibility)

PMO Sub-TierPIPTs &

FunctionalOffices

(Process) andSystem

Elements(Products)

IndependentRisk

Assessors

42

• Flexibility for assignment of risks and riskmanagement responsibilities among theteams.

• Strong emphasis on best management andtechnical practices which, if followed, avoidunnecessary risks.

Regarding RFP development, discussed later inthis section, information is provided on howthese characteristics should be addressed.

The Government/contractor partnership can beforged in at least two ways. First, the PMOshould include the prime contractor(s) in thetop-level risk planning and assessment activi-ties. This includes understanding and factoringin such issues as user requirements, affordabilityconstraints, and schedule limitations. Second,the PMO should include in advance specific riskassessment and handling tasks as key contrac-tual efforts during the concept exploration andprogram definition and risk reduction phases.

Forming a joint Government/contractor evalu-ation team is a good way of fostering an effec-tive partnership. This is especially true in aprogram’s early stages when uncertainty is highand both parties must frequently assess risks.These assessments, properly handled, involvemultidisciplinary efforts requiring subject-mat-ter experts from both the prime contractor andGovernment. This joint team should evaluatethe proposed program in detail and explore theinherent program risks, the proposed handlingstrategies, the detailed development schedule,and the contractor’s developmental resources(people, facilities, processes, tools, etc.).

A management approach using multiple teamsis the best approach to use, e.g., Sub-Tier IPTs.Joint team(s) should be established at the be-ginning of each development phase to assessthe risks to be overcome in that phase and todetermine the handling technique(s) to be used.

Requirements for contractor participation on theteam(s) should be identified in the RFP andsubsequent contract.

4.6 RISK MANAGEMENT AND THECONTRACTUAL PROCESS

4.6.1 Risk Management:Pre-Contract Award

The contractor’s developmental and manufac-turing processes and tools, the availability andskill of personnel, and the previous experienceof the Government and contractor team all in-fluence their ability to handle the proposed sys-tem development and production. Therefore, aneffective risk management process includes anevaluation of the capabilities of the potentialcontractors.

4.6.2 Early Industry Involvement:Industrial Capabilities Review

An Industrial Capabilities Review is a power-ful tool available to PMs for determining gen-eral industrial capabilities. To avoid potentialproblems in the subsequent competitive processand to ensure that a “level playing field” ismaintained, an announcement in the CommerceBusiness Daily should be made to inform allpotential offerors that the Government plans toconduct an Industrial Capabilities Review andto request responses from all interested parties.Below is a general approach that PMOs mayfind readily adaptable to any type of capabilityreview. The basic steps in the process are to:

• Obtain the Source Selection Authority’sapproval to conduct the review.

• Establish the criteria for the capability.

• Identify the potential contractors who willparticipate in the review.

43

• Provide an advance copy of the reviewmaterial to those contractors.

• Select the review team, ensuring that it hasthe necessary mix of talent.

• Train the team on the purpose of the reviewand review criteria.

• Conduct the review and evaluate the results.

• Provide feedback to each contractor on theresults of their review and assessment.

• Provide the results to the PM.

This review is an appraisal of general industrialcapabilities and supports identifying potentialprogram risks and best practices rather thanevaluating specific contractors.

Regardless of the approach, the PMO shoulddetermine what specific information is needed.DoD 4245.7-M is a good guide to help tailora set of questions for the contractors. Thequestions generally focus on two areas consis-tent with protection of contractor proprietaryinformation.

• What is the state-of-the-art of the technologyproposed for use in the system?

• What are the general developmental/manu-facturing capabilities of the potential con-tractors (including experience, tools, pro-cesses, etc.) as compared to industry bestpractices?

Table 4-2 shows some of the specific areas orsources for risk identification. It includes a num-ber of areas (threat, requirements, design, etc.)that have been shown through experience tocontain risk events that tend to be more criticalthan others, and which ones should receive themost management attention. Risk events are

determined by examining WBS element prod-uct and processes in terms of risk areas. Pro-cess areas are specifically addressed in DoD4245.7M. They are general in that areas of riskcould be present in any program from eithersource (WBS or process). They are intended asa list of “top-level” risk sources that will focusattention on a specific area. The PMO andcontractor(s) will have to examine lower levelsto understand the actual risks that are presentin their program and to develop an effectivemanagement plan. The risks shown are not in-tended to serve as a simple checklist that oneshould apply directly, then consider the programrisk-free if none of the listed risks are present.

An examination of the program in these areascan help to develop the final program acquisi-tion strategy and the risk-sharing structure be-tween the Government and industry. The PMOcan also use the results to adjust the RFP forthe next phase of the program.

4.6.3 Developing theRequest for Proposal

The RFP should communicate to all offerorsthe concept that risk management is an essentialpart of the Government’s acquisition strategy.

Before the draft RFP is developed using theresults of the Industrial Capabilities Review, thePMO should conduct a risk assessment toensure that the program described in the RFPis executable within the technical, schedule, andbudget constraints. Based on this assessment, aprogram plan, an integrated master schedule,and life-cycle cost (LCC) estimate may be pre-pared. The technical, schedule, and cost issuesshould be discussed in the pre-proposal con-ference(s) before the draft RFP is released. Inthis way, critical risks inherent in the programcan be identified and addressed in the RFP. Inaddition, this helps to establish key risk-man-agement contractual conditions. The RFP

44

Table 4-2. Significant Risks by Critical Risk Areas

Risk Area Significant Risks

• Uncertainty in threat accuracy.

• Sensitivity of design and technology to threat.

• Vulnerability of system to threat and threat countermeasures.

• Vulnerability of program to intelligence penetration.

• Operational requirements not properly established or vaguely stated.

• Requirements are not stable.

• Required operating environment not described.

• Requirements do not address logistics and suitability.

• Requirements are too constrictive—identify specific solutions that force high cost.

• Design implications not sufficiently considered in concept exploration.

• System will not satisfy user requirements.

• Mismatch of user manpower or skill profiles with system design solution orhuman-machine interface problems.

• Increased skills or more training requirements identified late in the acquisitionprocess.

• Design not cost effective.

• Design relies on immature technologies or “exotic” materials to achieveperformance objectives.

• Software design, coding, and testing.

• Test planning not initiated early in program (CTD Phase).

• Testing does not address the ultimate operating environment.

• Test procedures do not address all major performance and suitabilityspecifications.

• Test facilities not available to accomplish specific tests, especially system-leveltests.

• Insufficient time to test thoroughly.

• Same risks as contained in the Significant Risks for Test and Evaluation.

• M&S are not verified, validated, or accredited for the intended purpose.

• Program lacks proper tools and modeling and simulation capability to assessalternatives.

• Program depends on unproved technology for success—there are noalternatives.

• Program success depends on achieving advances in state-of-the-art technology.

• Potential advances in technology will result in less than optimal cost-effectivesystem or make system components obsolete.

• Technology has not been demonstrated in required operating environment.

• Technology relies on complex hardware, software, or integration design.

• Inadequate supportability late in development or after fielding, resulting in needfor engineering changes, increased costs, and/or schedule delays.

• Life-cycle costs not accurate because of poor logistics supportability analyses.

• Logistics analyses results not included in cost-performance tradeoffs.

• Design trade studies do not include supportability considerations.

Threat

Requirements

Design

Test andEvaluation

Simulation

Technology

Logistics

45

Table 4-2. Significant Risks by Critical Risk Areas(continued)

Risk Area Significant Risks

• Production implications not considered during concept exploration.

• Production not sufficiently considered during design.

• Inadequate planning for long lead items and vendor support.

• Production processes not proven.

• Prime contractors do not have adequate plans for managing subcontractors.

• Sufficient facilities not readily available for cost-effective production.

• Contract offers no incentive to modernize facilities or reduce cost.

• Immature or unproven technologies will not be adequately developed beforeproduction.

• Production funding will be available too early—before development effort hassufficiently matured.

• Concurrency established without clear understanding of risks.

• Developer has limited experience in specific type of development.

• Contractor has poor track record relative to costs and schedule.

• Contractor experiences loss of key personnel.

• Prime contractor relies excessively on subcontractors for major developmentefforts.

• Contractor will require significant capitalization to meet program requirements.

• Realistic cost objectives not established early.

• Marginal performance capabilities incorporated at excessive costs; satisfactorycost-performance tradeoffs not done.

• Excessive life-cycle costs due to inadequate treatment of support requirements.

• Significant reliance on software.

• Funding profile does not match acquisition strategy.

• Funding profile not stable from budget cycle to budget cycle.

• Schedule not considered in trade-off studies.

• Schedule does not reflect realistic acquisition planning.

• APB schedule objectives not realistic and attainable.

• Resources not available to meet schedule.

• Acquisition strategy does not give adequate consideration to various essentialelements, e.g., mission need, test and evaluation, technology, etc.

• Subordinate strategies and plans are not developed in a timely manner or basedon the acquisition strategy.

• Proper mix (experience, skills, stability) of people not assigned to PMO or tocontractor team.

• Effective risk assessments not performed or results not understood and actedupon.

Production/Facilities

Concurrency

Capability ofDeveloper

Cost/Funding

Schedule

Management

46

should encourage offerors to extend the contractWBS (CWBS) to reflect how they will identifyall elements at any level that are expected to behigh cost or high risk. The RFP should also en-courage offerors to cite any elements of theCWBS provided in the draft RFP that are notconsistent with their planned approach.

In the solicitation, PMs may ask offerors to in-clude a risk analysis and a description of theirmanagement plans, and also to develop a sup-porting program plan and an integrated masterschedule in their proposals. These proposalswill support the Government’s source selectionevaluation and the formulation of a most prob-able cost estimate for each proposal. In addi-tion, the RFP may identify the requirement forperiodic risk assessment reports that wouldserve as inputs to the PM’s assessment andmonitoring processes thereby ensuring that risksare continuously assessed.

4.6.4 The Offeror’s Proposal

The offerors should develop the proposed pro-gram plans and documentation at a level that isadequate to identify risks, develop associatedmanagement activities that they will use through-out the program, and integrate resources, tech-nical performance measures, and schedule inthe proposed program plans. Program plansshould extend the CWBS to reflect the offeror’sapproach and include the supporting activities,critical tasks, and processes in the CWBS dic-tionary. The associated schedules for eachshould be incorporated into an integrated mas-ter schedule. Plans should also have an estimateof the funds required to execute the programand include a breakout of resource requirementsfor high-risk areas.

The information required and the level of de-tail will depend on the acquisition phase, thecategory, and criticality of the program, as wellas on the contract type and value. However, the

detail submitted with the proposal must be at asufficiently low level to allow identification ofpossible conflicts in the planned acquisitionapproach and to support the Government’s pro-posal evaluation. Generally, the CWBS shouldbe defined below level 3, by the contractor, onlyto the extent necessary to capture those lowerlevel elements that are high cost, high risk, orof high management interest.

4.6.5 Basis for Selection

DoD acquisition management must focus onbalancing cost, schedule, performance, and riskby selecting the contractor team that providesthe best value to the user within acceptable risklimits. Therefore, the RFP/Source Selectionprocess must evaluate each offeror’s capabilityfor meeting product and process technical, costand schedule requirements while addressingand controlling the risks inherent in a program.

The evaluation team should discriminate amongofferors based upon the following:

• Risks determined by comparison with thebest practices baseline.

• Ability to perform with a focus on the criticalrisk elements inherent in the program.

• Adherence to requirements associated withany mandatory legal items.

• Past performance on efforts similar to theproposed program being evaluated.

The process of choosing among offerors maybe enhanced if the evaluation team includes riskmanagement as a “source selection discrimi-nator.” Risk management then becomes animportant factor in the Source Selection Author-ity determination of who provides the mostexecutable program.

47

past and present performance record to estab-lish a level of confidence in the contractor’s abil-ity to perform the proposed effort. Such an evalu-ation is not limited to programmatic technical is-sues, but also includes assessment of critical ven-dor financial viability. Financial cap-ability analy-ses and industrial capability assess-ments, con-ducted in accordance with DoD Handbook5000.60H, provide insight to a contractor’s abilityto perform the proposed effort.

A range of methods are available to the PM toevaluate performance risk. The PerformanceRisk Assessment Group (PRAG) is a group ofexperienced Government personnel that areappointed by the source selection advisorycouncil Chairperson to permit performance riskto be used, if appropriate. Performance risk maybe separately assessed for each evaluation fac-tor or as a whole with the assessment provideddirectly to the source selection advisory coun-cil/authority for final decision or indirectlythrough the Source Selection Evaluation Board.The assessment relies heavily (although notexclusively) on the contractor performanceevaluations and surveys submitted by the PMOand Defense Contract Management Agency(DCMA).

4.7 RISK MANAGEMENT:POST-CONTRACT AWARD

Post-contract award risk management builds onthe work done during the pre-contract awardphase. With the award of the contract, the rela-tionship between the Government and the con-tractor changes as teams are formed to addressprogram risk. These teams should validate pre-contract award management plans by review-ing assessments, handling plans, and monitor-ing intentions. The extent of assessments in-creases as the contractor develops and refineshis design, test and evaluation, and manufac-turing plans. The Government PMO should workwith the contractor to refine handling plans.

4.6.6 Source Selection

The purpose of a source selection is to select thecontractor whose cost, schedule and perfor-mance can best be expected to meet theGovernment’s requirements at an affordableprice. To perform this evaluation, the Govern-ment must assess both proposal risk and per-formance risk for each proposal. These riskassessments must be done entirely within theboundaries of the source selection process.Previous assessments of any of the offerors maynot be applicable or allowable.

4.6.6.1 Proposal Risk. This refers to the riskassociated with the offeror’s proposed approachto meet the Government cost, schedule, andperformance requirements. The evaluation ofproposal risk includes an assessment of pro-posed time and resources and recommended ad-justments. This assessment should be performedaccording to the definitions and evaluation stan-dards developed for the source selection. Pro-posal risk is, in essence, a moderate expansionof past evaluation processes. Historically, evalu-ators selected contractors who demonstratedthat they understood the requirements andoffered the best value approach to meetingthe Government’s needs. The expansion on thisconcept is the specific consideration of risk.

Technical and schedule assessments are primaryinputs to the most probable cost estimate foreach proposal. It is important to estimate theadditional resources needed to control any risksthat have moderate or high risk ratings. Offerorsmay define them in terms of additional time,personnel loading, hardware, or special actionssuch as additional tests. However, whatever thetype of the required resources, it is essential thatcost estimates be integrated and consistent withthe technical and schedule evaluations.

4.6.6.2 Performance Risk. A performance riskassessment is an evaluation of the contractor’s

48

The process begins with an Integrated BaselineReview (IBR) after contract award to ensure thatreliable plans and performance measurementbaselines capture the entire scope of work, areconsistent with contract schedule requirements,and have adequate resources assigned to com-plete program tasks. The IBR could be con-ducted to incorporate other steps identifiedbelow. These steps suggest an approach that thePMO might take to initiate the program’s riskmanagement plans and activities after contractaward. They are intended to be a starting point,and the PMO should tailor the plan to reflecteach program’s unique needs.

• Conduct initial meeting with the contractorto describe the program’s objectives andapproach to managing risks. The PM mayalso present the risk management plan.

• Train members of the PMO and the con-tractor’s organization on risk managementbasics, incorporating the program’s manage-ment plan and procedures into the training.

• Review the pre-contract award risk plan withthe PMO and contractor, revise it as neces-sary, and share results with the contractor.

• Conduct in-depth review of the pre-contractaward risk assessments and expand thereview to include any new informationobtained since the award of the contract.

• Review and revise risk-handling plans toreflect the reassessment of risks.

• Review the program’s documentation re-quirements with the contractor. Ensure thatthe PMO and contractor understand the pur-pose, format, and contents of various riskreports.

• Initially, it may be necessary to establish aformalized PMO-contractor risk management

organization for the program, consistent withthe terms of the contract.

• Working with the contractor, refine the risk-monitoring plans and procedures.

• Establish the program reporting require-ments with the contractor. Describe the riskmanagement information system that theprogram has established, including proce-dures for providing information for dataentry, and identify reports for the PMO andcontractor.

• In conjunction with the contractor, identifyother risk-management activities that needto be performed.

• Manage the program risk in accordance withthe risk management plan and contract.

• Working with the contractor, refine the risk-monitoring plans and procedures and de-velop appropriate measures and metrics totrack moderate- and high-risk items.

4.8 RISK MANAGEMENTREPORTING AND INFORMATIONSYSTEM

The PMO should have a practical method forrisk-management reporting, and an informationsystem that supports a risk management pro-gram. The reporting needs of the PM establishthe type, format, and frequency of informationsharing. The IPT concept suggests that the entireacquisition program team needs access to therisk management information, and the primecontractor(s) should have access to information,consistent with acquisition regulations. Thereporting and information system chosen maybe Government- or contractor-owned. SeeChapter 5 for an example of an MIS.

49

4.9 RISK MANAGEMENT TRAINING

A successful management program depends, toa large extent, on the level of risk managementtraining the PMO members and the functionalarea experts receive. The training will preparethem for critical tasks, such as risk assessments.DoD schools offer some risk-managementtraining; however, PMs will need to organizeand conduct principal training for the programoffice. A three-part framework for training cov-ers program-specific risk management issues,general structure and process, and techniques.

(1) The program-specific training shouldensure that everyone has a common vision.It should cover the acquisition strategy, thecompanion risk management plan, thePM’s risk-management structure andassociated responsibilities, and the MIS.

(2) The following topics provide a startingpoint for general training syllabus devel-opment. The final syllabus should be tai-lored to meet the program’s specific needs.Table 4-3 provides a list of references thatwill be useful in developing the syllabusand lesson plans.

• Concept of Risk,

• Risk Planning,

• Risk Identification,

• Risk Analysis (as applicable),

• Risk Handling, and

• Risk Monitoring.

(3) The third area of training concerns risk-management techniques, concentrating onthe techniques the PMO plans to employ.The training should focus on how to usethe techniques and should include ex-amples of their use. Chapter 5, Risk Man-agement Techniques, of this Guide pro-vides a starting point. It contains a generaldiscussion of a set of techniques that ad-dress all elements of the risk managementprocess. The discussion of each techniquecontains a list of references that provide amore in-depth description of the technique.The set of techniques is not exhaustive andthe program office should add to the list,if necessary.

50

Table 4-3. Risk Management Reference Documents

DoD 4245.7-M, Transition from Development Provides a structure for identifying technical riskto Production, September 1985. areas in the transition from a program’s development

to production phases. The structure is geared towarddevelopment programs but, with modifications, couldbe used for any acquisition program. The structureidentifies a series of templates for each of thedevelopment contractor’s critical engineeringprocesses. The template includes potential areas ofrisk and methods for reducing risk in each area.

Risk Management Concepts and Guidance, Devoted to various aspects of risk management.Defense Systems Management College,March 1989. (Superseded by this RiskManagement Guide.)

Systems Engineering Management Guide, Devoted to risk analysis and management andDefense Acquisition University Press, provides a good overview of the risk managementJanuary 2001, Section 15. process.

Continuous Risk Management Guide, Provides a risk management methodology similar toSoftware Engineering Institute, Carnegie the one described in the Defense Acquisition Deskbook.Mellon University, 1996. Its value is that it subdivides each process into a series

of steps; this provides useful insights. Appendix Adescribes 40 risk-management techniques, the majorityof which are standard management techniques adaptedto risk management. This makes them a usefulsupplement to the Defense Acquisition Deskbookidentified techniques.

A Systems Engineering Capability Maturity Describes one approach to conducting an IndustryModel, Version 1.0 Software Engineering Capabilities Review. Section PA 10 (pp. 4-72–4-76)Institute (Carnegie Mellon University), discusses software risk management. The materialHandbook SECMM-94-04, December 1994. presented in this handbook also can be tailored to

apply to system and hardware risk.

A Software Engineering Capability Maturity Describes an approach to assess the softwareModel, Version 1.01 Software Engineering acquisition processes of the acquiring organizationInstitute (Carnegie Mellon University), and identifies areas for improvement.Technical Report, December 1996.

Capability Maturity Model for Software This is a tool that allows an acquiring organization to(SM-CMM), Version 1.1,/CMU/SEI-93-TR-24, assess the software capability maturity of anFebruary 1993. organization.

Taxonomy-Based Risk Identification, Describes a method for facilitating the systematic andSoftware Engineering Institute, Carnegie repeatable identification of risks associated with theMellon University, CMU/SEI-93-TR-6 development of a software-intensive project. This(ESC-TR-93-183, June 1993. method has been tested in active Government-funded

defense and civilian software development projects.The report includes macro-level lessons learned fromthe field tests.

NAVSO P-6071. Navy “best practices” document with recommendedimplementations and further discussion on thematerial in DoD 4245.7-M.

Document Description

51

Risk Management, AFMC Pamphlet 63-101, An excellent pamphlet on risk management that isJuly 1997. intended to provide PMs and the PMO with a basic

understanding of the terms, definitions, and processesassociated with effective risk management. It is verystrong on how to perform pre-contract award riskmanagement.

Defense Acquisition Deskbook Primary reference tool for defense acquisition workforce; contains over 1,000 mandatory anddiscretionary publications and documents whichpromulgate acquisition policy and guidance.(http://www.deskbook.osd.mil)

Acquisition Software Development Describes one approach to conducting an IndustryCapability Evaluation, AFMC Pamphlet Capabilities Review. This two-volume pamphlet was63-103, 15 June 94. generated from material originated at Aeronautical

Systems Center. The concepts support evaluationsduring source selection and when requested by IPTs.The material presented in this pamphlet also can betailored to apply to system and hardware riskmanagement.

Risk Management Critical Process Provides guidance and extensive examples forAssessment Tool, Air Force SMC/AXD, developing RFP Sections “L” and “M,” plus sourceVersion 2, 9 June 1998. selection standards or risk management. Also includes

technical evaluation and review questions, which arehelpful for assessing a risk management process; andrisk trigger questions, which are helpful for riskidentification.

NAVSO P-3686, Top Eleven Ways to Contains Navy approach to risk management withManage Technical Risk, October 1998. baseline information, explanations, and best practices

that contribute to a well-founded technical riskmanagement program.

Risk Focus Area of the Program Provides comprehensive and ready source of currentManagement Community of Practice tools, papers, and practices in risk management field.(www.pmcop.dau.mil)

Table 4-3. Risk Management Reference Documents(continued)

Document Description

52

53

55RISK MANAGEMENT

TECHNIQUES

Several tools have been developed to supporteach of the components of the risk managementprocess, i.e., planning, assessing, handling, andmonitoring and documenting. Although tooldevelopers may claim otherwise, none areintegrated to totally satisfy all needs of a PM.Most likely, a PM will choose an overall riskstrategy, write a plan to reflect his strategy, re-view the list of proven techniques to supportthe components of risk management, assess thetechniques against the program’s needs andavailable resources, tailor the techniques to suitthe needs of the program, and train programoffice members to implement the plan.

5.3 RISK PLANNING TECHNIQUES

5.3.1 Description

This technique suggests an approach to riskplanning; the process of developing and docu-menting an organized, comprehensive ap-proach. It also suggests interactive strategy andmethods for identifying and tracking risk driv-ers, developing risk-handling plans, perform-ing continuous assessments to determine howrisks have changed, and planning adequateresources. The risk planning technique isapplicable to all functional areas in the program,especially critical areas and processes. Usingthe acquisition strategy as a starting point resultsin the development of a program risk manage-ment strategy, from which flows a managementplan that provides the detailed information and

5.1 INTRODUCTION

This Chapter provides top-level information ona number of techniques currently used in DoD,and a combination of techniques used by theServices, industry, and academia. Collectively,they focus on the components of the risk man-agement process and address critical risk areasand processes. The write-ups describe the tech-niques and give information on their applica-tion and utility. The descriptions are at a levelof detail that should permit potential users toevaluate the suitability of the techniques for ad-dressing their needs; however, the material doesnot, in most cases, provide all the informationthat is required to use a technique. Readers willfind that if a particular technique looks prom-ising, they can obtain enough information fromthe references and tools that will enable pro-gram offices to apply them. The descriptionsare in a format that aids comparison with otherapproaches.

5.2 OVERVIEW

Techniques are available to support risk man-agement activities. None are required by DoD,but some have been successfully used in thepast by DoD PMs. Many of the techniquessupport processes that are part of sound man-agement and systems engineering and giveGovernment and contractor PMs the tools forconsidering risk when making decisions onmanaging the program.

54

direction necessary to conduct an effective man-agement program. This risk management planprovides the PM with an effective method todefine a program, one that fixes responsibilityfor the implementation of its various aspects,and supports the acquisition strategy.

The technique should first be used in the Con-cept and Technology Development (CTD) Phasefollowing the development of the initial acqui-sition strategy. Subsequently, it may be used toupdate the management plan on the followingoccasions: (1) whenever the acquisition strat-egy changes, or there is a major change inprogram emphasis; (2) in preparation for majordecision points; (3) in preparation for and

immediately following technical audits andreviews; (4) concurrent with the review andupdate of other program plans; and (5) inpreparation for a PMO submission.

The PMO risk management coordinator, ifassigned, develops the risk management planbased on guidance provided by the PM, andcoordinating with the Program Level IPT. Tobe effective, the PM must make risk manage-ment an important program management func-tion and must be actively involved in the riskplanning effort. Planning requires the active par-ticipation of essentially the entire PMO andcontractor team.

Figure 5-1. Risk Planning Technique Input and Output

• Evaluate risk planningrequirements

• Evaluate the program’s currentrisk situation

• Develop a risk managementstrategy

• Determine the tasks andguidance required to implementthe risk management strategy

• Develop the PMO’s approach torisk management in general

• Provide application guidance forrisk management componentprocesses

• Develop inputs for otheracquisition strategies andprogram processes

PM Guidance

Input

• Acquisition strategy

• Prior riskmanagement plan(if any)

• Known risks

• System description

• Program description

• Key ground rules andassumptions

Output

• Risk ManagementPlan

• Risk ManagementTraining

• Program-Level IPT (or equivalentsuch as Risk Management Board)

• Risk management coordinator

55

5.3.2 Procedures

Figure 5-1 graphically depicts the process tobe followed in applying this technique. The pro-cedure consists of a number of iterative activi-ties that result in the development of the riskmanagement strategy and a Risk ManagementPlan.

The acquisition strategy and related manage-ment planning efforts (program management,and systems engineering), program constraints,and any existing risk management planning areintegrated and evaluated in the context of thePM’s guidance, which provides the directionfor the planning process. Typical types of PMguidance are concerns about certain categoriesof risk, guidance on funding of handlingactivities, emphasis to be placed on risk man-agement training, and frequency and type ofinternal reports.

The integration and evaluation of the primaryinputs establish the requirements and scope ofthe planning effort through an assessment ofthe program’s current risk situation. The resultsof the assessment provide the basis for devel-opment of management strategy. The strategyshould reflect the level of risk that the PM isprepared to accept, and should provide guid-ance on how and when known risks will bereduced to acceptable levels. It should alsodescribe the risk management process the PMOwill employ and the organization and structureof the management program, addressing thingssuch as risk ratings, the use of an MIS, policyand procedures on sharing risk managementinformation, and training.

The PMO should create an MIS early in theplanning process. It will serve as a planningsource and the data may be used for creatingreports. It will also become the repository forall current and historical information related torisk. Eventually, this information may include

risk assessment documents, contract deliverables,if appropriate, and other risk-related reports.

Based on the management strategy, the planidentifies specific tasks to be accomplished andassigns responsibility for their execution. Thetiming of these tasks should be incorporated intoan integrated critical path master schedule orequivalent. Guidance for task execution andcontrol should also be developed, covering suchthings as the suggested techniques to be usedfor each component, any assistance availableto Sub-Tier IPTs, the use of funds, the policyon the use of independent risk assessors, etc.This information may be documented in a riskmanagement plan. A sample format is shown inFigure 5-2. Appendix B contains two examplesof a Risk Management Plan.

The contents of the risk management strategyand plan should be consistent with the acqui-sition strategy and other program plans derivedfrom the acquisition strategy. Hence, it shouldbe tailored to each program rather than attempt-ing to use the same process and its implementa-tion on all programs. This will help to ensure thatrisk is considered in all program activities andthat it does not become a “stove pipe” function.

5.4 RISK ASSESSMENT TECHNIQUES

5.4.1 Product (WBS) Risk Assessment

5.4.1.1 Description. This technique identifiesthose risks associated with a given system con-cept and design. The difference between the pro-cess (DoD 4245.7-M) technique and this ap-proach is that DoD 4245.7-M addresses thecontractor’s engineering and manufacturingprocess and this technique focuses on the re-sulting product. This technique is used to iden-tify and analyze risks in the following criticalrisk areas: design and engineering, technology,logistics, production, concurrency, plus othersas needed for both hardware and software.

56

Figure 5-2. Sample Format for Risk Management Plan

INTRODUCTION. This section should address the purpose and objective of the plan, and provide a briefsummary of the program, to include the approach being used to manage the program, and the acquisitionstrategy.

PROGRAM SUMMARY. This section contains a brief description of the program, including the acquisitionstrategy and the program management approach. The acquisition strategy should address its linkage to therisk management strategy.

DEFINITIONS. Definitions used by the program office should be consistent with DoD definitions for ease ofunderstanding and consistency. However, the DoD definitions allow program managers flexibility in constructingtheir risk management programs. Therefore, each program’s risk management plan may include definitionsthat expand the DoD definitions to fit its particular needs. For example, each plan should include, among otherthings, definitions for the ratings used for technical, schedule, and cost risk.

RISK MANAGEMENT STRATEGY AND APPROACH. Provide an overview of the risk management approach,to include the status of the risk management effort to date, and a description of the program risk managementstrategy.

ORGANIZATION. Describe the risk management organization of the program office and list the responsibilitiesof each of the risk management participants.

RISK MANAGEMENT PROCESS AND PROCEDURES. Describe the program risk management process to beemployed, i.e., risk planning, assessment, handling, monitoring and documentation, and a basic explanationof these components. Also provide guidance for each of the risk management steps in the process. If possible,the guidance should be as general as possible to allow the program’s risk management organization (e.g.,IPTs) flexibility in managing the program risk, yet specific enough to ensure a common and coordinatedapproach to risk management. It should address how the information associated with each element of the riskmanagement process will be documented and made available to all participants in the process, and how riskswill be tracked, to include the identification of specific metrics if possible.

RISK PLANNING. This section describes the risk planning process and provides guidance on how it will beaccomplished, and the relationship between continuous risk planning and this RMP. Guidance on updates ofthe RMP and the approval process to be followed should also be included.

RISK ASSESSMENT. This section of the plan describes the assessment (identification and analysis) process.It includes procedures for examining the critical risk areas and processes to identify and document the associatedrisks. It also summarizes the analyses process for each of the risk areas leading to the determination of a riskrating. This rating is a reflection of the potential impact of the risk in terms of its variance from known BestPractices or probability of occurrence, its consequence, and its relationship to other risk areas or processes.This section may include:

• Overview and scope of the assessment process• Sources of information• Information to be reported and formats• Description of how risk information is retained• Assessment techniques and tools.

RISK HANDLING. This section describes the risk-handling options, and identifies tools that can assist inimplementing the risk-handling process. It also provides guidance on the use of the various handling optionsfor specific risks.

RISK MONITORING. This section describes the process and procedures that will be followed to monitor thestatus of the various risk events identified. It should provide criteria for the selection of risks to be reported on,and the frequency of reporting. Guidance on the selection of metrics should also be included.

RISK MANAGEMENT INFORMATION SYSTEM, DOCUMENTATION AND REPORTS. This section describesthe MIS structure, rules, and procedures that will be used to document the results of the risk managementprocess. It also identifies the risk management documentation and reports that will be prepared; specifies theformat and frequency of the reports; and assigns responsibility for their preparation.

57

The WBS is the starting point to describe con-tract work to be done and the resulting productand is the basis for determining risk events ineach critical risk area. The risk events—eventsthat might have a detrimental impact on thesystem, subsystems, or components—are evalu-ated to identify and characterize specific risksratings and prioritization.

This technique should be used shortly after thecompletion of the prime contractor’s WBS.Thereafter, it should be used regularly up tothe start of production. The technique can beused independently or in conjunction withother risk assessment techniques, such as theProcess (DoD 4245.7-M) Risk Assessmenttechnique. It may, if appropriate, also be usedin conjunction with the Integrated Baseline Re-view (IBR), which is conducted within 6months of contract award. See Section1.4.2.4.3 of the Defense Acquisition Deskbook(http://www.deskbook.osd.mil) for a discussionof an IBR. A World Wide Website is also avail-able at www.acq.osd.mil./pm/ibrmats/ibrmats.htm, which discusses the IBR Process.

To apply this technique, joint Government andindustry evaluation teams should examine theappropriate WBS levels in each Sub-Tier IPTsproduct area. If necessary, complementary in-dustry-only teams may take an in-depth look atselected areas at lower WBS levels. At times, itmay be desirable to include outside industryexperts on the teams to aid in the examinationof specific WBS elements or functional areas.

5.4.1.2 Procedures. Figure 5-3 depicts the pro-cess used in this technique. The first step is toreview the WBS elements down to the level be-ing considered, and identify risk events. Thisreview should consider the critical areas (de-sign and engineering, technology, logistics, etc.)that may help to describe risk events. Table 5-1shows a partial listing of these elements.

Using information from a variety of sources, suchas program plans, prior risk assessments, expertinterviews, etc., the WBS elements are examinedto identify specific risks in each critical area. Therisk event, are then analyzed to determine prob-ability of occurrence and consequences/impacts,

Figure 5-3. Product (WBS) Risk Assessment Technique Input and Output

Input

• Program Plans

• Past Projected Data

• Lesson Learned

• Expert Interview Data

• Test Results

• Integrated BaselineReview

Output

• Risk Information Forms

• Prioritized List of Risks

• List of AggregatedRisks

• Watch Lists

• Sub-Tier IPT Evaluation Teams

• “Outside” Industrial Experts

• WBS

• Integrated Master Schedule(or equivalent)

• Critical Area Evaluation Criteria

• Examine WBS elements andidentify risk events

• Analyze risk events(Includes rating andprioritizing risk events)

58

along with any interdependencies and risk eventpriorities. Several techniques and tools are avail-able to accomplish this, including, among others,technology assessments, modeling and simulation,hazard analysis, and fault tree analysis.

The results of this analysis should be docu-mented in a program-specific standard format,such as a Risk Information Form (RIF). Therisks, along with others identified using othertechniques, can be prioritized and aggregatedusing the technique described later in thischapter.

5.4.2 Process (DoD 4245.7-M)Risk Assessment

5.4.2.1 Description. This technique is usedto assess (identify and analyze) program tech-nical risks resulting from the contractor’s pro-cesses. It is based on the application of thetechnical risk area templates found in DoD4245.7-M. These templates describe the riskareas contained in the various technical pro-cesses (e.g., design, test, production, etc.) andspecify methods for reducing risks in eacharea. Success of any risk reduction efforts as-sociated with this technique will depend onthe contractor’s ability and willingness to make

Table 5-1. Critical Risk Areas and Example Elements

• Design/technology approach

• Operational environments

• External/internal interfaces

• Use of standard parts/programparts list

• System/subsystem critical designrequirement

• Operations and Maintenance(O&M) concept

• System diagnostic requirement

• Repairability and Maintainability(R&M) requirements

• Supply support requirements

• Built-in Test (BIT) requirements

• Integrated test

• Qualification testing

• Subsystem test limits

• Design producibility

• Manufacturing capabilityrequirements

• Parts/assemblies availability

• Program schedule adequacy

• Integration requirements

• Human-machine interface

• Design growth capacity

• Design maturity

• Safety and health hazards

• Manpower, training and skill profiles

• Support equipment requirements

• Maintenance interfaces

• Level of repair decisions

• Training equipment design

• Test environmental acceleration

• Supportability test results

• Special tooling/test equipment planningpersonnel availability

• Process/tooling proofing

• Production equipment availability

• Development phases concurrency

Example ElementsCritical Risk

Areas

Design andEngineering

Logistics

Testing

Manufacturing

Concurrency

59

a concerted effort to replace any deficient en-gineering practices and procedures with bestindustrial practices.

One of the primary benefits of this technique isthat it addresses pervasive and importantsources of risk in most DoD acquisition pro-grams and uses fundamental engineering prin-ciples and proven procedures to reduce techni-cal risks. The technique is accepted by manyaerospace companies in normal business activi-ties, and in fact, was developed by a group ofGovernment and aerospace experts.

The technique is primarily applicable during theConcept and Technology Development (CTD)Phase, and the System Demonstration part ofthe System Development and Demonstration(SDD) Phase of program development. In the

CTD Phase it provides a detailed checklist ofprocesses that the contractor needs to address;in the System Demonstration part of the SDDPhase, the processes are being implemented inpreparation for Low Rate Initial Production(LRIP). The description of each template inDoD 4245.7-M shows the phases in which thetemplate should be applied. The specific tim-ing of the application within the phases shouldbe determined based on the type of program,the acquisition strategy and plans, and the judg-ment of program officials. It should also be usedin preparation for milestone decisions and whenpreparing for source selection. This techniquemay be used independently or in conjunctionwith other risk assessment techniques. Whenfeasible, a Government-industry evaluationteam should be formed early in the program toapply this technique.

Figure 5-4. Process (DoD 4245.7-M) Risk Assessment Technique Input and Output

• Identify Program’s CriticalTechnical Processes

• Develop Technical Baselinefor Critical TechnicalProcesses

• Develop Program Baseline

• Measure Variances BetweenBaselines

• Report Risks

Input

• DoD 4245.7-MTemplates

• Combined Government/Industry AcquisitionFlow Chart

• Known Best Practices

• Past Project Data

• Best Practices Database(PMWS)

Output

• Technical Baseline

• Program Baseline

• Risk Information Forms

• Technical RiskAssessment Summary

• Prioritized List of Risks

• Watch Lists

• Government-IndustrialEvaluation Team

• “Outside” IndustrialExperts

• Corporate Policies, Practices& Procedures

• Contract RequirementsSpecifications &Modifications

60

5.4.2.2 Procedures. Figure 5-4 shows the ba-sic approach used in this technique. The DoD4245.7-M templates are used in conjunctionwith the contract requirements and specifica-tions to identify those technical processes criti-cal to the program and to establish a programbaseline of contractor processes. When pos-sible, the program baseline should be deter-mined by evaluating actual contractor perfor-mance, as opposed to stated policy. For ex-ample, design policy should be determined frominterviewing designers and not simply from re-viewing written corporate policies.

This program baseline should then be comparedto a baseline of industry-wide processes andpractices that are critical to the program. Thebaseline should be developed by reviewing andcompiling known best practices in use by vari-ous companies in both defense and non-defensesectors. One source of best practices informa-tion is the Program Manager’s Work Station(PMWS), a series of PC expert systemsdesigned to aid in the implementation of DoD4245.7-M. The point of contact for the PMWSis the Best Manufacturing Practices Center ofExcellence (http://www.bmpcoe.org).

The differences between the two baselines are areflection of the technical process risk present.These results should be documented in a stan-dard format, such as a program-specific RiskInformation Form (see MIS discussion thissection) to facilitate the development of a riskhandling and risk reporting plan.

5.4.3 Program DocumentationEvaluation Risk Identification

5.4.3.1 Description. This technique providesa methodology for comparing key programdocuments and plans to ensure that they are con-sistent and traceable to one another. Programdocuments and plans are hierarchical in nature.If the contents (activities, events, schedules, re-quirements, specifications, etc.) of a documentor plan do not flow from or support the con-tents of those above, below, or adjacent to it,there is a strong chance that risk will be intro-duced into the program or that known risks willnot be adequately addressed. This techniquereduces those risks and improves the quality ofprogram documentation.

Figure 5-5. Plan Evaluation Technique Input and Output

Input

• Program Plans

• RequirementsDocuments

• Other ProgramDocuments

Output

• List ofDocumentationInconsistencies

• Risk InformationForms

• PMO Team

• WBS

• SOW

• Baselines

• Evaluate each document

• Evaluate the correlationamong documents

61

This technique can be used in any acquisitionphase as documents or plans are being developedor updated. The comparison of program docu-mentation and plans should be performed by asmall team of experienced, knowledgeable per-sonnel who are intimately familiar with the totalprogram.

5.4.3.2 Procedures. Figure 5-5 shows the pro-cess used in this technique. The primary inputs tothe process are the PMO documents that detail thesteps involved in executing the program. Theseinclude, for example, the Mission Need Statement(MNS), Operational Requirements Document(ORD), acquisition plan, any master managementplan, Test and Evaluation Master Plan (TEMP),manufacturing plan, etc. Another set of key inputdocuments are those used to communicate with theprime contractor, e.g., WBS, specifications, State-ment of Work (SOW) or equivalent such as, State-ment of Objectives, etc. Before any comparison, thePMO should review all documents for accuracy andcompleteness. Figure 5-6 shows an example of thetype of correlation that should exist among the MNS,ORD, and TEMP during the CTD Phase.

If the comparison shows any gaps or incon-sistencies, reviewers should identify them as pos-sible risks on a RIF, the output of this process.

5.4.4 Threat and RequirementsRisk Assessment

5.4.4.1 Description. This technique describesan approach to assess risks associated with re-quirements and threat and to identify require-ments and threat elements that are risk drivers.Because operational needs, environmental de-mands, and threat determine system performancerequirements, to a large degree, they are a majorfactor in driving the design of the system andcan introduce risk in a program. Further, withthe introduction of CAIV, PMs and users aredirected to examine performance re-quirementsand identify areas that are not critical and areavailable for trade to meet cost objectives. Riskis a factor in CAIV considerations.

The requirements risk assessment process focuseson: determining if operational requirements areproperly established and clearly stated for each

Figure 5-6. Concept Technology Development (CTD) PhaseCorrelation of Selected Documents (Example)

MNS

ORDTEMP

Will testing determine ifmission needs aresatisfied?

Does the ORDsatisfy the needsspecified in the MNS

Are high risk performancespecifications being testedin a manner to support riskreduction?

62

program phase; ensuring that requirements arestable and the operating environment is ad-equately described; addressing logistics andsuitability needs; and determining if require-ments are too constrictive, thereby identifyinga specific solution. The evaluation of the threatrisk assessment process’ maturity addresses:uncertainty in threat accuracy and stability,sensitivity of design and technology to threat,vulnerability of the system to threat counter-measures, and vulnerability of the program tointelligence penetration. PMs should view re-quirements in the context of the threat and ac-curately reflect operational, environmental, andsuitability requirements in design documents.

PMs should use threat and requirements assess-ments during the early phases of program de-velopment and, as necessary, as the programadvances through development. Early and com-plete understanding of the requirements andthreat precludes misunderstandings between therequirements and development communities,

helps to identify risk areas, and allows early plan-ning to handle risk. Consequently, the usershould be actively involved in this process fromthe beginning.

5.4.4.2 Procedures. Figure 5-7 depicts the pro-cess used in this technique. The basic approachis to conduct a thorough review of the docu-ments containing performance requirementsand threat information, e.g., ORD, TEMP, Sys-tem Specification, System Threat Assessment(STA), Design Reference Mission Profile, etc.,to determine stability, accuracy, operating en-vironment, logistics and suitability require-ments, and consistency between these require-ments and the threat considerations cited above.There should be an understanding between theusers and the developers on Key PerformanceParameters (KPPs) in order to identify therequirements that are most important and criticalto program success. The Design Reference Mis-sion Profile and Design Requirements templatesin DoD 4245.7-M and the Program Documen-

Figure 5-7. Threat and Requirement Risk Assessment Technique Input and Output

• Functional Baseline

Input

• MNS

• ORD

• STA

• TEMP

• Past Project Data

• ConceptDevelopment Studies

• Test and SimulationResults

Output

• Risk InformationForms

• Prioritized List ofRisks

• List ofAggregated Risks

• Watch List

• Government-IndustryEvaluation Team

• Subject-Matter Experts

• Extract critical requirementsand threat areas to beassessed

• Assess technical maturity andcomplexity of system concepts

• Evaluate requirements andthreat process maturity

• Identify, analyze, and evaluaterequirements and threat risks

63

tation Evaluation Risk Identification techniquemay be useful in support of this technique.

Requirements should be thoroughly reviewedto identify those that drive performance. Thiswill require the “flow down” of performancerequirements to components and subassembliesand the identification of technologies/tech-niques to be used in these components/subas-semblies that may significantly affect thesystem’s ability to meet users’ needs.

Designers should determine the sensitivity ofsystem performance to the requirements andthreat and identify risk drivers. Models andsimulations are useful tools to determine thissensitivity. For example, the U.S. Army Mate-riel System Analysis Activity (AMSAA) hassuch an analytic model, the AMSAA RiskAssessment Methodology.

The PMWS can also be useful. The risk identi-fied in this technique should be documented ina program-specific format, such as a RIF (seeAnnex B).

5.4.5 Cost Risk Assessment

5.4.5.1 Description. This technique providesa program-level cost estimate at completion(EAC) that is a function of performance (tech-nical), and schedule risks. It uses the results ofprevious assessments of WBS elements and costprobability distributions developed for each ofthe elements. These individual WBS elementsare aggregated using a Monte Carlo simulationto obtain a probability distribution of the pro-gram-level cost EAC probability distributionfunction. These results are then analyzed todetermine the actual risk of cost overruns andto identify the cost drivers.

The use of these cost probability distributionsas the basis for the program-level cost estimateresults in a more realistic EAC than the

commonly used single point estimates for WBSelements, since they address both the probabil-ity of occurrence and consequences/impacts ofpotential risk events. Their use also eliminatesa major cause of underestimating (use of pointestimates) and permits the evaluation of per-formance (technical) or schedule causes of costrisk. Thus, this technique provides a basis forthe determination of an “acceptable” level ofcost risk.

This technique can be used in any of the acqui-sition phases, preferably at least once per phasebeginning in the CTD Phase although suitabledata may not exist until the System Integration(SI) Part of the SDD Phase in some cases. Itshould be used in conjunction with perfor-mance (technical) and schedule risk assess-ments and may be performed by small Gov-ernment-industry teams consisting of risk ana-lysts, cost analysts, schedule analysts and tech-nical experts who understand the significanceof previous performance and schedule risk as-sessments. They should report to the ProgramIPT. This technique requires close and continu-ous cooperation among cost analysts andknowledgeable technical personnel and thesupport of the prime contractor’s senior man-agement to help get valid cost data.

5.4.5.2 Procedures. Figure 5-8 depicts the pro-cess used in applying this technique. The firststep is to identify the lowest WBS level forwhich cost probability distribution will be con-structed. The level selected will depend on theprogram phase; e.g., during the CTD Phase, itmay not be possible to go beyond level 2 or 3,simply because the WBS has not yet been de-veloped to lower levels. As the program ad-vances into subsequent phases and the WBS isexpanded, it will be possible and necessary togo to lower levels (4, 5, or lower). Specific per-formance (technical) and schedule risks are thenidentified for these WBS elements.

64

To develop the WBS elements cost probabilitydistributions, the team, working with the primecontractor’s WBS element managers, determinesthe cost range for each element being investi-gated. The cost range encompasses cost estimat-ing uncertainty, schedule risk, and technical risk.The validity of the cost data used to constructthe distribution is critical. In fact, collecting gooddata is the largest part of the cost risk job. Con-sequently, PMOs should place major emphasison this effort.

The element cost probability distributions areaggregated and evaluated using a Monte Carlosimulation program. All Monte Carlo processescontain limitations, but they are more informa-tive than point estimates. Any number of thesesimulations are readily available to perform thisaggregation, and one that meets the specificneeds of the program should be selected. Theresults of this step will be a program-level costEAC and a cost distribution that shows the cu-mulative probability associated with differentcost values. These outputs are then analyzed todetermine the level of cost risk and to identifythe specific cost drivers. Cost risk is determined

by comparing the EAC with the cost baselinedeveloped as part of the acquisition programbaseline. Since the EAC and program cost dis-tribution are developed from WBS element riskassessments, it is possible to determine the costrisk drivers. The cost drivers can also be re-lated back to the appropriate performance andschedule risks. The results of the analysis (costrisks and drivers) should be documented inRIFs.

5.4.6 Quantified Schedule RiskAssessment

5.4.6.1 Description. This technique providesa means to determine program-level schedulerisk as a function of risk associated with variousactivities that compose the program. It estimatesthe program-level schedule by developing prob-ability distributions for each activity durationand aggregating these distributions using aMonte Carlo simulation or other analyticaltools. The resulting program-level schedule isthen analyzed to determine the actual schedulerisk and to identify the schedule drivers.

Figure 5-8. Cost Risk Assessment Top-Level Diagram

• Identify key performance andschedule risks

• Select cost analysis level ofdetail

• Construct WBS level probabilitydistributions

• Analyze Monte Carlo simulationresults

• Cost Risk Baseline

Input

• Performance*Risks

• Schedule Risks

• Cost Risk

• WBS

• Prior CostAnalysis Results

Output

• EAC

• Cost Risk

• Risk Information Forms

• List of AggregatedRisks (by type of cost)

• Watch List

• Program Cost EstimateProbability DensityFunction (PDF)

• Program Cost EstimateCumulative DistributionFunction (CDF)• Program Cost Estimating Team

• Industry WBS Element Managers

* Includes Technical Risk

65

This technique expands the commonly usedCritical Path Method (CPM) of developing aprogram schedule to obtain a realistic estimateof schedule risk. The basic CPM approach usessingle point estimates for the duration of pro-gram activities to develop the program’s ex-pected duration and schedule. It invariably leadsto underestimating the time required to com-plete the program and schedule overruns, pri-marily because the point estimates do notadequately address the uncertainty inherent inindividual activities. The uncertainty can becaused by a number of factors and may be areflection of the risk present in the activity.

The quantified schedule technique accounts foruncertainty by using a range of time that it willtake to complete each activity instead of singlepoint estimates. These ranges are then com-bined to determine the program-level scheduleestimate. This approach enables PMs to esti-mate early in a program if there is a signifi-cant probability/likelihood of overrunning theprogram schedule and by how much. It alsoidentifies high risk program activities that mayor may not be on the program “critical path.”

This technique can be used in any acquisitionphase beginning with the completion of the firststatement of work. The schedule probability dis-tribution function for each key activity shouldbe developed as soon as the activity is includedin the master schedule. The distribution func-tions should be periodically reviewed and re-vised, if necessary, at least once per phase. Thetechnique should be applied by a small Gov-ernment-industry team consisting of scheduleanalysts and technical experts who understandthe significance of prior risk performanceassessments.

5.4.6.2 Procedures. Figure 5-9 shows the pro-cess used in this technique. The first step is toidentify the lowest activity level for which du-ration/schedule probability distribution func-tions will be constructed. The WBS should beused as the starting point for identifying activi-ties and constructing a network of activities. TheWBS level selected will depend on the programphase.

Next, the contractor should construct a CPMschedule for these activities. To develop the

Figure 5-9. Schedule Risk Assessment Technique Input and Output

• Identification activities anddevelop relationships betweenthem

• Identify key performance risks

• Construct probabilitydistributions

• Analyze Monte Carlo simulationresults

• Schedule Risk Baseline

Input

• Performance Risks

• Probability DensityFunctions of WBSElements

• Prior ScheduleAnalysis Results

• Schedule RisksExternal to theProgram

Output

• Schedule Duration

• Schedule Risk

• Risk Information Forms

• List of AggregatedRisks (Critical PathVersus Non-CriticalPath)

• Watch List

• Program ScheduleProbability DensityFunction (PDF)

• Program ScheduleCumulative DistributionFunction (CDF)• Program Schedule Estimating Team

• Industry WBS Element Managers

66

activity duration probability distribution func-tions, the team, working with the prime con-tractor’s WBS element managers, determinesand analyzes duration range for each activitybeing investigated. This analysis should bedone by schedule analysts working closely withknowledgeable technical people.

The activity duration probability distributionsare aggregated using a Monte Carlo simulationprogram, such as ©Risk, Risk + for MicrosoftProject, or Crystal Ball. The result of this stepis a program-level schedule and distributionfunction that shows the cumulative probabilityassociated with different duration values. Theseoutputs are then analyzed to determine the levelof schedule risk and to identify the specificschedule drivers. Risk is determined by com-paring the program-level schedule with the de-terministic schedule baseline developed as partof the acquisition program baseline. The factthat the schedule and distribution are developedfrom WBS element risk assessments makes itpossible to determine the schedule risk drivers.These drivers can also be related back to theappropriate performance risks. The results ofthe analysis (schedule risks and drivers) shouldbe documented in RIFs. The analysis requirescontinued close cooperation between the sched-ule analysts and technical personnel familiarwith the details of the program.

5.4.7 Expert Interviews

5.4.7.1 Description. A difficult part of the riskmanagement process is data gathering. Thistechnique provides a means for collecting risk-related data from subject-matter experts andfrom people who are intimately involved withthe various aspects of the program. It relieson “expert” judgment to identify and analyzerisk events, develop alternatives, and provide“analyzed” data. It is used almost exclusivelyin a support role to help develop technical data,such as probability and consequences/impacts

information, required by a primary risk assess-ment technique. It can address all the functionalareas that make up the critical risk areas and pro-cesses, and can be used in support of risk handling.

Expert judgment is a sound and practical wayof obtaining necessary information that is notavailable elsewhere or practical to develop us-ing engineering or scientific techniques. How-ever, interviewers should be aware that expertopinions may be biased because of over-reli-ance on certain information and neglect of otherinformation; unwarranted confidence; the ten-dency to recall most frequent and most recentevents; a tendency to neglect rare events; andmotivation. Results may have to be temperedbecause of these biases.

5.4.7.2 Procedures. Figure 5-10 depicts the pro-cess used in this technique. The first step in theprocess is to identify risk areas and processes thatare to be evaluated using the expert interview tech-nique. Other techniques described in this section(e.g., WBS Risk Assessment, Process Risk As-sessment, etc.) can be used for this purpose.

Once the areas and processes are known,subject-matter experts and program/contractorpersonnel knowledgeable of the areas andprocesses should be identified to be interviewed.Similarly, qualified interviewers should beselected for each area and process.

Interviewers should prepare themselves by pre-paring a strategy and selecting a methodologyfor analysis and quantification of data. The ref-erences list sources for practical techniques forquantifying expert judgment. (See Appendix Dfor additional guidance in this area.)

After the interview, evaluators analyze the datafor consistency, resolve any issues, and docu-ment the results. Commercial “Groupware”software is available to assist in compiling anddocumenting the results of interviews.

67

5.4.8 Analogy Comparison/Lessons-Learned Studies

5.4.8.1 Description. This technique uses les-sons learned and historical information aboutthe risk associated with programs that are simi-lar to the new system to identify the risk asso-ciated with a new program. It is normally usedto support other primary risk assessment tech-niques, e.g., Product (WBS) Risk Assessment,Process Risk Assessment, etc. The technique isbased upon the concept that “new” programsare originated or evolved from existing pro-grams or simply represent a new combinationof existing components or subsystems. Thistechnique is most appropriate when systems en-gineering and systems integration issues, plussoftware development, are minimal. A logicalextension of this premise is that key insightscan be gained concerning aspects of a currentprogram’s risks by examining the successes,failures, problems, and solutions of similarexisting or past programs. This techniqueaddresses all the functional areas that make upthe critical risk areas and processes.

5.4.8.2 Procedures. Figure 5-11 depicts theprocess used in this technique. The first step inthis approach is to select or develop a baselinecomparison system (BCS) that closely approxi-mates the characteristics of the new system/equipment to as low a level as possible and usesthe processes similar to those that are neededto develop the new system. For processes, in-dustry-wide best practices should be used as abaseline. The PMWS is a useful tool for identi-fying these best practices.

Relevant BCS data are then collected, analyzed,and compared with the new system require-ments. The BCS data may require adjustmentto make a valid comparison; for example, ap-ply appropriate inflation indices for cost com-parisons, adjust design schedule for softwareevolution versus software development, etc. Thecomparisons can be a major source of risk as-sessment data and provide some indication ofareas that should be investigated further. Thistechnique is especially useful as a front-endanalysis of a new start program.

Figure 5-10. Expert Interview Technique Input and Output

• Identify, select and verifysubject/area experts

• Develop interview material

• Prepare for interview

• Conduct interview

• Analyze results

• Primary risk assessment techniquerequirements and constraints

INPUT

• Critical Risk Areas

• Critical RiskProcesses

• WBS

• Integrated MasterSchedule (IMS)

• Related Analyses, i.e.,Cost Analysis,Technical RiskAnalyses, Etc.

OUTPUT

• Qualitative Judgments

• Quantitative Judgments

• Interview Reports

• Risk Information Forms

• List of AggregatedRisks

• Watch List

• Government-Industrial Evaluation Team

• Subject/area experts

68

5.5 RISK PRIORITIZATION

5.5.1 Description

This technique provides a means to prioritizethe risks present in a program. It is a part ofrisk analysis. The prioritized list provides thebasis for developing handling plans, preparinga handling task sequence list, and allocatinghandling resources.

When using this technique, PMs establishdefinitive criteria to evaluate the risks, such as,probability (probability/likelihood) of failure,(P

F), and consequence/impact of failure (C

F),

along with any other factors considered ap-propriate. The risks are evaluated using quali-tative expert judgment and multi-voting meth-ods to prioritize and aggregate risks. (See Ref-erences-SEI, Continuous Risk Management,1996, for a discussion of multi-voting methods.)

A qualitative approach using subject-matterexperts is generally preferred in this techniquebecause of the tendency to rely on ordinalvalues to describe P

F, C

F and the inherent

inaccuracies resulting from any attempts touse quantifiable methods derived from raw(uncalibrated) ordinal scales.

This technique should be used appropriatelyduring the CTD Phase, and SI and SD parts ofthe SDD Phase, at the conclusion of a majorrisk assessment undertaking, when there hasbeen a significant change in the acquisitionstrategy, when risk monitoring indicates signifi-cant changes in the status of a number of risks,and prior to a milestone review.

The PMO risk management coordinator (ifassigned) may function as a facilitator andsupport the program IPT in applying thistechnique.

Figure 5-11. Analogy Comparison/Lessons-Learned Studies Top-Level Diagram

• Identify Baseline ComparisonSystem (BCS) candidates

• Breakdown new system intological components forcomparison

• Select BCS

• Collect detailed data on BCSand new system

• Analyze BCS and new systemdata

• Conduct comparisons anddevelop conclusions

Input

• Description andProgramCharacteristics of theNew System and ItsComponents

• Information Needs

Output

• Risk AssessmentData (Risk Areasfor FurtherInvestigation)

• Government-Industrial Evaluation Team

• Primary risk assessment techniquerequirements and constraints

69

5.5.2 Procedures

Figure 5-12 depicts the process used to prioritizethe risks present in a program. The inputs ofthis process are risks that have been identified.

The evaluation team, through consensus or asdirected by the Risk Management Plan, selectsthe prioritization criteria. P

F and C

F should

always be part of the criteria, along with anyother appropriate factors. Urgency, an indica-tion of the time available before the proceduresfor handling the specific risk must be initiated,is often considered in the evaluation. The PMmay also choose to rank-order the prioritizationcriteria, e.g., consequence/impact is moreimportant than probability.

A multi-voting method is useful to prioritizerisks (see References-Scholtes, 1988; Linstone,1975). The Delphi method is a simple and ef-fective method of arriving at a consensus amonga group of experts. The procedure is for teammembers to vote on the priority of each riskand tally the results, which are fed back to theteam. Team members vote again and the pro-cess is repeated until no changes occur in theresults. It is normal to reach the final outcome

within a few voting sessions. If there are a largenumber of risks, they may be broken intosmaller groups for ranking. As a general rule,no more than 10 items should be prioritizedper vote. The results of the series of votes aredocumented in the prioritized list of risks.

PM guidance, which operates as a techniquecontrol function, can be used, for example, tospecify prioritization criteria and prescribe theformat of the prioritized list of risks.

5.5.2.1 Risk Aggregation. Figure 5-13 showsthe process for this technique, which relies onqualitative judgment and multi-voting methodsto summarize risks at the critical risk area andprocess level in terms of P

F and C

F. The risks

identified in the RIFs and the prioritized list ofrisks are first grouped according to critical riskareas and processes, and listed in prioritysequence.

Within each area and process, the individual risksare evaluated against a set of established criteriato determine the overall aggregate risk rating forthe area/process. Aggregation criteria needs tobe established separately for P

F and C

F ; P

F and

CF should not be combined into a single index,

Figure 5-12. Risk Prioritization Technique Input and Output

• Review risks for understanding

• Select prioritization criteria

• Select voting criteria

• Vote

• PM Guidance

Input

• Risk InformationForms

Output

• Prioritized List ofRisks

• Program-Level IPT

• Risk Management Coordination

70

e.g., moderate risk. Examples of aggregationcriteria include: (1) most undesirable P

F and C

F

of all the risks within a risk area or process be-comes the aggregated values for the area or pro-cess, or (2) the P

F and C

F for each area or pro-

cess represents the mean value for that area orprocess.

The team then votes on each risk area and pro-cess to determine its rating for P

F and C

F, and

the results are documented. In addition to theP

F and C

F ratings for each critical risk area and

process, those risks that tend to “drive” the ag-gregate risk rating for the area/process shouldbe included in a list of aggregated risks to givesubstance to the aggregated ratings, e.g., allrisks in which either P

F or C

F are rated as high.

Figure 5-14 provides a sample list ofaggregated risks.

Figure 5-14. List of Aggregated Risks

Program XY Risk Status

Risk Area Status: Design PF: Hi CF: Hi

Significant Design Risks:

1. Risk Title: Aircraft Weight PF: Hi CF: Hi

Risk Event: Exceed aircraft weight budget by 10%. Decrease range-payload by 4%.

Action: Developing risk-handling plan. User reviewing requirements.

Risk Area Status: Logistics PF: Hi CF: Mod/Hi

Significant Logistics Risks: etc.

Figure 5-13. Risk Aggregation Technique Input and Output

• Sort risks by critical risk areasand processes

• Review risks for understanding

• Select aggregation criteria

• Select voting criteria

• Vote

• Identify reporting criteria

• PM Guidance

Input

• Risk InformationForms

• Risk PrioritizationList

Output

• List ofAggregatedRisks

• Program-Level IPT

• Risk Management Coordination

71

Risk Matrix is a software tool that is designed toaid in managing the identification, rating, andprioritization of key risks that might affect aproject. It provides a structured method forprioritizing project risks and for tracking thestatus and effects of risk-handling efforts. Themajor feature that Risk Matrix offers the pro-gram office is a means to both rate and rankprogram risks. This is helpful in differentiatingamong risks that have the same rating. Forexample, if a program has eight risks that theprogram office has evaluated/rated as high, RiskMatrix provides the means to rank them in orderof severity. The user can use this ranking as aguide to help focus risk-handling efforts. RiskMatrix was developed by the Air Force Elec-tronic Systems Center (ESC) and The MitreCorporation and is available to program officesat no cost. Another useful software tool to use invoting on risks is “Expert Choice”—based onthe Analytical Hierarchy Process (AHP). What-ever software tool is used, the analyst should rec-ognize that a number of inherent limitation existwith such software tools, (e.g., unintentionallybiasing the voting process) that can lead to erro-neous results.

5.6 RISK-HANDLING TECHNIQUES

5.6.1 General (e.g., Moderate andHigh Risk-Rated Items)

After the program’s risks have been assessed,the PM must develop approaches to handlesignificant ones by analyzing various handlingtechniques and selecting those best fitted to theprogram’s circumstances. The PM shouldreflect these approaches in the program’sacquisition strategy and include the specificson what is to be done to deal with the risk, whenit should be accomplished, who is responsible,and the cost and schedule impact.

As described in Chapter 2, there are essentiallyfour risk-handling techniques, or options. Risk

avoidance eliminates the sources of high risk andreplaces them with a lower-risk solution. Risktransfer is the reallocation of risk from one partof the system to another, or the reallocation ofrisks between the Government and the primecontractor or within Government agencies. Riskcontrol manages the risk in a manner that reducesthe probability/likelihood of its occurrence and/or minimizes the risk’s effect on the program.Risk assumption is the acknowledgment of theexistence of a particular risk situation and a con-scious decision to accept the associated level ofrisk without engaging in any special efforts tocontrol it. There is a tendency on many programsto select “control” as the risk-handling optionwithout seriously evaluating assumption, avoid-ance, and transfer. This is unwise, since controlmay not be the best option, or even appropriateoption in some cases. An unbiased assessmentof risk-handling options should be performed todetermine the most appropriate option.

In determining the “best” overall risk-handlingstrategy to be adopted, a structured approachshould be taken.

A structured approach for developing a risk-handling strategy has been described by Dr.Edmund Conrow in his book Effective RiskManagement: Some Keys to Success. (See Ref-erence.) A risk-handling strategy is composedof the selected risk-handling option and the spe-cific implementation activity. The risk-handlingoption is first chosen, then the best implemen-tation activity is picked for the selected option.This avoids a common mistake —choosing theimplementation activity without first evaluat-ing all four risk-handling (generic) options. Incases where a relatively high risk exists, orwhere the other circumstances dictate, one ormore backup risk-handling strategies may beneeded. In these cases, the selection processissued again to choose the option and imple-mentation activity. The backup strategy mayhave a different option than used in the primary

72

risk-handling strategy, and will certainly havea different implementation activity.

For each evaluated event risk, all potentiallyapplicable options or techniques should beidentified and evaluated, using the followingcriteria:

• Feasibility – Feasibility is the ability to im-plement the handling technique/option and in-cludes an evaluation of the potential impactof the technique/option in the following areas:

– Technical considerations, such as testing,manufacturing, and maintainability,caused by design changes resulting fromrisk-handling techniques.

– Adequacy of budget and scheduleflexibility to apply the technique.

– Operational issues such as usability (man-machine interfaces), transportability, andmobility.

– Organizational and resource considerations,e.g., manpower, training, and structure.

– Environmental issues, such as the use ofhazardous materials to reduce technicalrisk.

– External considerations beyond the imme-diate scope of the program, such as theimpact on other complementary systemsor organizations.

• Cost and schedule implications – The risk-handling techniques have a broad range ofcost implications in terms of dollars, as wellas other limited resources, e.g., critical ma-terials and national test facilities. The mag-nitude of the cost and schedule implicationswill depend on circumstances and can be as-sessed using such techniques as cost-benefit

analyses and the cost and schedule assess-ment techniques previously described. Theapproval and funding of risk-handling tech-niques should be part of the trade-off pro-cess that establishes and refines the CAIVcost and performance goals.

• Effect on the system’s technical perfor-mance – The risk-handling techniques mayaffect the system’s capability to achieve therequired technical performance objectives.This impact must be clearly understood be-fore adopting a specific technique. As therisk-handling techniques are assessed, thePMO should attempt to identify any addi-tional parameters that may become criticalto technical performance as a result of imple-menting them. Trade studies and sensitivityanalyses can be useful in determining theexpected effectiveness of this approach.

Once the risk-handling technique is selected, aset of program management indicators shouldbe developed to provide feedback on programprogress, effectiveness of the risk-handlingoptions selected, and information necessary tomanage the program. These indicators shouldconsist of cost and scheduling data, technicalperformance measures, and program metrics.

Subsequent paragraphs in this section describethe various risk-handling technique: Risk Con-trol, Avoidance, Assumption, Transfer (CAAT).

5.6.2 Risk Control

5.6.2.1 Description. In this risk-handlingtechnique, the Government and contractor takeactive steps to reduce the probability/likelihoodof a risk event occurring and to reduce thepotential impact on the program. The commonname for the control option is “mitigation.” Mostrisk-control steps share two features: they requirea commitment of program resources, and theymay require additional time to accomplish them.

73

Thus, the selection of risk-control actions willundoubtedly require some tradeoff betweenresources and the expected benefit of the actions.Some of the many risk-control actions includethe following:

Multiple Development Efforts – The use oftwo or more independent design teams (usuallytwo separate contractors, although it could alsobe done internally) to create competing systemsin parallel that meet the same performancerequirements.

Alternative Design – Sometimes, a designoption may include several risky approaches,of which one or more must come to fruition tomeet system requirements. However, if thePMO studies the risky approaches, it may bepossible to discover a lower-risk approach (witha lower performance capability). These lower-risk approaches could be used as backups forthose cases where the primary approach(es) failto mature in time. This option presumes thereis some trading room among requirements.Close coordination between the developer andthe user is necessary to implement lowercapability options.

Trade Studies – Systems engineering decisionanalysis methods include trade studies to solvea complex design problem. The purpose of thetrade studies is to integrate and balance allengineering requirements in the design of asystem. A properly done trade study considersrisks associated with alternatives.

Early Prototyping – The nature of a risk canbe evaluated by a prototype of a system (or itscritical elements) built and tested early in thesystem development. The results of the proto-type can be factored into the design and manu-facturing process requirements. In addition tofull-up systems, prototyping is very useful in soft-ware development and in determining a system’sman-machine interface needs. The key to making

prototyping successful as a risk-control tool is tominimize the addition of new requirements tothe system after the prototype has been tested(i.e., requirement changes not derived from ex-perience with the prototype). Also, the tempta-tion to use the prototype design and softwarewithout doing the necessary follow-on design andcoding/manufacturing analyses should beavoided.

Incremental Development – Incrementaldevelopment is completion of the systemdesign and deployment in steps, relying onpre-planned product improvements (P3I) orsoftware improvements after the system is de-ployed to achieve the final system capability.Usually, these added capabilities are not in-cluded originally because of the high risk thatthey will not be ready along with the remain-der of the system. Hence, development is split,with the high-risk portion given more time tomature. The basic system, however, incorpo-rates the provisions necessary to include theadd-on capabilities. Incremental developmentof the initial system requirements are achievedby the basic system.

Technology Maturation Efforts – Technologymaturation is an off-line development effort tobring an element of technology to the neces-sary level so that it can be successfully incor-porated into the system (usually done as part ofthe technology transition process). Normally,technology maturation is used when the desiredtechnology will replace an existing technology,which is available for use in the system. In thosecases, technology maturation efforts are usedin conjunction with P3I efforts. However, it canalso be used when a critical, but immature, tech-nology is needed. In addition to dedicatedefforts conducted by the PMO, Service or DoD-wide technology improvement programs andadvanced technology demonstrations byGovernment laboratories as well as industryshould be considered.

74

Robust Design – This approach uses advanceddesign and manufacturing techniques that pro-mote achieving quality through design. It nor-mally results in products with little sensitivity tovariations in the manufacturing process.

Reviews, Walk Throughs, and Inspections –These three risk control actions can be used toreduce the probability/likelihood and potentialconsequences/impacts of risks through timelyassessments of actual or planned events in thedevelopment of the product. They vary in thedegree of formality, level of participants, andtiming.

Reviews are formal sessions held to assess thestatus of the program, the adequacy and suffi-ciency of completed events, and the intentionsand consistency of future events. Reviews areusually held at the completion of a programphase, when significant products are available.The team conducting the review should have aset of objectives and specific issues to beaddressed. The results should be documentedin the form of action items to be implementedby the PMO or contractor. The type of reviewwill dictate the composition of the review team,which may include developers, users, managers,and outside experts.

A walk through is a technique that can be veryuseful in assessing the progress in the develop-ment of high- or moderate-risk components,especially software modules. It is less formalthan a review, but no less rigorous. The personresponsible for the development of the compo-nent “walks through,” the product development(to include perceptions of what is to be done,how it will be accomplished, and the schedule)with a team of subject-matter experts. The teamreviews and evaluates the progress and plansfor developing the product and provides imme-diate and less formal feedback to the responsibleperson, thus enabling improvements or correc-tive actions to be made while the product is still

under development. This technique is appliedduring the development phases, as opposed toreviews, which are normally held at the comple-tion of a phase or product.

Inspections are conducted to evaluate the cor-rectness of the product under development interms of its design, implementation, test plans,and test results. They are more formal and rig-orous than either reviews or walk throughs andare conducted by a team of experts following avery focused set of questions concerning allaspects of the product.

Design of Experiments – This is an engineer-ing tool that identifies critical design factors thatare difficult to meet.

Open Systems – This approach involves the useof widely accepted commercial specificationsand standards for selected system interfaces,products, practices, and tools. It provides thebasis for reduced life-cycle costs, improved per-formance, and enhanced interoperability,especially for long-life systems with short-lifetechnologies. Properly selected and appliedcommercial specifications and standards canresult in lower risk through increased designflexibility; reduced design time; more predict-able performance; and easier product integra-tion, support, and upgrade. However, a numberof challenges and risks are associated with theuse of the open systems approach and must beconsidered before implementation. These in-clude such issues as: maturity and acceptabil-ity of the standard, and its adequacy for mili-tary use; the loss of control over the develop-ment of products used in the system; the amountof product testing done to ensure conformanceto standards; and the higher configurationmanagement workload required.

See Defense Acquisition Deskbook Section1.2.2.2.5 for a more detailed discussion of theuse of open systems. (Additional information is

75

also available at the Open Systems Joint TaskForce Website at www.acq.osd.mil/osjtf/.)

Use of Standard Items/Software ModuleReuse – The use of standard items and softwaremodule reuse should be emphasized to the ex-tent possible to minimize development risk. Stan-dard items range from components and assem-blies to full-up systems. A careful examinationof the proposed system option will often findmore opportunities for the use of standard itemsor existing software modules than first consid-ered. Even when the system must achieve pre-viously unprecedented requirements, standarditems can find uses. A strong program policy em-phasizing the use of standard items and softwarereuse is often the key to taking advantage of thissource of risk control. Standard items and soft-ware modules have proven characteristics thatcan reduce risk. However, the PMO must be cau-tious when using standard items in environ-ments and applications for which they were notdesigned. A misapplied standard item oftenleads to problems and failure. Similarly, if thecycle for a fielded product extends for manyyears, it is possible that key software tools andproducts will become obsolete or will no longerbe supported. If this occurs, costly redesign mayresult if software re-development is necessary.

Two-Phase Development – This risk controlapproach incorporates a formal risk-reductioneffort in the initial part of the SDD phase. Itmay involve using two or more contractorswith a down-select occurring at a predefinedtime (normally after the preliminary design re-view). A logical extension of this concept is the“spiral” development model, which emphasizesthe evaluation of alternatives and risk assess-ments throughout the system’s development andinitial fielding.

Use of Mockups – The use of mockups, espe-cially man-machine interface mock-ups, can beused to conduct early exploration of design

options. They can assist in resolving designuncertainties and providing users with earlyviews of the final system configuration.

Modeling/Simulation – The use of modelingand simulation can provide insights into asystem’s performance and effectiveness sensi-tivities. Decision makers can use performancepredictions to assess a system’s military worthnot only before any physical prototypes arebuilt, but also throughout the system life cycle.Modeling and simulation can help manage riskby providing information on design capabili-ties and failure modes during the early stagesof design. This allows initial design conceptsto be iterated without having to build hardwarefor testing. The T&E community can use pre-dictive simulations to focus the use of valuabletest assets on critical test issues. They can alsouse extrapolated simulations to expand thescope of evaluation into areas not readily test-able, thus reducing the risk of having the sys-tem fail in the outer edges of the “test envelope.”Additionally, a model can serve as a frameworkto bridge the missing pieces of a completesystem until those pieces become available.

Although modeling and simulation can be avery effective risk-handling tool, it requiresresources, commitment to refine models as thesystem under development matures, and aconcerted verification and validation effort toensure that decisions are based on credibleinformation.

Key Parameter Control Boards – When aparticular parameter (such as system weight)is crucial to achieving the overall program re-quirements, a control board for that parametermay be appropriate. This board has represen-tatives from all affected technical functionsand may be chaired by the PM. It providesmanagement focus on the parameter and sig-nals the importance of achieving the param-eter to the technical community. If staffed

76

properly by all affected disciplines, it can alsohelp avoid sacrificing other program require-ments to achieve that requirement.

Manufacturing Screening – For programs inlate SDD and early production and deployment,various manufacturing screens (including en-vironmental stress screening (ESS)) can be in-corporated into test article production and low-rate initial production to identify deficientmanufacturing processes. ESS is a manufactur-ing process for stimulating parts and workman-ship defects in electronic assemblies and units.These data can then be used to develop theappropriate corrective actions.

Test, Analyze, and Fix (TAAF) – TAAF is theuse of a period of dedicated testing to identifyand correct deficiencies in a design. It was origi-nally conceived as an approach to improvereliability; it can also be used for any systemparameter whose development could benefitfrom a dedicated period of testing and analysis.Although a valuable aid in the development

process, TAAF should not be used in lieu of asound design process.

Demonstration Events – Demonstration eventsare points in the program (usually tests) thatare used to determine if risks are being success-fully abated. Careful review of the planneddevelopment of each risk area will reveal a num-ber of opportunities to verify the effectivenessof the development approach. By including asequence of demonstration events throughoutthe development, PMO and contractor person-nel can monitor the process and identify whenadditional efforts are needed. Demonstrationevents can also be used as information-gather-ing actions, as discussed before, and as part ofthe risk-monitoring process. Table 5-2 containsexamples of demonstration events.

Process Proofing – When particular processes,especially those of manufacturing and support,are critical to achieving system requirements, anearly process proof demonstration is useful toabate risk. If the initial proof is unsuccessful, time

Table 5-2. Examples of Demonstration Events

Three Case Burst Tests

Propellant Characterization

Thermal Barrier Bond Tests

Ignition and Safe/Arm Tests

Nozzle Assembly Tests

10 Development Motor Firings

— Temperature and Altitude Cycle

— Vibration and Shock

— Aging

Test Breadboard

Develop/Test Unique Microcircuits

Build/Test Prototype

Item Demonstration Event Completion Date

Rocket Motor

CentralComputer

By completion of preliminary design

By completion of final design

By completion of preliminary design

By completion of final design

77

is still available to identify and correct deficien-cies or to select an alternative approach.

No single technique or tool is capable of provid-ing a complete answer—a combination must beused. In general, risk-monitoring techniques areapplied to follow through on the planned actionsof the risk-handling program. They track andevaluate the effectiveness of handling activitiesby comparing planned actions with what is ac-tually achieved. These comparisons may be asstraightforward as actual versus planned comple-tion dates, or as complex as detailed analysis ofobserved data versus planned profiles. In anycase, the differences between planned and ac-tual data are examined to determine status andthe need for any changes in the risk-handlingapproach.

PMO personnel should also ensure that the in-dicators/metrics selected to monitor programstatus adequately portray the true state of therisk events and handling actions. Otherwise,indicators of risks that are about to becomeproblems will go undetected. Subsequent sec-tions identify specific techniques and tools thatwill be useful to PMOs in monitoring risks andprovide information on selecting metrics thatare essential to the monitoring effort. The tech-niques focus primarily at the program level,addressing cost, schedule, and performancerisks.

5.6.2.2 Procedures. Risk control involvesdeveloping a risk-reduction plan, with actionsidentified, resourced, and scheduled. Successcriteria for each of the risk-reduction eventsshould also be identified. The effectiveness ofthese actions must be monitored using the typesof techniques described in Section 5.7.

5.6.3 Risk Avoidance

5.6.3.1 Description. This technique reducesrisk through the modification or elimination of

those operational requirements, processes oractivities that cause the risks. Eliminating op-erational requirements requires close coordina-tion with the users. Since this technique resultsin the reduction of risk, it should generally beinitiated in the development of a risk-handlingplan. It can be done in parallel with the initialoperational requirements analysis and shouldbe supported by a cost-benefit analysis.

5.6.3.2 Procedures. Analyzing and reviewingthe proposed system in detail with the user isessential to determine the drivers for each op-erational requirement. Operational require-ments scrubbing involves eliminating those thathave no strong basis. This also provides thePMO and the user with an understanding ofwhat the real needs are and allows them toestablish accurate system requirements for thecritical performance. Operational requirementsscrubbing essentially consists of developinganswers to the following questions:

• Why is the requirement needed?

• What will the requirement provide?

• How will the capability be used?

• Are the requirements specified in terms offunctions and capabilities, rather than aspecific design?

Cost/requirement trade studies are used tosupport operational requirements scrubbing.These trades examine each requirement anddetermine the cost to achieve various levels ofthe requirement (e.g., different airspeeds, range,payloads). The results are then used to deter-mine, with the user, whether a particular re-quirement level is worth the cost of achievingthat level. Trade studies are an inherent part ofthe systems engineering process. (See Desk-book2.6.1 for details on systems engineering process.)

78

5.6.4 Risk Assumption

5.6.4.1 Description. This technique is used inevery program and acknowledges the fact that,in any program, risks exist that will have to beaccepted without any special effort to controlthem. Such risks may be either inherent in theprogram or may result from other risk-control-ling actions (residual risks). The fact that risksare assumed does not mean that they areignored. In fact, every effort should be made toidentify and understand them so that appropri-ate management action can be planned. Also,risks that are assumed should be monitoredduring development; this monitoring should bewell-planned from the beginning.

5.6.4.2 Procedures. In addition to the identi-fication of risks to be assumed, the followingsteps are key to successful risk assumption:

• Identify the resources (time, money, people,etc.) needed to overcome a risk if it materi-alizes. This includes identifying the specificmanagement actions that will be used, forexample, redesign, retesting, requirementsreview, etc.

• Whenever a risk is assumed, a schedule andcost risk reserve should be set aside to coverthe specific actions to be taken if the riskoccurs. If this is not possible, the programmay proceed within the funds and scheduleallotted to the effort. If the program cannotachieve its objectives, a decision must be madeto allocate additional resources, accept a low-er level of capability (lower the requirements),or cancel the effort.

• Ensure that the necessary administrative actionsare taken to quickly report on the risk eventand implement these management actions, suchas contracts for industry expert consultants, ar-rangements for test facilities, etc., and reporton occurrences of the risk event.

5.6.5 Risk Transfer

5.6.5.1 Description. This technique involves thereduction of risk exposure by the reallocation ofrisk from one part of the system to another or thereallocation of risks between the Government andthe prime contractor, or between the prime con-tractor and its sub-contractor.

5.6.5.2 Procedures. In reallocating risk, designrequirements that are risk drivers are transferredto other system elements, which may result inlower system risk but still meet system require-ments. For example, a high risk caused by asystem timing requirement may be lowered bytransferring that requirement from a softwaremodule to a specially designed hardware mod-ule capable of meeting those needs. The effec-tiveness of requirements reallocation dependson good system engineering and design tech-niques. In fact, efficient allocation of thoserequirements that are risk drivers is an integralpart of the systems engineering process. Modu-larity and functional partitioning are two designtechniques that can be used to support this typeof risk transfer. In some cases, this approachmay be used to concentrate risk areas in onearea of the system design. This allows manage-ment to focus attention and resources on thatarea.

For the Government/contractor risk-transferapproach to be effective, the risks transferredto the contractor must be those that the con-tractor has the capacity to control and manage.These are generally risks associated with tech-nologies and processes used in the program —those for which the contractor can implementproactive solutions. The types of risks that arebest managed by the Government include thoserelated to the stability of and external influenceson program requirements, funding, and sched-ule, for example. The contractor can supportthe management of these risks through the de-velopment of flexible program plans, and the

79

incorporation of performance margins in the sys-tem and flexibility in the schedule. A number ofoptions are available to implement risk transferfrom the Government to the contractor: warran-ties, cost incentives, product performance incen-tives, and various types of fixed price contracts.A similar assessment of prime contractor versussub-contractor allocation of risks can also bedeveloped and used to guide risk transfer be-tween these parties.

5.7 RISK MONITORING

5.7.1 General

Risk monitoring is a continuous process tosystematically track and evaluate the perfor-mance of risk-handling actions against estab-lished metrics throughout the acquisitionprocess. It should also include results of peri-odic reassessments of program risk to evaluateboth known and new risks to the program. Ifnecessary, the PMO should reexamine the risk-handling approaches for effectiveness whileconducting assessments. As the program pro-gresses, the monitoring process will identify theneed for additional risk-handling options.

An effective monitoring effort provides infor-mation to show if handling actions are notworking and which risks are on their way tobecoming actual problems. The informationshould be available in sufficient time for thePMO to take corrective action. The functioningof IPTs is crucial to effective risk monitoring. Theyare the “front line” for obtaining indications thathandling efforts are achieving their desiredeffects.

The establishment of a management indicatorsystem that provides accurate, timely, andrelevant risk information in a clear, easilyunderstood manner is key to risk monitoring.Early in the planning phase of the process, PMOsshould identify specific indicators to be

monitored and information to be collected, com-piled, and reported. Usually, documentation andreporting procedures are developed as part ofrisk management planning before contract awardand should use the contractor’s reporting sys-tem. Specific procedures and details for risk re-porting should be included in the risk manage-ment plans prepared by the Government and thecontractor.

To ensure that significant risks are effectivelymonitored, handling actions (which include spe-cific events, schedules, and “success” criteria)developed during previous risk managementphases should be reflected in integrated programplanning and scheduling. Identifying these han-dling actions and events in the context of WBSelements establishes a linkage between themand specific work packages, making it easierto determine the impact of actions on cost,schedule, and performance. The detailed infor-mation on risk-handling actions and eventsshould be contained in various risk managementdocumentation (both formal and informal).Experience has shown that the use of an elec-tronic on-line database that stores and permitsretrieval of risk-related information is almostessential to effective risk monitoring. Thedatabase selected or developed will depend onthe program. A discussion of risk managementinformation systems and databases and sug-gested data elements to be included in the data-bases is contained later in this chapter.

5.7.2 Earned Value Management

5.7.2.1 Description. Earned value (EV) is amanagement technique that relates resource plan-ning to schedules and to technical performancerequirements. It is useful in monitoring the ef-fectiveness of risk-handling actions in that it pro-vides periodic comparisons of the actual workaccomplished in terms of cost and schedule withthe work planned and budgeted. These compari-sons are made using a performance baseline that

80

is established by the contractor and the PM atthe beginning of the contract period. This is ac-complished through the Integrated Baseline Re-view (IBR) process. The baseline must capturethe entire technical scope of the program in de-tailed work packages. The baseline also includesthe schedule to meet the requirements as well asthe resources to be applied to each work pack-age. Specific risk-handling actions should be in-cluded in these packages. See Defense Acquisi-tion Deskbook Section 2.B.2.1 for a more de-tailed discussion of Earned Value and IBR.

5.7.2.2 Procedures. The periodic EV data canprovide indications of risk and the effectivenessof handling actions. When variances in cost orschedule begin to appear in work packagescontaining risk-handling actions, or in anywork package, the appropriate IPTs can ana-lyze the data to isolate causes of the variancesand gain insights into the need to modify orcreate handling actions.

5.7.3 Technical PerformanceMeasurement

5.7.3.1 Description. Technical performancemeasurement (TPM) is a technique that com-pares estimated values of key performanceparameters with achieved values, and deter-mines the impact of any differences on systemeffectiveness. This technique can be useful inrisk monitoring by comparing planned andachieved values of parameters in areas of knownrisk. The periodic application of this techniquecan provide early and continuing predictionsof the effectiveness of risk-handling actions orthe detection of new risks before irrevocableimpacts on the cost or schedule occur.

5.7.3.2 Procedures. The technical perfor-mance parameters selected should be those thatare indicators of progress in the risk-handlingaction employed. They can be related to sys-tem hardware, software, human factors, and

logistics—any product or functional area of thesystem. Parameter values to be achievedthrough the planned handling action are fore-cast in the form of planned performance pro-files. Achieved values for these parameters arecompared with the expected values from theprofile, and any differences are analyzed to getan indication of the effectiveness of the handlingaction. For example, suppose a system requiresthe use of a specific technology that is not yetmature and the use of which has been assessedas high risk. The handling technique selectedis risk control, and an off-line technology matu-ration effort will be used to get the technologyto the level where the risk is acceptable. Thetechnology is analyzed to identify those param-eters that are key drivers, and performance pro-files that will result from a sufficiently maturetechnology are established. As the maturationeffort progresses, the achieved values of theseparameters are compared with the planned pro-file. If the achieved values meet the plannedprofile, it is an indicator that the risk-handlingapproach is progressing satisfactorily; if theachieved values fall short of the expected values,it is an indicator that the approach is failing tomeet expectations and corrective action may bewarranted.

5.7.4 Integrated Planning and Scheduling

5.7.4.1 Description. Once a contract has beenawarded, techniques such as integrated planningand scheduling (integrated master plans and in-tegrated master schedules) can become invalu-able program baseline and risk-monitoring tools.Integrated planning identifies key events, mile-stones, reviews, all integrated technical tasks, andrisk-reduction actions for the program, alongwith accomplishment criteria to provide a de-finitive measure that the required maturity orprogress has been achieved. Integrated schedul-ing describes the detailed tasks that support thesignificant activities identified in integrated plan-ning and timing of tasks. Also, the integrated

81

schedule can include the resources planned tocomplete the tasks. The events, tasks, andschedule resulting from integrated planning arelinked with contract specification requirements,WBS, and other techniques such as TPM. Whenthe events and tasks are related to risk-reduc-tion actions, this linkage provides a significantmonitoring tool, giving specific insights into therelationships among cost, schedule, and perfor-mance risks.

5.7.4.2 Procedures. In integrated planning, theGovernment and contractor (or other perform-ing activity) should identify key activities of theprogram, to include risk-handling actions andsuccess criteria. The contractor should thenprepare the integrated schedule reflecting theplanned completion of tasks associated with

these activities. As the program progresses, thePMO can monitor effectiveness of handlingactivities included in the integrated planningevents and schedule by comparing observed ac-tivity results with their criteria and determiningany deviations from the planned schedule. Anyfailures of handling actions to meet either theevent criteria or schedule should be analyzedto determine the deviation’s impact, causes, andneed for any modifications to the risk-handlingapproach.

5.7.5 Watch List

5.7.5.1 Description. The watch list is a list-ing of critical areas which management shouldpay special attention to during program ex-ecution. It is a straightforward, easily prepared

Table 5-3. Watch List Example

Potential Risk Area Risk Reduction Actions Action Code

• Accuratelypredicting shockenvironmentshipboardequipment willexperience.

• Evaluatingacoustic impactof the shipsystems that arenot similar toprevious designs.

31 Aug 01

31 Aug 02

31 Aug 01

31 Aug 02

Due Date Date Completed Explanation

• Use multiple finiteelement codes &simplified numericalmodels for earlyassessments.

• Shock test simpleisolated deck, andproposed isolatedstructure to improveconfidence inpredictions.

• Concentrate onacoustic modelingand scale testing oftechnologies notdemonstratedsuccessfully in large-scale tests or full-scale tests.

• Factor acousticsignature mitigationfrom isolated modulardecks into systemrequirements.Continue model teststo validate predictionsfor isolated decks.

SEA 03P31

SEA 03P31

SEA 03TC

SEA 03TC

82

document that is derived from a prioritized listof risks. It may include such things as the prior-ity of the risk, how long it has been on the watchlist, handling actions, planned and actual comple-tion dates for handling actions, and explanationsfor any differences. See Table 5-3 for an examplewatch list.

5.7.5.2 Procedures. Watch list developmentis based on the results of the risk assessment. Itis common to keep the number of risks on thewatch list relatively small, focusing on thosethat can have the greatest impact on the pro-gram. Items can be added as the program un-folds and periodic reassessments are conducted.If a considerable number of new risks are sig-nificant enough to be added to the watch list, itmay be an indicator that the original assessmentwas not accurate and that program risk is greater

than initially thought. It may also indicate thatthe program is on the verge of becoming out ofcontrol. If a risk has been on the watch list for along time because of a lack of risk-handlingprogress, a reassessment of the risk or the han-dling approach may be necessary. Items on thewatch list should be reviewed during the vari-ous program reviews/meetings, both formal andinformal.

5.7.6 Reports

5.7.6.1 Description. Reports are used to con-vey information to decision makers and programteam members on the status of risks and the ef-fectiveness of risk-handling actions. Risk-relatedreports can be presented in a variety of ways,ranging from informal verbal reports when timeis of the essence to formal summary-type reports

Figure 5-15. Example Showing Detailed List of Top-Level Risk Information

Risk Management Status

Moderate LowHigh Status/CommentRisk IssueRisk

Plan #

98-12-9

98-12-10

98-12-11

98-12-12

98-12-13

98-12-14

98-12-15

98-12-51

98-12-16

Non-stock Listed Spares

Engineering Updates

Spares & Support

Long Lead Requisitions

T.O. Validation

Lack of LSA Records forGFE*

Program Parts Obsolescence

Design Maturity

System Y Interface Definition

Closed

Closed

Closed

Closed

Data still in review; need toassign part numbers.

Data reviewed; updates notrequired at this time.

Spares listing approved indefinitization conference. Nocurrent abatement plan.

Closed Issue.

Contractor LSA plansubmitted for approval;rescheduled for 5/95.Analysis in work, identifyinglast opportunity buys.

Studying Commercial MixInterface.

Questions about antennalocation and cable raised risk.

(* Detail of highlighted item described in Figure 5-16.)

83

presented at milestone reviews. The level of de-tail presented will depend on the audience.

5.7.6.2 Procedures. Successful risk manage-ment programs include timely reporting ofresults of the monitoring process. Reportingrequirements and procedures, to include for-mat and frequency, are normally developed aspart of risk management planning and aredocumented in the risk management plan.Reports are normally prepared and presentedas part of routine program management activi-ties. They can be effectively incorporated intoprogram management reviews and technicalmilestones to indicate any technical, schedule,and cost barriers to the program objectives andmilestones being met. One example of a sta-tus presentation is shown in Figure 5-15. Itshows some top-level risk information that can

be useful to the PMO as well as others exter-nal to the program.

Although this level of reporting can providequick review of overall risk status for identifiedproblems, more detailed risk planning and sta-tus can be provided on individual risk items. Forexample, some program IPTs have combined risklevel and scheduled activities to provide a graphi-cal overview of risk status for either internal orexternal review. One method for graphicallyshowing risk status for an individual item isshown in Figure 5-16.

5.7.7 Management Indicator System

5.7.7.1 Description. A management indicatorsystem is a set of indicators or metrics that pro-vide the PMO with timely information on the

Figure 5-16. Example of More Complex Combination of Risk Level and Scheduled Tasks

- Action Open

Lack of Support Records for GFE

97 1998

D J F M A M J J A S O N D

D J F M A M J J A S O N D

97 1998

LOW

HIGH

MODERATE

CLOSE ISSUE

NOTE: PLAN WILL BE APPROVED EARLY

2. CONTRACTOR SUBMITS PLAN TO PMO FOR APPROVAL.

1. PMO PREVIOUSLY AUTHORIZED CONTRACTOR TO USE “SIMILAR TO”DATA WHEN GFE SUPPORT DATA IS UNAVAILABLE. DOCUMENTED INPLAN UNDER DEVELOPMENT.

4. PLAN APPROVED

3. PMO REVIEWING PLAN

- Action Completed

84

Table 5-4. Examples of Product-Related Metrics

• Key DesignParameters– Weight– Size– Endurance– Range

• Design Maturity

– Open problemsreports

– Number ofengineering changeproposals

– Number of drawingsreleased

– Failure activities

• Computer ResourceUtilization

Engineering Requirements SupportProduction

• RequirementsTraceability

• Requirements Stability

• Manufacturing Yields

• Incoming MaterialYields

• DelinquentRequisitions

• Unit Production Cost

• Process Proofing

• Special Tools and TestEquipment

• Support InfrastructureFootprint

• Manpower Estimates

status of the program and risk-handling actions,and is essential to risk monitoring and programsuccess. To be meaningful, these metrics shouldhave some objective value against which ob-served data can be measured, reflecting trendsin the program or lack thereof. Metrics shouldbe developed jointly by the PMO and the con-tractor. The contractor’s approach to metricsshould be a consideration in the proposal evalu-ation process. If the contractor does not have anestablished set of metrics, this may be an area ofrisk that will need to be addressed.

5.7.7.2 Procedures. Metrics can be catego-rized as relating to technical performance, cost,and schedule. Technical performance metricscan be further broken down into categoriessuch as engineering, production, and support,and within these groups as either product- orprocess-related. Product-related metrics per-tain to characteristics of the system being de-veloped; they can include such things asplanned and demonstrated values of the criti-cal parameters monitored as part of the TPM

process and system-unique data pertaining tothe different steps in the development and ac-quisition processes. Table 5-4 provides ex-amples of product-related metrics.

Process metrics pertain to the various processesused in the development and production ofthe system. For each program, certain pro-cesses are critical to the achievement of pro-gram objectives. Failure of these processes toachieve their requirements is symptomatic ofsignificant problems. Metrics data can be usedto diagnose and aid in problem resolution.They should be used in formal, periodic per-formance assessments of the various devel-opment processes and to evaluate how wellthe system development process is achievingits objectives. DoD 4245.7M, Transition fromDevelopment to Production, and other sup-porting documents such as NAVSO P-6071,Best Practices, identify seven process areas:funding, design, test, production, facilities,logistics, and management. Within each ofthese areas, a number of specific processes are

85

Table 5-5. Examples of Process Metrics

identified as essential to assess, monitor, andestablish program risk at an acceptable level;the documents also provide risk indicators thatcan be used as the basis for selecting specificprocess metrics. Another document, Methodsand Metrics for Product Success, July 1994,published by the Office of the Assistant Sec-retary of the Navy (RD&A), Product Integ-rity Directorate, provides a set of metrics foruse in assessing and monitoring the design,test, and production risk areas. Table 5-5 pro-vides examples of process-related metrics.

Cost and schedule metrics can be used to de-pict how the program is progressing towardcompletion. The information provided by thecontractor in the earned value managementsystem can serve as these metrics, showinghow the actual work accomplished compareswith the work planned in terms of scheduleand cost. Other sources of cost and schedulemetrics include the contractor’s costaccounting information and the integratedmaster schedule. Table 5-6 provides examplesof cost and schedule metrics.

• Developmentof require-mentstraceabilityplan

• Developmentof specifica-tion tree

• Specificationsreviewed for:

– Definition ofall useenviron-ments

– Definition ofall func-tionalrequire-ments foreachmissionperformed

Integrated TestPlan

DesignProcess

FailureReporting

SystemTrade

StudiesDesign

Requirements

• Users needsprioritized

• Alternativesystemconfigura-tions selected

• Test methodsselected

• Designrequirementsstability

• Producibilityanalysisconducted

• Designanalyzed for:

– Cost

– Partsreduction

– Manufac-turability

– Testability

• All develop-mental testsat systemand sub-system levelidentified

• Identificationof who will totest (Govern-ment,contractor,supplier)

• Contractorcorporate-level manage-ment involvedin failurereporting andcorrectiveactionprocess

• Responsibilityfor analysisand correctiveactionassigned tospecificindividual withclose-out date

ManufacturingPlan

• Plan docu-mentsmethods bywhich designto be built

• Plan containssequence andschedule ofevents atcontractorand sub-contractorlevels thatdefines useof materials,fabricationflow, testequipment,tools, facili-ties, andpersonnel

• Reflectsmanufactur-ing inclusionin designprocess.Includesidentificationand assess-ment ofdesignfacilities

86

5.8 RISK MANAGEMENTINFORMATION SYSTEMSAND DOCUMENTATION

5.8.1 Description

To manage risk, PMs should have a databasemanagement system that stores and allowsretrieval of risk-related data. The risk-manage-ment information system provides data forcreating reports and serves as the repository forall current and historical information related torisk. This information may include risk as-sessment documents, contract deliverables, ifappropriate, and any other risk-related reports.The PM should consider a number of factors inestablishing the management information sys-tem and developing rules and procedures forthe reporting system:

• Assign management responsibility for thereporting system;

• Publish any restrictions for entering data intothe database;

• Identify reports and establish a schedule, ifappropriate;

• Use standard report formats as much aspossible;

• Ensure that the standard report formatssupport all users, such as the PM, IPTs, andIIPTs

• Establish policy concerning access to the re-porting system and protect the database fromunauthorized access.

With a well-structured information system, aPMO may create reports for senior managementand retrieve data for day-to-day programmanagement. Most likely, the PM will choosea set of standard reports that suits specific needson a periodic basis. This eases definition of thecontents and structure of the database. In addi-tion to standard reports, the PMO will need tocreate ad hoc reports in response to special que-ries, etc. Commercial database programs nowavailable allow the PMO to create reports withrelative ease. Figure 5-17 shows a concept fora management and reporting system.

5.8.2 Risk Management Reports

The following are examples of basic reports thata PMO may use to manage its risk program.Each office should tailor and amplify them, ifnecessary, to meet specific needs.

Risk Information Form (RIF). The PMOneeds a document that serves the dual purposeof a source of data entry information and a re-port of basic information for the IPTs. The RIFserves this purpose. It gives members of theproject team, both Government and contractors,a format for reporting risk-related information.The RIF should be used when a potential riskevent is identified and updated over time as in-formation becomes available and the status

Cost variance

Cost performance index

Estimate at completion

Management reserve

ScheduleCost

Schedule variance

Schedule performance index

Design schedule performance

Manufacturing schedule performance

Test schedule performance

Table 5-6. Examples of Cost and Schedule Metrics

87

changes. As a source of data entry, the RIFallows the database administrator to controlentries. To construct the database and ensurethe integrity of data, the PMO should design astandard format for a RIF.

Risk Assessment Report (RAR). Risk assess-ments form the basis for many program deci-sions, and the PM will probably need a detailedreport of any assessment of a risk event. A RARis prepared by the team that assessed a risk eventand amplifies the information in the RIF. Itdocuments the identification and analysis pro-cess and results. The RAR provides informa-tion for the summary contained in the RIF, isthe basis for developing risk-handling plans, andserves as a historical recording of program riskassessment. Since RARs may be large docu-ments, they may be stored as files. RARs shouldinclude information that links it to the appropri-ate RIF.

Risk-Handling Documentation. Risk-hand-ling documentation may be used to provide thePM with the information he needs to choosethe preferred mitigation option and is the basis

for the handling plan summary that is containedin the RIF. This document describes the exami-nation process for the risk-handling options andgives the basis for the selection of the recom-mended choice. After the PM chooses anoption, the rationale for that choice may be in-cluded. There should be a plan for each risk-mitigation task. Risk-handling plans are basedon results of the risk assessment. This docu-ment should include information that links it tothe appropriate RIF.

Risk Monitoring Documentation. The PMneeds a summary document that tracks thestatus of high and moderate risks. He can pro-duce a risk-tracking list, for example, that usesinformation that has been entered from the RIF.Each PMO should tailor the tracking list to suitits needs. If elements of needed information arenot included in the RIF, they should be added tothat document to ensure entry into the database.

Database Management System (DBMS). TheDBMS that the PM chooses may be commercial,Government-owned, or contractor-developed.It should provide the means to enter and access

Figure 5-17. Conceptual Risk Management and Reporting System

Risk Management Concept

Request orCreate Report

Submit DataFor Entry

Request Reports or Information(Controlled Access)

StandardReports

Ad HocReports

HistoricalData

Data BaseManagement

System

RiskCoordinator

IPTs

Functional

Contractor

Other

88

data, control access, and create reports. Manyoptions are available to users.

Key to the MIS are the data elements that residein the database. The items listed in Table 5-7 areexamples of risk information that might be in-cluded in a database that supports risk manage-ment. They are a compilation of several risk re-porting forms used in current DoD programs andother risk document sources. “Element” is thetitle of the database field; “Description” is a sum-mary of the field contents. PMs should tailor thelist to suit their needs.

5.9 SOFTWARE RISK MANAGEMENTMETHODOLOGIES

The management of risk in software intensiveprograms is essentially the same as for any othertype of program. A number of methodologiesspecifically focus on the software aspects ofdevelopmental programs and can be usefulin identifying and analyzing risks associatedwith software. Several of these methodolo-gies are described in the U.S. Air Force publi-cation, Guide to Software Acquisition and Man-agement. Three of these methodologies aredescribed below.

5.9.1 Software Risk Evaluation (SRE)

This is a formal approach developed by the Soft-ware Engineering Institute (SEI) using a riskmanagement paradigm that defines a continu-ous set of activities to identify, communicate,and resolve software risks. These activities areto identify, analyze, plan, track, and control.(The SEI activities are analogous to the activi-ties of the risk management process defined inthis section.)

This methodology is initiated by the PM, who tasksan independent SRE team to conduct a risk evalu-ation of the contractor’s software development ef-fort. The team executes the following SRE

functions in performing this evaluation, and pre-pares findings that will provide the PM with theresults of the evaluation:

• Detection of the software technical riskspresent in the program. An SEI Taxonomy-Based Questionnaire is used to ensure thatall areas of potential risk are identified. Thisquestionnaire is based on the SEI SoftwareDevelopment Risk Taxonomy, which pro-vides a systematic way of organizing andeliciting risks within a logical framework.

• Specification of all aspects of identifiedtechnical software risks, including theirconditions, consequences/impacts, andsource.

• Assessment of the risks to determine the prob-ability of risk occurrence and the severity ofits consequences/impacts.

• Consolidation of the risk data into a conciseformat suitable for decision making.

A detailed discussion of the SRE methodologyis found in Software Engineering Institute Tech-nical Report CMU/SEI-94-TR-19, SoftwareRisk Evaluation Model, Version 1.0, December1994.

5.9.2 Boehm’s Software RiskManagement Method

This risk management methodology, developedby Barry W. Boehm and described in IEEE Soft-ware, Software Risk Management: Principlesand Practices, January 1991, consists of twoprimary steps, each with three subordinatesteps. This risk management structure is shownin Table 5-8.

Boehm provides a number of techniques that canbe used to accomplish each of the steps in themethodology. For example, to assist in risk

89

Table 5-7. Database Management System Elements

Element Description

Risk Identification(ID) Number

Risk Event

Priority

Data SubmittedMajor System/ComponentSubsystem/Functional AreaCategory

Statement of Risk

Description ofRisk

KeyParameters

Assessment

Analyses

Probability ofOccurrenceConsequence

Time SensitivityOther AffectedAreas

Risk HandlingPlansRisk MonitoringActivityStatus

Status Due DateAssignmentReported By

Identifies the risk and is a critical element of information, assuming that arelational database will be used by the PMO. (Construct the ID number toidentify the organization responsible for oversight.)

States the risk event and identifies it with a descriptive name. The statementand risk identification number will always be associated in any report.

Reflects the importance of this risk priority assigned by the PMO compared toall other risks, e.g., a one (1) indicates the highest priority.

Gives the date that the RIF was submitted.

Identifies the major system/component based on the WBS.

Identifies the pertinent subsystem or component based on the WBS.

Identifies the risk as technical/performance cost or schedule or combination ofthese.

Gives a concise statement (one or two sentences) or the risk.

Briefly describes the risk. Lists the key processes that are involved in thedesign, development, and production of the particular system or subsystem. Iftechnical/performance, includes how it is manifested (e.g., design andengineering, manufacturing, etc.).

Identifies the key parameter, minimum acceptable value, and goal value, ifappropriate. Identifies associated subsystem values required to meet theminimum acceptable value and describes the principal events planned todemonstrate that the minimum value has been met.

States if an assessment has been done. Cites the Risk Assessment Report, ifappropriate.

Briefly describes the analysis done to assess the risk. Includes rationale andbasis for results.

States the likelihood of the event occurring, based on definitions in theprogram’s Risk Management Plan.

States the consequence of the event, if it occurs, based on definitions in theprogram’s Risk Management Plan.

Estimates the relative urgency for implementing the risk-handling option.

If appropriate, identifies any other subsystem or process that this risk affects.

Briefly describes plans to mitigate the risk. Refers to any detailed plans thatmay exist, if appropriate.

Measures using metrics for tracking progress in implementing risk-handlingplans and achieving planned results for risk reduction.

Briefly reports the status of the risk-handling activities and outcomes relevantto any risk handling milestones.

Lists date of the status report.

Lists individual assigned responsibility for mitigation activities.

Records name and phone number of individual who reported the risk.

90

Table 5-8. Software Risk Management Steps

Description

• Produces lists of project specific riskevents

• Assesses probability of risk event andconsequences

• Assesses compound risk resulting fromrisk event interaction

• Produces rank-ordered list of identifiedand analyzed risk events

• Produces plan for addressing each riskevent

• Integrates individual risk event planswith each other and the overall plan

• Establishes the environment andactions to resolve or eliminate risks

• Tracks progress in resolving risks

• Provides feedback for refiningprioritization and plans

Secondary Steps

Risk Identification

Risk Analysis

Risk Prioritization

Risk Management Planning

Risk Resolution

Risk Monitoring

Risk Assessment

Risk Control

Primary Steps

Table 5-9. Top 10 Software Risks

Staffing with top talent; job matching team building; key personnelagreements; cross training

Detailed multisource cost and schedule estimation; design-to-cost;incremental development; software reuse; requirements scrubbing

Organizational analysis; mission analysis; operations conceptformulation; user surveys; prototyping; early users’ manuals

Task analysis; prototyping; scenarios; user characterization(functionality, style, workload)

Requirements scrubbing; prototyping; cost/benefit analysis;design-to-cost

High change threshold; information hiding; incrementaldevelopment (defer changes to later increments)

Benchmarking; inspections; reference checking; compatibilityanalysis

Reference checking; pre-award audits; award-fee contracts;competitive design or prototyping; team building

Simulation; benchmarking; modeling; prototyping; instrumentation;tuning

Technical analysis; cost-benefit analysis; prototyping; referencechecking

Personnel Shortfalls

Unrealistic schedules andbudgets

Developing the wrong softwarefunctions

Developing wrong user interface

Goldplating

Continuing stream ofrequirements changes

Shortfalls in externally furnishedcomponents

Shortfalls in internally performedtasks

Real-time performance shortfalls

Straining computer sciencecapabilities

Risk Risk Management Techniques

91

Best Practices Initiative Risk Management Method

Address theProblem

• Recognize that allsoftware has risk

• Attempt to resolverisk as early aspossible when costimpact is less thanit will be later indevelopment

Check Status

• Risk Officer appointed?

• Risk databases set up?

• Risk assessments have clearimpact on program plans anddecisions?

• Frequency and timeliness of riskassessment updates consistentwith decision updates?

• Objective criteria used to identify,assess, and manage risk?

• Information flow patterns andreward criteria support identificationof risk by all program personnel?

• Risks identified throughout entirelife cycle?

• Risk management reserve exist?

• Risk profile for every risk, andcomponents updated regularly?

• Risk management plan has explicitprovisions for altering decisionmakers when risk becomesimminent?

Table 5-10. Best Practices Initiative Risk Management Method

Practice Essentials

• Identify risks

• Decriminalize risk

• Plan for risk

• Formally designate a Risk Officer

• Include in budget and schedule a riskreserve buffer of time, money, and otherresources

• Compile database for all non-negligiblerisks

• Prepare profile for each risk showingprobability and consequences

• Include all risks over full life cycle

• Provide frequent risk status reports thatinclude:

– Top 10 risk items

– Number of risk items resolved

– Number of new risk items

– Number of risk items unresolved

– Unresolved risk items on critical path

• Probably costs for unresolved risks

identification, he includes the top 10 top-levelsoftware risks, based on surveys of experiencedsoftware project managers. These risks areshown in Table 5-9, along with recommendedtechniques to manage them. Using this list as astarting point, managers and engineers can thendevelop lists of lower-level risks to be assessedand resolved.

5.9.3 Best Practices InitiativeRisk Management Method

The Software Acquisition Best Practices Initia-tive was instituted in 1994 to improve and re-structure the software acquisition management

process through the identification of effectivepractices used in successful software develop-ments. One result of this effort was the publi-cation of the Program Manager’s Guide to Soft-ware Acquisition Best Practices by the SoftwareProgram Managers Network (SPMN). Thisdocument identified nine principal best practicesthat are essential to the success of any large-scalesoftware development. The first of these nine isformal risk management. To assist in implement-ing this top practice, SPMN developed a three-part methodology consisting of the followingsteps: address the problem; practice essentials;and check status. Specific activities associatedwith these steps are shown in Table 5-10.

92

SPMN provides PMOs with specialized train-ing programs covering the core disciplines andtechniques for implementing this formal riskmanagement practice, as well as the other bestpractices. SPMN also has available (or underdevelopment) a number of guidebooks designedto provide software developers and PMs withpractical guidance for planning, implementing,and monitoring their programs. SPMN can beaccessed on the Internet at http://spmn.com/.

Risk Grouping Software Risk Issue

Project-Level

Project Attributes

Management

Engineering

WorkEnvironment

Other

1. Excessive, immature, unrealistic or unstable requirements

2. Lack of involvement

3. Underestimation of project complexity or dynamic natures

4. Performance shortfalls (includes errors and quality)

5. Unrealistic cost or schedule (estimates and/or allocated amounts)

6. Ineffective project management (possible at multiple levels)

7. Ineffective integration, assembly and test; quality control; specialtyengineering; systems engineering or (possible at multiple levels)

8. Unanticipated difficulties associated with the user interface

9. Immature or untried design, processes or technologies selected

10. Inadequate work plans or configuration control

11. Inappropriate methods or tool selection or inaccurate metrics

12. Poor planning

13. Inadequate or excessive documentation or review process

14. Legal or contractual issues (e.g., litigation, malpractice, ownership)

15. Obsolescence (includes excessive schedule length)

16. Unanticipated difficulties with subcontracted items

17. Unanticipated maintenance and/or support costs

Table 5-11. Software Risk Grouping

In addition to the studies by Barry Boehm, andinformation on the SPMN, a survey was con-ducted by Conrow and Shishido (See Reference)which evaluated 10 prior studies and categorizedthe resulting risk issues across the studies intosix categories and 17 total issues, as shown inTable 5-11. The very high degree of overlap be-tween risk issues identified in the 10 underlyingstudies suggest that some risk issues are com-mon to many software-intensive projects.

A-1

APPENDIX A

DOD RISK MANAGEMENTPOLICIES AND PROCEDURES

1. DoDD 5000.1 The Defense AcquisitionSystem, 23 October 2000, Change 1, 4January 2001

Para 4.5.1. Tailoring

There is no one best way to structure an acquisi-tion program so that it accomplishes the objectivesof the Defense Acquisition System. Decisionmakers and program managers shall tailor acquisi-tion strategies to fit the particular conditions of anindividual program, consistent with common sense,sound business management practice, applicablelaws and regulations, and the time-sensitive natureof the user’s requirement. Proposed programs mayenter the acquisition process at various decisionpoints, depending on concept and technology ma-turity. Tailoring shall be applied to various aspectsof the acquisition system, including program docu-mentation, acquisition phases, the timing and scopeof decision reviews, and decision levels. Milestonedecision authorities shall promote flexible, tailoredapproaches to oversight and review based on mu-tual trust and a program’s dollar value, risk, andcomplexity.

Para 4.5.4. Simulation-Based Acquisition

Program managers shall plan and budget for ef-fective use of modeling and simulation to reducethe time, resources, and risk associated with theentire acquisition process; increase the quality,military worth and supportability of fielded sys-tems; and reduce total ownership costs through-out the system life cycle.

DoD policies and procedures that address riskmanagement for acquisition programs are con-tained in five key documents:

1. DoD Directive (DoDD) 5000.1, The DefenseAcquisition System;

2. DoD Instruction (DoDI) 5000.2, Operationof the Defense Acquisition System;

3. DoD Regulation 5000.2-R, Mandatory Pro-cedures for Major Defense Acquisition(MDAPs) and Major AutomatedInformation System (MAIS) AcquisitionPrograms;

4. DoDD 5000.4, OSD Cost Analysis Improve-ment Group; and

5. DoD Manual 5000.4-M, Cost AnalysisGuidance and Procedures.

The relevant sections of each document arereferenced in the Defense Acquisition Deskbookunder Mandatory Direction and are displayedunder DoD-Wide Practices. They present strongstatements on the need for risk management butcollectively are not sufficient to enable theestablishment of an effective risk managementprogram. The following are verbatim extracts ofsections of the DoD 5000 series of documentsthat address risk management as part of acquisi-tion policy and procedures. The reader shouldbe aware that changes to the 5000 series couldresult in different paragraph numbers.

A-2

2. DoD Instruction 5000.2. Operation of theDefense Acquisition System, 5 April 2002

Para 4.6.2.2.4. Information Superiority

All programs shall be managed and engineeredusing best processes and practices to reducesecurity risks; ensure programs are synchronized;be designed to be mutually compatible withother electric or electronic equipment and the op-erational electromagnetic environment; identifyCritical Program Information that requires pro-tection to prevent unauthorized disclosure or in-advertent transfer of leading-edge technologiesand sensitive data or systems; require harden-ing, redundancy, or other physical protectionagainst attack; be certified for spectrum support-ability; and comply with the provisions of theClinger-Cohen Act (CCA) (reference (m))….

Para 4.7.1.5. The Defense AcquisitionManagement Framework – General

…Acquisitions shall be structured in such a waythat undue risk (such as through the use of firmfixed price options that cover more than fiveyears) is not imposed on contractors, and so thatcontractor investment (beyond normal workingcapital and investments for plant, equipment, etc.)is not required….

Para 4.7.1.9. The Defense AcquisitionManagement Framework – General

…Milestone decision authorities shall promoteflexible, tailored approaches to oversight andreview based on mutual trust and a program’sdollar value, risk, and complexity.

Para 4.7.2.3. Technological OpportunityActivities

…The S&T Program is uniquely positioned toreduce the risks of promising technologies beforethey are assumed in the acquisition process….

Para 4.7.2.3.2.4. Technology TransitionObjectives

For those technologies with the most promisefor application to weapon systems or AISs, beresponsible for maturing technology to a readi-ness level that puts the receiving MDA at lowrisk for systems integration and acceptable tothe cognizant MDA, or until the MDA is nolonger considering that technology.

Para 4.7.2.4.3.1. Concept Exploration

Concept Exploration typically consists of com-petitive, parallel, short-term concept studies. Thefocus of these efforts is to define and evaluatethe feasibility of alternative concepts and to pro-vide a basis for assessing the relative merits (i.e.,advantages and disadvantages, degree of risk,etc.) of these concepts….

Para 4.7.2.4.6. Component AdvancedDevelopment

The project shall exit Component Advanced De-velopment when a system architecture has beendeveloped and the component technology hasbeen demonstrated in the relevant environment orthe MDA decides to end this effort. This effort isintended to reduce risk on components and sub-systems that have only been demonstrated in alaboratory environment and to determine the ap-propriate set of subsystems to be integrated into afull system….

Para 4.7.3.2.1.1. Begin Development andDevelop and Demonstrate Systems –General

The purpose of the System Development andDemonstration phase is to develop a system,reduce program risk, ensure operationalsupportability, design for producibility, ensureaffordability, ensure protection of Critical

A-3

Program Information, and demonstrate systemintegration, interoperability, and utility….

Para 4.7.3.2.3.1.2. Milestone ApprovalConsiderations

…For shipbuilding programs, the lead shipengineering development model shall be autho-rized at Milestone B. Critical systems for the leadand follow ships shall be demonstrated given thelevel of technology maturity and the associatedrisk prior to ship installation….

Para 4.7.3.2.3.4.1. Entry into SystemDevelopment and Demonstration

Milestone B approval can lead to SystemIntegration or System Demonstration. Regardlessof the approach recommended, PMs and other ac-quisition managers shall continually assess programrisks. Risks must be well understood, and riskmanagement approaches developed, before deci-sion authorities can authorize a program to pro-ceed into the next phase of the acquisition process.Risk management is an organized method of iden-tifying and measuring risk and developing, select-ing, and managing options for handling these risks.The types of risk include, but are not limited to,schedule, cost, technical feasibility, threat, risk oftechnical obsolescence, security, software manage-ment, dependencies between a new program andother programs, and risk of creating a monopolyfor future procurements.

Para 4.7.3.2.4.2. System Integration

This effort is intended to integrate the subsystemsand reduce system-level risk….

Para 4.7.3.3.2.1. Entrance Criteria

Technology maturity (with an independenttechnology readiness assessment), system andrelevant mission area (operational) architectures,mature software capability, demonstrated system

integration or demonstrated commercial prod-ucts in a relevant environment, and no significantmanufacturing risks.

3. DoD Regulation 5000.2-R. MandatoryProcedures for Major Defense AcquisitionPrograms (MDAPs) and Major AutomatedInformation System (MAIS) AcquisitionPrograms, 5 April 2002

Para C1.3.4.2. Management Incentives

The PM, via the Contracting Officer, shall struc-ture Requests for Proposal (RFPs) and resultingcontracts to provide an incentive to the contrac-tor to meet or beat program objectives. When-ever applicable, risk reduction through use ofmature processes shall be a significant factor insource selection….

Para C1.4.3.3.2. Cost

Cost figures shall reflect realistic estimates of thetotal program, including a thorough assessmentof risk….

Para C2.3.1. Program Structure

…The acquisition strategy shall specificallyaddress the benefits and risks associated withreducing lead-time through concurrency and therisk mitigation and tests planned if concurrentdevelopment is used….

Para C2.5. Risk

The acquisition strategy shall address riskmanagement. The PM shall identify the risk ar-eas of the program and integrate risk manage-ment within overall program management. Thestrategy shall explain how the risk managementeffort shall reduce system-level risk to acceptablelevels by the interim progress review precedingsystem demonstration and by Milestone C.

A-4

Para C2.6.2. Information Sharing andDoD Oversight

…DoD oversight activities (i.e., contract man-agement offices, contracting offices, technicalactivities, and program management offices) shallconsider all relevant and credible information thatmight mitigate risk and reduce the need for DoDoversight before defining and applying directDoD oversight of contractor operations….

Para C2.8.1. Support Strategy

As part of the acquisition strategy, the PM shalldevelop and document a support strategy for life-cycle sustainment and continuous improvementof product affordability, reliability, and support-ability, while sustaining readiness…. The sup-port strategy shall continue to evolve towardgreater detail, so that by Milestone C, it containssufficient detail to define how the program willaddress the support and fielding requirementsthat meet readiness and performance objectives,lower TOC, reduce risks and avoid harm to theenvironment and human health. The supportstrategy shall address all applicable support re-quirements to include, but not be limited to, thefollowing elements:…

Para C2.8.1.7.3. Support Strategy

Contract service risk assessments over the life ofthe system.

Para C2.8.4.2.2. Supply Source of Support

The PM shall use a competitive process to selectthe best value supply support provider. Accessto multiple sources of supply is encouraged toreduce the risks associated with a singlesource….

Para C2.8.6. Environment, Safety, andOccupational Health (ESOH)Considerations

As part of risk reduction, the PM shall preventESOH hazards, where possible, and shall man-age ESOH hazards where they cannot beavoided. The support strategy shall contain asummary of the Programmatic ESOH Evalua-tion (PESHE) document, including ESOHrisks, a strategy for integrating ESOH consider-ations into the systems engineering process, iden-tification of ESOH responsibilities, a method fortracking progress, and a compliance schedule forNational Environmental Policy Act (NEPA) (42U.S.C. 4321-4370d (reference (x)) and Execu-tive Order (E.O.) 12114 (reference (y))). (Seesubparagraph C5.2.3.5.10. )

Para C2.8.9. Post Deployment Evaluation

The PM shall select the parameters for evalua-tions based on their relevance to future modi-fications or evolutionary block upgrades forperformance, sustainability, and affordabilityimprovements, or when there is a high level ofrisk that a KPP will not be sustained over thelife of the system….

Para C2.9.1.3.2.3 Sub-Tier Competition

During early exchanges of information with in-dustry (e.g., the draft request for proposal pro-cess), PMs shall identify the critical product andtechnology areas that the primes plan to provideinternally or through exclusive teaming. The PMshall assess the possible competitive effects ofthese choices. The PM shall take action to miti-gate areas of risk….

Para C2.9.1.4.2.2 Commercial and Non-Developmental Items

…If acquiring products with closed interfaces,the PM shall conduct a business case analysis to

A-5

justify acceptance of the associated economic im-pacts on TOC and risks to technology insertionand maturation over the service life of the system.

Para 2.9.1.4.4.1 Industrial Capability

The acquisition strategy shall summarize ananalysis of the industrial base capability to de-sign, develop, produce, support, and, if appro-priate, restart the program (10 U.S.C. 2440 (ref-erence (an))) as appropriate for the next programphase. This analysis (see DoD Directive5000.60 (reference (ao)) and DoD 5000.60-H(reference (ap))) shall identify DoD investmentsneeded to create or enhance certain industrialcapabilities, and the risk of industry being unableto provide program design or manufacturing ca-pabilities at planned cost and schedule….

Para 2.9.1.4.4.2 Industrial Capability

In many cases, commercial demand now sus-tains the national and international technologyand industrial base. The PM shall structure theacquisition strategy to promote sufficient pro-gram stability to encourage industry to invest,plan, and bear risks….

Para C2.9.3.2. Contract Type

For each major contract, the acquisition strategyshall identify the type of contract planned (e.g.,firm fixed-price (FFP); fixed price incentive, firmtarget; cost plus incentive fee; or cost plus awardfee) and the reasons it is suitable, including con-siderations of risk assessment and reasonablerisk-sharing by the Government and thecontractor(s)….

Para C2.9.3.5. Integrated BaselineReviews

PMs and their technical staffs or IPTs shall evalu-ate contract performance risks inherent in thecontractor’s planning baseline. This evaluation

shall be initiated within 6 months after contractaward or intra-Government agreement is reachedfor all contracts requiring EVMS or C/SSRcompliance.

Para C2.9.3.8. Component Breakout

The PM shall consider component breakout onevery program and break out components whenthere are significant cost savings (inclusive ofGovernment administrative costs), the technicalor schedule risk of furnishing government itemsto the prime contractor is manageable, and thereare no other overriding Government interests(e.g., industrial capability considerations or de-pendence on contractor logistics support)….

Para C3.1.1. Test and Evaluation (T&E)Overview

…The T&E strategy shall provide informationabout risk and risk mitigation, provide empiricaldata to validate models and simulations, evalu-ate technical performance and system maturity,and determine whether systems are operation-ally effective, suitable, and survivable against thethreat detailed in the System Threat Assessment.(See paragraph C6.2.4)….

Para C3.2.1.1. Evaluation Strategy

…The evaluation strategy shall primarily addressM&S, including identifying and managing the as-sociated risk, and early T&E strategy to evaluatesystem concepts against mission requirements….

Para C3.2.3.2.1. T&E Guidelines

Early T&E activities shall harmonize MOEs,MOPs, and risk with the needs depicted in theMNS, and with the objectives and thresholdsaddressed in the analysis of alternatives, anddefined in the ORD, APB, and TEMP, as thesedocuments become available….

A-6

Para C3.2.3.2.2.8. T&E Guidelines

The concept of early and integrated T&E shallemphasize prototype testing during systemdevelopment and demonstration and early OAsto identify technology risks and provideoperational user impacts….

Para C3.4.1.2. Developmental Test andEvaluation (DT&E)

Identify and describe design technical risks.Assist in the design of a system at the compo-nent, sub-system, and system level by reducingtechnical risk prior to transitioning to the nextlevel;

Para C3.4.1.6. Developmental Test andEvaluation (DT&E)

Assess progress toward meeting KPPs andother ORD requirements, COIs, mitigating ac-quisition technical risk, and achieving manu-facturing process requirements and systemmaturity;

Para C3.5.1. Certification of Readiness forOperational Test & Evaluation (OT&E)

The developing agencies (i.e., materiel and com-bat developers) shall complete the following tasksbefore starting OT&E: Define risk managementmeasures and indicators, with associated thresh-olds, to address performance and technical ad-equacy of both hardware and software.

Para C3.6.1.3. Operational Test andEvaluation (OT&E)

Information assurance testing shall be conductedon information systems to ensure that plannedand implemented security measures satisfy ORDand System Security Authorization Agreement(SSAA) requirements when the system is in-stalled and operated in its intended environment.

The PM, OT&E test authority, and designatedapproving authority shall coordinate and deter-mine the level of risk associated with operatingthe system and the extent of security testing re-quired. (See section C6.6.)…

Para C3.6.1.13. Operational Test andEvaluation (OT&E)

All weapon, Command, Control, Communica-tions, Computers, Intelligence, Surveillance,and Reconnaissance (C4ISR), and informationprograms that are dependent on external infor-mation sources, or that provide information toother DoD systems, shall be assessed for infor-mation assurance. The level of informationassurance testing depends on the system riskand importance. Systems with the highest im-portance and risk shall be subject to penetra-tion-type testing prior to the beyond LRIPdecision. Systems with minimal risk and im-portance shall be subject to normal National Se-curity Agency security and developmental test-ing, but shall not be subject to field penetrationtesting during OT&E.

Para C4.3. Analysis of Alternatives

Analyzing alternatives is part of the CAIVprocess. Alternatives analysis shall broadly ex-amine multiple elements of project or programalternatives including technical risk andmaturity, and costs.

Para C4.5.1.2. Resource Estimates

The DoD Component cost agency shall preparean independent LCCE and associated report forthe decision authority for all ACAT IC programs,except those reviewed by the CAIG, for all ma-jor decision points as specified in enclosure 3 ofreference (a), or as directed by the MDA. Forprograms with significant cost risk or high vis-ibility, the CAE may request an additional DoDComponent cost analysis estimate.

A-7

Para C4.5.2.1. Life-Cycle Cost Estimates(LCCEs)

The estimating activity shall explicitly base theLCCE (or EA for ACAT IA programs) on pro-gram objectives; operational requirements; con-tract specifications; careful risk assessments; and,for ACAT I programs, a DoD program workbreakdown structure (WBS), or, for ACAT IAprograms, a life-cycle cost and benefit elementstructure agreed upon by the IPT….

Para C4.5.4.1.1. ManpowerConsiderations

For all programs regardless of acquisition cat-egory, the DoD Components shall determine thesource of support for all new, modified, and re-placement systems based on the procedures,manpower mix criteria, and risk assessment in-structions in Deputy Under Secretary of Defense(Program Integration), Office of the Under Sec-retary of Defense (Personnel & Readiness)(OUSD(P&R)), and Deputy Under Secretary ofDefense (Installations), Office of USD(AT&L)annual memo, “DoD Inventory of Commercialand Inherently Governmental Activities DataCall….”

Para C4.5.4.1.2. ManpowerConsiderations

The DoD Components shall determine man-power and contract support based on bothpeacetime and wartime requirements, and es-tablish manpower authorizations at the mini-mum necessary to achieve specific vital ob-jectives (DoD Directive 1100.4 (reference(bv))). As part of this process, the DoD Com-ponents shall assess the risks (DoD Instruc-tion 3020.37 (reference (bw))) involved incontracting support for critical functions in-theater, or in other areas expecting hostile fire.Risk mitigation shall take precedence over costsavings in high-risk situations or when there

are highly sensitive intelligence or securityconcerns.

Para C4.5.4.2.4. Manpower Estimate

The manpower estimate shall address whetherthere are any personnel issues that would ad-versely impact full operational deployment of thesystem. It shall clearly state the risks associatedwith and the likelihood of achieving manpowernumbers reported in the estimate. It shall brieflyassess the validity of the manpower numbers,stating whether the DoD Component used vali-dated manpower methodologies and manpowermix criteria, and assessed all risks….

Para C5.2.1. Systems Engineering

…Systems engineering shall permeate design,manufacturing, T&E, and support of the product.Systems engineering principles shall influencethe balance between performance, risk, cost, andschedule.

Para C5.2.2.3. Systems Engineering

The systems engineering process shall… Char-acterize and manage technical risks.

Para C5.2.2.4. Systems Engineering

The systems engineering process shall: …Apply scientific and engineering principles,using the system security engineering pro-cess, to identify security vulnerabilities andminimize or contain information assuranceand force protection risks associated withthese vulnerabilities. (See DoD 5200.1-M(reference (bx)).)

Para C5.2.3.2. Functional Analysis/Allocation

…The design approach shall partition asystem into self-contained, functionally

A-8

cohesive, interchangeable, and adaptable el-ements to enable ease of change, achievetechnology transparency and mitigate risk ofobsolescence….

Para C5.2.3.4.2. System Analysis andControl

The overall risk management effort shall includetechnology transition planning and shall estab-lish transition criteria.

Para C5.2.3.4.3. System Analysis andControl

The establishment of a risk management process(including planning, assessment (identificationand analysis), handling, and monitoring) to beintegrated and continuously applied throughoutthe program, including, but not limited to, thedesign process. The risk management effort shalladdress risk planning, the identification andanalysis of potential sources of risks includingbut not limited to cost, performance, and sched-ule risks based on the technology being usedand its related design, manufacturing capabili-ties, potential industry sources, and test and sup-port processes; risk handling strategies, and riskmonitoring approaches. The overall risk man-agement effort shall interface with technologytransition planning, including the establishmentof transition criteria for such technologies.

Para C5.2.3.4.7. System Analysis andControl

Performance metrics to measure technicaldevelopment and design, actual versus planned;and to measure meeting system requirementsin terms of performance, cost, schedule, andprogress in implementing risk handling. Per-formance metrics shall be traceable to perfor-mance parameters identified by the operationaluser.

Para C5.2.3.5.5.2.10. Open SystemsDesign

PMs shall use an open systems approach toachieve the following objectives:… To mitigatethe risks associated with technology obsoles-cence, being locked into proprietary technology,and reliance on a single source of supply overthe life of a system;

Para C5.2.3.5.6. Software Management

The PM shall manage and engineer software-intensive systems using best processes andpractices known to reduce cost, schedule, andperformance risks.

Para C5.2.3.5.6.1.3. SoftwareManagement

The PM shall base software systems design anddevelopment on systems engineering principles,to include the following: …Select the program-ming language in context of the systems andsoftware engineering factors that influenceoverall life-cycle costs, risks, and the potentialfor interoperability;

Para C5.2.3.5.6.1.5. SoftwareManagement

…However, if the prospective contractordoes not meet full compliance, risk mitiga-tion planning shall describe, in detail, theschedule and actions that will be taken to re-move deficiencies uncovered in the evaluationprocess. Risk mitigation planning shall requirePM approval.

Para C5.2.3.5.6.1.7. SoftwareManagement

Assess information operations risks (DoDDirective S-3600.1 (reference (bz))) usingtechniques such as independent expert reviews;

A-9

Para C5.2.3.5.6.2.2.3. Software SpiralDevelopment

The PM shall consider the risks and extent ofchange impacts to enable a cost-effective, yetrigorous T&E process.

Para C5.2.3.5.6.4.7. Software SecurityConsiderations

When employing COTS software, the contract-ing process shall give preference during prod-uct selection/evaluation to those vendors whocan demonstrate that they took efforts to mini-mize the security risks associated with foreignnationals that have developed, modified, orremediated the COTS software being offered.

Para C5.2.3.5.7.2.5. Commercial, Off-the-Shelf (COTS) Considerations

The PM shall develop an appropriate T&E strat-egy for commercial items to include evaluatingpotential commercial items in a system test bed,when practical; focusing test beds on high-riskitems; and testing commercial-item upgrades forunanticipated side effects in areas such as secu-rity, safety, reliability, and performance.

Para C5.2.3.5.7.2.6. Commercial, Off-the-Shelf (COTS) Considerations

Programs are encouraged to use code-scanningtools, within the scope and limitations of the li-censing agreements, to ensure both COTS andGovernment off-the-shelf software do not poseany information assurance or security risks.

Para C5.2.3.5.10.2. Environment, Safety,and Occupational Health (ESOH)

The PM shall prepare a Programmatic ESOHEvaluation (PESHE) document early in theprogram life cycle (usually Milestone B). ThePESHE shall identify ESOH risks, contain a

strategy for integrating ESOH considerationsinto the systems engineering process, delineateESOH responsibilities, and provide a method fortracking progress, and provide a completionschedule for NEPA (reference (x)) and E.O.12114 (reference (y))….

Para C5.2.3.5.10.4. ESOH Compliance

To minimize the cost and schedule risks overthe system’s life cycle that changing ESOHrequirements and regulations represent, the PMshall regularly review ESOH regulatory re-quirements and evaluate their impact on theprogram’s life-cycle cost, schedule, andperformance.

Para C5.2.3.5.10.6.1. Safety and Health

The PM shall identify and evaluate safety andhealth hazards, define risk levels, and estab-lish a program that manages the probabilityand severity of all hazards associated with de-velopment, use, and disposal of the system.The PM shall use and require contractors touse the industry and DoD standard practicefor system safety, consistent with mission re-quirements. This standard practice managesrisks encountered in the acquisition life cycleof systems, subsystems, equipment, and facili-ties. These risks include conditions that cre-ate significant risks of death, injury, acute orchronic illness, disability, and/or reduced jobperformance of personnel who produce, test,operate, maintain, support, or dispose of thesystem.

Para C5.2.3.5.10.6.2. Safety and Health

The following policy applies to the acceptanceof risk: …The PM shall formally document eachmanagement decision accepting the riskassociated with an identified hazard.

A-10

Para C5.2.3.5.10.6.2.2. Safety and Health

“High Risk” hazards shall require CAE approval(Lead Executive Component authority prevailsfor joint programs).

Para C5.2.3.5.10.6.2.3. Safety and Health

The acceptance of all risks involving explosivessafety (see subparagraph C5.2.3.5.10.9. below)shall require the appropriate risk acceptance au-thority to consult with the DoD Component’stechnical authority managing the explosivessafety program.

Para C5.2.3.5.10.6.2.4. Safety and Health

“Serious Risk” hazards shall require PEOapproval.

Para C5.2.3.5.10.6.2.5. Safety and Health

“Medium Risk” and “Low Risk” hazards shallrequire PEO approval.

Para C5.2.3.5.10.8.1. Pollution Prevention

The PM shall identify and evaluate environmen-tal and occupational health hazards and estab-lish a pollution prevention program. The PMshall identify the impacts of the system on theenvironment during its life (including disposal),the types and amounts of pollution from allsources (air, water, noise, etc.) that will be re-leased to the environment, actions needed to pre-vent or control the impacts, ESOH risksassociated with using the new system, and otherinformation needed to identify source reduction,alternative technologies, and recyclingopportunities….

Para C5.2.3.5.13 Mission Assuredness

…The PM shall include the considerations in therisk benefit analysis of system design and cost….

Para C5.2.3.5.15.1. Anti-TamperProvisions

…Because of its function, anti-tamper should notbe regarded as an option or a system capabilitythat may later be traded off without a thoroughoperational and acquisition risk analysis. Toaccomplish this, the PM shall identify criticaltechnologies, identify system vulnerabilities, and,with assistance from counter-intelligence orga-nizations, perform threat analyses to the criticaltechnologies. The PM shall research anti-tampermeasures and determine which best fit the per-formance, cost, schedule, and risk of the pro-gram.

Para C5.3.1. Work Breakdown Structure(WBS)

…The PM shall normally specify contract WBSelements only to level three for prime contrac-tors and key subcontractors. Only low-levelelements that address high risk, high value, orhigh technical interest areas of a program shallrequire detailed reporting below level three….

Para C5.3.2.2.1.5. Implementing aPerformance-Based Business Environment(PBBE)

The PM shall structure the PBBE to accomplishthe following: …Encourage life-cycle risk man-agement versus risk avoidance;

Para C5.3.2.2.1.6. Implementing aPerformance-Based Business Environment(PBBE)

The PM shall structure the PBBE to accom-plish the following: …Simplify acquisition andsupport operating methods by transferring tasksto industry where cost effective, risk-accept-able, commercial capabilities exist; and

A-11

Para C6.2.2. Intelligence Support

Users shall assess and evaluate information su-periority requirements. They shall determine thevulnerability of IT, including NSS, supportinginfrastructures, and the effectiveness of risk miti-gation methods to reduce vulnerability to an ac-ceptable level.

Para C6.6.1. Information Assurance

PMs shall manage and engineer information sys-tems using the best processes and practicesknown to reduce security risks, including therisks to timely accreditation.

Para C6.6.2.1. Information Assurance

Accordingly, for each information systemdevelopment, PMs shall: …Conduct a systemrisk assessment based on system criticality,threat, and vulnerabilities;

Para C6.7.2.4. Technology Protection

Technology protection planning and develop-ment of the program protection plan shall beginearly in the acquisition life cycle. The followingconsiderations apply: …Security organizationsshall identify system vulnerabilities and recom-mend cost-effective security measures using riskmanagement evaluations.

Para C7.2. Decision Points

There are three types of decision points: mile-stones, decision reviews, and interim progressreviews. Each decision point results in a deci-sion to initiate, continue, advance, or terminatea project or program work effort or phase. Thereview associated with each decision point shalltypically address program progress and risk,affordability, program trade-offs, acquisition

strategy updates, and the development of exitcriteria for the next phase or effort….

Para C7.3.1.4. Defense Acquisition Board(DAB) Review

The PM shall brief the acquisition program tothe DAB and specifically emphasize technologymaturity, risk management, affordability, criti-cal program information, technology protection,and rapid delivery to the user….

Para C7.4.2. Exit Criteria

Phase-specific exit criteria normally trackprogress in important technical, schedule, ormanagement risk areas….

Para C7.5.1. Technology Maturity

Technology maturity shall measure the degreeto which proposed critical technologies meetprogram objectives. Technology maturity is aprincipal element of program risk. A technologyreadiness assessment shall examine programconcepts, technology requirements, and demon-strated technology capabilities to determine tech-nological maturity.

Para C7.5.4. Technology Maturity

TRLs enable consistent, uniform, discussionsof technical maturity, across different types oftechnologies. Decision authorities shall con-sider the recommended TRLs (or someequivalent assessment methodology, e.g.,Willoughby templates) when assessing programrisk….

Para C7.12.1. Cost Analysis ImprovementGroup (CAIG) Procedures

…The DoD Component responsible for ac-quisition of a system shall cooperate with theCAIG and provide the cost, programmatic, and

A-12

technical information required to estimate costsand appraise cost risks….

Para C7.15.7. Contract ManagementReports

…Except for high-cost or high-risk elements,the required level of reporting detail shall belimited to level three of the contract WBS.

Para C7.15.7.1.2. Contractor Cost DataReporting (CCDR)

…CCDR reporting is not required for con-tracts priced below $6.5 million. The CCDRrequirement on high-risk or high-technical-interest contracts priced between $6.5 and $42million is left to the discretion of the CostWIPT.

Para C7.15.7.1.8.1. Level of CostReporting

Routine reporting shall be at the contract WBSlevel three for prime contractors and key sub-contractors. Only low-level elements that ad-dress high-risk, high-value, or high-technical-interest areas of a program shall require de-tailed reporting below level three….

4. DoD Directive (DoDD) 5000.4. OSD CostAnalysis Improvement Group (CAIG),November 24, 1992

Para 4.1.8 Risk Assessment

The CAIG Chair report, in support of a mile-stone review, shall include quantitative assess-ments of the risk in the estimate of life-cyclecosts. In developing an assessment of cost risk,the CAIG shall consider the validity of such pro-grammatic assumptions of the CARDs as EMDschedules, rates of utilization of test assets, pro-duction ramp rates, and buy rates, consistent with

historical information. The CAIG shall also con-sider uncertainties in inputs to any cost estimat-ing relationships used in its estimates, as well asthe uncertainties inherent in the calibration of theCERs, and shall consider uncertainties in the fac-tors used in making any estimates by analogy.The CAIG shall consider cost and schedule riskimplications of available assessments of theprogram’s technical risks, and may include theresults in its cost-risk assessments. The CAIGmay consider information on risk provided byany source, although primary reliance will beon the technical risk assessments that are theresponsibility of the sponsoring DoD compo-nents, and of other OSD offices, in accordancewith their functional responsibilities.

5. DoD 5000.4-M. Cost Analysis Guidanceand Procedures, December 1992

Chapter 1:(Outline of CARD Basic Structure)

Para 1.2.1.x (..x..) Subsystem Description

This series of paragraphs (repeated for each sub-system) describes the major equipment (hard-ware/software) WBS components of the system.The discussion should identify which items areoff-the-shelf. The technical and risk issues as-sociated with development and production of in-dividual subsystems also must be addressed.

Para 2.0 Technical and Physical Description

This section identifies the program manager’sassessment of the program and the measuresbeing taken or planned to reduce those risks.Relevant sources of risk include: design concept,technology development, test requirements,schedule, acquisition strategy, funding availabil-ity, contract stability, or any other aspect thatmight cause a significant deviation from the

A-13

planned program. Any related external technol-ogy programs (planned or on-going) should beidentified, their potential contribution to the pro-gram described, and their funding prospects andpotential for success assessed. This sectionshould identify these risks for each acquisitionphase (DEM/VAL, EMD, productions and de-ployment, and O&S). (Phase terminologychanged in DoD 5000.2-R, 2 April 2002.)

Chapter 2:(Presentation of Cost Analysis to OSD CAIG)

Para C2.2.9. Sensitivity Analysis

The sensitivity of projected costs to criticalprogram assumptions shall be examined. Aspectsof the program to be subjected to sensitivity analy-sis shall be identified in the DoD CCA of program

assumptions. The analysis shall include factorssuch as learning curve assumptions; technicalrisk, i.e., the risk of more development and/orproduction effort, changes in performance char-acteristics, schedule alterations, and variations intesting requirements; and acquisition strategy(multiyear procurement, dual sourcing, etc.).

Para C2.3.3 PM Presentation

The Program Manager’s designated represen-tative shall present the CAIG with the POE foreach alternative under construction and explainhow each is derived. This presentation shallcover the estimates and estimating proceduresat the major subcomponent level (e.g., airframe,engine, major avionics subsystem, etc.). Thepresentation should focus on the items that arecost drivers and/or elements of high cost risk.

A-14

B-1

APPENDIX B

GENERIC RISKMANAGEMENT PLAN

SAMPLE RISK MANAGEMENT PLAN

PREFACE

DoDI 5000.2 requires that “PMs and otheracquisition managers shall continually assessprogram risks.” Further, DoD 5000.2-R statesthat for ACAT I Programs, “The PM shall iden-tify the risk areas of the program and integraterisk management within overall program man-agement.” Although the need for a risk manage-ment program and a risk management processare addressed throughout this regulation, thereis no requirement for a formal Risk ManagementPlan (RMP). However, Program Managers(PMs) have found such a plan necessary to fo-cus properly on the assessment and handling ofprogram risk, a core acquisition managementissue that Milestone Decision Authorities(MDAs) must rigorously address at appropriatemilestones before making program decisions.

Attached is a sample format for a RMP that is acompilation of several good risk plans and theresults of the DoD Risk Management WorkingGroup Study. It represents the types ofinformation and considerations that a plan,tailored to a specific program, might contain.There are also two examples of Risk Manage-ment Plans—one for an ACAT I or II Program,the other for an ACAT III or IV Program. TheDefense Acquisition Deskbook, Section 2.5.2,has general guidance and advice in all areas ofrisk management. Section 2.5.2.4 of the DefenseAcquisition Deskbook contains information

concerning the development of a risk manage-ment plan. The information in this Guide is con-sistent with, and in most cases identical to, theDefense Acquisition Deskbook.

There is a danger in providing a sample docu-ment. First of all, because it is written as a guidefor a general audience, it does not satisfy all ofthe needs of any particular program. Second,there is the possibility that some prospectiveuser will simply adopt the plan as written,despite the fact that it does not fit his or herprogram. We discourage this.

The reason for providing this sample format isto give PMs and their staffs a starting point fortheir own planning process. It should stimulatethought about what has to be done and givesome ideas on how to begin writing a plan. Thesample plan contains more information thanmost program offices should need. Few PMshave the resources for a dedicated risk man-agement effort as depicted in the plan. The keyto using the sample plan is to keep things simpleand tailor the plan to suit your needs, focusingon the management of risk in the key criticalareas of your program.

The following text reflects the outline of a riskmanagement plan found in the DefenseAcquisition Deskbook section 2.5.2.4, Figure2.5.2.4-2.

B-2

Introduction. This section should address thepurpose and objective of the plan, and provide abrief summary of the program, to include the ap-proach being used to manage the program, andthe acquisition strategy.

Program Summary. This section contains abrief description of the program, including theacquisition strategy and the program manage-ment approach. The acquisition strategy shouldaddress its linkage to the risk managementstrategy.

Definitions. Definitions used by the programoffice should be consistent with DoD defini-tions for ease of understanding and consistency.However, the DoD definitions allow programmanagers flexibility in constructing their riskmanagement programs. Therefore, each pro-gram’s risk management plan may include defi-nitions that expand the DoD definitions to fitits particular needs. For example, each planshould include, among other things, definitionsfor the ratings used for technical, schedule andcost risk. (Discussion of risk rating is containedin the Defense Acquisition Deskbook Section2.5.2.1.)

Risk Management Strategy and Approach.Provide an overview of the risk managementapproach, to include the status of the riskmanagement effort to date, and a descriptionof the program risk management strategy. Seethe Defense Acquisition Deskbook Sections2.5.2.1 and 2.5.2.3.

Organization. Describe the risk managementorganization of the program office and list theresponsibilities of each of the risk managementparticipants. See the Defense AcquisitionDeskbook Section 2.5.2.3.

SAMPLE FORMAT FORRISK MANAGEMENT PLAN

Risk Management Process and Procedures.Describe the program risk management processto be employed; i.e., risk planning, assessment,handling, monitoring and documentation, and abasic explanation of these components. See theDefense Acquisition Deskbook Section 2.5.2.1.Also provide application guidance for each ofthe risk management functions in the process. Ifpossible, the guidance should be as general aspossible to allow the program’s risk managementorganization (e.g., IPTs) flexibility in managingthe program risk, yet specific enough to ensurea common and coordinated approach to riskmanagement. It should address how the in-for-mation associated with each element of the riskmanagement process will be documented andmade available to all participants in the pro-cess,and how risks will be tracked, to include the iden-tification of specific metrics if possible.

Risk Planning. This section describes the riskplanning process and provides guidance on howit will be accomplished, and the relationship be-tween continuous risk planning and this RMP.Guidance on updates of the RMP and the approvalprocess to be followed should also be included.See Section 2.5.2.1 of the Defense AcquisitionDeskbook for information on risk planning.

Risk Assessment. This section of the plandescribes the assessment process and proce-dures for examining the critical risk areas andprocesses to identify and document the asso-ciated risks. It also summarizes the analysesprocess for each of the risk areas leading tothe determination of a risk rating. This ratingis a reflection of the potential impact of therisk in terms of its variance from known BestPractices or probability of occurrence, itsconsequence/impact, and its relationship to otherrisk areas or processes. This section may include:

B-3

• Overview and scope of the assessmentprocess;

• Sources of information;

• Information to be reported and formats;

• Description of how risk information isdocumented; and

• Assessment techniques and tools (seeSection 2.5.2.4 of the Defense AcquisitionDeskbook).

Risk Handling. This section describes the pro-cedures that can be used to determine and evalu-ate various risk-handling options, and identifiestools that can assist in implementing the risk-han-dling process. It also provides guidance on theuse of the various handling options for specificrisks.

Risk Monitoring. This section describes theprocess and procedures that will be followed tomonitor the status of the various risk events iden-tified. It should provide criteria for the selectionof risks to be reported on, and the frequency ofreporting. Guidance on the selection of metricsshould also be included.

Risk Management Information System,Documentation and Reports. This sectiondescribes the MIS structure, rules, and proce-dures that will be used to document the resultsof the risk management process. It also identi-fies the risk management documentation andreports that will be prepared; specifies theformat and frequency of the reports; and assignsresponsibility for their preparation.

B-4

SAMPLE RISK MANAGEMENT PLANFOR THE XYZ PROGRAM (ACAT I, II)

leading to Milestone B; this plan concentrateson the tasks and activities of the System Integra-tion part of the System Development and Dem-onstration (SDD) Phase. Subsequent updates tothis RMP will shift focus to the later acquisitionphases. There are changes in every area of theplan; they include refinement of the risk identi-fication process. The PMO Risk ManagementCoordinator has been identified and training ofIPT members has commenced.

1.2 PROGRAM SUMMARY

The XYZ program was initiated in response toMission Need Statement (MNS) XXX, datedDD-MM-YYYY and Operational Require-ments Document (ORD), dated DD-MM-YYYY. It is required to support the fundamen-tal objective of U.S. defense policy as stated inDefense Planning Guidance (DPG) and theNational Military Strategy. The XYZ system isbased on the need for an integrated combat sys-tem to link battlefield decision makers. TheXYZ mission areas are: (Delineate applicableareas).

The XYZ program will develop and procure 120advanced platforms to replace the aging ABCplatforms currently in the inventory. In orderto meet force structure objectives, the XYZsystem must reach Initial Operational Capabil-ity (IOC) (four platforms) by FY-07. The pro-gram is commencing an eight-year EMD phasethat will be followed by a five-year procurementphase. The objectives of the EMD phase are to(discuss the specific objectives of this phase). Theprogram has Congressional interest and is re-stricted to a research and development fundingceiling of $300 million.

1.0 INTRODUCTION

1.1 PURPOSE

This Risk Management Plan (RMP) presentsthe process for implementing proactive riskmanagement as part of the overall managementof the XYZ program. Risk management is aprogram management tool to assess and miti-gate events that might adversely impact the pro-gram. Therefore, risk management increases theprobability/likelihood of program success. ThisRMP will:

• Serve as a basis for identifying alternativesto achieve cost, schedule, and performancegoals,

• Assist in making decisions on budget andfunding priorities,

• Provide risk information for Milestonedecisions, and

• Allow monitoring the health of the programas it proceeds.

The RMP describes methods for identifying,analyzing, prioritizing, and tracking riskdrivers; developing risk-handling plans; andplanning for adequate resources to handle risk.It assigns specific responsibilities for the man-agement of risk and prescribes the document-ing, monitoring, and reporting processes to befollowed.

This is the second edition of the Risk Manage-ment Plan for the XYZ program. The initialplan concentrated on the tasks and the Con-cept and Technology Development (CTD) Phase

B-5

1.2.1 System Description

The XYZ will be an affordable, yet capable,platform taking advantage of technologicalsimplification and advancements. The XYZintegrated Combat System includes all non-propulsion electronics and weapons. Sub-systems provide capabilities in combat control,electronic warfare support measures (ESM),defensive warfare, navigation, radar, interiorcommunications, monitoring, data transfer,tactical support device, exterior communica-tions, and Identification Friend or Foe (IFF).Weapons systems are to be provided by theprogram offices that are responsible for theirdevelopment. The Mechanical and Electrical(M&E) system comprises.... The Combat Sys-tem, M&E systems, and subsystems provide theXYZ system with the capability and connec-tivity to accomplish the broad range of missionsdefined in the MNS and ORD.

1.2.2 Acquisition Strategy

The XYZ program initial strategy is to contractwith one prime contractor in the SystemIntegration part of the System Development andDemonstration Phase for development of twoprototype systems for test and design valida-tion. Due to the technical complexity of achiev-ing the performance levels of the power gen-eration systems, the prime will use two sub-contractors for the engine development anddown select to one producer prior to low rateinitial production, which is scheduled for FY-04. Various organizations, such as the Govern-ment Research Laboratory will be funded to pro-vide experts for assessment of specific areas ofrisk. The program has exit criteria, included inthe list of Critical Program Attributes in AnnexA, that must be met before progressing to thenext phase.

1.2.3 Program Management Approach

The XYZ program is managed using the IPPDconcept, with program integrated product teams(PIPTs) established largely along the hierarchyof the product work breakdown structure(WBS). There are also cost-performance andtest Working IPTs (WIPTs) established for ver-tical coordination up the chain of command.The PM chairs a program level IPT (PLIPT)that addresses issues that are not resolved atthe WIPT or PIPT level.

1.3 DEFINITIONS

1.3.1 Risk

Risk is a measure of the inability to achieveoverall program objectives within defined cost,schedule, and technical constraints and has twocomponents: (1) the probability of failing toachieve a particular outcome and (2) theconsequences/impacts of failing to achieve thatoutcome. For processes, risk is a measure ofthe difference between actual performance ofa process and the known best practice forperforming that process.

1.3.2 Risk Event

Risk events are those events within the XYZprogram that, if they go wrong, could result inproblems in the development, production, andfielding of the system. Risk events should bedefined to a level such that the risk and causesare understandable and can be accurately as-sessed in terms of probability/likelihood andconsequence/impact to establish the level ofrisk. For processes, risk events are assessed interms of process variance from known bestpractices and potential consequences/impactsof the variance.

B-6

1.3.3 Technical Risk

This is the risk associated with the evolution ofthe design and the production of the XYZsystem affecting the level of performancenecessary to meet the operational requirements.The contractor’s and subcontractors’ design,test, and production processes (process risk)influence the technical risk and the nature ofthe product as depicted in the various levels ofthe Work Breakdown Structure (product risk).

1.3.4 Cost Risk

This is the risk associated with the ability ofthe program to achieve its life-cycle costobjectives. Two risk areas bearing on cost are(1) the risk that the cost estimates and objec-tives are accurate and reasonable and (2) therisk that program execution will not meet thecost objectives as a result of a failure to handlecost, schedule, and performance risks.

1.3.5 Schedule Risk

These risks are those associated with the ad-equacy of the time estimated and allocated forthe development, production, and fielding of thesystem. Two risk areas bearing on schedule riskare (1) the risk that the schedule estimates andobjectives are realistic and reasonable and (2)the risk that program execution will fall shortof the schedule objectives as a result of failureto handle cost, schedule, or performance risks.

1.3.6 Risk Ratings

This is the value that is given to a risk event (orthe program overall) based on the analysis ofthe probability/likelihood and consequences/impacts of the event. For the XYZ program,risk ratings of Low, Moderate, or High willbe assigned based on the following criteria.See Section 3.3.2 of this appendix for guidanceon determining probability/likelihood and

consequences/impacts. When rating process vari-ance from best practices, there is no rating ofprobability/likelihood, rather the level would bea measure of the variance from best practices(see Paragraph 3.3.2.3).

• Low Risk: Has little or no potential forincrease in cost, disruption of schedule, ordegradation of performance. Actions withinthe scope of the planned program and nor-mal management attention should result incontrolling acceptable risk.

• Moderate Risk: May cause some increasein cost, disruption of schedule, or degrada-tion of performance. Special action and man-agement attention may be required to handlerisk.

• High Risk: Likely to cause significantincrease in cost, disruption of schedule, ordegradation of performance. Significantadditional action and high priority manage-ment attention will be required to handle risk.

1.3.7 Independent Risk Assessor

An independent risk assessor is a person whois not in the management chain or directlyinvolved in performing the tasks being assessed.Use of independent risk assessors is a valid tech-nique to ensure that all risk areas are identifiedand that the consequence/impact and probabil-ity/likelihood (or process variance) are prop-erly understood. The technique can be used atdifferent program levels, e.g., Program Office,Service Field Activities, Contractors, etc. TheProgram Manager will approve the use ofindependent assessors, as needed.

1.3.8 Templates and Best Practices

A “template” is a disciplined approach for theapplication of critical engineering and manu-facturing processes that are essential to the

B-7

success of most programs. DoD 4245.7-M,Transition from Development to ProductionSolving the Risk Equation, provides a numberof such templates. For each template processdescribed in DoD 4245.7-M, Best PracticeInformation is described in NAVSO P-6071.These documents outline the ideal or low riskapproach and thus serve as a baseline fromwhich risk for some XYZ processes can beassessed.

1.3.9 Metrics

There are measures used to indicate progressor achievement.

1.3.10 Critical Program Attributes

Critical Program Attributes are performance,cost, and schedule properties or values that arevital to the success of the program. They arederived from various sources, such as theAcquisition Program Baseline, exit criteria forthe next program phase, Key PerformanceParameters, test plans, the judgment of programexperts, etc. The XYZ program will track theseattributes to determine the progress in achiev-ing the final required value. See Annex A for alist of the XYZ Critical Program Attributes.

2.0 RISK MANAGEMENT APPROACH

2.1 GENERAL APPROACH ANDSTATUS

DoDI 5000.2 states: “Risks must be well un-derstood, and risk management approaches de-veloped, before decision authorities can autho-rize a program to proceed into the next phaseof the acquisition process.” This policy is imple-mented in DoD Regulation 5000.2-R, withmore detailed guidance provided in the indi-vidual Service regulation. The Defense Acqui-sition Deskbook (Section 2.5.2) provides addi-tional guidance, advice, and wisdom on the

management of risk. Figure B-1 shows how theXYZ program risk management fits into thephases and milestones of the acquisition process.

The XYZ program will use a centrally devel-oped risk management strategy throughout theacquisition process and decentralized risk plan-ning, assessment, handling, and monitoring.XYZ risk management is applicable to allacquisition functional areas.

The results of the Concept Exploration Phaseof the program identified potential risk eventsand the Acquisition Strategy reflects theprogram’s risk-handling approach. Overall, therisk of the XYZ program for Milestone B wasassessed as moderate, but acceptable. Moder-ate risk functional areas were threat, manufac-turing, cost, funding, and schedule. The remain-ing functional areas of technology, design andengineering (hardware and software), support,(schedule) concurrency, human systemsintegration, and environmental impact wereassessed as low risk.

2.2 RISK MANAGEMENT STRATEGY

The basic risk management strategy is intendedto identify critical areas and risk events, bothtechnical and non-technical, and take necessaryaction to handle them before they can becomeproblems, causing serious cost, schedule, orperformance impacts. This program will makeextensive use of modeling and simulation, tech-nology demonstrations, and prototype testingin handling risk.

Risk management will be accomplished usingthe integrated Government-Contractor IPT or-ganization. These IPTs will use a structured as-sessment approach to identify and analyze thoseprocesses and products that are critical to meet-ing the program objectives. They will then de-velop risk-handling options to mitigate the risksand monitor the effectiveness of the selected

B-8

handling options. Key to the success of the riskmanagement effort is the identification of theresources required to implement the developedrisk-handling options.

Risk information will be captured by the IPTsin a risk management information system(RMIS) using a standard Risk Information Form(RIF). The RMIS will provide standard reports,and is capable of preparing ad hoc tailored re-ports. See Annex B for a description of the RMISand RIF.

Risk information will be included in all pro-gram reviews, and as new information becomesavailable, the PMO and contractor will conductadditional reviews to ascertain if new risks ex-ist. The goal is to be continuously looking tothe future for areas that may severely impactthe program.

2.3 ORGANIZATION

The risk organization for the XYZ program isshown in Figure B-2. This is not a separateorganization, but rather shows how risk isintegrated into the program’s existing organi-zation and shows risk relationships amongmembers of the program team.

2.3.1 Risk Management Coordinator

The Risk Management Coordinator, the XYZTechnology Assessment and R&D Manager, isoverall coordinator of the Risk ManagementProgram. The Risk Management Coordinator isresponsible for:

• Maintaining this Risk Management Plan;

• Maintaining the Risk Management Database;

Figure B-1. Risk Management and the Acquisition Process

Current Status

• Baseline– Cost– Schedule– Performance

• Execution Status

Plans

• Program Plans• Exit Criteria

Assessment

• Cost• Schedule• Performance

Plans

• Program Plans• Exit Criteria

Current Status

• Refined Baseline– Cost– Schedule– Performance

• Execution Status

Overall Acquisition

MilestonePhase

MilestonePhasePhase

RiskManagement

RiskManagement

Assessment

• Cost• Schedule• Performance

B-9

• Briefing the PM on the status of XYZprogram risk;

• Tracking efforts to reduce moderate and highrisk to acceptable levels;

• Providing risk management training;

• Facilitating risk assessments; and

• Preparing risk briefings, reports, and docu-ments required for Program Reviews and theacquisition Milestone decision processes.

2.3.2 Program Level Integrated ProductTeam (PLIPT)

The PLIPT is responsible for complying withthe DoD risk management policy and for struc-turing an efficient and useful XYZ risk man-agement approach. The Program Manager is

Figure B-2. XYZ Risk Management Organization

the Chair of the PLIPT. The PLIPT member-ship may be adjusted but is initially establishedas the chairs of the Program IPTs, designatedsub-tier IPTs, and the Heads of PMO Func-tional Offices.

2.3.3 PIPTs

The Program IPTs are responsible for imple-menting risk management tasks per this plan.This includes the following responsibilities:

• Review and recommend to the Risk Man-agement Coordinator changes on the over-all risk management approach based onlessons learned.

• Quarterly, or as directed, update the programrisk assessments made during the SystemIntegration (SI) part of the System Develop-ment and Demonstration (SDD) Phase.

SupportContractor

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

○ ○ ○ ○ ○ ○

SupportContractor

PrimeContractor

FunctionalSupportOffices

As NeededCoordination

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

○ ○ ○ ○ ○ ○ Support provided bynon-PMO organizations

○ ○ ○○

PM

RiskManagementCoordinator

ProgramLevel IPT(PLIPT)

Sub-TierProgram IPTs

(PIPTs)

PMOFunctional

Offices

IndependentRisk

Assessors

B-10

• Review and be prepared to justify the riskassessments made and the risk handling planproposed.

• Report risk to the Program Manager/Pro-gram Director, with information to theRisk Management Coordinator via RiskInformation Forms (RIFs).

• Ensure that risk is a consideration at eachProgram and Design Review.

• Ensure Design/Build Team responsibilitiesincorporate appropriate risk managementtasks.

2.3.4 XYZ Independent Risk Assessors

Independent Assessors made a significantcontribution to the XYZ Milestone B riskassessments. The use of independent assess-ments as a means of ensuring that all risk areasare identified will continue, when necessary.

2.3.5 Other Risk AssessmentResponsibilities

The Risk Assessment responsibilities of otherSystems Command codes, Service Field Activi-ties, Design/Build Teams, and Contractors willbe as described in Memoranda of Agreement(MOAs), Memoranda of Understanding(MOUs), Systems Command Tasking, or con-tracts. This RMP should be used as a guide forXYZ risk management efforts.

2.3.6 User Participation

The Requirements Organization (specific code)is the focal point for providing the ProgramExecutive Officer or the Project Manager withuser identified risk assessments.

2.3.7 Risk Training

The key to the success of the risk efforts is thedegree to which all members of the team, bothGovernment and contractor are properly trained.The XYZ Program Office will provide risktraining, or assign members to training classes,during the SDD Phase. Key personnel withXYZ management or assessment responsibili-ties are required to attend. All members of theteam will receive, at a minimum, basic risk man-agement training. XYZ sponsored training isplanned to be presented according to theschedule provided in Annex X (not provided).

3.0 RISK MANAGEMENT PROCESSAND PROCEDURES

3.1 OVERVIEW

This section describes XYZ program’s riskmanagement process and provides an overviewof the XYZ risk management approach. TheDefense Acquisition Deskbook defines riskmanagement as “the act or practice of control-ling risk. It includes risk planning, assessingrisk areas, developing risk-handling options,monitoring risks to determine how risks havechanged, and documenting the overall riskmanagement program.” Figure B-3 shows, ingeneral terms, the overall risk management pro-cess that will be followed in the XYZ program.This process follows DoD and Service policiesand guidelines and incorporates ideas found inother sources. Each of the risk managementfunctions shown in Figure B-3 is discussed inthe following paragraphs, along with specificprocedures for executing them.

3.2 RISK PLANNING

3.2.1 Process

Risk planning consists of the up-front activitiesnecessary to execute a successful risk

B-11

management program. It is an integral part ofnormal program planning and management.The planning should address each of the otherrisk management functions, resulting in anorganized and thorough approach to assess,handle, and monitor risks. It should also assignresponsibilities for specific risk managementactions and establish risk reporting and docu-mentation requirements. This RMP serves asthe basis for all detailed risk planning, whichmust be continuous.

3.2.2 Procedures

3.2.2.1 Responsibilities. Each IPT is respon-sible for conducting risk planning, using thisRMP as the basis. The planning will cover allaspects of risk management to include assess-ment, handling options, and monitoring of riskhandling activities. The Program Risk Manage-ment Coordinator will monitor the planningactivities of the IPTs to ensure that they are con-sistent with this RMP and that appropriate re-visions to this plan are made when required toreflect significant changes resulting from theIPT planning efforts.

Each person involved in the design, production,operation, support, and eventual disposal of theXYZ system or any of its systems or compo-nents is a part of the risk management process.This involvement is continuous and should beconsidered a part of the normal managementprocess.

3.2.2.2 Resources and Training. An effectiverisk management program requires resources.As part of its planning process, each IPT willidentify the resources required to implement therisk management actions. These resources in-clude time, material, personnel, and cost. Train-ing is major consideration. All IPT membersshould receive instruction on the fundamentalsof risk management and special training in theirarea of responsibility, if necessary.

3.2.2.3 Documentation and Reporting. ThisRMP establishes the basic documentation andreporting requirements for the program. IPTsshould identify any additional requirements thatmight be needed to effectively manage risk attheir level. Any such additional requirementsmust not conflict with the basic requirementsin this RMP.

Figure B-3. Risk Management Structure(also referred to as the Risk Management Process Model)

• Requirements

• Responsibilities

• Definitions

• Resources

• Procedures

• Risk Events

• Analysis

• Update Assessments

• Document Findings

• Mitigation Tasks

• Metrics

• Report

• Metrics

• Track Status

• Report

Documentation

Execution Phase

RiskPlanning

RiskAssessment

RiskHandling

RiskMonitoring

B-12

3.2.2.4 Metrics. Each IPT should establishmetrics that will measure the effectiveness oftheir planned risk-handling options. See AnnexC for an example of metrics that may be used.

3.2.2.5 Risk Planning Tools. The followingtools can be useful in risk planning. It may beuseful to provide this information to the con-tractors to help them understand the XYZprogram’s approach to managing risk. This listis not meant to be exclusive.

• DoD Manual 4245.7-M, a DoD guide forassessing process technical risk.

• The Navy’s Best Practices Manual, NAVSOP-6071, provides additional insight into eachof the Templates in DoD 4245.7-M and achecklist for each template.

• Program Manager’s Work Station (PMWS)software, may be useful to some risk asses-sors. PMWS has a Risk Assessment modulebased on the Template Manual and BestPractices Manual.

• Commercial and Government developed riskmanagement software.

The latter includes Government software, suchas Risk Matrix developed by Mitre Corpora-tion for the Air Force and the New Attack Sub-marine Program’s On-Line Risk Data Base(OLRDB).

3.2.2.6 Plan Update. This RMP will be up-dated, if necessary, on the following occasions:(1) whenever the acquisition strategy changes,or there is a major change in program empha-sis; (2) in preparation for major decision points;(3) in preparation for and immediately follow-ing technical audits and reviews; (4) concur-rent with the review and update of otherprogram plans; and (5) in preparation for a POMsubmission.

3.3 RISK ASSESSMENT

The risk assessment process includes theidentification of critical risk events/processes,which could have an adverse impact on theprogram, and the analyses of these events/processes to determine the probability/likeli-hood of occurrence/process variance andconsequences/impacts. It is the most demand-ing and time-consuming activity in the riskmanagement process.

3.3.1 Process

3.3.1.1 Identification. Risk identification is thefirst step in the assessment process. The basicprocess involves searching through the entireXYZ program to determine those critical eventsthat would prevent the program from achiev-ing its objectives. All identified risks will bedocumented in the RMIS, with a statement ofthe risk and a description of the conditions orsituations causing concern and the context ofthe risk.

Risks will be identified by all Program IPTsand by any individual in the program. Thelower-level IPTs can identify significant con-cerns earlier than otherwise might be the caseand identify those events in critical areas thatmust be dealt with to avoid adverse conse-quences/impacts. Likewise, individuals in-volved in the detailed and day-to-day techni-cal, cost, and scheduling aspects of the programare most aware of the potential problems (risks)that need to be managed.

3.3.1.2 Analysis. This process involves:

• Identification of WBS elements

• Evaluation of the WBS elements using therisk areas to determine risk events

B-13

• Assignment of probability/likelihood and con-sequence/impact to each risk event to establisha risk rating

• Prioritization of each risk event relative toother risks.

Risk analysis should be supported by a study,test results, modeling and simulation, tradestudy, the opinion of a qualified expert (toinclude justification of his or her judgment), orany other accepted analysis technique. The De-fense Acquisition Deskbook, Section 2524.2 de-scribes a number of analysis techniques that maybe useful. Evaluators should identify all assump-tions made in assessing risk. When appropri-ate, a sensitivity analysis should be done onassumptions.

Systems engineering analysis, risk assessments,and manpower risk assessments provide addi-tional information that must be considered. Thisincludes, among other things, environmentalimpact, system safety and health analysis, andsecurity considerations. Classified programsmay experience difficulties in access, facilities,and visitor control that can introduce risk andmust be considered.

The analysis of individual risk will be theresponsibility of the IPT identifying the risk,or the IPT to which the risk has been assigned.They may use external resources for assistance,such as field activities, Service laboratories, andcontractors. The results of the analysis of allidentified risks must be documented in theRMIS.

3.3.2 Procedures

3.3.2.1 Assessments – General. Risk assess-ment is an iterative process, with each assess-ment building on the results of previous assess-ments. The current baseline assessment is acombination of the risk assessment delivered by

the contractors as part of the Concept and Tech-nology Development (CTD) Phase, the programoffice risk assessment done before Milestone B,and the post-award Integrated Baseline Review(IBR) performed in the SI part of SDD.

For the program office, unless otherwise di-rected in individual tasking, program level riskassessments will be presented at each ProgramReview meeting with a final update not laterthan 6 months before the next scheduled Mile-stone decision. The primary source of informa-tion for the next assessment will be the currentassessment baseline, and existing documenta-tion such as, Concept and Technology Devel-opment (CTD) Phase study results, the designmission profile that was done as part of the CTDPhase, the IBR, which will be conducted im-mediately after the System Integration (SI) Partof the System Development and Demonstration(SDD) Phase contract award, the contract WBSthat is part of the IBR, industry best practicesas described in the PMWS Knowledge base,the ORD, the Acquisition Program Baseline(APB), and any contractor design documents.

IPTs should continually assess the risks in theirareas, reviewing risk-handling actions and thecritical risk areas whenever necessary to assessprogress. For contractors, risk assessmentupdates should be made as necessary.

The risk assessment process is intended to beflexible enough so that field activities, servicelaboratories, and contractors may use their judg-ment in structuring procedures considered mostsuccessful in identifying and analyzing all riskareas.

3.3.2.2 Identification. Following is a descrip-tion of step-by-step procedures that evaluatorsmay use as a guide to identify program risks.

• Step One – Understand the requirements andthe program performance goals, which are

B-14

defined as thresholds and objectives (see5000.2-R). Describe the operational (func-tional and environmental) conditions underwhich the values must be achieved byreferring or relating to design documents. TheORD and APB contain Key Performance Pa-rameters (KPPs).

• Step Two – Determine the engineering andmanufacturing processes that are needed todesign, develop, produce, and support thesystem. Obtain industry best practices forthese processes.

• Step Three – Identify contract WBS elements(to include products and processes).

• Step Four – Evaluate each WBS elementagainst sources/areas of risk described inTable 4-2 of the DSMC Risk ManagementGuide , plus other sources/areas asappropriate.

• Step Five – Assign a probability and conse-quence/impact to each risk event

• Step Six – Prioritize the risk events.

Following are indicators that IPTs may findhelpful in identifying and assessing risk:

• Lack of Stability, Clarity, or Understand-ing of Requirements: Requirements drivethe design of the system. Changing or poorlystated requirements guarantees the intro-duction of performance, cost, and scheduleproblems.

• Failure to Use Best Practices virtually as-sures that the program will experience somerisk. The further a contractor deviates frombest practices, the higher the risk.

• New Processes should always be suspect,whether they are related to design, analysis,

or production. Until they are validated, anduntil the people who implement them havebeen trained and have experience in success-fully using the process, there is risk.

• Any Process Lacking Rigor should also besuspect; it is inherently risky. To have rigor,a process should be mature and documented,it should have been validated, and it shouldbe strictly followed.

• Insufficient Resources: People, funds,schedule, and tools are necessary ingredi-ents for successfully implementing a pro-cess. If any are inadequate, to include thequalifications of the people, there is risk.

• Test Failure may indicate corrective actionis necessary. Some corrective actions maynot fit available resources, or the schedule,and (for other reasons as well) may containrisk.

• Qualified Supplier Availability: A suppliernot experienced with the processes for de-signing and producing a specific product isnot a qualified supplier and is a source ofrisk.

• Negative Trends or Forecasts are cause forconcern (risk) and may require specificactions to turn around.

There are a number of techniques and toolsavailable for identifying risks. Among them are:

• Best Judgment: The knowledge and expe-rience of the collective, multi-disciplinedIntegrated Project Team (IPT) membersand the opinion of subject-matter experts(SMEs) are the most common source of riskidentification.

• Lessons Learned from similar processes canserve as a baseline for the successful way to

B-15

achieve requirements. If there is a departurefrom the successful way, there may be risk.

• DoD 4245.7-M, Transition from Develop-ment to Production, is often called the “Tem-plates” book because it identifies technicalrisk areas and provides, in “bullet” form, sug-gestions for avoiding those risks. It focuseson the technical details of product design, test,and production to help managers proactivelymanage risk. It also includes chapters on fa-cilities, logistics, and management, whichmake this a useful tool in identifying weakareas of XYZ planned processes early enoughto implement actions needed to avoid adverseconsequences/impacts. A copy of this manualis available at: http://www.dtic.mil/whs/directives.

• The NAVSO P-6071 Best Practices Man-ual was developed by the Navy to add depthto the Template Book, DoD 4245.7-M.

• Critical Program Attributes are metricsthat the program office developed to mea-sure progress toward meeting our objectives.Team members, IPTs, functional managers,contractors, etc., may develop their ownmetrics to support these measurements. Theattributes may be specification requirements,contract requirements, or measurable param-eters from any agreement or tasking. The ideais to provide a means to measure whether weare on track in achieving our objectives.

• Methods and Metrics for Product Successis a manual published by the Office of theAssistant Secretary of the Navy (RDA) Prod-uct Integrity Directorate. It highlights areasrelated to design, test, and production pro-cesses where problems are most often foundand metrics for the measurement of effec-tiveness of the processes. It also describesthe software tool, Program Manager’s WorkStation (PMWS). (See next paragraph.)

• PMWS contains risk management software,“Technical Risk Identification and MitigationSystem (TRIMS) and Knowledgebase.” Theyprovide a tailorable management systembased on NAVSO P-6071 and DoD 4245.7-M. The PMWS provides a compact disk(CD) that contains the necessary programs forassessing a program’s risk and software forprogram management. PMWS can be ob-tained by calling the Best Manufacturing Pro-gram (BMP) Office at (301) 403-8100.

• New Nuclear Submarine (NSSN) On-LineRisk Database (ONLRB) is a software toolmay be used to support the XYZ Risk Man-agement Process. The tool helps IPTs in theidentification and assessment of risk andmanagement of handling efforts.

• Risk Matrix is another candidate for use bythe PMO. It is an automated tool, developedby Mitre Corporation, that supports a struc-tured approach for identifying risk andassessing its potential program impact. It isespecially helpful for prioritizing risks.

• Requirements Documents describe theoutput of our efforts. IPT efforts need to bemonitored continuously to ensure require-ments are met on time and within budget.When they aren’t, there is risk.

• Contracting for Risk Management helpsensure the people involved with the detailsof the technical processes of design, test, andproduction are involved with managing risk.The principle here is that those performingthe technical details are normally the firstones to know when risks exist.

• Quality Standards, such as ISO9000,ANSI/ASQC Q 9000, MIL-HDBK 9000,and others describe processes for develop-ing and producing quality products. Com-paring our processes with these standards can

B-16

highlight areas we may want to change toavoid risk.

• Use of Independent Risk Assessors is amethod to help ensure all risk is identified.The knowledgeable, experienced people areindependent from the management andexecution of the processes and proceduresbeing reviewed. Independent assessmentpromotes questions and observations nototherwise achievable.

3.3.2.3 Analysis. Risk analysis is an evalua-tion of the identified risk events to determinepossible outcomes, critical process variancefrom known best practices, the probability/like-lihood of those events occurring, and the con-sequences/impacts of the outcomes. Once thisinformation has been determined, the risk eventmay be rated against the program’s criteria andan overall assessment of low, moderate, or highassigned. Figure B-4 depicts the risk analysisprocess and procedures.

Critical Process Variance. For each process riskrelated event identified, the variance of the pro-cess from known standards or best practicesmust be determined. As shown in Figure B-4,there are five levels (a-e) in the XYZ riskassessment process, with the correspondingcriteria of Minimal, Small, Acceptable, Large,and Significant. If there is no variance then thereis no risk.

Probability/Likelihood. For each risk area iden-tified, the probability/likelihood the risk willhappen must be determined. As shown in FigureB-4, there are five levels (a-e) in the XYZ riskassessment process, with the correspondingsubjective criteria of Remote, Unlikely, Likely,Highly Likely, and Near Certainty. If there is zeroprobability/likelihood of an event, there is no riskper our definition.

Consequence/impact. For each risk area identi-fied, the following question must be answered:Given the event occurs, what is the magnitudeof the consequence/impact? As shown in thefigure, there are five levels of consequence/impact (a-e). “Consequence/impact” is a mul-tifaceted issue. For this program, there are fourareas that we will evaluate when determiningconsequence/impact: technical performance,schedule, cost, and impact on other teams. Atleast one of the four consequence/impact areasneeds to apply for there to be risk; if there is noadverse consequence/impact in any of the areas,there is no risk.

• Technical Performance: This category in-cludes all requirements that are not includedin the other three metrics of the Conse-quence/Impact table. The wording of eachlevel is oriented toward design processes,production processes, life cycle support, andto retirement of the system. For example, theword “margin” could apply to weight mar-gin during design, safety margin during test-ing, or machine performance margin duringproduction.

• Schedule: The words used in the Schedulecolumn, as in all columns of the Conse-quence /Impact table, are meant to be uni-versally applied. Avoid excluding a conse-quence/impact level from consideration justbecause it doesn’t match your team’s spe-cific definitions. In other words, phrases suchas need dates, key milestones, critical path,and key team milestones are meant to applyto all IPTs.

• Cost: Since costs vary from component tocomponent and process to process, the per-centage criteria shown in the figure may notstrictly apply at the lower levels of the WBS.These team leaders can set the percentagecriteria that best reflects their situation. How-ever, when costs are rolled up at higher levels

B-17

(e.g., Program), the following definitions willbe used: Level 1—no change, Level 2—<5%, Level 3—5-7%, Level 4—7-10%, andLevel 5—>10%.

• Impact on Other Teams: Both the conse-quence/impact of a risk and the mitigationactions associated with reducing the risk mayimpact another team. This may involveadditional coordination or managementattention (resources) and may thereforeincrease the level of risk. This is especiallytrue of common technical processes.

Risk Rating. Probability and consequence/impact should not always be considered equally;for example, there may be consequences/impacts

so severe that it is considered high risk eventhough the probability to achieve a particularoutcome is low. After deciding a level of pro-cess variance/probability/likelihood (a throughe) and a level of consequence/impact (a throughe), enter the Assessment Guide portion of Fig-ure B-4 to obtain a risk rating (green = LOW,yellow = MOD, and red = HIGH). For example;consequence/impact/process variance/probabil-ity/likelihood level 2b corresponds to LOW risk,level 3d corresponds to MOD risk, level 5c cor-responds to HIGH risk. After obtaining the riskrating, make a subjective comparison of the riskevent with the applicable rating definition in Fig-ure B-4 (e.g., High= unacceptable, major dis-ruptions, etc.). There should be a close match. Ifthere isn’t, consider reevaluating the level of

Figure B-4. Risk Assessment Process

Level What is the Likelihood theRisk Event Will Happen?

a Remote

b Unlikely

c Likely

d Highly likely

e Near certainty

a Minimal or no impact Minimal or no impact Minimal or no impact None

b Acceptable with some Additional resources <5% Some impactreduction in margin required; able to meetneed dates

c Acceptable with significant Minor slip in key milestones; 5-7% Moderate impactreduction in margin not able to meet need date

d Acceptable; no remaining Major slip in key milestone 7-10% Major impactmargin or critical path impacted

e Unacceptable Can’t achieve key team or >10% Unacceptablemajor program milestone

LevelTechnical

Performanceand/or Schedule

and/or Cost

and/or

Impact onOther Teams

RISK ASSESSMENT

R HIGH—Unacceptable. Majordisruption likely. Differentapproach required. Prioritymanagement attentionrequired.

Y MODERATE—Somedisruption. Differentapproach may be required.Additional managementattention may be needed.

G LOW—Minimum impact.Minimum oversight neededto ensure risk remains low.

Process Variance refers todeviation from best practices.Likelihood/Probability refers torisk events.

ASSESSMENT GUIDE

e L M H H H

d L M M H H

c L M M M H

b L L L M M

a L L L L M

a b c d e

Lik

elih

oo

d

Consequence

B-18

probability/likelihood or consequence/impact.Those risk events that are assessed as moderateor high should be submitted to the XYZ RiskManagement Coordinator on a RIF.

Figure B-4 is useful to convey information todecision makers and will be used primarily forthat purpose. The PMO will use the RiskTracking Report and Watch List. (See AnnexD.)

3.4 RISK HANDLING

3.4.1 Process

After the program’s risks have been identifiedand assessed, the approach to handling eachsignificant risk must be developed. There areessentially four techniques or options for han-dling risks: avoidance, control, transfer, andassumption. For all identified risks, the vari-ous handling techniques should be evaluatedin terms of feasibility, expected effectiveness,cost and schedule implications, and the effecton the system’s technical performance, and themost suitable technique selected. Section2524.3 of the Defense Acquisition Deskbookcontains information on the risk-handling tech-niques and various actions that can be used toimplement them. The results of the evaluationand selection will be included and documentedin the RMIS using the RIF. This documenta-tion will include: what must be done, the levelof effort and materials required, the estimatedcost to implement the plan, a proposed sched-ule showing the proposed start date, the timephasing of significant risk reduction activities,the completion date, and their relationship tosignificant Program activities/milestones (anexample is provided in Annex B), recom-mended metrics for tracking the action, a listof all assumptions, and the person responsiblefor implementing and tracking the selectedoption.

3.4.2 Procedures

The IPT that assessed the risk is responsiblefor evaluating and recommending to the PM therisk-handling options that are best fitted to theprogram’s circumstances. Once approved, theseare included in the program’s acquisition strategyor management plans, as appropriate.

For each selected handling option, the respon-sible IPT will develop specific tasks that, whenimplemented, will handle the risk. The taskdescriptions should explain what has to bedone, the level of effort, and identify neces-sary resources. It should also provide a pro-posed schedule to accomplish the actions in-cluding the start date, the time phasing of sig-nificant risk reduction activities, the comple-tion date, and their relationship to significantProgram activities/milestones (an example isprovided in Annex B), and a cost estimate.The description of the handling options shouldlist all assumptions used in the developmentof the handling tasks. Assumptions should beincluded in the RIF. Recommended actionsthat require resources outside the scope of acontract or official tasking should be clearlyidentified, and the IPTs, the risk area, or otherhandling plans that may be impacted shouldbe listed.

Reducing requirements as a risk avoidance tech-nique will be used only as a last resort, and thenonly with the participation and approval of theuser’s representative.

DoD 4245.7-M Templates and NAVSO P-6071 Best Practices Manual, are useful in de-veloping risk-handling actions for design, test,or manufacturing process risks.

B-19

3.5 RISK MONITORING

3.5.1 Process

Risk monitoring systematically tracks and evalu-ates the performance of risk-handling actions. Itis part of the PMO function and responsibilityand will not become a separate discipline. Es-sentially, it compares predicted results of plannedactions with the results actually achieved to de-termine status and the need for any change inrisk-handling actions. The effectiveness of therisk-monitoring process depends on the estab-lishment of a management indicator system(metrics) that provides accurate, timely, and rel-evant risk information in a clear, easily under-stood manner. (See Annex D.) The metrics se-lected to monitor program status must adequatelyportray the true state of the risk events and han-dling actions. Otherwise, indicators of risks thatare about to become problems will go undetec-ted.

To ensure that significant risks are effectivelymonitored, risk-handling actions (which includespecific events, schedules, and “success” crite-ria) will be reflected in integrated programplanning and scheduling. Identifying these riskhandling actions and events in the context ofWork Breakdown Structure (WBS) elementsestablishes a linkage between them and spe-cific work packages, making it easier to deter-mine the impact of actions on cost, schedule,and performance. The detailed information onrisk-handling actions and events will be in-cluded in the RIF for each identified risk, andthus be resident in the RMIS.

3.5.2 Procedures

The functioning of IPTs is crucial to effectiverisk monitoring. They are the “front line” forobtaining indications that risk-handling effortsare achieving their desired effects. Each IPT isresponsible for monitoring and reporting the

effectiveness of the handling actions for the risksassigned. Overall XYZ program risk assessmentreports will be prepared by the XYZ Risk Man-agement Coordinator working with the cogni-zant IPT.

Many techniques and tools are available formonitoring the effectiveness of risk-handlingactions, and IPTs must ensure that they selectthose that best suit their needs. No single tech-nique or tool is capable of providing a com-plete answer—a combination must be used. Ata minimum, each IPT will maintain a watch listof identified high priority risks. See Section2524.4 of the Defense Acquisition Deskbook forinformation on specific techniques.

Risks rated as Moderate or High risk will bereported to the XYZ Risk Management Coor-dinator, who will also track them, using infor-mation provided by the appropriate IPT, untilthe risk is considered Low and recommendedfor “Close Out.” The IPT that initially reportedthe risk retains ownership and cognizance forreporting status and keeping the databasecurrent. Ownership means implementing han-dling plans and providing periodic status of therisk and of the handling plans. Risk will bemade an agenda item at each management ordesign review, providing an opportunity for allconcerned to offer suggestions for the bestapproach to managing risk. Communicatingrisk increases the program’s credibility andallows early actions to minimize adverseconsequences/impacts.

The risk management process is continuous.Information obtained from the monitoring pro-cess is fed back for reassessment and evalua-tions of handling actions. When a risk area ischanged to Low, it is put into a “Historical File”by the Risk Management Coordinator and it isno longer tracked by the XYZ PMO. The“owners” of all Low risk areas will continuemonitoring Low risks to ensure they stay Low.

B-20

The status of the risks and the effectiveness ofthe risk-handling actions will be reported to theRisk Management Coordinator:

• Quarterly;

• When the IPT determines that the status ofthe risk area has changed significantly (as aminimum when the risk changes from highto moderate to low, or vice versa); and

• When requested by the Program Manager.

4.0 RISK MANAGEMENTINFORMATION SYSTEMAND DOCUMENTATION

The XYZ program will use the XXX databasemanagement system as its RMIS. The systemwill contain all of the information necessaryto satisfy the program documentation andreporting requirements.

4.1 RISK MANAGEMENTINFORMATION SYSTEM (RMIS)

The RMIS stores and allows retrieval of risk-related data. It provides data for creating reportsand serves as the repository for all current andhistorical information related to risk. Thisinformation will include risk assessment docu-ments, contract deliverables, if appropriate, andany other risk-related reports. The PMO willuse data from the RMIS to create reports forsenior management and retrieve data for day-to-day management of the program. The pro-gram produces a set of standard reports forperiodic reporting and has the ability to createad hoc reports in response to special queries.See Annex D for a detailed discussion of theRMIS.

Data are entered into the RMIS using the RiskInformation Form (RIF). The RIF gives mem-bers of the project team, both Government and

contractors, a standard format for reporting risk-related information. The RIF should be usedwhen a potential risk event is identified and willbe updated as information becomes available asthe assessment, handling, and monitoring func-tions are executed.

4.2 RISK DOCUMENTATION

All program risk management information willbe documented, using the RIF as the standardRMIS data entry form. The following para-graphs provide guidance on documentationrequirements for the various risk managementfunctions.

4.2.1 Risk Assessment Documentation

Risk assessments form the basis for many pro-gram decisions. From time to time, the PM willneed a detailed report of any assessment of arisk event. It is critical that all aspects of therisk management process are documented.

4.2.2 Risk Handling Documentation

Risk-handling documentation will be used toprovide the PM with the information he needsto choose the preferred mitigation option.

4.2.3 Risk Monitoring Documentation

The PM needs a summary document that tracksthe status of high and moderate risks. The RiskManagement Coordinator will produce a risktracking list, for example, that uses informa-tion that has been entered from the RMIS. Thisdocument will be produced on a monthly basis.

4.3 REPORTS

Reports are used to convey information todecision makers and team members on thestatus of the program and the effectiveness ofthe risk management program. Every effort will

B-21

be made to generate reports using the dataresident in the RMIS.

4.3.1 Standard Reports

The RMIS will have a set of standard reports. IfIPTs or functional managers need additionalreports, they should work with the RiskManagement Coordinator to create them. Accessto the reporting system will be controlled;however, any member of the Government or

contractor team may obtain a password to gainaccess to the information. See Annex B for adescription of the XYZ program reports.

4.3.2 Ad Hoc Reports

In addition to standard reports, the PMO willneed to create ad hoc reports in response tospecial queries. The Risk Management Coor-dinator will be responsible for these reports.

B-22

ANNEX ATO XYZ RISK MANAGEMENT PLAN

— CRITICAL PROGRAM ATTRIBUTES —

Table B-1. Critical Program Attributes

Category Description Responsible IPT Remarks

Performance/Physical Speed

Weight

Endurance

Crew Size

Survivability

Maneuverability

Size

Receiver Range

Transmitter Range

Data Link Operations

Recovery Time

Initial Setup

Identification Time

Accuracy Location

Probability of Accurate ID

Reliability

Maintainability

Availability

Etc.

Cost Operating and Support Costs

Etc.

Processes Requirements Stable

Test Plan Approved

Exit Criteria Engine Bench Test

Accuracy Verified by Test Dataand Analysis

Toolproofing Completed

Logistics Support Reviewed byUser

B-23

ANNEX BTO XYZ RISK MANAGEMENT PLAN

— PROGRAM RISK REDUCTION SCHEDULE —(EXAMPLE OF RISK HANDLING PLAN SCHEDULE)

Figure B-5. XYZ Program Risk Handling Plan Schedule (Example)

MS C

RISK

RATING

MEDIUM

LOW

HIGH

CTD P&DSDDCY 2000 2001 2002 2003 2004 2005 2006 2007

Accomplished Planned

Determine and flowdown requirements, evaluate potential hardware and software solutions. Gather dataon NDI capabilities, limitations, evaluate alternatives and pick lower risk solutions.

Simulations to evaluate subsystem interactions, timing issues. Simulations to evaluate target sets,environment effects.

Preliminary design and trade studies to work issues such as temperature and shock environments.Develop baseline design. Reassess risk.

Get hardware and software in place for pre-EMD simulations. Consolidate team structure and supplier.

Hardware-in-the-Loop (HWIL) and performance prediction demo. Supporting analyses and designstudies.

Initiate detailed trade studies and identify alternatives. Validate and implement trade studydecisions with customer on IPD teams for lower risk options. Reassess risk.

Extensive simulations & HWIL testing. Developmental test program, supportinganalyses, reviews and decisions.

Systems integration testing (supported by continued simulations) toverify design. TAAF program with selected subsystems. Reassess risk.

Qualification testing.

Operational testing & simulations.(LRIP items)

Production.FRP

MS BPDR CDR FCASRRSRR PCA

B-24

Table B-3. Examples of Process Metrics

Table B-2. Examples of Product-Related Metrics

ANNEX CTO XYZ RISK MANAGEMENT PLAN— PROGRAM METRIC EXAMPLES —

Key Design Parameters• Weight• Size• Endurance• Range

Design Maturity• Open problems

reports• Number of

engineering changeproposals

• Number of drawingsreleased

• Failure activities

Computer ResourceUtilization

Etc.

Engineering Requirements SupportProductionRequirements Traceability

Requirements Stability

Threat Stability

Design Mission Profile

Manufacturing Yields

Incoming Material Yields

Delinquent Requisitions

Unit Production Cost

Process Proofing

Waste

Personnel Stability

Special Tools and TestEquipment

Support InfrastructureFootprint

Manpower Estimates

Development ofrequirementstraceability plan

Development ofspecification treeSpecificationsreviewed for:

• Definition ofall useenviron-ments

• Definition ofall functionalrequirementsfor eachmissionperformed

Integrated TestPlan

DesignProcess

FailureReporting

SystemTrade

StudiesDesign

Requirements

Users needsprioritized

Alternativesystem configu-rations selected

Test methodsselected

Design require-ments stability

Producibilityanalysis con-ducted

Design analyzedfor:

• Cost

• Partsreduction

• Manufac-turability

• Testability

All developmen-tal tests atsystem andsubsystem levelidentified

Identification ofwho will do test(Government,contractor,supplier)

Contractorcorporate-levelmanagementinvolved infailure reportingand correctiveaction process

Responsibility foranalysis andcorrective actionassigned tospecific indi-vidual with close-out date

ManufacturingPlan

Plan documentsmethods bywhich design tobe built

Plan containssequence andschedule ofevents at con-tractor and sub-contractor levelsthat defines useof materials, fab-rication flow, testequipment, tools,facilities, andpersonnel

Reflects manu-facturing inclu-sion in designprocess. In-cludes identi-fication andassessment ofdesign facilities

B-25

Table B-4. Examples of Cost and Schedule Metrics

Cost Schedule

Cost variance Schedule variance

Cost performance index Schedule performance index

Estimate at completion Design schedule performance

Management reserve Manufacturing schedule performance

Test schedule performance

B-26

Figure B-6. Conceptual Risk Management and Reporting System

ANNEX DTO XYZ RISK MANAGEMENT PLAN

— MANAGEMENT INFORMATION SYSTEM AND DOCUMENTATION —

1.0 DESCRIPTION

In order to manage risk, we need a databasemanagement system that stores and allows re-trieval of risk-related data. The Risk Manage-ment Information System provides data forcreating reports and serves as the repository forall current and historical information related torisk. This information may include risk assess-ment documents, contract deliverables, if appro-priate, and any other risk-related reports. TheRisk Management Coordinator is responsiblefor the overall maintenance of the RMIS, andhe or his designee are the only persons who mayenter data into the database.

The RMIS will have a set of standard reports.If IPTs or functional managers need additionalreports, they should work with the Risk Man-agement Coordinator to create them. Access to

the reporting system will be controlled; however,any member of the Government or contractorteam may obtain a password to gain access tothe information.

In addition to standard reports, the PMO willneed to create ad hoc reports in response to spe-cial queries etc. The Risk Management Coor-dinator will be responsible for these reports.Figure B-6 shows a concept for a managementand reporting system.

2.0 RISK MANAGEMENT REPORTS—XYZ PROGRAM

The following are examples of basic reports thata PMO may use to manage its risk program.Each office should coordinate with the RiskManagement Coordinator to tailor and amplifythem, if necessary, to meets its specific needs.

Other

Risk Management Concept

Contractor

Request orCreate Report

RIFSubmit Data

for Entry

Request Reports or Information(Controlled Access)

IPTs

Functional

RiskCoordinator

DatabaseManagement

System

StandardReports

Ad HocReports

HistoricalData

B-27

2.1 RISK INFORMATION FORM

The PMO needs a document that serves the dualpurpose of a source of data entry informationand a report of basic information for the IPTs,etc. The Risk Information Form (RIF) servesthis purpose. It gives members of the projectteam, both Government and contractors, a for-mat for reporting risk-related information. TheRIF should be used when a potential risk eventis identified and updated over time as informa-tion becomes available and the status changes.As a source of data entry, the RIF allows thedatabase administrator to control entries. Theformat for a RIF is included on page B-29.

2.2 RISK ASSESSMENT REPORT

Risk assessments form the basis for many pro-gram decisions, and the PM may need a detailedreport of assessments of a risk event that hasbeen done. A Risk Assessment Report (RAR)is prepared by the team that assessed a risk eventand amplifies the information in the RIF. Itdocuments the identification, analysis, andhandling processes and results. The RARamplifies the summary contained in the RIF, isthe basis for developing risk-handling plans,and serves as a historical recording of programrisk assessment. Since RARs may be largedocuments, they may be stored as files. RARsshould include information that links it to theappropriate RIF.

2.3 RISK-HANDLINGDOCUMENTATION

Risk-handling documentation may be used toprovide the PM with information he needs tochoose the preferred mitigation option and is

the basis for the handling plan summary con-tained in the RIF. This document describes theexamination process for risk-handling optionsand gives the basis for the selection of the rec-ommended choice. After the PM chooses anoption, the rationale for that choice may be in-cluded. There should be a time-phased plan foreach risk-handling task. Risk-handling plans arebased on results of the risk assessment. Thisdocument should include information that linksit to the appropriate RIF.

2.4 RISK MONITORINGDOCUMENTATION

The PM needs a summary document that tracksthe status of high and moderate risks. The XYZprogram will use a risk-tracking list that con-tains information that has been entered fromthe RIF. An example of the tracking report/listis shown on page B-30.

3.0 DATABASE MANAGEMENTSYSTEM (DBMS)

The XYZ Risk Management Information Sys-tem (RMIS) provides the means to enter andaccess data, control access, and create reports.Key to the MIS are the data elements that residein the database. Listed below are the types ofrisk information that will be included in thedatabase. “Element” is the title of the databasefield; “Description” is a summary of the fieldcontents. The Risk Management Coordinatorwill create the standard reports such as, the RIF,Risk Monitoring, etc. The RMIS also has theability to create “ad hoc” reports, which can bedesigned by users and the Risk ManagementCoordinator.

B-28

Table B-5. DBMS Elements

Element Description

Risk Identification(ID) Number

Risk Event

Priority

Data Submitted

Major System/Component

Subsystem/Functional Area

Category

Statement of Risk

Description ofRisk

KeyParameters

Assessment

Analyses

Probability ofOccurrence

Consequence

Time Sensitivity

Other AffectedAreas

Risk Handling Plans

Risk MonitoringActivity

Status

Status Due Date

Assignment

Reported By

Identifies the risk and is a critical element of information, assuming that arelational database will be used by the PMO. (Construct the ID number toidentify the organization responsible for oversight.)

States the risk event and identifies it with a descriptive name. The statementand risk identification number will always be associated in any report.

Reflects the importance of this risk priority assigned by the PMO compared toall other risks, e.g., a one (1) indicates the highest priority.

Gives the date that the RIF was submitted.

Identifies the major system/component based on the WBS.

Identifies the pertinent subsystem or component based on the WBS.

Identifies the risk as technical/performance cost or schedule or combination ofthese.

Gives a concise statement (one or two sentences) or the risk.

Briefly describes the risk. Lists the key processes that are involved in thedesign, development, and production of the particular system or subsystem. Iftechnical/performance, includes how it is manifested (e.g., design andengineering, manufacturing, etc.)

Identifies the key parameter, minimum acceptable value, and goal value, ifappropriate. Identifies associated subsystem values required to meet theminimum acceptable value and describes the principal events planned todemonstrate that the minimum value has been met.

States if an assessment has been done. Cites the Risk Assessment Report, ifappropriate.

Briefly describes the analysis done to assess the risk. Includes rationale andbasis for results.

States the likelihood of the event occurring, based on definitions in theprogram’s Risk Management Plan.

States the consequence of the event, if it occurs, based on definitions in theprogram’s Risk Management Plan.

Estimates the relative urgency for implementing the risk-handling option.

If appropriate, identifies any other subsystem or process that this risk affects.

Briefly describes plans to mitigate the risk. Refers to any detailed plans thatmay exist, if appropriate.

Measures using metrics for tracking progress in implementing risk handlingplans and achieving planned results for risk reduction.

Briefly reports the status of the risk-handling activities and outcomes relevantto any risk handling milestones.

Lists date of the status report.

Lists individual assigned responsibility for mitigation activities.

Records name and phone number of individual who reported the risk.

B-29

Risk Information Form

Risk Identification Number DateRisk Event:Priority

Major System/Component/Functional Area:

Category:

Statement of Risk:Description of Risk:

Key Parameters:Assessment:

Analysis:

Process VarianceProbability of Occurrence:Consequence:

Time Sensitivity:Other Affected Areas:

Risk Handling Plans:

Risk Monitoring Activity:

StatusStatus Date:

Assignment: Reported By:

Figure B-7. Risk Information Form

B-30

I. Risk Area Status: Design PF: Hi CF: Hi

Significant Design Risks:

1. Title: System Weight PF: Hi CF: Hi

Risk Event: Exceed system weight by 10%; decreasing the range and increasing fuelconsumption.

Action: Examining subsystems to determine areas where weight may be reduced.Reviewing the requirement. Closely watching the effect on reliability andsurvivability.

2. Title: Design Analysis PF: Hi C

F: Hi

Risk Event: Failure Modes, Effects and Criticality Analysis (FMECA) is planned too lateto

identify and correct any critical single-point failure points prior to designfreeze.

Action: Additional resources are being sought to expedite performance of FMECA.

II. Risk Area Status: Supportability PF: Hi CF: Mod/Hi

1. Title: Operational Support PF: Hi CF: Mod/Hi

Risk Event: Power supply subcontractor is in financial trouble and may go out of business.No other known sources exist.

Action: Doing trade study to see if alternative designs have a broader power supplyvendor base. Prime contractor is negotiating with the subcontractor to buydrawings for development of second source.

Risk Tracking Report(Example Report)

Figure B-8. Risk Tracking Report Example

B-31

Table B-6. Watch List Example

Potential RiskArea

Risk ReductionActions

ActionCode

• Accuratelypredicting shockenvironmentshipboardequipment willexperience.

• Evaluatingacoustic impactof the shipsystems that arenot similar toprevious designs.

31 Aug 01

31 Aug 02

31 Aug 01

31 Aug 02

Due DateDate

Completed Explanation

• Use multiple finiteelement codes &simplified numericalmodels for earlyassessments.

• Shock test simpleisolated deck, andproposed isolatedstructure to improveconfidence inpredictions.

• Concentrate onacoustic modelingand scale testing oftechnologies notdemonstratedsuccessfully in large-scale tests or full-scale trials.

• Factor acousticsignature mitigationfrom isolated modulardecks into systemrequirements.Continue model teststo validate predictionsfor isolated decks.

SE03

SE03

SE031

SE032

B-32

SAMPLE RISK MANAGEMENT PLANFOR THE ABC PROGRAM (ACAT III, IV)

reviews; (4) concurrent with the review and up-date of other program plans; and (5) inpreparation for a POM submission.

2.0 PROGRAM SUMMARY

2.1 DESCRIPTION

The ABC Program is an ACAT III level pro-gram that was initiated in response to the NEWCOM Operational Requirements Document(ORD) XXX, dated DD-MM-YYYY. The pro-gram will provide an ABC communicationssystem that will be the common system (trans-mitter/receiver/controller) for all DoD compo-nents for UHF satellite communications. AllDoD systems requiring UHF satellite commu-nications procured subsequent to Initial Opera-tional Capability (IOC) of the ABC system willincorporate it to meet their needs. The Bx Un-manned Air Vehicle is the lead system for inte-gration. The program has completed the Sys-tems Integration part of the System Develop-ment and Demonstration (SDD) Phase and ispreparing for an Interim Progress Review.

The system will be acquired using off-the-shelfUHF satellite communications systems. Duringthe System Integration (SI) part of the SystemDevelopment and Demonstration (SDD) Phaseof the program, two contractors deliveredprototypes of their systems. One is a ruggedizedcommercial product and the other is built tomilitary specifications. The Government testedboth systems against functional and perfor-mance requirements and some environmentalextremes. Although, each failed portions of thetests, both were evaluated as mature enough torepresent an acceptable risk for proceeding tothe System Demonstration part of the SDDPhase of the program.

1.0 INTRODUCTION

1.1 PURPOSE

This Risk Management Plan (RMP) presentsthe process for implementing the comprehen-sive and proactive management of risk as partof the overall management of the ABC Program.Risk management is a program managementtool to handle events that might adverselyimpact the program, thereby increasing theprobability/likelihood of success. This RMPdescribes a management tool that will:

• Serve as a basis for identifying alternativesto achieve cost, schedule, and performancegoals,

• Assist in making decisions on budget andfunding priorities,

• Provide risk information for Milestonedecisions, and

• Allow monitoring the health of the programas it proceeds.

The RMP describes methods for assessing (identi-fying and analyzing), prioritizing, and monitoringrisk drivers; developing risk-handling approaches,and applying adequate resources to handle risk. Itassigns specific responsibilities for these functions,and prescribes the documenting, monitoring, andreporting processes to be followed.

If necessary, this RMP will be updated on thefollowing occasions: (1) whenever the acquisi-tion strategy changes, or there is a major changein program emphasis; (2) in preparation formajor decision points; (3) in preparation for,and immediately following, technical audits and

B-33

2.2 ACQUISITION STRATEGY

The Government will invite the contractors thatparticipated in System Integration (SI) Part ofthe System Development and Demonstration(SDD) Phase of the program to submit pro-posals to refine their approached into a stable,interoperable, producible, supportable, andcost-effective design; validate the manufactur-ing or production process; and demonstratesystem capabilities through testing. The Gov-ernment will select one of the two proposalsfor the System Demonstration part of the SDDPhase of the program. The contractor, upondemonstration of exit criteria (See Annex A),will proceed with a Low Rate Initial Production(LRIP) of the system.

The IOC (20 systems) for the ABC system isrequired by FY-02 to support the fielding of theBx UAV. Production capacity for the ABC sys-tem at IOC is expected to be 20 units per monthto meet the demand of new systems.

2.3 PROGRAM MANAGEMENTAPPROACH

The ABC Program Manager (PM) reports to theProgram Director, Satellite Communicationswho has responsibility for all satellite commu-nications systems. The ABC Program Office(PO) is composed of the PM and one assistant,with matrix support from the systems commandorganizations, and program managementsupport from an external contractor. An inte-grated management approach will be used forthis program. The government and selectedcontractor will have representation on Inte-grated Product Teams (IPTs) that will focus oncost, design, test, manufacturing, and supportof the system. The PM chairs the governmentIPT that develops strategies for acquisition andcontracts.

3.0 RISK-RELATED DEFINITIONS

The Defense Acquisition Deskbook, Section2521 contains the definitions for risk, risk man-agement, risk events, and the terms associatedwith risk management that will be used by theABC PO. Variation and clarification of defini-tions that appear in the Defense AcquisitionDeskbook, as they are used in the ABC programare described below.

3.1 TECHNICAL RISK

This is the risk associated with the evolution ofthe design, production, and supportability of theABC system affecting the level of performancenecessary to meet the operational requirements.The contractor and subcontractors’ design, test,and production processes (process risk) influ-ence the technical risk and the nature of theproduct as depicted in the various levels of theWork Breakdown Structure (product risk). Pro-cess risks are assessed in terms of process vari-ance fro known best practices and potentialconsequences/impacts of the variance. Productrisks are assessed in terms of technical perfor-mance measures and observed variances fromestablished profiles.

3.2 COST RISK

The risk associated with the ability of the pro-gram to achieve its life-cycle cost objectives.Two risk areas bearing on cost are (1) the riskthat the cost estimates and objectives areaccurate and reasonable and (2) the risk thatprogram execution will not meet the costobjectives as a result of a failure to mitigatetechnical risks.

3.3 RISK RATINGS

This is the value that is given to a risk event (orthe program overall) based on the analysis ofthe probability/likelihood and consequences/

B-34

impacts of the event. For the ABC program, riskratings of low, moderate, or high will be assignedbased on the criteria in Section 6.2.

4.0 RISK MANAGEMENT STATUSAND STRATEGY

4.1 RISK MANAGEMENT STATUS

As a result of the Program Definition and RiskReduction Phase, the overall risk of the ABCProgram for Milestone C is assessed as moder-ate, but acceptable. Moderate risk functionalareas are environmental requirements; form, fitand function; integration; manufacturing; andcost.

4.2 RISK MANAGEMENT STRATEGY

The ABC Program risk management strategy isto handle program risks, both technical and non-technical, before they become problems, caus-ing serious cost, schedule, or performance im-pacts. This strategy is an integral part of theAcquisition Strategy and the program manage-ment approach, and will be executed primarilythrough the Government-Contractor PIPT or-ganization. The PIPTs will continuously andproactively assess critical areas (especiallythose listed in the previous paragraph) to iden-tify and analyze specific risks and will developoptions to mitigate all risks designated as mod-erate and high. The PIPTs will also identify theresources required to implement the developedrisk-handling options. The PM, through theProgram Level Integrated Product Team (PLIPT),will review and approve the PIPT options. Onceapproved, the options will be incorporated intothe program integrated master plan (IMP) andintegrated master schedule (IMS). The PIPTswill monitor the effectiveness of the selectedhandling options, and adjust the risk handlingapproach as necessary.

IPTs will keep risk information current by usingthe risk management information system de-scribed in paragraph 6.5. Risk status will be re-ported at all program reviews. As new informa-tion becomes available, the PO and contractorwill conduct additional reviews to ascertain ifnew risks exit. The goal is to be continuouslylooking to the future for areas that may severelyimpact the program.

5.0 RISK MANAGEMENTORGANIZATION

5.1 PROGRAM OFFICE

The ABC Program risk management organiza-tion is shown in Figure B-9. This structure isintegrated into the contractor and Government’sexisting organizations. Program IntegratedProduct Teams (PIPTs) will be formed for thefunctional areas that are critical to the successof the program. All functional areas not cov-ered by a PIPT will be assessed and reviewedby the PLIPT co-chaired by the ABC PM andcontractor PM, to ensure adequate vigilanceagainst emerging risk areas. Independent riskassessors amy conduct reviews, when directedby the PM, to ensure the interface requirementsof user systems are being met by the ABCsystem design.

The PM the is overall coordinator of RiskManagement Program and is responsible for:

• Maintaining this Risk Management Plan;

• Maintaining the Risk Management Database;

• Approving risk-handling options;

• Incorporating risk-handling actions into theprogram master plan and schedule;

• Briefing the decision makers on the statusof ABC Program risk efforts; and

B-35

• Preparing risk briefings, reports, and docu-ments required for Program Reviews and theacquisition Milestone decision processes.

PLIPT

The PLIPT is responsible for complying withthe DoD risk management policy and for struc-turing an efficient and useful ABC risk man-agement approach and supporting the RiskManagement Coordinator/PM in carrying outhis responsibilities. The PM and contractor PMCo-Chair the PLIPT. The PLIPT membershipmay be adjusted, but is initially established asthe chairs of the PIPTs, a representative fromthe joint requirements and users’ office, and arepresentative from the contractor. It’s maineffort is integration of risk assessments per-formed by various program IPTs.

PIPTs

The program IPTs, or PIPTs, are the backboneof the program risk management efforts. Theywill execute the following responsibilitiesrelative to their functional areas:

Figure B-9. ABC Risk Management Organization

• Conduct risk assessments and develop risk-handling options, to include mitigation plansand resources required.

• Monitor effectiveness of risk-handling actions.

• Review and recommend to the PM changesin the overall risk management approachbased on lessons learned.

• Update the risk assessments quarterly, or asdirected.

• Ensure information in the Risk ManagementDatabase is current.

• Prepare risk status reports in their areas forall Program and Design Reviews.

• Ensure Design/Build Team responsibilitiesincorporate appropriate risk managementtasks.

• Coordinate PIPT risk management activitieswith the PLIPT.

PM

Independent RiskAssessor

PLIPT

Cost PIPT Design PIPT Test PIPT Manufacturing PIPT

B-36

6.0 RISK MANAGEMENTSTRUCTURE AND PROCEDURES

The ABC program will use a structured riskmanagement approach consisting of four ele-ments: planning, assessment, handling, andmonitoring. These elements and the generalprocedures to be used for each of them aredescribed in subsequent paragraphs of thissection. A number of guidance documents areuseful in addressing these risk managementelements, and should be used as appropriate byeach PIPT. Some of these documents are listedbelow. (This list is not meant to be complete.)

• Defense Acquisition Deskbook, Section 2.5.2,Risk Management,

• DSMC, Risk Management Guide, June 2002,

• AFMC Pamphlet 63-101, Risk Management,9 July 1997, and

• The Navy’s Best Practices Manual, NAVSOP-6071, and Top Eleven Ways to ManageTechnical Risk, NAVSO P-3686, provideinsight into best practices within the NavalService.

6.1 RISK PLANNING

Risk planning is essential for the execution ofa successful risk management program. It willbe done continuously by all PIPTs as an inte-gral part of normal ABC program management.This RMP serves as the basis for all detailedrisk planning, which must be continuous. Thefollowing paragraphs provide direction for thePIPTs on the conduct of risk planning for thisprogram.

• PIPTs will develop an organized and thoroughapproach to assess, handle, and monitor risks.It will assign responsibilities for specific riskmanagement actions and establish internal risk

reporting and documentation requirements.The PLIPT will monitor the planning activi-ties of the PIPTs to ensure that they are con-sistent with this RMP and that appropriaterevisions to this plan are made when requiredto reflect significant changes resulting fromthe PIPT planning efforts.

• Each PIPT will establish metrics that willmeasure the effectiveness of their plannedrisk-handling options. See Annex C for anexample of metrics that may be used.

• Each PIPT will identify the resources re-quired to implement the risk managementactions. These resources include time,material, personnel, and cost. Training is amajor consideration. All PIPT membersshould receive instruction on the fundamen-tals of risk management and special train-ing in their areas of responsibility, if neces-sary. General risk management training willbe arranged by the PO; PIPT leaders willidentify any specialized training needs.

• This RMP establishes the basic documenta-tion and reporting requirements for theprogram. PIPTs should identify any addi-tional requirements, consistent with thisRMP, that might be needed to effectivelymanage risk at their level.

6.2 RISK ASSESSMENT

The risk assessment process includes the iden-tification of critical risk events/processes, theanalyses of these events/processes to determinethe probability/likelihood of occurrence/processvariance and consequences/impacts, and thepriority of the risks. The output of this processprovides the foundation for all the program risk-handling actions. Therefore, it is essential thatall members of the ABC program team be asthorough as possible when identifying andanalyzing risks. In addition to the normal areas

B-37

of design , test, manufacturing, etc., PIPTs mustidentify and analyze the risks associated withsuch areas as manpower, environmental impact,system safety and health analysis, and securityconsiderations. The Defense AcquisitionDeskbook, Section 2524, provides informationon various risk assessment techniques.

Risk assessments should be done by the PIPTsand the PLIPT with active participation of bothGovernment and contractor personnel. Whennecessary or appropriate, the PIPTs and thePLIPT can direct a contractor-only assessment,or conduct a Government assessment. PIPTsand the PLIPT should continually assess therisks in their areas, reviewing critical risk areas,risk ratings and prioritization, and the effective-ness of risk-handling actions whenever neces-sary to assess progress. The assessment pro-cess will be iterative, with each assessmentbuilding on the results of previous assessments.PIPTs and the PLIPT will use the currentassessment baseline as the starting point fortheir initial assessment during this phase. Thisbaseline is a combination of the risk assessmentdelivered by the contractors as part of the Con-cept and Technology Development (CTD)Phase, the PMO process risk assessment donebefore Milestone B, and the post award Inte-grated Baseline Review (IBR). Risk assess-ments will be updated and the results presentedat all functional and program reviews, with afinal update for this phase prepared not laterthan six months prior to the next scheduledMilestone decision.

6.2.1 Risk Identification

Each PIPT will review all aspects of their func-tional areas to determine the critical events thatwould prevent the program from achieving itsobjectives. They should apply the knowledge,best judgment and experience of the PIPTmembers, lessons learned from similar programs,and the opinion of subject-matter experts (SMEs)

to identify these risk events. PIPTs should fol-low these general procedures to identify riskevents:

• Understand the requirements and the pro-gram performance goals, which are definedas thresholds and objectives (see DoD5000.2-R). Understand the operational(functional and environmental) conditionsunder which the values must be achieved asdescribed in the Design Reference MissionProfile. The ORD and Acquisition ProgramBaseline (APB) contain Key PerformanceParameters (KPPs).

• Determine technical/performance risksrelated to engineering and manufacturingprocesses. Identify those processes that areplanned or needed to design, develop,produce, and support the system. Comparethese processes with industry best practicesand identify any variances or new, untriedprocesses. These variances or untried prac-tices are sources of risk. The contractorshould review the processes to be used byits subcontractors to ensure they are consis-tent with best industry practices. Table 4-2of the DSMC Risk Management Guideshows some of the specific of sources of pro-cess risk, and should be used by the PIPTs.NAVSO P-6071, Best Practices Manual,which describes risks associated with design,test, production, facilities, logistics, manage-ment, and funding, should also be used bythe PIPTs to identify risks.

• Determine technical/performance risksassociated with the product (the ABC com-munications system) in the following criti-cal risk areas: design and engineering, tech-nology, logistics, concurrency, and manufac-turing. The design and manufacturing PIPTswill identify the contract WBS elementsdown to level 3, and evaluate each of theseelements to identify risk events. They will

B-38

use a variety of methods to accomplish this:review of similar programs, existing programplans, expert opinion, etc.

• Identify schedule risk. Each PIPT willdetermine the schedule risk associated withits functional area. When identifying thisschedule risk, they will consider the risk thatthe schedule estimate is accurate, and the riskthat the established schedule can be met. ThePLIPT will monitor the development of theschedule risk in each PIPT, and consolidatethese risks to identify overall programschedule risk.

• Identify cost risk. Each PIPT will determinethe cost risk associated with its functionalarea. They will identify risks associated withthe accuracy of the cost estimates developedfor their areas, and the risk that the estab-lished cost objectives will be met. The CostPIPT will monitor the development of theother PIPT cost risk efforts, and consolidatetheir risks into a set of overall program costrisks.

• All identified risks will be documented inthe RMIS, with a statement of the risk and adescription of the conditions or situationscausing concern and the context of the risk.See Paragraph 6.4 for guidance on docu-menting identified risks.

In identifying risks, PIPTs should be particu-larly alert for the following indicators. They arecommon sources of risk for all programs, andwill be applicable to the ABC program.

• Requirements that are not clearly stated orstable,

• Failure to use Best Practices,

• Use of new processes materials, or applica-tions of existing technologies,

• Use of processes lacking rigor in terms ofmaturity, documentation of establishedprocedures, and validation,

• Insufficient resources—the people, funds,schedule, and tools, necessary for success-ful development, test, production and supportof the ABC program,

• Lack of a formalized failure, reporting,analyze, and corrective action (FRACAS)system,

• Use of suppliers or subcontractors who areinexperienced in the processes for designingand producing required products,

• Failure of prime contractor to effectivelymonitor processes and establish quality re-quirements for suppliers and subcontractors.

6.2.2 Risk Analysis

Risk Analysis is an evaluation of the identifiedrisk events to determine the probability/likeli-hood of the events occurring and their conse-quences/impacts, to assign a risk rating basedon the program criteria, and to prioritize therisks. Each PIPT and the PLIPT are responsiblefor analyzing those risk events they identify.They may use subject matter experts for assis-tance, such as Field Activities, Service Labo-ratories, contractors, or outside consultants. Theuse of external assets will be coordinatedthrough the PMO. The results of the analysisof all identified risks must be documented inthe RMIS.

There are a number of techniques available tosupport risk analysis, to include studies, testresults, modeling and simulation, and theopinions of qualified experts (to includejustification of their judgment). The DefenseAcquisition Deskbook, Section 2524.2 describesa number of analysis techniques that may be

B-39

useful. Regardless of the technique used, PIPTsand the PLIPT will identify all assumptions madein analyzing risk and, where appropriate, conducta sensitivity analysis of assumptions.

For each risk event, the following risk analysisguidelines will be used:

• Probability/Likelihood

For each risk identified, determine the probabil-ity/likelihood that the event will occur. Fivelevels of probability/likelihood will be used forthe ABC program. Table B-7 shows these levelsand their definitions. PIPTs and the PLIPT willassign one of these values to each identifiedrisk event based on their analysis of the event.For example, if it is known that there will be avariance between the soldering process to beused for component X and the industry stan-dard, this process variance risk event will beassigned a probability/likelihood value of “e”—near certainty. Similarly, if the ManufacturingPIPT determines that the schedule estimate forthe fabrication of component Y is overly opti-mistic, and will probably not be attained, itwould assign a probability/likelihood level of“c” or “d” depending on its analysis of the sched-ule estimate.

• Consequence/Impact

For each risk identified, the following questionmust be answered: Given the event occurs, whatis the magnitude of the consequence/impact?For the ABC program, consequence/impact willbe determined in each of four areas: technicalperformance, schedule, cost, and impact on otherteams.

Technical Performance: This category relatesto the risks associated with the processes to beused in the development, testing, and manufac-turing of the ABC system, and the nature of theABC communications system. It includes theform, fit, function, manufacturability, support-ability, etc. Essentially, technical risk includesall requirements that are not part of cost andschedule. The wording of each consequence/impact level is oriented toward design and pro-duction processes, life cycle support, and re-tirement of the system. For example, the word“margin” could apply to weight margin duringdesign, safety margin during testing, or machineperformance margin during production.

Schedule: The description in the Schedule isself-explanatory. The need dates, key mile-stones, critical path, and key team milestones aremeant to apply to all program areas and PIPTs.

Table B-7. Likelihood Levels

Level Likelihood of Occurrence

a Remote

b Unlikely

c Likely

d Highly likely

e Near certainty

B-40

Cost: Since costs vary from component to com-ponent and process to process, the percentagecriteria shown in the figure may not strictlyapply at the lower levels of the WBS. PIPT andPLIPT leaders may set the percentage criteriathat best reflect their situation. However, whencosts are rolled up at higher levels (e.g., Pro-gram), the definitions shown will be used.

Impact on Other Teams: Both the conse-quences/impacts of a risk and the mitigationactions associated with handling the risk mayimpact another team. This may involve addi-tional coordination or management attention(resources), and may therefore increase thelevel of risk. This is especially true of mitiga-tion actions that involve the use of commonmanufacturing processes and/or equipment.

PIPTs and the PLIPT will evaluate each riskevent in terms of these areas, and assign a levelof consequence/impact (1-5). Table B-8 showsthese 5 levels of consequence/impact, anddefines the levels for each area. This table willbe used when assigning the consequence/impactmagnitude.

6.2.3 Risk Rating

Each identified risk will be assigned a risk ratingbased on the joint consideration of event prob-ability/likelihood and consequence/impact.This rating is a reflection of the severity of therisk and provides a starting point for the devel-opment of options to handle the risk. It isimportant to consider both the probability/likelihood and consequences/impacts in estab-lishing the rating, for there may be risk eventsthat have a low probability/likelihood, butwhose consequences/impacts are so severe thatthe occurrence of the event would be disastrousto the program.

Figure B-10 describes the risk rating processthat will be used in this program. PIPTs and

the PLIPT will analyze each risk event to deter-mine the probability/likelihood and consequence/impact values using the definitions in Tables B-7 and B-8; they will determine the consequence/impact for each of the four areas (technical per-formance, schedule, cost, and team impact). Thevalues will be used to determine the risk ratingusing the Assessment Guide in Figure B-10. TheAssessment Guide defines the risk rating asso-ciated with each combination of probability/like-lihood and consequence/impact values, and willbe used throughout the program. For example,consequence/impact/probability/likelihood level1b corresponds to a risk rating of (L) LOW, level4b corresponds to MODERATE risk, and level5c corresponds to HIGH risk.

Those risk events that are assessed as MOD-ERATE or HIGH will be submitted to the ABCPM on a Risk Identification Form (RIF). SeeAppendix B for the RIF format. PIPTs and thePLIPT must actively manage these MODER-ATE and HIGH risks. They must also continu-ously assess the other identified risks in theirareas to see if their ratings have become MOD-ERATE or HIGH.

6.2.4 Risk Prioritization

PIPTs and the PLIPT will prioritize the MOD-ERATE and HIGH risks in their areas. Thisprioritization will provide the basis for thedevelopment of risk handling plans and theallocation of risk management resources.Prioritization will be accomplished using expertopinion within the PIPTs, and will be based onthe following criteria:

• Risk Rating – Obviously HIGH-MODER-ATE.

• Consequence/Impact – Within each rating,the highest value of consequence/impact, e.g.,“e.”

B-41

Table B-8. Risk Consequence

Technical Impact onLevel Performance Schedule Cost Other Teams

a Minimal or no impact Minimal or no impact Minimal or Noneno impact

b Acceptable with some Additional resources <5% Some impactreduction in margin required. Able to meet

need dates

c Acceptable with Minor slip in key milestone. 5-7% Moderatesignificant reduction Not able to meet need dates impactin margin

d Acceptable—no Major slip in key milestone 7-10% Major impactremaining margin or critical path impacted

e Unacceptable Can’t achieve key team or >10% Unacceptablemajor program milestone

Figure B-10. Risk Assessment Process

Level What is the Likelihood theRisk Event Will Happen?

a Remote

b Unlikely

c Likely

d Highly likely

e Near certainty

a Minimal or no impact Minimal or no impact Minimal or no impact None

b Acceptable with some Additional resources <5% Some impactreduction in margin required; able to meetneed dates

c Acceptable with significant Minor slip in key milestones; 5-7% Moderate impactreduction in margin not able to meet need date

d Acceptable; no remaining Major slip in key milestone 7-10% Major impactmargin or critical path impacted

e Unacceptable Can’t achieve key team or >10% Unacceptablemajor program milestone

LevelTechnical

Performanceand/or Schedule

and/or Cost

and/or

Impact onOther Teams

RISK ASSESSMENT

R HIGH—Unacceptable. Majordisruption likely. Differentapproach required. Prioritymanagement attentionrequired.

Y MODERATE—Somedisruption. Differentapproach may be required.Additional managementattention may be needed.

G LOW—Minimum impact.Minimum oversight neededto ensure risk remains low.

Process Variance refers todeviation from best practices.Likelihood/Probability refers torisk events.

ASSESSMENT GUIDE

e L M H H H

d L M M H H

c L M M M H

b L L L M M

a L L L L M

a b c d e

Lik

elih

oo

d

Consequence

B-42

• Urgency – How much time is available beforerisk-handling actions must be initiated.

• Probability/Likelihood – Within each rating,the highest value, e.g., “e.”

The PLIPT will review the prioritized list ofPIPT-developed risks, and integrate them intoa single list of prioritized program risks, usingthe same criteria.

6.3 RISK HANDLING

After the program’s risks have been identified,analyzed, and prioritized, PIPTs and the PLIPTmust develop an approach for handling eachMODERATE and HIGH risk. For all such risks,the various handling techniques should beevaluated in terms of feasibility, expectedeffectiveness, cost and schedule implications,and the effect on the system’s technical perfor-mance, and the most suitable technique selected.The Defense Acquisition Deskbook, Section2524.3 contains information on the risk-handlingtechniques and various actions that can be usedto implement them. Reducing requirements as arisk avoidance technique will be used only as alast resort, and then only with the participationand approval of the user’s representative at thePLIPT level.

The results of the evaluation and selection willbe included and documented in the RMIS usingthe RIF. This documentation will include thefollowing elements:

• What must be done,

• List of all assumptions,

• Level of effort and materials required,

• Resources needed that are outside the scopeof the contract or official tasking,

• Estimated cost to implement the plan,

• Proposed schedule showing the proposedstart date, the time phasing of significant riskreduction activities, the completion date, andtheir relationship to significant Programactivities/milestones,

• Recommended metrics for tracking risk-handling activity,

• Other PIPTs, risk areas, or other handlingplans which may be impacted, and

• Person responsible for implementing andtracking the selected option.

Risk handling actions will be integrated intoprogram planning and scheduling, and incor-porated into the IMP and IMS. PIPTs and thePLIPT will develop these risk-handling actionsand events in the context of Work BreakdownStructure (WBS) elements, establishing a link-age between them and specific work packagesthat makes it easier to determine the impact ofactions on cost, schedule, and performance. Thedetailed information on risk-handling actionsand events will be included in the RIF for eachidentified risk, and thus be resident in the RMIS.

6.4 RISK MONITORING

Risk monitoring is the systematic tracking andevaluation of the progress and effectiveness ofrisk-handling actions by the comparison of pre-dicted results of planned actions with the resultsactually achieved to determine status and theneed for any change in risk-handling actions.The PIPTs and the PLIPT will monitor all iden-tified risks in their areas, with particular atten-tion to those rated as HIGH or MODERATE.There are a number of techniques and toolsavailable for monitoring the effectiveness ofrisk-handling actions. (See the Defense Acqui-sition Deskbook, Section 2524.4 for information

B-43

on specific techniques.) PIPTs and the PLIPTmust select those that best suit their needs. Nosingle technique or tool is capable of providinga complete answer—a combination must beused. At a minimum, each PIPT and the PLIPTwill use the Risk Tracking Report (RTR) andWatch List for day-to-day management andmonitoring of risks. See Annex B for examplesof an RTR and Watch List. The status of risk-handling actions for all MODERATE and HIGHrisks will be an agenda item at each program orfunctional area review.

For each identified risk, the PIPTs and PLIPTwill establish a management indicator system(metrics) that provides accurate, timely, andrelevant risk monitoring information in a clear,easily understood manner. PIPTs and the PLIPTshould select metrics that portray the true stateof the risk events and handling actions. SeeAnnex C for an example of metrics that maybe used.

MODERATE or HIGH risks will also be moni-tored by the ABC PM through the PLIPT, usinginformation provided by the appropriate PIPT,until the risk is considered LOW and recom-mended for “Close Out.” PIPTs and the PLIPTwill continue to monitor LOW risk events intheir areas to ensure that appropriate risk-handling action can be initiated if there areindications that the rating may change.

The status of the risks and the effectiveness ofthe risk-handling actions will be agenda itemsfor all functional area and program reviews, andwill be reported to the PM on the following oc-casions:

• Quarterly,

• When the IPT determines that the status ofthe risk area has changed significantly (as aminimum when the risk changes from highto moderate to low, or vice versa),

• When requested by the Program Manager.

6.5 RISK MANAGEMENTINFORMATION SYSTEM (RMIS),DOCUMENTATION, AND REPORTS

The ABC Program uses a modified version ofRisk Matrix as its RMIS. The Risk Matrixdatabase will contain all of the information nec-essary to satisfy the program documentationand reporting requirements. This informationwill include risk assessment documents, risk-handling plans, contract deliverables, if appro-priate, and any other risk-related reports. Theprogram office will use data from the RMIS tocreate reports for senior management and forday-to-day management of the program. Theprogram produces a set of standard reports forperiodic reporting and has the ability to createad hoc reports in response to special queries.

Each PIPT and the PLIPT are responsible forentering and maintaining accurate risk manage-ment data in the RMIS. A standard format RiskInformation Form (RIF) Data will be used fordata entry. A RIF will be completed and sub-mitted when a potential risk event is identified,and will be updated as information becomesavailable as the assessment, handling, andmonitoring functions are executed. See AnnexB for a sample of the RIF. Annex B also con-tains examples of reports to be used in the ABCProgram.

B-44

ANNEX ATO ABC RISK MANAGEMENT PLAN

— CRITICAL PROGRAM ATTRIBUTES —

Category Description Responsible IPT Remarks

Performance/Physical Transmitter Power

Weight

MTBF

Receiver Gain

EMP Survivability

Heat Dissipation

Size

Receiver Range

Transmitter Range

Data Link Operations

Interface Commonality

Initial Setup

Identification Time

Accuracy Location

Bandwidth

Reliability

Maintainability

Availability

Etc.

Cost Operating and Support Costs

Etc.

Processes Requirements Stable

Test Plan Approved

Exit Criteria Bench Test

Accuracy Verified by Test Dataand Analysis

Toolproofing Completed

Logistics Support Reviewed byUser

Table B-9. Critical Program Attributes

B-45

ANNEX BTO ABC RISK MANAGEMENT PLAN

— MANAGEMENT INFORMATION SYSTEM AND DOCUMENTATION —

1.0 DESCRIPTION

In order to manage risk, we need a databasemanagement system that stores and allowsretrieval of risk-related data. The Risk Manage-ment Information System provides data forcreating reports and serves as the repository forall current and historical information related torisk. The PM is responsible for the overallmaintenance of the RMIS, and he/she or his/her designee are the only persons who may enterdata into the database.

The RMIS has a set of standard reports. If PIPTsor functional managers need additional reports,they should work with the PM to create them.Access to the reporting system will be con-trolled, however any member of the Govern-ment or contractor team may obtain a passwordto gain access to the information.

In addition to standard reports, the PO will needto create ad hoc reports in response to specialqueries, etc. The PM will be responsible forthese reports.

2.0 RISK MANAGEMENT FORMS ANDREPORTS

The following are examples of basic reports andforms that are used in the ABC Program.

2.1 RISK INFORMATION FORM

The PO needs a document that serves the dualpurpose of a source of data entry informationand a report of basic information for the PIPTs,etc. The Risk Information Form (RIF) serves

this purpose. It gives members of the projectteam, both Government and contractors, a formatfor reporting risk-related information. The RIFwill be used when a potential risk event is iden-tified and updated over time as information be-comes available and the status changes. As asource of data entry, the RIF allows the data-base administrator to control entries. The formatand information required in a RIF is detailed inthe following table.

2.2 RISK MONITORINGDOCUMENTATION

The PM needs a summary document that tracksthe status of HIGH and MODERATE risks. TheABC program will use a Risk-Tracking Report(RTR) that contains information that has beenentered from the RIF. An example of the RTRis shown in Figure B-11. The PM and PIPTsmust also be aware of upcoming deadlines andevents to ensure they are not caught unpreparedfor a result. A Watch List will be used to trackupcoming events and activities. A sample WatchList is contained in Table B-11.

2.3 PIPT RISK SUMMARY REPORT

In addition to the RTRs for individual HIGHand MODERATE risks, PIPTs will prepare aperiodic summary of the ratings for all the risksin their areas. Figure B-12 provides an exampleof this report. The format for this summary isbased on the Risk Assessment Guide shown inFigure B-10. The entries in each cell of thematrix represent the number of identified riskswith the corresponding probability/likelihoodand consequence/impact values.

B-46

Risk Identification Identifies the risk and is a critical element of information, assuming that a(ID) Number relational database will be used by the PO. (Construct the ID number to

identify the organization responsible for oversight.)

Risk Event States the risk event and identifies it with a descriptive name. The statement andrisk identification number will always be associated in any report.

Priority Reflects the importance of this risk priority assigned by the PO compared to allother risks, e.g., a one (1) indicates the highest priority.

Data Submitted Gives the date that the RIF was submitted.

Major System/Com- Identifies the major system/component based on the WBS, or the process inponent or Process which the risk event occurs.

Subsystem/ Identifies the pertinent subsystem or component based on the WBS.Functional AreaCategory Identifies the risk as technical/performance cost or schedule or combination of

these.

Statement of Risk Gives a concise statement (one or two sentences) or the risk.

Description of Briefly describes the risk; lists the key processes that are involved in the design,Risk development, and production of the particular system or subsystem. If technical/

performance, include how it is manifested (e.g., design and engineering,manufacturing, etc.).

Key parameters Identifies the key parameter, minimum acceptable value, and goal value, ifappropriate. Identifies associated subsystem values required to meet theminimum acceptable value and describes the principal events planned todemonstrate that the minimum value has been met.

Assessment States if an assessment has been done. Cites the Risk Assessment Report (seenext paragraph), if appropriate.

Analysis Briefly describes the analysis done to assess the risk; includes rationale andbasis for results.

Process Variance States the variance of critical technical processes from known standards or bestpractices, based on definitions in the program’s risk management plan.

Probability of States the likelihood of the event occurring, based on definitions in theOccurrence program’s Risk Management Plan.

Consequence States the consequence of the event, if it occurs, based on definitions in theprogram’s Risk Management Plan.

Risk Rating Identifies the rating assigned to the risk based on the criteria established by theprogram.

Time Sensitivity Estimates the relative urgency for implement the risk-handling option. Ifappropriate, identifies any other subsystem or process that this risk affects.

Other Affected If appropriate, identifies any other subsystem or process that this risk affects.AreasRisk Handling Briefly describes plans to mitigate the risk. Refers to any detailed plans that mayPlans exist, if appropriate.

Risk Monitoring Measurement and metrics for tracking progress in implementing risk-handlingActivity plans and achieving planned results for risk reduction.

Status Briefly reports the status of the risk-handling activities and outcomes relevantto any risk handling milestones.

Status Due Date Lists date of the status report.

Assignment Lists individual assigned responsibility for mitigation activities.

Reported By Records name and phone number of individual who reported the risk.

Element Description

Table B-10. DBMS Elements

B-47

Figure B-11. Example Risk Tracking Report

Risk Tracking Report(Example Report)

I. Risk Area Status: Design PF: High CF: High

Significant Design Risks:

1. Title: System Weight PF: High CF: High

Risk Event: Exceed system weight by 10%; decreasing the range and increasing fuelconsumption.

Action: Examining subsystems to determine areas where weight may be reduced.Reviewing the requirement. Closely watching the effect on reliability andinteroperability.

2. Title: Design Analysis PF: High CF: High

Risk Event: Failure Modes, Effects and Criticality Analysis (FMECA) is planned toolate to identify and correct any critical single-point failure points prior todesign freeze.

Action: Additional resources are being sought to expedite performance ofFMECA.

II. Risk Area Status: Supportability PF: High CF: Moderate/High

1. Title: Operational Support PF: High CF: Moderate/High

Risk Event: Power supply subcontractor is in financial trouble and may go out ofbusiness. No other known sources exist.

Action: Doing trade study to see if alternative designs have a broader powersupply vendor base. Prime contractor is negotiating with the subcontractorto buy drawings for development of second source.

B-48

Figure B-12. Example PIPT Risk Summary Report

Table B-11. Sample Watch List

Potential RiskArea

Risk HandlingActions

ActionCode

• Accuratelypredicting shockenvironmentshipboardequipment willexperience.

• Evaluatingacoustic impactof the shipsystems that arenot similar toprevious designs.

31 Aug 01

31 Aug 02

31 Apr 01

31 Aug 02

Due DateDate

Completed Explanation

• Use multiple finiteelement codes &simplified numericalmodels for earlyassessments.

• Shock test simpleisolated deck, andproposed isolatedstructure to improveconfidence inpredictions.

• Concentrate onacoustic modelingand scale testing oftechnologies notdemonstratedsuccessfully in large-scale tests or full-scale trials.

• Factor acousticsignature mitigationfrom isolated modulardecks into systemrequirements.Continue model teststo validate predictionsfor isolated decks.

SE03

SE03

SE031

SE032

Lik

elih

oo

d

Consequence

e 0 1 0 1 0

d 0 0 1 1 2

c 3 2 1 0 0

b 4 3 5 2 1

a 5 3 1 1 2

a b c d e

B-49

Table B-13. Examples of Process Metrics

Development ofrequirementstraceability plan

Development ofspecification tree

Specificationsreviewed for:

• Definition ofall useenviron-ments

• Definition ofall functionalrequirementsfor eachmissionperformed

Integrated TestPlan

DesignProcess

FailureReporting

SystemTrade

StudiesDesign

Requirements

Users needsprioritized

Alternativesystem configu-rations selected

Test methodsselected

Design require-ments stability

Producibilityanalysis con-ducted

Design analyzedfor:

• Cost

• Partsreduction

• Manufac-turability

• Testability

All developmentaltests at systemand subsystemlevel identified

Identification ofwho will to test(Government,contractor,supplier) ofrequirementstraceability plan

Development ofspecification tree

Specificationsreviewed for:

• Definition ofall useenvironments

• Definition ofall functionalrequirementsfor eachmissionperformed

Contractorcorporate-levelmanagementinvolved infailure reportingand correctiveaction process

Responsibilityfor analysis andcorrective actionassigned tospecific indivi-dual with close-out date

ManufacturingPlan

Plan documentsmethods bywhich design tobe built

Plan containssequence andschedule ofevents atcontractor andsub-contractorlevels thatdefines use ofmaterials,fabrication flow,test equipment,tools, facilities,and personnel

Reflects manu-facturinginclusion indesign process.Includesidentification andassessment ofdesign facilities

Table B-14. Example of Cost and Schedule Metrics

Cost Schedule

Cost variance Schedule variance

Cost performance index Schedule performance index

Estimate at completion Design schedule performance

Management reserve Manufacturing schedule performance

Test schedule performance

B-50

C-1

APPENDIX C

GLOSSARY

ACAT – Acquisition Category

AHP – Analytical Hierarchy Process

AMSAA – Army Materiel System Analysis Activity

APB – Acquisition Program Baseline

API/PM – Acquisition Program Integration/Program Management

ASP – Acquisition System Protection

BCS – Baseline Comparison System

BIT – Built-in Test

BMP – Best Manufacturing Program

CAIG – Cost Analysis Improvement Group

CAIV – Cost As an Independent Variable

CARD – Cost Analysis Requirements Description

CCA – Component Cost Analysis

CCDR – Contractor Cost Data Reporting

CDF – Cumulative Distribution Function

CDR – Critical Design Review

CER – Cost Estimating Relationship

CPM – Critical Path Method

CTD – Concept and Technology Development

CWBS – Contract Work Breakdown Structure

DAD – Defense Acquisition Deskbook

DAU – Defense Acquisition University

DBMS – Database Management System

DCMA – Defense Contract Management Agency

DFARS – Defense Federal Acquisition Regulation Supplement

DoD – Department of Defense

DoDD – DoD Directive

DoDI – DoD Instruction

C-2

DPG – Defense Planning Guidance

DR – Decision Review

DSMC – Defense Systems Management College

DT&E – Development, Test and Evaluation

DTSE&E – Director, Test, Systems Engineering, and Evaluation

EAC – Estimate At Completion

EMP – Electromagnetic Pulse

ESC – Electronic Systems Center

ESM – Electronic Warfare Support Measures

ESS – Environmental Stress Screening

EV – Earned Value

FMECA – Failure Mode, Effects and Criticality Analysis

FRACAS – Failure, Reporting, Analyze, and Corrective Action

GAO – Government Accounting Office

GFE – Government Furnished Equipment

HWIL – Hardware-in-the-Loop

IBR – Integrated Baseline Review

IFF – Identification Friend or Foe

IIPT – Integrating Integrated Product Teams

IMP – Integrated Master Plan

IMS – Integrated Master Schedule

IOC – Initial Operational Capability

IPD – Integrated Product Development

IPPD – Integrated Product and Process Development

IPR – Interim Progress Review

IPT – Integrated Product Teams

KPP – Key Performance Parameters

LCC – Life-Cycle Cost

LFT&E – Live-Fire Test and Evaluation

LRIP – Low Rate Initial Production

C-3

M&E – Mechanical and Electrical

M&S – Modeling and Simulation

MAIS – Major Automated Information System

MDA – Milestone Decision Authority

MDAPs – Major Defense Acquisition Programs

MIS – Management Information System

MNS – Mission Need Statement

MOA – Memoranda of Agreement

MOU – Memoranda of Understanding

MS – Milestone

MTBF – Mean Time Between Failure

NDI – Non-Developmental Item

NSSN – New Nuclear Submarine

O&M – Operations and Maintenance

OIPT – Overarching Integrated Product Team

OLRDB – On-Line Risk Data Base

ORD – Operational Requirement Document

OSD – Office of the Secretary of Defense

OT&E – Operational Test and Evaluation

P&D – Production and Deployment

PDF – Probability Density Function

PIPT – Program Integrated Product Team

PLIPT – Program Level Integrated Product Team

PM – Program Manager

PMI – Project Management Institute

PMO – Program Management Office

PMWS – Program Manager’s Work Station

POE – Program Office Estimate

POM – Program Objective Memorandum

PRAG – Performance Risk Assessment Group

PRR – Production Readiness Review

PSR – Program Status Report

C-4

R&D – Research and Development

R&M – Repairability and Maintainability

RD&A – Research, Development and Acquisition

RAR – Risk Assessment Report

RFP – Request for Proposal

RIF – Risk Information Form

RMIS – Risk Management Information System

RMP – Risk Management Plan

RTR – Risk Tracking Report

SDD – System Development and Demonstration

SEI – Software Engineering Institute

SI – System Integration

SME – Subject-Matter Expert

SOW – Statement of Work

SPMN – Software Program Managers Network

SRE – Software Risk Evaluation

SRR – System Requirements Review

STA – Special Threat Assessment

STAR – Special Threat Assessment Report

T&E – Test and Evaluation

TAAF – Test, Analyze, and Fix

TEMP – Test and Evaluation Master Plan

TPM – Technical Performance Measurement

TRIMS – Technical Risk Identification and Mitigation Software

UAV – Unmanned Aerial Vehicle

UHF – Ultra-High Frequency

USC – United States Code

USD(AT&L) – Under Secretary of Defense, Acquisition, Technology, and Logistics

WBS – Work Breakdown Structure

D-1

APPENDIX D

QUANTIFYINGEXPERT JUDGMENT

I. GENERAL

Most quantitative risk analysis techniques sharea common need, and that is the estimation of aprobability of occurrence associated with a riskevent. Often the estimation of probability datarequires expert judgement, and inherent injudgement is a degree of uncertainty.

The challenge for the analyst is to obtain esti-mates in the areas of cost, schedule, and/or tech-nical/performance. These estimates often beginas qualitative information which must then beconverted to quantitative probability data so thatthe results can be represented as a probabilitydensity function (PDF), which is a key input toa number of different types of models (e.g.,Monte Carlo simulations).

There are a number of methods which can beused to convert qualitative estimates into quan-titative probability distributions. The remain-der of this appendix will focus on a few of themost popular, practical, and accurate techniquesfor doing so. The techniques discussed wereselected because they are relatively simple andeasy to master. This factor is of paramount im-portance, because in most cases the analyst andthose being interviewed will have neither thetime nor the knowledge of more advanced tech-niques to accurately implement them. Finally,the use of these techniques does not precludegenerating uncertain and/or erroneous PDFs —the quality of the resulting probability distri-butions will be no better than the interviewingtechnique used by the analyst, the level ofknowledge of the experts interviewed, and the

ability of the analyst to convert the informationgleaned from participants into probability distri-butions.

The following techniques will be discussed inthis appendix:

1. Diagrammatic

2. Direct

3. Betting

4. Modified Churchman/Ackoff technique

5. Delphi Approach

II. DESCRIPTION OF TECHNIQUES

1. Diagrammatic

Many analysts prefer the diagrammatic methodas a way of capturing and representing anexpert’s judgement. This method is a simpleway of describing an expert’s uncertainty bypresenting him with a range of PDF diagramsand having the expert select the shape of thePDF which is considered to reflect most accu-rately the schedule, cost, or technical param-eter in question. Using this method, the analystcan ascertain whether the PDF is symmetric orskewed, the degree of variability, etc. For ex-ample, if the expert feels that there is a greatamount of risk associated with completing anactivity within a certain period of time, a PDFskewed to the right may be selected. Likewise,

D-2

activities with little risk may be skewed to theleft. If the expert feels that each value over agiven range is equally likely to occur, a uniformdistribution may be most appropriate. The ana-lyst and the expert, working together, can selectthe PDF which most accurately reflect theschedule, cost, or technical item under question.

The diagrammatic method of obtaining PDFsis applicable when the expert has a soundunderstanding of probability concepts and canmerge that understanding with his understand-ing of the parameters under question. In thisway the expert can accurately identify theappropriate PDFs.

2. Direct

The direct method is a relatively simple tech-nique which can be used to obtain subjectiveprobability distributions by asking the expertto assign probabilities to a given range of values.

The direct method of obtaining PDFs is appli-cable, 1) when questions can be phrased to therespondents in such a way that there is no con-fusion likely to exist in the respondents mind,and 2) when the results will not violate theaxioms of probability. This method is applicablewhen time/resource constraints do not allow formore complex, resource intensive methods.

The application of the direct method is quitesimple. The analyst would define a relevantrange and discrete intervals for the parameterfor which the PDF is to be constructed. Forexample, the analyst might define the relevanttime duration for a program activity (test of apiece of equipment) to be between 0 and 27days. The analyst would then break this relevantrange down into intervals, say intervals of threedays, the resulting formulation would look asfollows:

0 – 3 days 16 – 19 days4 – 7 days 20 – 23 days8 – 11 days 24 – 27 days

12 – 15 days

Given these intervals over the relevant range,the analyst would then query the expert to assignrelative probabilities to each range. From this,the form of the PDF could be identified. It isimperative that the axioms of probability notbe violated.

Besides the application already described, theanalyst could request the expert to provide alowest possible value, a most likely value, anda highest possible value. The analyst then makesan assumption about the form of the densityfunction. That is, is the PDF uniform, normal,beta, triangular, etc.?

3. Betting

One method of phrasing questions to experts inorder to obtain probabilities for ranges of values(cost/schedule) states the problem in terms ofbetting. A form of this method, which wasdescribed by Winkler (1967), helps the expert(assessor) assess probabilities of events whichare in accordance with his judgement. Theassumption with this method is that the judge-ment of the expert may be fully represented bya probability distribution, f(x) of a randomvariable x. This method offers the expert a seriesof bets.

Under ideal circumstances, the bets are actual,not hypothetical. That is, in each case the win-ner of the bet is determined and the amount ofmoney involved actually changes hands. How-ever, under our circumstances, this is not fea-sible (or legal!). In each case, the expert mustchoose between two bets (the expert is notallowed to refrain from betting). The expert

D-3

must choose between a bet with a fixed prob-ability q of winning and 1–q of losing, and abet dependent on whether or not some event E(a particular program activity duration range,or cost range) occurs. The bet can be depictedas follows:

Bet 1a – win $A if the event E occurs– lose $B if event E does not occur

Bet 1b – win $A with probability of q– lose $B with probability of 1–q.

The expected values of bets 1a and 1b to theexpert are respectively Ap + Bp – B and Aq +Bq – B, where P is the probability of event Eoccurring. The following inferences may bedrawn from the experts decision: if bet 1a ischosen, Ap + Bp – B > Aq + Bq – B, so p > q;likewise if 1b is selected p < q.

By repeating the procedure, varying the valueof q, the probability of event e can be ascer-tained. It is the point at which the expert is in-different between bets 1a and 1b, where p = q.The degree of precision is dependent on thenumber of bets and the incremental changes ofthe value of q.

A way of avoiding the problem of a large num-ber of bets to obtain p would be to assess theprobabilities through the use of direct interro-gation, and then to use the betting situation asa check on the assumed probabilities.

To complete a PDF, the analyst repeats thisprocedure over a relevant range of intervalvalues. The analyst then plots the points at thecenter of the range for each event and smoothesin a curve, so that the area under it equals one,as in Figure D-1. The analyst must ensure thatall of the relevant axioms of probability aremaintained.

Many people, when questioned one way, arelikely to make probability statements that areinconsistent with what they will say when ques-tioned in another equivalent way, especiallywhen they are asked for direct assignment ofprobabilities. As the number of events increases,so does the difficulty of assigning direct prob-abilities. Therefore, when this is a problem, thebetting method is most appropriate.

To apply the betting technique, we will selectone interval for the relevant range to demon-strate how this method can be used to obtainprobability estimates and, hence, PDFs. The betis established as follows:

Bet 1a – win $10,000 if cost is between$15,000 and $20,000

– lost $5,000 if cost in not between$25,000 and $20,000

Bet 1b – win $10,000 with probability of q– lose $5,000 with probability of 1–q

The value of q is established initially, and theexpert is asked which of the two bets he wouldtake.

Figure D-1.Fitting a Curve to Expert Judgment

PR

OB

AB

ILIT

Y

COST

.0825

.165

.25

.33

3025201510

D-4

The value of q is then varied systematically, ei-ther increased or decreased. The point at whichthe expert is indifferent between the two bets(with the associated q value) provides the prob-ability of the cost being between $15,000 and$20,000. This process is repeated for each inter-val, and the results used create the PDF associ-ated with the cost of that particular programevent.

4. Modified Churchman/Ackoff Technique

Another method, which can be used to ascer-tain PDFs for cost, schedule, or performanceparameters, is the “Modified Churchman/Ackoff method.” This technique builds uponprocedures which were presented by Church-man and Ackoff in 1954. This technique wasdeveloped as a means to order events in termsof likelihood. The modification to the techniquewas performed so that once the order of eventlikelihoods had been accomplished, relativeprobabilities could be assigned to the events andfinally probability density functions developed.So as to be relevant for our purposes, eventsare defined as range values for cost, schedule,or performance (activity durations) relating to theoutcome of a specific activity in a program.

The modified Churchman/Ackoff technique ismost appropriate when there is one expert, andthat expert has a thorough understanding of therelative ranking of cost/schedule ranges and alimited understanding or probability concepts.The remainder of this section was extracted andmodified from the Compendium on Risk Analy-sis Techniques (1972, see references). Note thatwhile the mathematical calculations appear tomake this a very precise technique, it is still anapproximation of an expert’s judgement andshould not be interpreted to be more exact thanother similar techniques.

The first step in applying the modified Church-man/Ackoff technique is to define the relevant

range of values. That is, the end points, along arange of values with zero probability of occur-rence must be specified. These values need onlybe any low and high values which the expertspecifies as having zero probability of occur-rence. Next, ranges of individual values withinthe relevant range must be determined. Theseranges of values which will form the set of com-parative values for this technique are specifiedby the following approach:

(1) Start with the low value in the relevantrange.

(2) Progress upward on the scale of values untilthe expert is able to state a simple prefer-ence regarding the relative probabilities ofoccurrence of the two characteristic values.If he is able to say that he believes one valuehas either a greater chance or a lesser changeof occurring than the other of the two values,then it is inferred that the expert is able todiscriminate between the two values.

(3) Using the higher of the two previously speci-fied scale values as a new basis, repeat step(2) to determine the next value on the scale.

(4) Repeat steps (2) and (3) until the high endpoint value of the range of parametersvalues is approached.

Employing this procedure for the duration re-quired to successfully test a piece of equipment,may yield the results show in Table D-1.

01 = 0 – 3 days02 = 4 – 7 days03 = 8 – 11 days04 = 12 – 15 days05 = 16 – 19 days06 = 20 – 23 days07 = 24 – 27 days

Table D-1. Characteristic Values forEquipment Test Durations

D-5

The descending order of probability or occur-rence can be determined by applying thefollowing paired comparison method.

Ask the expert to compare, one at a time, thefirst interval value (01) of the set to each of theother values (02, 03, etc.), stating a preferencefor that value in each group of two values thathe believes has the greater change of occurring(denoting a greater probability of occurrenceby >, and equal chance by =, and a lesser changeby <). The following hypothetical preferencerelationships could result for a set of sevenvalues (01 < 02, 01 < 03, 01 < 04, 01 < 05, 01 <06, 01 < 07).

Next, ask the expert to compare, one at a time,the second interval values (02) of the set to eachof the other interval values succeeding it in theset (i.e., 03, 04, etc.). The following preferencerelationships might result (02 < 03, 02 < 04, 02< 05, 02 < 06, 02 < 07). Continue this processuntil all values (0i) have been compared.

Now total the number of times (0i) value waspreferred over other values. The results for thisprocedure are listed in Table D-2.

probability (e.g., X1). Then, as in the first step,question the expert regarding the relative chanceof occurrence of each of the other values onthe ordinal scale in Table D-3 with respect tothe value at the top of the scale. Assigning X1 arating of 100 points, the expert is first interro-gated as to his feeling of the relative chance ofoccurrence of the second highest scale value(e.g., X2), with respect to X1. Does it have 25percent chance? 60 percent? 70 percent? 80percent? As much chance of realization as X1?The relative probability rating, based on 100points (i.e., 100 percent as much chance), willthen be posted for X2.

Next, question the expert about the relativechance of occurrence of the next highest scale(e.g., X3) first with respect to the most preferredvalue (X1), and then with respect to the secondmost preferred scale value (X2). The resultingnumerical ratings should concur. For example,if the expert decides that X2 has 8/10 as muchchance of occurring as does X1, and that X3has 1/2 as much chance as X1, and 5/8 as muchchance as X2, the ratings become X1 = 100points, X2 = 80 points, and X3 = 50 points.

This process continues for each successivelylower interval value on the ordinal scale asshown in Table D-3. Determine the relativenumber of points to be accorded each value with

List the values in descending order of simpleordinal probability preference and change thesymbols for each value from 0i to Xj as shownin Table D-3.

Arbitrarily assign a rating of 100 points to thecharacteristic value with the highest subjective

CharacteristicValue Preference New(Days) Rank Symbol

0 – 3 04 1 X14 – 7 03 2 X2

8 – 11 05 3 X312 – 15 02 4 X416 – 19 06 5 X520 – 23 01 6 X624 – 27 07 7 X7

Table D-3.Transformation

04 = 6 times03 = 5 times05 = 4 times02 = 3 times06 = 2 times01 = 0 times07 = 0 times

Table D-2.Summary of Preference Relationships

D-6

respect to the top scale and with respect to allother values on down the scale which are abovethe characteristic value in question.

In the event of minor disparities between rela-tive probability ratings for a given value, theaverage of all such ratings for that characteris-tic value might be computed. For example, X4might be determined to be 3/10 as probable asX1, 1/4 as probable as X2, and 1/2 as probableas X3. The three absolute ratings for X4 are thusinferred to be 30, 20, and 25 points, respectively.The average of these ratings is 25. However,before averaging such figures, it might be ben-eficial to have the expert revaluate his relativeratings for X4 with respect to X1, X2, and X5.

As a result of the above process, the relativeprobability values shown in Table D-4 mightbe attained.

Similarly P(Xi) is defined as:

R(Xi)

R(X1)

for i = 2, 3, …7.

Assuming that the independent characteristicvalues evaluated represent all possible valuesattainable by the component characteristic, therespective probabilities must sum to 1.0 (i.e.,P(X1) + P(X2) + P(X3) + P(X4) + P(X5) + P(X6)+ P(X7) = 1.0). Substituting the expressions forP(Xi), i = 2, …7, it follows that:

R(Xi) R(X3) R(X4)

R(X1) R(X1) R(X1)

R(X5) R(X6) R(X7)

R(X1) R(X1) R(X1)

Solving this equation for P(X1), the remainingP(Xi), i = 2, …7 can be determined using therelationship:

R(Xi)R(X1)

As an illustration, consider the relative prob-ability ratings in Table D-4. Using the values,the preceding equation is given by:

80 50100 100

25 10100 100

Solving this equation, P(X1) = 0.377.

This value can be used to determine the remainingprobabilities as follows:

Finally, the scale of relative probability valuescan be converted directly into a scale of actualprobability density values by letting P(X1) equalthe actual subjective probability or occurrenceof the highest value. Then, P(X2) is then definedas:

R(X2)

R(X1)[P(X1)]

[P(X1)]

[P(X1)].P(X1) =

[P(X1)]+ [P(X1)]+ [P(X1)]+ [P(X1)]

+ [P(X1)]+ [P(X1)]+ [P(X1)] = 1.

P(X1) + P(X1) + P(X1) +

P(X1) + P(X1) = 1.

RX1 = 100 Probability pointsRX2 = 80 Probability pointsRX3 = 50 Probability pointsRX4 = 25 Probability pointsRX5 = 10 Probability pointsRX6 = 0 Probability pointsRX7 = 0 Probability points

Table D-4.Relative Probability Ratings

D-7

RX2RX1

RX3RX1

RX4RX1

RX5RX1

RX6RX1

RX7RX1

The resulting probability density appears inTable D-5.

P(X2) + P(X1) = 0.80 (0.377) = 0.301

P(X3) + P(X1) = 0.50 (0.377) = 0.189

P(X4) + P(X1) = 0.25 (0.377) = 0.095

P(X5) + P(X1) = 0.10 (0.377) = 0.038

P(X6) + P(X1) = 0.0 (0.377) = 0.000

P(X7) + P(X1) = 0.0 (0.377) = 0.000

known as the Delphi to avoid the pressures.

The Delphi technique has become well knownin management circles, but is subject to mis-conception. Too often the term is used to iden-tify a committee or multiple interview process,and these do not share the advantages of theDelphi technique.

The Delphi technique has been extended inrecent years to cover a wide variety of types ofgroup interaction. The technique can be usedfor group estimation, that is, the use of a groupof knowledgeable individuals to arrive at anestimate of an uncertain quantity. The quantitycan be a cost, a time period associated with anevent, or a performance level.

The Delphi technique is most appropriate when:

• The problem does not lend itself to preciseanalytical techniques but can benefit fromsubjective judgements on a collective basis.

• The individuals needed to contribute to theexamination of a broad or complex problemhave no history of adequate communicationand may represent diverse backgrounds withrespect to experience or expertise.

• More individuals are needed than can effec-tively interact in a face-to-face exchange.

• Time and cost make frequent group meetingsunfeasible.

• The efficiency of face-to-face meetings canbe increased by a supplemental groupcommunication process.

• Disagreements among individuals are sosevere or politically unpalatable that the com-munication process must be refereed and/oranonymity assured.

5. Delphi Approach

In many cases, expert judgement does not residesolely with one individual, but is spread amongmultiple experts. Committee approaches toobtaining a group assessment have been foundto contain problems relating to interpersonalpressures to a degree that caused researchers atthe RAND Corporation to devise a method

ComponentCharacteristic Probability

Value

X1 0.377X2 0.301X3 0.189X4 0.095X5 0.038X6 0.000X7 0.000

Total 1.000

Table D-5. Probability Density

D-8

• The heterogeneity of the participants must bepreserved to assure validity of the results, i.e.,avoidance of domination by quantity or bystrength of personality (“bandwagon effect”).

The Delphi technique differs from othermethods of obtaining a group opinion, becauseit physically separates the group’s membersfrom one another in order to reduce irrelevantinterpersonal influences. Properly carried out,the technique is facilitated by an analystobtaining each panel member’s reason for theopinion. The analyst then reduces the opinionsand reasons to standard statements in order topreserve anonymity. The analyst then shows thepanel member the aggregated opinions of theother panel members in statistical terms. Theanalyst provides each panel member with thereasons justifying the opinions that differ withthe member, and requests revaluation and fur-ther substantiation. This iterative feeding backcontinues until no further substantial changeresults. At this point, the moderator takes thefinal individual opinions and computes a set ofmedian values to represent the group opinion.The median value, rather than the average, isused as a central estimate to prevent the esti-mate from being overly influenced by extremeindividual values.

One technique which holds much promise forthe future as a means of capturing expert judge-ment is “expert support systems”. Ideally, theexpert support system would lead the expert(s)

through a series of parameter specific questions(cost and schedule, possible performance) andgenerate PDFs based on the responses.

III. RELIABILITY

The reliability of the PDFs obtained throughthese techniques is affected by a number of fac-tors. Foremost is the degree to which the socalled “expert” is in fact an expert. The betterunderstanding the expert has of the parameterbeing modeled, the more reliable the resultingPDFs will be. The burden also falls on the ana-lyst to select the technique most appropriate forobtaining PDFs. For example, if expertiseresides with more than one expert, a Delphitechnique would result in much more reliablePDFs than would a direct method of asking onlyone expert. Likewise, if the expert has very littleunderstanding of probability concepts, it wouldbe inappropriate to ask him to select a PDF froma visual list of options. Under these circum-stances, the modified Churchman-Ackoffmethod or a betting technique would most likelyresult in more reliable PDFs.

In summary, much of the reliability of the PDFsis predicated on the techniques selected by theanalyst for constructing them. Therefore, it isimportant that the analyst know when eachtechnique is most appropriate, given the uniquecircumstances of that specific program office.

E-1

APPENDIX E

BIBLIOGRAPHY

AFMC 63-101 Update, Risk ManagementHandbook, Appendices to Augment AFMCPamphlet 63-101, 4 June 1996.

AFMC Pamphlet 63-101, Risk Management, 9July 1997.

AMC-P 70-27, Guidance for Integrated Prod-uct and Process Management, Volume 1,Concept Implementation, 15 March 1996.

AMSAA Special Publication 71, AMSAA RiskAssessment Primer, 1995.

AT&T and Department of the Navy, PracticalEngineering Guides for Managing Risk:Moving a Design into Production,McGraw-Hill, 1993.

AT&T and Department of the Navy, PracticalEngineering Guides for Managing Risk:Design To Reduce Technical Risk, McGraw-Hill, 1993.

AT&T and Department of the Navy, PracticalEngineering Guides for Managing Risk:Design’s Impact on Logistics, McGraw-Hill, 1993.

AT&T and Department of the Navy, PracticalEngineering Guides for Managing Risk:Testing to Verify Design and Manufactur-ing Readiness, McGraw-Hill, 1993.

Biery, F., D. Hudak and S. Gupta, “ImprovingCost Risk Analysis,” The Journal of CostAnalysis, Spring 1994, pp. 57-86.

Boehm, Barry W. Software Risk Management:Principles and Practices, IEEE Software,January 1991.

Carnegie Mellon University, Software Engi-neering Institute, Continuous Risk Manage-ment Guidebook, 1996.

Charette, R. N., Software Engineering RiskAnalysis and Management, McGraw-Hill,1989.

Conrow, Edmund H., The Use of Ordinal RiskScales in Defense Systems Engineering,Acquisition Research Symposium Proceed-ings, Defense Systems ManagementCollege, June 1995.

Conrow, Edmund H., Some Limitations ofQuantitative Risk Analysis ApproachesUsed in Project Management,– Developedfor the Office of the Secretary of Defense,April 1998.

Conrow, Edmund H., and Patty S. Shishido,Implementing Risk Management on Soft-ware-Intensive Projects, IEEE Software,May/June 1997.

Conrow, Edmund H., “Effective Risk Manage-ment: Some Keys to Success,” AmericanInstitute of Aeronautics and Astronautics,2000.

Conrow, Edmund H. (Principal author), RiskManagement Critical Process AssessmentTool, Version 2, 9 June 1998.

E-2

CVN 77, Risk Management Plan, November1998.

DAU, Risk Focus Area of the Program Man-agement Community of Practice (http://www.pmcop.dau.mil).

DAU, Scheduling Guide for Program Manag-ers, 2001.

DoD 4245.7M, Transition from Developmentto Production, September 1985.

DoD 5000.2-R, Mandatory Procedures forMajor Defense Acquisition Programs(MDBPs) and Major AutomatedInformation System (MAIS) AcquisitionPrograms, 5 April 2002.

DoD 5000.4-M, Cost Analysis Guidance andProcedures, 1992.

DoD Defense Acquisition Deskbook—Com-pendium of acquisition-related mandatoryand discretionary guidance, including riskmanagement (http://www.deskbook.osd.mil).

DoD Inspector General Report 96-162, AuditReport on the Audit of Risk ManagementPrograms for Defense Acquisition Systems,June 1996.

DoDD 5000.1, The Defense Acquisition Sys-tem, w/change 1, 4 January 2001.

DoDI 5000.2, Operation of the Defense Acqui-sition System, 5 April 2002.

DoDD 5000.4, OSD Cost Analysis Improve-ment Group, 1992.

DSMC, Earned Value Management Textbook,December 1996.

DSMC, Program Management Teaching Note,Acquisition Program Protection, July 2001.

DSMC, Program Risk Management TeachingNote, January 2002.

DSMC, System Engineering ManagementGuide, 1990.

EIA Engineering Standard, IS-632, SystemsEngineering, 1994.

Garvey, P. R. and Z. F. Lansdowne. Risk Matrix:An Approach for Identifying, Assessing,and Ranking Program Risks. Air ForceJournal of Logistics 25 (No. 1). 16-19,1998.

Garvey, P. R. Probability Methods for Cost Un-certainty Analysis - A Systems EngineeringPerspective, Marcel Dekker, Incorporated,1999.

Government Accounting Office, Technical RiskAssessment: The Status of Current DoDEfforts, PEMD-86-5, (Washington, DC),April 1986.

Hayes, R. W., et al., Risk Management in Engi-neering Construction, Special SERCReport by Project Management Group,UMIST, Thomas Telford LTD, London,December 1986.

Hulett, D. T., Project Cost Risk Assessment,Hulett & Associates, Los Angeles, CA,1994.

Hulett, D. T., “Project Schedule Risk Assess-ment,” Project Management Journal, March1995.

Hulett, D. T., “Schedule Risk Analysis Simpli-fied,” Project Management Journal, July1996.

E-3

JLC Joint Group on Systems Engineering, Prac-tical Software Measurement, March 27,1996.

Keny, R., and D. Von Winterfield, “On the Useof Expert Judgment on Complex TechnicalProblems,” IEEE Transactions on Engineer-ing Management, Vol. 36, No. 2, May 1989,pp. 83-86.

Linstone, H. A., and M. Turoff, The DelphiMethod Techniques and Application,Addison-Welsey Publishing Company,Reading, MA, 1975.

MIL-STD-1388-1A, Logistics Support Analysis,Task 203—Comparative Analysis, 1988.

MIL-STD-961D, Department of Defense Stan-dard Practice: Defense Specifications,March 1995.

NAVSO P-3686, Top Eleven Ways to ManageTechnical Risk, October 1998.

NAVSO P-6071, Best Practices: How to AvoidSurprises in the World’s Most ComplicatedTechnical Process, March 1986.

New Attack Submarine Risk Management Plan,Draft, 15 April 1994.

Office of the Assistant Secretary of the Navy(RD&A), Methods and Metrics for Prod-uct Success, July 1994.

Pariseau, Richard and Oswalt, Ivar, “Using DataTypes and Scales for Analysis and DecisionMaking,” Acquisition Review Quarterly,Spring 1994, pp. 145-159.

PMI, Project and Program Risk Management,A Guide to Managing Project Risks and Op-portunities, preliminary edition, PMBOKHandbook Series – Volume 6, 1992.

Project Management Institute, A Guide to theProject Management Body of Knowledge,Chapter 11, Project Risk Management, 2000.

SBIRS Risk Management Implementation,Lockheed Martin presentation to Risk Man-agement Symposium, Manhattan Beach,CA, 2-4 June, 1997.

Scholtes, Peter R., The Team Handbook: Howto Use Teams to Improve Quality, Madison,WI, Joiner Associates, Inc., 1988.

Smith, P. L. and S.A. Book, Reducing Subjec-tive Guesswork and Maintaining Traceabil-ity When Estimating the ‘Risks’ AssociatedWith a Cost Estimate, The AerospaceCorporation, 1992.

Software Engineering Institute (SEI), Continu-ous Risk Management Guidebook, 1996.

Space and Naval Warfare Systems Command,Open Systems Acquisition & Supportabil-ity Guide, NGCR Document No. AST 003,Version 1.0, 31 December 1996.

TR-6304-3-1, Air Force Warranty Cost/BenefitAnalysis Handbook, 30 September 1993.

Tri-Service Technical Brief, 002-93-08, Environ-mental Stress Screening Guidelines, July 1993.

Tri-Service Technical Brief, The TAAF Process:A Technical Brief for TAAF Implementation,January 1989.

Tversky, A., and D. Kahneman, “JudgmentUnder Uncertainty: Heuristics and Biases,”Science, No. 185, September 1974, pp.1124-31.

Tversky, A., D. Kahneman, and P. Slovic, Judg-ment Under Uncertainty: Heuristics and Bi-ases, Cambridge University Press, 1982.

E-4

Under Secretary of Defense (Acquisition &Technology) [USD(A&T)] Memorandum,Reducing Life Cycle Costs for New andFielded Systems.

U.S. Air Force, “Guidelines for SuccessfulAcquisition and Management of SoftwareIntensive Systems,” Version 2, June 1996.

U.S. Air Force Materiel Command, AcquisitionRisk Management Guide, 1997.

U.S. Navy, Methods and Metrics for ProductSuccess: A Dual-Use Guide, 1994.

E-5

DAU PRESSWANTS TO HEAR FROM YOU!

Please rate this publication in various ways using the following scores:4 – Excellent 3 – Good 2 – Fair 1 – Poor O – Does not apply

Name of this publication: Risk Management Guide

This publication:

A. _____ is easy to read.

B. _____ has a pleasing design and format.

C. _____ successfully addresses acquisition management and reform issues.

D. _____ contributes to my knowledge of the subject areas.

E. _____ contributes to my job effectiveness.

F. _____ contributes to my subordinate’s job effectiveness.

G. _____ is useful to me in my career.

H. _____ I look forward to receiving this publication.

I. _____ I read all or most of this publication.

J. _____ I recommend this publication to others in acquisition workforce.

How can we improve this publication? Provide any constructive criticism for us to consider for the future.

What other DAU publications do you read?

— — OPTIONAL — —

Name/Title

Company/Agency

Address

Work Phone ( ) DSN FTS

Fax Email

Please mail me a brochure listing other DAU publications.

Please send a free subscription to the following at the address above:

Acquisition Review Quarterly

Program Manager Magazine

Copy this form and fax it to DAU Press at (703) 805-2917.

E-6