nato code of best practice for c2 assessment

325

Upload: phamhanh

Post on 11-Jan-2017

221 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: NATO Code of Best Practice for C2 Assessment
Page 2: NATO Code of Best Practice for C2 Assessment

A DoD CCRP/NATO Collaboration

This major revision to the Code of Best Practice forC2 Assessment is the product of a NATO Researchand Technology Organisation (RTO) sponsoredResearch Group (SAS-026). It represents over adecade of work by many of the best analysts fromthe NATO countries. A symposium (SAS-039) washosted by the NATO Consultation Command ControlAgency (NC3A) and provided the venue for a rigorouspeer review of this code.

This publication is the latest in a series produced bythe Command and Control Research Program (CCRP)under the auspices of the Assistant Secretary ofDefense (C3I). The CCRP has demonstrated theimportance of having a research program focused onthe national security implications of the InformationAge. The research sponsored and encouraged by theCCRP contributes to the development of thetheoretical foundations necessary to support theInformation Age transformation of the Department.Other CCRP initiatives are designed to acquaintmilitary and civilian leaders with emerging issuesrelated to transformation. This CCRP PublicationSeries is a key element of this effort.

Check our Web site for the latest CCRP activities and publications.

This code is also available on our Web site.

www.dodccrp.org

Page 3: NATO Code of Best Practice for C2 Assessment

DoD Command and Control Research ProgramASSISTANT SECRETARY OF DEFENSE (C3I)

&CHIEF INFORMATION OFFICER

Mr. John P. Stenbit

PRINCIPAL DEPUTY ASSISTANT SECRETARY OF DEFENSE (C3I)

Dr. Linton Wells, II

SPECIAL ASSISTANT TO THE ASD(C3I)&

DIRECTOR, RESEARCH AND STRATEGIC PLANNING

Dr. David S. Alberts

Opinions, conclusions, and recommendations expressed or impliedwithin are solely those of the authors. They do not necessarilyrepresent the views of the Department of Defense, or any other U.S.Government agency. Cleared for public release; distribution unlimited.

Portions of this publication may be quoted or reprinted withoutfurther permission, with credit to the DoD Command and ControlResearch Program, Washington, D.C. Courtesy copies of reviewswould be appreciated.

Library of Congress Cataloging-in-Publication Data

NATO code of best practice for command and control assessment.—Rev. ed. p. cm. ISBN 1-893723-09-7 (pbk) 1. Command and control systems—North America—Evaluation. 2. Commandand control systems—Europe—Evaluation. 3. North Atlantic Treaty Organization—Armed Forces—Equipment—Quality control. I. Title. UB212 .N385 2002 355.3’304180973--dc21 2002013927

October 2002

Page 4: NATO Code of Best Practice for C2 Assessment

NATO Codeof

Best Practicefor

C2 Assessment

Revised 2002 Reprintedby

Page 5: NATO Code of Best Practice for C2 Assessment

i

Table of ContentsChapter 1—Introduction ...................................... 1

Chapter 2—Preparing For Success: AssessmentParticipants, Relationships, and Dynamics ..... 21

Chapter 3—Problem Formulation ..................... 53

Chapter 4—Solution Strategies ......................... 69

Chapter 5—Measures of Merit ........................... 89

Chapter 6—Human Organizational Factors ..... 127

Chapter 7—Scenarios ...................................... 163

Chapter 8—Methods and Tools ....................... 187

Chapter 9—Data ............................................... 229

Chapter 10—Risk and Uncertainty .................. 247

Chapter 11—Products ...................................... 267

Annex A—References .......................................A-1

Annex B—Acronyms .........................................B-1

Annex C—The MORS Code of Ethics ..............C-1

Annex D—Human And Organisational IssuesChecklist ............................................................D-1

Annex E—Alternate Definitions of Risk andUncertainty.........................................................E-1

Annex F—SAS-026 History ............................... F-1

Page 6: NATO Code of Best Practice for C2 Assessment

iii

List of Tablesand Figures

Table 1.1—Comparison of Symmetric,Conventional Warfare and OOTW Missions &Principles Mission ................................................ 6

Figure 2.1.—Example of the Mapping of Rolesonto Players—for a Complex C2 Project (theEvaluation of Immediate Reaction Task Force(Land) C2 Concept) ............................................ 27

Table 2.1.—Knowledge, Capabilities and SkillsNeeded by OR/OA Cell Team Members ............ 33

Figure 2.2—C2 Assessment Process ............... 36

Figure 3.1—The Formulated Problem ............... 59

Figure 3.2—Problem Formulation ..................... 64

Figure 4.1—From Problem Formulation to Solu-tion Strategy ....................................................... 71

Figure 4.2—Solution Strategy ........................... 77

Figure 4.3—Overall Study Plan ......................... 79

Figure 5.1—Relationships of Measures ofMerit .................................................................... 93

Table 5.1—Validity Criteria of Measures .......... 99

Table 5.2—Reliability Criteria of Measures .... 100

Table 5.3—Other Useful Criteria ..................... 100

Page 7: NATO Code of Best Practice for C2 Assessment

iv

Table 5.4—Examples of Time- and Accuracy-Based Measures ............................................... 103

Table 5.5—Examples of MoPs and MoCEs .... 104

Table 5.6—Normality Indicators ...................... 108

Figure 6.1—Representation of the C2 System,based on Mandeles et al. (1996) ...................... 149

Figure 6.2—Decisionmaking Drivers .............. 152

Figure 7.1—The Scenario Framework ............ 175

Figure 7.2—Intersections in Hierarchy ........... 179

Figure 7.3—Segmentation Hierarchy Range .. 180

Table 8.1—Some Recommended Tools byCategory Type .................................................. 190

Figure 8.1—Model Linkage .............................. 195

Figure 8.2—Model-Tool Hierarchy .................. 205

Figure 8.3—Modelling Guidelines ................... 208

Figure 9.1—Data Taxonomy ............................ 231

Figure. 10.1—Illustration of the Variety ofVariables Relevant to Decisionmaking ........... 258

Page 8: NATO Code of Best Practice for C2 Assessment

1

CHAPTER 1

IntroductionSo the principles which are set forthin this treatise will, when taken up bythoughtful minds, lead to manyanother more remarkable result; andit is to be believed that it will be soon account of the nobility of thesubject, which is superior to anyother in nature.

—Galileo Galilei (1638)

Command and ControlAssessment Challenges

NATO and its member nations are in the midst of arevolution in military affairs. There are three major

dimensions to this revolution—a geopoliticaldimension, a technological dimension, and a closelycoupled conceptual dimension. This multidimensionalrevolution poses significant new challenges foranalysis in general and for command and controlassessment in particular.

The changed geopolitical context is characterised bya shift from a preoccupation with a war involving NATOand the Warsaw Pact to a concern for a broad rangeof smaller military conflicts and Operations Other Than

Page 9: NATO Code of Best Practice for C2 Assessment

2 NATO Code of Best Practice for C2 Assessment

War (OOTW). Analysts will increasingly be called uponto provide insights into these non-traditional operations.

Advances in technology, particularly information-related technologies, offer military organisationsunprecedented opportunities to significantly reduce thefog and friction traditionally associated with conflict.At the same time, they may prove to be challenges inthemselves across a wide variety of realms—technical,organisational, and cultural.

To the extent that they can be achieved, significantlyreduced levels of fog and friction offer an opportunityfor the military to develop new concepts of operations,new organisational forms, and new approaches toCommand and Control (C2), as well as to theprocesses that support it. Analysts will be increasinglycalled upon to work in this new conceptual dimensionin order to examine the impact of new information-related capabilities coupled with new ways oforganising and operating.

Definition of Command and Control

C2 has been defined by NATO as Military Function01: “The Organisation, Process, Procedures andSystems necessary to allow timely political and militarydecisionmaking and to enable military commandersto direct and control military forces.” (NATO 1996) C2systems are further defined in NATO documents toinclude: headquarters facilities, communications,information systems, and sensors & warninginstallations. (NATO 1998).

Other terms are used in NATO member nations thatare synonymous with, or closely related to, C2. These

Page 10: NATO Code of Best Practice for C2 Assessment

3Chapter 1

include Command, Control, and Communications orConsultation (C3), and Computers (C4), andIntelligence ([C3I] or [C4I]), and Surveillance andReconnaissance (C4ISR). The term CIS is sometimesused to refer to command information systems. Morerecently the term “C2” has referred to the collaborativeand consultative processes that are an inherent partof coalition operations.

For the purposes of this Code of Best Practice (COBP),the term C2 is intended to be an umbrella term thatencompasses the concepts, issues, organisations,activities, processes, and systems associated with theNATO definition of C2 as well as the other termsenumerated above.

Uniqueness of C2 Analyses and Issues

The focus of military research and analysis haspredominantly been on the physical domain. C2 issuesdiffer in fundamental ways from physics dominatedproblems. C2 deals with distributed teams of humansoperating under stress and in a variety of other operatingconditions. C2 problems are thus dominated by theirinformation, behavioural, and cognitive aspects thathave been less well researched and understood. Thisfocus creates a multidimensional, complex analyticspace.

Military operations involve multi-sided dynamicsencompassing friendly, adversary, and other actorsincluding:

• Action-reaction dynamics;

Page 11: NATO Code of Best Practice for C2 Assessment

4 NATO Code of Best Practice for C2 Assessment

• Tightly connected interaction among subjectiveelements such as cultures, morale, doctrine,training, and experience and between thosesubjective elements and the combat arena;

• Non Governmental Organisations (NGO);

• Private Volunteer Organisations (PVO);

• International organisations;

• International corporations; and

• Transnational, subnational, criminal, andterrorist organisations.

C2 issues are difficult to decompose and recomposewithout committing errors of logic. Moreover, thecomposition rules, by which the various factors thatimpact C2 interact, are poorly understood except inarenas that have been previously studied in detail.Finally, the C2 arena is weakly bounded by issuesthat on initial examination appear quite finite, but proveto be linked to very high-level factors. For example,tactical performance may be tied to national culture.

Analyses of C2 are also often constrained by factorsthat are beyond the boundaries of the research. Forexample, security policies may restrict data availabilityand otherwise constrain the analysis. The availabilityof data often limits the scope of an analysis. Moreover,the time and resources available to conduct an analysisare often severely constrained because the decisionprocesses being supported are being driven by outsideplanning, operational, or budget and decisionprocesses. This should be seen as a challenge ratherthan a problem. Uncertainty and risk associated with a

Page 12: NATO Code of Best Practice for C2 Assessment

5Chapter 1

lack of appropriate data need to be embraced as partof the analytical approach. It is unreasonable to expectthat data would be available for the performance offuture systems and processes that do not yet exist. Anexperimental component and a modelling andsimulation component need to be integrated into modernC2 analyses in order to close the gap in knowledgeand data.

Finally, because of the complexity of C2 processesand systems, analysis in this area requires the abilityto understand how Dimensional Parameters (DP),Measures of Performance (MoP), Measures of C2Effectiveness (MoCE), Measures of ForceEffectiveness (MoFE), and Measures of PolicyEffectiveness (MoPE) are linked and impact on oneanother. The cumulative set of these measures isdenoted as Measures of Merit (MoM) in the COBP.Determining the precise nature of these relationshipsnearly always proves to be an analytic challenge.

Taken together, all these factors mean that C2 modellingand analysis are more uncertain and therefore moreprone to risk than their equivalents in conventionalweapon and platform analyses. Indeed, C2 issues havelong been regarded as difficult to analyse. Manyoperational analysis (OA) studies have simply assumedperfect C2 in order to focus on other variables. As aresult of these characteristics of C2 analysis, theseendeavours will require a heavy element of researchwithin each analysis. This COBP is intended to assistthe community in dealing with, and overcoming, thebarriers to effective analysis of C2.

Page 13: NATO Code of Best Practice for C2 Assessment

6 NATO Code of Best Practice for C2 Assessment

Table 1.1. Comparison of Symmetric, Conventional Warfare andOOTW Missions & Principles Mission

In symmetric conventional warfare, the mission tendsto be relatively stable, there is a clear focus on theenemy, and the military has a common understandingand commitment.1 Conversely, in OOTW the missionis often more dynamic. This is captured by the oftenpejorative term “mission creep.” In many of theoperations in question there is no “enemy.” This isobviously true for operations such as humanitarianassistance and disaster relief. In addition,peacekeeping activities involve protagonists who mustbe treated even-handedly if the operation is to be

Differences Across the Mission Spectrum

There are significant differences among the differentparts of the mission spectrum (e.g., MoM) that theassessment team needs to take into consideration.Table 1.1 highlights the differences between traditionalcombat and OOTW.

Page 14: NATO Code of Best Practice for C2 Assessment

7Chapter 1

successful. In the latter missions, political-militaryambiguities frequently result in uncertainunderstanding of the goals and objectives of themission and a limited commitment.2 (Starr, Haut &Hughes, 1997)

Principles

Military theorists have frequently propounded basicprinciples of conventional warfare. Three often citedprinciples include the need for unity of command, theimportance of hierarchical decisionmaking, and thecriticality of achieving surprise in operations. A recentbook has proposed alternative principles for OOTW.(Alberts & Hayes, 1995) It cites the need for unity ofpurpose, consensus decisionmaking, andtransparency of operations.

Information

In conventional warfare, the issue of informationgathering and management focuses on the issue of“known unknowns” (e.g. Where are the enemy’sbattalions?). For that case, the key question is how toget the needed information (e.g. What are the keysignatures for the targets in question? What sensorsshould we task to exploit those signatures?). Clearly,the focus is on the enemy military and one objective isto assemble a complete, timely, and accurate commonpicture of the air-land-sea situation. The result is avery large, time-sensitive database, but one that isrelatively well structured (e.g. enemy order of battle).Conversely, in OOTW, the problem of informationgathering and management is dominated by “unknownunknowns.” Thus, the primary question to address is

Page 15: NATO Code of Best Practice for C2 Assessment

8 NATO Code of Best Practice for C2 Assessment

what information to get. The information focus is muchmore diffuse because of the myriad of military, political,economic, and social factors that must be considered.Consequently, situation awareness is much morecomplex. Political considerations often make it prudentto limit the dissemination of information, creating atension between the desire to create sharedawareness by increasing information sharing and theneed, for political and/or security reasons, to limitinformation sharing. The resulting databases arefrequently larger and less structured.

Analysis

Over many years, the military operations researchcommunity has become relatively adept at analysingkey aspects of symmetric conventional warfare. As anillustration, analyses of ground warfare often focus onbattalion-level operations and techniques have emergedto integrate across those results to derive insights intocampaign outcomes. The focus is on military systemsand organisations, and the techniques in questioninvolve a broad set of methods (e.g., mathematicalprogramming, decision theoretic approaches) and tools(e.g., models and simulations). Analyses of C2 issuesremain among the most challenging, even in warfarecontexts. In addition, analyses of OOTW often requireconsideration of individual behaviour. It has proven verydifficult to integrate across these results to derive acomprehensive understanding of the problem. The issueis compounded by the many factors that have to beconsidered in the analysis process (e.g. military, political,economic, social). This has led to the application of“softer” analytic approaches (e.g., extensive relianceon expert elicitation). Moreover, the very nature of

Page 16: NATO Code of Best Practice for C2 Assessment

9Chapter 1

warfare appears to be changing. For example,asymmetric threats are becoming more common,information technologies are impacting C2 processes,and organic structure and dynamics are changingrapidly and in ways we do not fully understand.

Types of C2 Assessments

The assessment team could be called upon to supporta wide variety of sponsors (e.g. acquirers of C2systems, long range planners and programmers,developers of requirements, operational commanders,and trainers). These sponsors will bring differentproblems to the assessment team (e.g. assessmentof alternative systems or concepts, identification andselection of alternative courses of action in anoperational context). Some of these will deal with aspecific mission (e.g. air defence) while others will needto deal with the entire mission spectrum from forwardpresence to high intensity conventional war.

Specific problems that the team may be called uponto address:

Requirements Analysis

• Derivation of specific C2 requirements frombroad statements of objectives; and

• The establishment of a minimum, standard, orexpected level of performance.

Page 17: NATO Code of Best Practice for C2 Assessment

10 NATO Code of Best Practice for C2 Assessment

Assessment of Alternatives

• Comparison and selection of alternative systemsthat may be very dissimilar but are designed toachieve a similar purpose;

• Assessment of the utilisation of a system in a newor unexpected application domain or mission;

• Trade-offs between C2 systems and combat systems;

• Analysis of the impact of an organisational change;

• Determination of the most cost-effectiveapproach to achieving the desired objective; and

• Comparison of a replacement system orcomponents of a system.

Research Issues

• Effectiveness of human decisionmaking as afunction of system performance or other factors;

• Effectiveness of C2 training; and

• Impact of collaboration on C2 quality.

Support to Operations

• Course of action analysis;

• Real time assessment of mission effectiveness; and

• Rehearsal assessment.

Page 18: NATO Code of Best Practice for C2 Assessment

11Chapter 1

Purpose and Scope of the COBP

This COBP offers broad guidance on the assessmentof C2 for the purposes of supporting a wide variety ofdecisionmakers and the conduct of C2 researchdescribed above. It should be noted that this COBP isfocused upon the assessment challenges associatedwith the nature of C2 and does not attempt tospecifically address the unique properties andconstraints associated with each of the many C2-related problem domains.

Given the increasing interdependence among theelements of a mission capability package3

(organisation; doctrine; C2 concepts, processes,systems; materiel; education; training; and forces), C2-related analysis cannot easily be done in isolation froma more comprehensive mission analysis. This COBPis meant to support analyses that go beyond thetraditional boundaries of C2 analyses.

This new version of the COBP for C2 assessment wasdeveloped by SAS-026 building upon the initial versionof the COBP produced by SAS-002. This new COBPis a synthesis of decades of expertise from variouscountries and hundreds of analyses. The COBP wasdeveloped using a set of case studies to test out thevaried advice and guidance received, and incorporatesfeedback from users of the initial version. Lastly, SAS-039 provided a peer review of the final draft product.

The earlier version focused on the analysis of groundforces at a tactical echelon in mid to high intensityconflicts. Consequently, the initial version of the COBPdid not completely address the full range of importantissues related to C2. In developing this new version of

Page 19: NATO Code of Best Practice for C2 Assessment

12 NATO Code of Best Practice for C2 Assessment

the COBP, SAS-026 explicitly focused upon OOTW,the impact of significantly improved information relatedcapabilities, and their implications for militaryorganisations and operations. In addition, SAS-026 wascognisant of the fact that NATO operations are likely toinclude coalitions of the willing, which might involvePartnership for Peace (PfP) nations, others partnersoutside of NATO, international organisations, andNGOs. NATO operations may also be “out of area.”

Feedback from users of the original COBP alsoidentified a number of ways in which the original COBPcould be improved. These areas were addressedduring the development of this version of the COBP.

Cost analyses continue to be explicitly excluded fortwo reasons. First, cost analysis is a mature disciplinethat experienced operational analysts already practice.Hence, C2 issues are not unique in the arena. Second,most nations have already developed approaches tocost analysis and cost effectiveness that are consistentwith their national approaches to accounting. Becausethese national practices differ among NATO members,no single approach would be appropriate.

As this COBP is being drafted, novel experiments withnew information-related capabilities, particularlynetworking and ways to accomplish their assigned tasksabound. Advances in technology are expected tocontinue at an increasing rate and spur both sustainingand disruptive innovation in military organisations. It isto be expected that this COBP will need to beperiodically revisited in light of these developments.

Page 20: NATO Code of Best Practice for C2 Assessment

13Chapter 1

Overview of COBP AssessmentPhilosophy

The COBP assumes that the objective of a C2 systemis to exercise control over its environment, througheither adaptive or reactive control mechanisms, orsome combination of those two approaches. Thisprovides the context and point of departure for theassessment of C2.

Analysis of C2 should consider all the relevant actors,military command levels, and functions involved andshould investigate issues of integration acrossdisparate organisations, military command levels, andfunctional domains over time. Consideration shouldalso be given to the robustness and security ofinformation systems and to human computer interfaceissues. Human behavioural, physiological, andcognitive factors, along with organisational anddoctrinal issues, must be considered in C2 analyses.

C2 assessments must also consider a range ofmissions, adversary capabilities, and adversarybehaviours. Moreover, it must be understood thatadversaries will use asymmetric tactics and techniquesto deny or exploit differences in technology, force size,information systems, or cultural factors. Hence,scenarios and analyses that deal with an appropriateset of all these dimensions should be considered ineither the main research design or in the excursionsto assess risks and uncertainty.

The evaluation of C2 issues depends in important wayson both distinguishing and linking dimensionalparameters, measures of performance, measures of

Page 21: NATO Code of Best Practice for C2 Assessment

14 NATO Code of Best Practice for C2 Assessment

C2 effectiveness, and measures of force and policyeffectiveness. Modelling and other tools must bedesigned to support this requirement.

Tools and data used in C2 analysis should conform togood OA processes and practices and, to the extentfeasible, should be subject to Model Verification,Validation, and Accreditation (VV&A) and to DataVerification, Validation, and Certification (VV&C).

Interoperable analytical infrastructures (e.g. datadictionaries, glossaries, models, tools, data sets) arenecessary to facilitate the efficient proliferation andreuse of study results and data within the broaderinterdisciplinary research community.

Because the complexity of C2 and the requirementsfor its analysis are often underestimated bydecisionmakers, a continuing dialogue betweenanalysts and those decisionmakers is necessary bothto scope the problem properly and to ensure that theanalytic results are properly understood. Part of thisprocess includes performing sensitivity analyses andother common practices designed to ensure the validityand reliability of the results.

Changes to C2 systems will often lead to changes inmilitary concepts, command approaches, doctrine,Tactics, Techniques, and Procedures (TTP), and relatedfactors, which must also be considered in the analysis.

Current State of Practice in C2 Analysis

Assessment of C2 issues typically employs classictools of OA. Relatively few specialised tools andmethods have been developed for C2. Moreover, those

Page 22: NATO Code of Best Practice for C2 Assessment

15Chapter 1

specialised tools generated to deal with the uniqueaspects of C2-focused research are generally not aswell understood as those used in more traditionalwarfare modelling domains. C2 analysts will often findthemselves having to develop tools and approachesappropriate for their research agendas. However, ageneral analytic process can be identified that willenhance the likelihood that an OA analyst can conductsuccessful analyses.

Organisation of the COBP

This COBP is organised into four themes. The firsttheme deals with study dynamics, problem formulation,and the development of a solution strategy. Thesecond theme identifies and discusses in depth theessential elements of assessment: measures of merit,scenarios, human and organisational issues, data, andtools. The third theme addresses issues related to riskand uncertainty while the final theme describes therange of assessment products.

This represents a significant enhancement of the initialCOBP. In particular the first, third and fourth themeswere not treated in detail in the initial version of theCOPB. In addition, material has been added to thesecond theme to address the unique assessmentchallenges associated with OOTW.

Brief History of SAS-026

SAS-026 builds upon almost a decade of work thatbegan with the formation of the Ad Hoc Working Groupon the Impact of C3I on the Battlefield by Panel 7 of the

Page 23: NATO Code of Best Practice for C2 Assessment

16 NATO Code of Best Practice for C2 Assessment

NATO Defence Research Group in 1991 to assess thestate of the art in C2 analysis. Based on therecommendations of the Ad Hoc Working Group, Panel7 constituted Research Study Group-19 (RSG-19) toaddress issues of methodology, measures of merit, andtools and analysis. The panel also addressed issues ofimproving a nation’s capability to examine C2 acquisitionand decisionmaking. At the October 1995 RSG-19planning meeting, the group determined that the primaryproduct of RSG-19 was to be a Code of Best Practicefor assessing C2. As part of selected RSG-19 meetings,workshops would be conducted to support thedevelopment of the major sections of the COBP.Workshops were conducted on Measures of Merit(Canada), Scenario Development (Netherlands), C3ISystems, Structures, Organisations, and StaffPerformance Evaluations (Norway), and Models Usedfor C3 Systems and Analysis (US/UK). Representativesfrom the nations in parentheses took the lead inorganising the workshops and summarising the results.The minutes of the workshops provide furtherillustrations of the techniques presented in the COBP.

At the October 1996 meeting, the group took up arequest by Panel 7 to conduct a symposium onmodelling and analysis of C3I, which was scheduledat the July 1997 meeting for January 1999. Thissymposium was a forum for presentation anddiscussion of the COBP and related topics.

At the July 1997 meeting, in response to a query byPanel 7, the group discussed, acknowledged, andagreed on the need for a follow-on group to SAS-002.An exploratory group on organisational change (SAS-E05) was formed to recommend a way ahead.

Page 24: NATO Code of Best Practice for C2 Assessment

17Chapter 1

SAS-E05 recommended the formation for a follow-onactivity to SAS-002 to accomplish four objectives:

• Demonstrate and assess the initial version ofthe COBP;

• Revise and extend the COBP;

• Identify research areas; and

• Facilitate the adoption of the COBP.

The SAS panel concurred in May 1999 and approvedthe formation of SAS-026, which began its 2 1/2-yearplan of work in a symposium in January 2000.

Chapter 1 Acronyms

C2 – Command and Control

C3(I) – Command, Control, Communications (andIntelligence)

C4(I) – Command, Control, Communications, andComputers (and Intelligence)

C4ISR – Command, Control, Communications, Computers,Intelligence, Surveillance and Reconnaissance

CIS – Command Information Systems

COBP – Code of Best Practice

DP – Dimensional Parameters

MoCE – Measures of C2 Effectiveness

MoFE – Measures of Force Effectiveness

MoM – Measures of Merit

Page 25: NATO Code of Best Practice for C2 Assessment

18 NATO Code of Best Practice for C2 Assessment

MoP – Measures of Performance

MoPE – Measures of Policy Effectiveness

NGO – Non Governmental Organisations

OA – Operational Analysis

OOTW – Operations Other Than War

PfP – Partnership for Peace

PVO – Private Volunteer Organisations

RSG-19 – Research Study Group-19

TTP – Tactics, Techniques, and Procedures

VV&A – Verification, Validation, and Accreditation

VV&C – Verification, Validation, and Certification

Chapter 1 References

Alberts, D. (1996). The unintended consequences ofinformation age technologies: Avoiding thepitfalls, seizing the initiative. Washington, DC:National Defense University.

Alberts, D., & Hayes, R. E. (1995). Commandarrangements for peace operations .Washington, DC: National Defense University.

North Atlantic Treaty Organisation (NATO) Bi-MNC C2plan part 2 – Command and controlrequirements. (1998).

North Atlantic Treaty Organisation (NATO) Annex Bto MC Guidance for Defence Planning. MC-299/5. (November 1996)

Page 26: NATO Code of Best Practice for C2 Assessment

19Chapter 1

1As an illustration, General Colin Powell, then Chairman, JointChiefs of Staff, summarised the mission in Desert Storm bystating that “First, we will cut off the enemy and then we will killit.” (Pentagon Briefing, Wednesday January 23, 1991.)2As an example, the US Congress has continually sought toimpose an arbitrary deadline for US forces to withdraw fromBosnia.3Mission capability packages include all of the elementsnecessary for an operation. (Alberts, 1996).

Starr, S.H., Haut, D., and Hughes, W. (1997)“Developing Intellectual Tools to Support C4ISRAnalyses for Operations Other Than War,”Proceedings of the Third InternationalSymposium on Command and ControlResearch & Technology, National DefenseUniversity, Fort McNair, Washington, DC, pp.600-608, June 17 – 20, 1997.

Page 27: NATO Code of Best Practice for C2 Assessment

21

CHAPTER 2

Preparing ForSuccess:

AssessmentParticipants,

Relationships,and Dynamics

For hypotheses ought…to explain theproperties of things and not attempt topredetermine them except in so far asthey can be an aid to experiments.

—Isaac Newton (1687)

We have run out of Money—now wehave to think.

—Ernest Rutherford (1871-1937)

Overview

This chapter is organised into three parts. The firstdiscusses the roles played by the significant players

associated with a Command and Control (C2)

Page 28: NATO Code of Best Practice for C2 Assessment

22 NATO Code of Best Practice for C2 Assessment

assessment and how these roles affect the design andconduct of the assessment. The second part identifiesthe major phases of a C2 assessment and their iterativenature. The concluding section addresses the subjectof professional ethics and standards of conduct.

Assessment Participants

Like their subject, the organisation of C2 studiesinvolves complex interrelationships. It is crucial for theanalytical team to establish which individuals andorganisations are involved at an early stage of thestudy. It is prudent for the analytical team to map theroles described below onto the individuals andorganisations involved and to understand theirinterrelationships. An example of such a mapping isat Figure 2.1. Appendix 1 to this chapter provides abrief explanation of the organisations involved.

Due to the dynamic nature of such projects, thoseinvolved should not be surprised if the nature of the teamsinvolved might have to expand or change with time.

Assessment Team

The assessment team is working for the sponsor orclient (sponsor). The team consists of a senior teamleader (who may also be referred to as the projectmanager), a core set of analysts, subject matterexperts including military officers, and supporting staffwho are working on the study on a day to day basis.The team provide the legitimacy and authority for thestudy. The sponsor will provide the terms of reference,access to needed information, and identify the desiredproducts. It is important for the analytical team to

Page 29: NATO Code of Best Practice for C2 Assessment

23Chapter 2

understand exactly why the sponsor wants the studyand what the sponsor wants to do with the results.

Decisionmakers or Problem Owners

The decisionmakers are the individuals ororganisations that are expected to make decisionswholly or partially based on the output or findings ofthe study. If there is no decision to be made (i.e. thisis an exploratory study) then the decisionmakers couldbe referred to as problem-owners. It is important forthe assessment team to understand exactly what typeof assessment the decisionmakers want the study tosupport. The decisionmakers may or may not be incommand of or part of the sponsor’s organisation.Complex problems may arise when the decisionmakeris several steps laterally away in the organisation fromthe sponsor and study team.

Stakeholders

The stakeholders are the persons or organisations thatare directly or indirectly affected by the study outcome.Stakeholders may also play other roles. Theassessment team must be aware of the potential forconflict when the stakeholders do not include thesponsor or decisionmaker. Complex problems may arisein the provision of data for the study, as it is thestakeholders who may have to provide the data, setthe security or releasibility of that data (and hence thestudy), and/or agree that the data are representative.For these reasons it is essential that the analystsestablish a working relationship with the stakeholdersearly in the process.

Page 30: NATO Code of Best Practice for C2 Assessment

24 NATO Code of Best Practice for C2 Assessment

Bill Payer

The bill payer is the organisation or individual officialpaying for the study. It is important for the assessmentteam to know the level of resources available. Billpayers will normally have a direct interest in theoutcome of the study and may be one or more of theother players. Contractual authorities have the legalauthority to let contracts on behalf of the bill payer.

Existing Study Teams

The assessment team must be aware of and sensitiveto the existence of teams in other related study areas.Should such teams exist, the assessment team shouldendeavour to exploit the work done and available skillsand techniques. Such external teams may also beappropriate for membership in peer reviews.

Future Study Teams

The assessment team must be aware of and sensitiveto the needs of future analyses and assessments. Datacollection, method documentation, and the archivingof data, methods, models and results are fundamentalresponsibilities of all professional analysts. Method anddata should be (as far as is practicable) disseminatedand published.

Peer Reviewers

Outside experts brought in to look at the work andprovide constructive criticisms are called peerreviewers. Peer review teams could be composed ofspecialists and other study teams in related subject

Page 31: NATO Code of Best Practice for C2 Assessment

25Chapter 2

areas and should include representatives from all keydisciplines in the assessment.

Data Providers

Data providers are the individuals and organisationsthat possess data and information useful to theassessment team. Many of these will be stakeholders.The motivation to provide data to the study must bedeveloped by the analytical team and the sponsor.

Assumption Providers

Assumption providers are the individuals ororganisations that can provide “givens” such as futuredoctrine, performance data, force mixes,organisational structures, and scenarios. Creation ofa positive relationship with these organisations isimportant to the study.

Data Collectors

In some C2 analyses, where data must be extracted fromreal world experiences, exercises, experiments, andwargames, teams of data collectors and subject matterexperts will be required. The identification of people withthe appropriate background and training as datacollectors is an important element of such studies.

Page 32: NATO Code of Best Practice for C2 Assessment

26 NATO Code of Best Practice for C2 Assessment

Relationships Among Participantsand the Conduct of the Assessment

Relationships

Figure 2.1 illustrates how complicated the participantroles and relationships can be in a real C2 assessment.This particular figure represents the organisationsinvolved in the recent Immediate Reaction Task Force(Land) (IRTF(L)) C2 Concept evaluation thatcompleted at the end of 2001 (Candan & Lambert,2002). Appendix 1 to this chapter provides a briefexplanation of the organisations involved. Although notall projects will be this complex, many important C2assessments will.

Through the prudent act of mapping the roles of theparticipants of the study, the potential conflicts ofinterest and complex interactions can be identified.One method to mitigate these is to present or conductthis activity openly and discuss with all involved sothat all potentially affected participants are aware ofthe possibility of future conflict and the fact that allparticipants fall into one or more roles within a project.

In the event of conflict with other participants in theproject the assessment team should address theissues in a neutral and independent manner.

Page 33: NATO Code of Best Practice for C2 Assessment

27Chapter 2

Figure 2.1. Example of the Mapping of Roles ontoPlayers—for a Complex C2 Project (the Evaluation ofImmediate Reaction Task Force (Land) C2 Concept)

Understanding the Context of theAssessment

The relationship among the assessment team, the keysponsor, and the stakeholders is of paramountimportance and perhaps, more than any other singlefactor, will influence the course and success of theeffort. Accordingly, adequate attention needs to be paidto understanding the situation facing the key sponsorand stakeholders as much as the subject under study.

The assessment team should be aware that thedifferent participants will have divergent perspectivesand may have divergent agendas.

Therefore the assessment team should understand thebackground of the individuals involved, theirorganisational settings, roles and responsibilities, their

Page 34: NATO Code of Best Practice for C2 Assessment

28 NATO Code of Best Practice for C2 Assessment

history, and their current situation. Contact with analystswho have worked with this sponsor and review of prioranalyses for this sponsor facilitate this objective.

It is good practice to build and maintain long-termrelationships with the sponsor and stakeholderorganisations. This will yield substantial dividends inthe form of easier communication, greater trust, andstronger support.

A Continuing Dialogue

It is important that a dialogue with the sponsor andstakeholders is maintained by the assessment teamthroughout the study. As there is no single “language”that will describe the study problem, it is important tospend time at the beginning of a study to establish acommon “language” that both the assessment teamand the sponsor and stakeholder can understand. Thispoint may seem obvious in a NATO setting in whichthe participants speak many different naturallanguages. However, it is equally important in a singlelanguage setting because common words and phraseshave different meanings for different organisations,services, and even individuals within a singleorganisation. Regular meetings and contact willminimise misunderstandings. From a professionalpoint of view, Operations Research (OR) andOperations Analysis (OA) analysts will always wish toinform the sponsor and stakeholders of keydevelopments and/or challenges as the study unfolds.Regular and routine interactions need to be built intothe project plan. If there are multiple sponsors andstakeholders and other key actors, the assessmentteam should try to meet them jointly, particularly when

Page 35: NATO Code of Best Practice for C2 Assessment

29Chapter 2

decisions need to be made. Separate meetings willoften lead to inconsistent guidance and will place theassessment team in a position of trying toaccommodate differing interests.

The development of a collectively agreed upon StudyGlossary that captures the definitions of words, phrasesand acronyms used in the study is a useful tool.

Terms of Reference

A good “term of reference” covers goals, scope,products, schedule, and resources. These willdetermine the focus of the assessment and establishlimits or freedoms granted to the assessment teamwithin the sponsor’s and stakeholder’s organisations.Letters of introduction and instructions to actors withinthe sponsor and stakeholder organisations may alsobe useful.

Understanding How the Output of theStudy Will be Used

It is important to know at an early stage in the projectwhat the products of the study are to be used for bythe sponsor and stakeholder organisations. Theexpected end product will set the tone and relativeimportance of the project in the eyes of the sponsor,stakeholder, and other actors. The assessment teamneeds to establish and understand the productsneeded or desired by the sponsor and stakeholder.For example, a study could be used to create asignificant impact on the stakeholder’s domain, gain agreater understanding of the issues, produce an

Page 36: NATO Code of Best Practice for C2 Assessment

30 NATO Code of Best Practice for C2 Assessment

improved capability to perform future work, and/ormake contributions to the body of knowledge.

Budget

The sponsor will have limited resources with a studybudget in mind. When the sponsor’s resources arelimited to a level below what is required to support aquality study, the assessment team will need tosuggest strategies to address the shortfall. Alternativeapproaches include decomposing the problem andonly undertaking the core part of the study that isaffordable, linking the sponsor to other actors that havean interest in the same or a similar problem and whocould contribute resources, and/or stretching theproject over a longer time so that resources from futurebudget cycles become available. In developingstrategies that involve doing only a part of the study tosatisfy budget constraints, care must be taken toensure that the product that will be produced providesa meaningful answer or contribution and does notdepend upon a follow-on effort that may or may notbe funded.

It may take a complete iteration of the assessmentphases of the project to establish the complete scopeof the project and the resources required. Therefore itis good practice in large C2 projects to allow theassessment team to perform a rapid first pass of allthe phases of the project to help establish the budgetrequired. This is contrary to the usual practice of settingthe budget in stone immediately following the initialProblem Formulation or Solution Strategy phase (seeSection 2-E below).

Page 37: NATO Code of Best Practice for C2 Assessment

31Chapter 2

Relationships

Figure 2.1 illustrates how complicated the participantroles and relationships can be in a practical C2assessment. This particular figure represents theorganisations involved in the recent ImmediateReaction Task Force (Land) (IRTF(L)) C2 Conceptevaluation that completed at the end of 2001 (Candan& Lambert, 2002). Although not all projects will be thiscomplex, many important C2 assessments will be.

Building an Assessment Team

Skills Available to the Assessment Team

Following initial problem formulation (Chapter 3), theprecise skills and experience required by theassessment team will need to be established. Typically,the assessment team will need to be interdisciplinary.The wide range of skills and experience required canbe allocated between a full-time core team and acollection of consultants or part-time team members.As an example, an ideal breakdown of the skills availableto the assessment team involved in the evaluation ofthe Immediate Reaction Task Force (Land) C2 conceptstudy1 is given below:

Skills: Core Team

• Project management;

• OR/OA skills: simulation, wargaming,mathematical programming, database creationand management, brainstorming and problem

Page 38: NATO Code of Best Practice for C2 Assessment

32 NATO Code of Best Practice for C2 Assessment

structuring, scientific/military report writing/editing;

• Cross military experience—i.e., OR/OApersonnel with military experience or militarypersonnel with OR/OA knowledge;

• Organisational theory; and

• Data collection (e.g., questionnaire and formdesign).

Skills and Experience: Consultants and Part-timeTeam Members

• Military (or access to practical experience ofproblem under study);

• Training and exercise planning (if an exercise isto be used as a vehicle for the study);

• Communications and information systemsspecialists for the systems of the organisationunder study;

• Human computer interface expertise;

• Operations Other Than War (OOTW) relatedissues (e.g., C2/Headquarters [HQ], media, civil-military cooperation—theory and practice);

• Social scientists (e.g., political, psychological,economic, cultural, legal);

• Military history;

• Command experience;

• Deployment analysis;

• Intelligence/threat/area of operations expertise; and

Page 39: NATO Code of Best Practice for C2 Assessment

33Chapter 2

• Legal/contracts/administration expertise.

As another example of the skills required for C2Assessment Studies, the skills required to provide ORsupport to C2 elements (such as OR/OA support toan operational HQ) is also analogous, as illustrated inTable 2.1 (RTO, 1999).

Table 2.1. Knowledge, Capabilities and Skills Neededby OR/OA Cell Team Members

Background of the Assessment Team

Building a C2 assessment team with this full breadthof knowledge, capabilities, and skill requires a long-term commitment by the mother OR/OA organisationto prepare a corpus of potential team members throughrecruitment, education, training, and opportunities forappropriate field experience.

Following the identification of the skills required forthe team, those analysts made available for the teamshould ensure that they leave a basic understandingof the military fields under consideration. Gaps inexperience should be rapidly filled through backgroundreading, short courses, field experience, or additional/

Page 40: NATO Code of Best Practice for C2 Assessment

34 NATO Code of Best Practice for C2 Assessment

alternative analysts with the appropriate experienceand skills.

Forming the Assessment Team

In a study that involves dispersed and disparateorganisations and teams, the effort to command andcontrol the study group must be recognised and effortand time built into the study plan. This can be, forexample, through maintenance of distributed workingenvironments such as web portals, informationcampaign material and travel time to meetings. In thesecases the senior team leader will revert to a role moreakin to a project manager.

It is one thing to assemble a group of people, quiteanother to forge them into a coherent effective team.Sufficient time and a facilitating process should be builtin to the project plan for the group of individuals tocoalesce into a team.

Interdisciplinary AssessmentTeam and Outside Relations

It follows that C2 analysis, particularly for OOTW issues,should be done by an interdisciplinary assessmentteam. Experienced analysts know that their work owessuccess in no small measure to efficient workingrelationships within the assessment team and with thecustomer of analysis. Building good workingrelationships among representatives of differentscientific cultures, such as OR/OA and IT analystsgrounded in (hard) physical sciences and mathematicson one hand and (soft) social scientists on the other,requires sufficient mutual understanding of

Page 41: NATO Code of Best Practice for C2 Assessment

35Chapter 2

methodologies and tools. In fact, differences in scientificcultures can outweigh differences in natural culturesprovided that all members of the assessment team havesufficient command of a common language. Therefore,in addition to leadership and project management skills,the head of the assessment team must have a goodgeneral idea of the current state of all disciplines involvedin order to compose an efficient team and facilitateinterdisciplinary cooperation throughout the analysis.

Good personal and working relationships with thecustomer of the analysis are essential forunderstanding every aspect of the problem and beingable to arrive at a problem structure and solutionstrategy that meets the customer’s immediate needsin the light of the strategic objectives of the respectiveOOTW. Knowing the customer’s position in thecommand hierarchy and the degree of influence he/she wields through informal relationships overstakeholders and actors, the co-operation of whichmight be essential for an implementation of analysisresults, and understanding the respectiveconsequences associated with alternative solutions isimportant for assessing their acceptability andorganisational risk.

It is equally important for the assessment team toestablish working relationships with the potentialsubjects of study in the early stages. This is essentialfor capturing the nature and problem relevance of formaland informal relations between all organisations, groups,and individuals that are subjects of the study, findingout about their motivations and agendas, and elicitingfirsthand information that is critical for solving theproblem such as their capabilities and the conditionsattached for their employment. However, the analyst

Page 42: NATO Code of Best Practice for C2 Assessment

36 NATO Code of Best Practice for C2 Assessment

should be careful not to allow this effort to gain greaterunderstanding of the problem to introduce bias.

Assessment Phases, Process,and Dynamics

It is important to realise that all of the elements of theC2 assessment are interrelated. Hence ProblemFormulation, Solution Strategy, Measures of Merit,Scenarios, Human/Organisational Factors, Models/Tools/Data, and products are all interdependent.Figure 2.2 illustrates the major phases and iterativenature required for C2 assessments. The AssessmentProcess diagram was the most difficult thing for theSAS026 team to agree upon. In essence this diagramis at the heart of the COBP (Starr, 2001). Theremainder of this chapter discusses the key points inthis diagram.

Figure 2.2. C2 Assessment Process

Page 43: NATO Code of Best Practice for C2 Assessment

37Chapter 2

Problem Formulation

The output of Problem Formulation (Chapter 3) specifiesthe “what” of the assessment. C2 studies tend to becomplex and feature multiple attributes, some of whichmay not be apparent at the start of the study. Neitherthe assessment team nor the sponsor should besurprised if the issues initially presented for study arereplaced by other issues that are closer to the underlyingcauses of the initial problem or, in some cases,symptoms presented. A consequence of the dynamicnature of problem formulation is that the solution strategyand any of the other elements of the assessment maychange as the study progresses. Problem formulationshould therefore be consciously and routinely iteratedduring a study—especially when new attributes start toappear. As a minimum an iteration should occurimmediately following the establishment of the initialsolution strategy and the assessment of study risk.Additionally, the sponsor should be aware that thenature of the assessment team, sponsor, or assumptionprovider teams might also have to expand or changewith time. This has implications for planning, budgeting,and tasking.

In nearly all C2 studies the assessment team will studyonly a subset of the whole problem space due to thesponsor’s sphere of interest. This fact must berecognised by the assessment team. An initial studyof the complete problem space is essential to establishthis realisation. This will help the assessment team tounderstand the context of the study and provide adviceto the sponsor on the actual underlying causes to hisproblem and consequently the requirement to involveother participants.

Page 44: NATO Code of Best Practice for C2 Assessment

38 NATO Code of Best Practice for C2 Assessment

Solution Strategy

The next step is to develop a Solution Strategy(Chapter 4) that specifies the “how” of the assessment.Arising from the Solution Strategy agreed upon andadopted by the sponsor are a set of terms of reference(e.g., Statements of Work [SOW] for contracts) thatwill determine what work is to be conducted, thecontractual obligations, deadlines, and resources.Although these must be established as anexperimentation campaign plan3 and studymanagement plan (project plan) before work on theproject begins in earnest, flexibility must be built-in dueto the iterative nature of C2 assessment. Theassessment team must be aware of any preconceived“solutions” that have been proposed by the sponsor,stakeholders, and/or decisionmakers and explicitlydeal with these as appropriate, avoiding anotherpressure to be steered in a particular direction. Theassessment team must note if its results are beingsteered in a particular direction and follow ethicalbehaviour in performing the study (see the end of thischapter). In many cases a risk-based approach to C2assessments can usefully complement the moretraditional cost-effectiveness approach. In particular,this helps decisionmakers to deal with the uncertaintiesof the real problem.

From a professional point of view analysts should alwaysdefer the selection of a particular method until the problemhas been formulated and a solution strategy has beendefined. Recognise and beware of “preconceived”solutions that could influence the assessment.

Page 45: NATO Code of Best Practice for C2 Assessment

39Chapter 2

Review

Once there has been a preliminary formulation of theproblem and development of a solution strategy, it isimperative that an initial review be conducted. Thisreview should be conducted from multiple perspectives(e.g. with respect to the sponsor’s initial problem, thefeasibility with respect to resources including teamskills and schedule, soundness of the proposedanalytic approach). As a result of this review, changeswill usually be made in both the problem formulationand the solution strategy.

Measures of Merit, Scenarios, andHuman/Organisational Factors

At this stage the assessment team must specify thehierarchy of Measures of Merit (MoM) (Chapter 5),incorporate and identify relevant human andorganisational factors (Chapter 6), and specify theappropriate scenarios (Chapter 7). As suggested bythe diagram, there is no unique sequence for doingthese tasks. Iteration is required to ensure that thesetasks are done in a coherent, consistent fashion. Whenall of these tasks have been completed, the team hasspecified the key variables to the necessary level ofdetail with adequate considerations for assessmentvalidity and reliability.

When developing the MoM it is very valuable to involvethe sponsor in establishing the linkages between theMoMs and the hierarchy of MoM. This is because thenthe sponsor will then appreciate the dynamics of theproblem and the requirement for breadth in the study.Although a full set of MoM must be derived in

Page 46: NATO Code of Best Practice for C2 Assessment

40 NATO Code of Best Practice for C2 Assessment

accordance with the best practice noted in Chapter 5,the MoM should be prioritised to focus on providingsupport to the objectives of the study and be practicaland cost effective.

When selecting appropriate scenarios it is goodpractise to utilise scenarios (if they exist) from astandard set of scenarios approved for use within theassessment and sponsor organisation. The sponsormust always be approached for approval of thescenarios. It is bad practice to design a scenario toprove a point.

Models, Tools, and Data Requirements

The next step is to iteratively identify the methods andtools (Chapter 8) and data (Chapter 9) required toperform the assessment. One of the major challengesof the assessment is to identify and gain access tomodels, tools, and data that are appropriate forexploring the issues of interest. The challenges comein several dimensions:

• First, there is a limited set of tools that dealeffectively with the C2 dimension of the problem.

• Second, for even this limited set, it is oftendifficult to access and modify the tools to reflectthe variables of interest.

• Third, there is often a paucity of useful data andpreviously validated parameters.

As a result of the establishment of the MoM for thestudy and the data that underpins those MoM andmodels, a data collection and analysis plan should beformulated. The sponsor should also be made aware

Page 47: NATO Code of Best Practice for C2 Assessment

41Chapter 2

of the difficulties associated with getting appropriatedata, cost of the data collection and analysis plan, andthe implications to the study if the required resourcesare not set aside and budgeted to collect, collate,process, and analyse the data.

Assess Study Risk

At this point in the process the assessment teamshould take a look at the risks and uncertainties(Chapter 10) associated with the decisions made withrespect to all of the tasks performed to date (e.g.,consistency between the scenarios and the data,models and availability of data, tools and analysis).The sponsor must be made aware of these risks anduncertainties and the strategies developed by the teamto mitigate them. If the risks associated with thesuccessful completion of the study are perceived asbeing too high, the solution strategies should berevisited and adjusted accordingly.

Peer Review

When the risk and uncertainties are perceived asmanageable, a peer review should be conducted. Peerreviews are not used enough because they tend to betime-consuming, seen as raising costs, or perceivedas threatening. In addition, research teams often wantto perfect their results and methods before revealingthem. The key is to build a peer review into the studyfrom the outset. The sponsor should be informed asto the importance of the peer review. Peer reviewsshould be built into the budget and reviewers invitedto look at the terms of reference, interim products, anddraft reports so that they are knowledgeable about

Page 48: NATO Code of Best Practice for C2 Assessment

42 NATO Code of Best Practice for C2 Assessment

the effort and motivated to support the project. In laterstages of the study, the peer review can improvepresentation and also act as a mechanism to makethe results known to the professional communities.Over time the assessment team should develop arelationship with high quality peers and use them as apool of reviewers.

Peer reviews are not a luxury but a necessity.

Conduct of the Study

At this point we are in a position to execute theassessment. The assessment team leader shouldkeep a study notebook or journal in which allassumptions and decisions are documented so thatthey are available for detailed discussion. Detailedadministrative records need to be kept regarding thedata, metadata, models, and analytical anddocumentation tools. This will enable replication ofparts of the C2 analysis should the need arise. Aneffort should be made to create data sets (not just theproject results) that will be available to otherresearchers. The resources required to make suchdata available to external bodies needs to be madeclear to the sponsor. The conduct of the study will notusually be linear. It should be anticipated that multipleiterations will be conducted and that lessons learnedfrom initial data collection and analysis efforts willinform subsequent activities.

Study Products

The team must recognise the importance of presentingthe results of the assessment in a clear and

Page 49: NATO Code of Best Practice for C2 Assessment

43Chapter 2

comprehensive manner, taking into consideration thestyle of the decisionmaker (Chapter 11). It isparticularly important that these results illuminaterather than obscure the uncertainties associated withthe assessment.

Ethics

Professional operations research organisations, suchas the Military Operations Research Society (MORS),have developed professional codes of ethics (AnnexC). The assessment team should also be guided by aset of professional ethics and standards of conduct toensure the integrity and quality of the analysis. Thismeans that the assessment team should, inter alia:

• Maintain an open and honest dialogue with thesponsor and other key players within the projectin order to minimise misunderstandings;

• Ensure that C2 assessments are organised andconducted in a balanced fashion that adequatelyidentifies and represents all perspectives,options, and relevant evidence;

• Inform the sponsor and other key players of:

• any constraints, assumptions, orcircumstances that threaten a balancedassessment; and

• the risks and uncertainties associated with themethods and data used in the project, and

• strategies to minimise the risks.

Page 50: NATO Code of Best Practice for C2 Assessment

44 NATO Code of Best Practice for C2 Assessment

Chapter 2 Acronyms

ACE Resources – Allied Command Europe Resources– (Part of SHAPE)

AF(N) – Regional Command (North)

AMF(L) – ACE Mobile Force (Land)

ARRC – ACE Rapid Reaction Corps.

C2 – Command and Control

CDE – Concept Development and Experimentation

CPX – Command Post Exercise

FTX – Field Training Exercise

HB(A) – UK Historical Branch (Army)

HQ – Headquarters

IRTF(L) – Immediate Reaction Task Force (Land)

JCSC – Joint Sub-Regional Command South Centre

JCSE – Joint Sub-Regional Command South East

JFCOM – US Joint Forces Command

KIBOWI – NL Army Exercise Driver

MND(C) – Multinational Division (Central)

MoM – Measures of Merit

MORS – Military Operations Research Society

NC3A – NATO C3 (Consultation, Command &Control) Agency

NL MOD – Netherlands Ministry of Defence

Page 51: NATO Code of Best Practice for C2 Assessment

45Chapter 2

OOTW – Operations Other Than War

OA – Operational Analysis

OR – Operations Research

PRL – SHAPE Policy Requirements Land

SACLANT OA – Supreme Allied Command AtlanticOperational Analysis Cell

SFS – Strike Force South

SHAPE – Supreme HQ Allied Powers Europe

SOW – Statements of Work

WPC – Warrior Preparation Center (Ramstein Germany)

Chapter 2 References

Research and Technology Organization (RTO) SAS-003report on IFOR and SFOR data collection & analysis(6 AC/323(SAS)TP/11). [Technical Report]. (1999).Neuilly-Sur-Seine Cedex, France: North AtlanticTreaty Organisation (NATO) RTO. Available fromhttp://www.rta.nato.int/RDP.asp?RDP=RTO-TR-006

Starr, S. H. (n.d.). (2001). Lessons recorded fromapplying the North Atlantic Treaty Organisation(NATO) code of best practice (COBP) for C2assessment to operations other than war(OOTW). McLean, VA: MITRE Corporation.Available from http://www.mitre.org/support/p a p e r s / t e c h _ p a p e r s _ 0 1 / s t a r r _ n a t o /starr_nato.pdf

Page 52: NATO Code of Best Practice for C2 Assessment

46 NATO Code of Best Practice for C2 Assessment

Candan, U. and Lambert, N. J. (2002). Methods usedin the Analysis and Evaluation of the ImmediateReaction Task Force (Land) Command andControl Concept. TN 897. NATO C3 Agency,PO Box 174, 2501 CD, The Hague, TheNetherlands.

Appendix 1 to Chapter 2: ParticipantMapping of the Evaluation of theImmediate Reaction Task Force(Land) C2 Concept—An Explanationof Figure 2.1

Background

The Immediate Reaction Task Force (Land) (IRTF(L))command and control concept was proposed in 1998as a mechanism to modernise the ACE Mobile Force(Land) (AMF(L)). The IRTF(L) concept is predicatedon the enlargement of AMF(L) from brigade size up todivision size with a single streamlined headquartersand a chain of command using embedded mini-TaskGroup HQ cells. This was evaluated between 1999and 2001 as a test case for the NATO ConceptEvaluation and Experimentation (CDE) process usinga series of FTX, CPX, wargames, simulations andhistorical analyses.

Assessment Team

In the case of the IRTF(L) study the assessment teamwas led by NC3A OR Division, with contracted experts,analytical and military support from KS Consultants

Page 53: NATO Code of Best Practice for C2 Assessment

47Chapter 2

and UK DERA. Free analytical support was also madeavailable at peak periods from US JFCOM andSACLANT Operational Analysis (such as the exercisesof the experimentation campaign).

The sponsor was SHAPE Policy and Requirements Land—who were tasked with the evaluation of the C2 Concept.

It was clear to the assessment team as to why thesponsor wanted the study—a straightforwardevaluation of the military utility (to NATO) of the C2Concept. However at the end of the study the resultswere combined with other issues, and decisions weremade on the future of the unit under study. This wassomething that was not foreseen by any of theparticipants at the start of the project.

Decisionmakers or Problem Owners

The sponsor’s task was to provide advice up the chainof command to SACEUR and ultimately the MilitaryCommittee on the efficacy of the C2 Concept. Althoughthe HQ ACE Mobile Force (Land) was the subject ofthe study it was also party to any decision regardingits own future modernisation. It is commanded directlyby SACEUR via SHAPE.

SACLANT CDE, however, was not in the commandchain, but was seen as a decisionmaker within thecontext of the study as it was interested in theexperience of the team in conducting the study as atest case to illustrate the value of NATO Centred CDEto the Alliance.

Page 54: NATO Code of Best Practice for C2 Assessment

48 NATO Code of Best Practice for C2 Assessment

Stakeholders

The ACE Mobile Force (Land) was the mainstakeholder as it was the subject of the study. As adecisionmaker, data and assumption provider and alsopossible member of a future study team it was in avery powerful position, and was approached andtreated with much respect by the assessment team.After a shaky start (where neither side was sure of theother’s intentions) a good working relationship wasestablished over the period of the project.

The Netherlands MOD—in the form of the RoyalNetherlands Army—was the provider of the CommandInformation System (ISIS) used as the digitisationvehicle for the evaluation of the concept. As such itwas directly affected by the exercise program usedfor the experimentation and also the future of theconcept and AMF(L).

The Military Committee was also a stakeholder,representing the Nations of NATO that contributetroops and staff to the AMF(L), and these nations wouldbe directly affected by any decision on the concept.

ACE Resources at SHAPE were also a stakeholder asthey were required to sanction and organise any manningchanges proposed for the HQ—including the temporaryadditional manning required for the evaluation.

Bill Payer

Monies were mostly provided from the slice of theNC3A Scientific Program Of Work (SPOW) controlledby SHAPE PRL. In the initial stages of the projectadditional monies were also provided by SACLANT.

Page 55: NATO Code of Best Practice for C2 Assessment

49Chapter 2

Monies also had to be sought from the SHAPEExercise budget to pay for movement of the exerciseobservers in order to attend the exercises.

Throughout the project the NC3A was the contractualauthority to let contracts on behalf of the bill payer.

Existing Study Teams

An extensive literature search was conducted for thestudy—with the majority of recent references occurringwithin the UK and US. Exploratory trips (organisedthrough US JFCOM) to US Battle labs and UK facilities(through UK DERA) revealed the current state ofknowledge with respect to measuring C2 performancein exercises and evaluating new C2 concepts. Inresponse to this the data collection methodology wasbased initially on the Fort Leavenworth, US ArmyResearch Institute ACCES method.

Future Study Teams

It was identified at an early stage that there could befuture related projects. In particular those relating toexpeditionary and initial entry forces. The probableNATO organizations that could be involved in suchstudies were NC3A, SHAPE PRL, AMF(L),Multinational Division (Central), Strike Force South andthe ACE Rapid Reaction Corps. Of course there wouldprobably be future study teams within the nations—but these plans are not visible to NATO. Consequently,as the assessment team was very likely to be involvedin such future work; all data was archived and routinelywritten up and published.

Page 56: NATO Code of Best Practice for C2 Assessment

50 NATO Code of Best Practice for C2 Assessment

Peer Reviewers

The assessment team were not able to arrange aformal Peer Review of the solution strategy adopted.This mechanism does not yet exist for NATO centredstudies. What was achieved the submission of theproblem to the SAS026 panel as an example for testingthe coverage of the revised COBP. This yielded somepractical advice and helped the assessment teambetter understand the dynamics of the project.

Data Providers

Most of the data for the evaluation were derived fromNATO training exercises run by or for AMF(L),MND(C), SFS and Joint Sub Regional CommandsSouth East and South Centre. In all cases relationshipshad to be curried by the assessment team and sponsorto allow access to the HQ and Exercise Control forthe exercise observers, and for background materiel.In two cases national exercise training centres andexercise drivers were hired by the assessment teamto support command post exercises (WarriorPreparation Center, and the KIBOWI exercise driver).Historical data for the study was also provided fromthe UK Historical Branch (Army)—which wasapproached via the contracted UK members of theassessment team.

Assumption Providers

The assessment team was in the fortunate position toactually be one of the assumption providers—throughNC3A’s and the sponsor’s involvement in the NATODefence Requirements Review. The owner of the C2

Page 57: NATO Code of Best Practice for C2 Assessment

51Chapter 2

1As developed independently by the SAS-026 panel in February2001 in response to a presentation in the IRTF(L) project. Infact, this was fairly close to what was available to the team.2Linear programming, dynamic programming, queuing theory,inventory control, network analysis with PERT, game theory andsimulation.3Required if the C2 Assessment makes use of a series of linkedevents such as seminars, wargames, command post exercises(CPX), field training exercises (FTX), etc.

Concept however remained HQ AMF(L) itself, andtherefore remained the authority as to its conceptualand physical implementation.

Data Collectors

In the case of the IRTF(L) study, data was largelyextracted through observation of HQ activities duringexercises and team-in-the-loop wargames. In all ofthese exercises Subject Matter Experts (SME) wereused to observe functional and cross functionalactivities in the HQ. Most of the military SMEs wereprovided by Regional Command AF NORTH and itssubordinate commands across Allied CommandEurope (ACE). Additional data collectors were alsoprovided by the German University of the FederalArmed Forces and US JFCOM. UK DERA providedmilitary analysts to lead some of the activities involvedin capturing the HQ processes.

Page 58: NATO Code of Best Practice for C2 Assessment

53

CHAPTER 3

ProblemFormulation

First find out what the question is—then find out what the real question is.

—Vince Roske

Definition of Problem Formulation

Effective problem formulation is fundamental to thesuccess of all analysis, but particularly in Command

and Control (C2) assessment because the problemsare often ill-defined and complex, involving many

Page 59: NATO Code of Best Practice for C2 Assessment

54 NATO Code of Best Practice for C2 Assessment

dimensions and a rich context. Problem formulationinvolves decomposition of the analytic problem intoappropriate dimensions such as structures, functions,mission areas, command echelons, and C2 systems.Problem formulation is an iterative process that evolvesover the course of the study. It is essential even forsmall studies or where time is short—it will save timelater and help ensure quality.

The problem formulation phase should identify the contextof the study and aspects of the problem-related issues.

The context of the study includes:

• Geopolitical context that bounds the problem space;

• Political, social, historical, economic, geographic,technological environments;

• Actors;

• Threats;

• Aim and objectives of the analysis, including thedecisions to be supported;

• Generic C2 issues;1

• Relevant previous studies; and

• Stakeholders and their organizational affiliation(including both stakeholders of the problem andstakeholders of the study).

The aspects of the problem include:

• Issues to be addressed;

• Assumptions;

Page 60: NATO Code of Best Practice for C2 Assessment

55Chapter 3

• High-level Measures of Merit (MoM);

• Independent variables (controllable anduncontrollable);

• Constraints on the values of the variables(domain and range);

• Time constraints on delivery of advice to thedecisionmaker; and

• Whether this is a single decision or (possibly oneof) a chain of decisions to be made over time.

The problem is not formulated until the assessmentteam has addressed each aspect of the problem. Insimple terms, problem formulation can be seen as aniterative process. First, the team must identify thevariables that bound the problem space. Then theymust determine which of these are outputs (dependentvariables) and which of these are inputs (independentvariables). The team proceeds by iterating to build anunderstanding of how these relate to each other. Itshould be viewed as a voyage of discovery. In most, ifnot all, cases of C2 assessment, the knowledgedomain under study is in fact a system characterisedby rich interaction and feedback among all the factorsor variables of interest. The choice of dependentvariables results from a clear specification of the issuesand products needed to satisfy the terms of reference.Independent and intervening variables are also chosenbased on the purpose of the analysis.

In the initial problem formulation iteration, it is criticalto begin with an understanding of the REAL problemrather than a determination to apply readily availabletools, scenarios, and data.

Page 61: NATO Code of Best Practice for C2 Assessment

56 NATO Code of Best Practice for C2 Assessment

Within the NATO context, a number of documents areavailable or under development that may assist inunderstanding the study context. They are listed atthe end of this chapter.

Principles of Problem Formulation

There is no universally acceptable approach toproblem formulation. However, best practices exist thatcan be applied. The principles associated with problemformulation are addressed in two categories: those thatare appropriate for all C2 assessments and those thatare appropriate for assessments of C2 for OperationsOther Than War (OOTW).

Principles Appropriate for C2 Assessments

Explicit problem formulation must precede constructionof concepts for analysis or method selection. This isnot a trivial exercise, especially in C2 assessments.Proper resourcing of problem formulation activities willimprove the overall efficiency and quality of the study.

An understanding of the decisions to be supported bythe analysis and the viewpoints of the variousstakeholders (e.g., customers, users, and suppliers)is essential to clarifying the study issues. Thisunderstanding should be fed back to the stakeholders.A careful review of previous and current work must becarried out as a valuable source of ideas, information,and insight. This review should also serve to identifypitfalls and analytic challenges.

Problem formulation must not only provide problemsegments amenable to analysis, but also a clear and valid

Page 62: NATO Code of Best Practice for C2 Assessment

57Chapter 3

mechanism for meaningful synthesis to provide coherentknowledge about the original, larger problem. Theformulated problem is thus an abstraction of the real problemthat can be defined in terms of dependent variables thatrelate to this real problem and coherent settings for theindependent variables that can be interpreted in terms ofdecisions and actions by the customer.

Problem formulation must be broad and iterative innature, accepting the minimum of a priori constraintsand using methods to encourage creative and multi-disciplinary thinking, such as proposing a number ofhypotheses for the expression of the problem. It mustbe recognised that change is inevitable in manydimensions (e.g., understanding of the problem,requirements, technologies, co-evolution of conceptsof operation, command concepts, organisation,doctrine, systems). Thus the assessment process mustanticipate and accommodate this change.

Practical constraints such as data availability, studyresources (including time), and limitations of toolsshould be treated as modifiers of the problemformulation rather than initial drivers. Such constraintmay, in the end, drive the feasible solutions, but it isimportant to recognise this as a compromise ratherthan an ideal. Proper problem formulation takessubstantial time and effort!

It is important that problem formulation address riskfrom multiple perspectives. In addition to sensitivityanalysis of the dependent variables, risk analysistechniques should be used to directly explore optionsto mitigate risk (Chapter 10).

Page 63: NATO Code of Best Practice for C2 Assessment

58 NATO Code of Best Practice for C2 Assessment

C2 assessment often involves impacts on defensebusiness outside the context of a particular campaignor operation. The study must address these impacts.

Principles Appropriate for OOTW C2Assessments

Problem formulation must address the geopoliticalcontext of the problem and seek to identify the “broad”C2 issues contained within the terms of reference forthe study. There are no universal societal “norms.”Therefore, care must be taken in attempting to transferthe experience in one OOTW to another.

OOTW C2 assessments often involve policy-relatedimpacts outside the context of a particular militaryoperation. Therefore, MoM hierarchies must containmeasures of policy effectiveness.

An historical perspective is critical to understandingOOTW because social conflict and structures oftenhave roots far back in history. However, it must beremembered that present-day social behaviour is notdriven by historical events themselves, but by present-day perceptions, processes, and prejudices whichhave evolved from the past.

A key risk in complex OOTW studies is allowing theproblem formulation process to focus prematurely onsubsets of the problem because they are: a) interesting;b) familiar; c) pre-judged to be critical; or d) explicitlycalled out by the customer. This requires great disciplineby the study team, especially where the team’s previousexperience is biased in favour of particular parts of theproblem space. The assessment team needs accessto subject matter experts from a broad range of

Page 64: NATO Code of Best Practice for C2 Assessment

59Chapter 3

disciplines (e.g., social scientists, historians, andregional experts in OOTW assessment).

Problem Formulation Process

During the early stages of problem formulation it isimportant to quickly cover the whole problem andproduce an initial formulation (i.e., an explicitexpression of the problem) (See Figure 3.1). Thisprevents premature narrowing of the assessment andserves as an aid to shared situation awareness withinthe study team.

Figure 3.1. The Formulated Problem

The process begins with the sponsor presenting theassessment team with a problem to assess and anarticulation of broad constraints (e.g., schedule,resources). Based on a preliminary assessment of theproblem, the team identifies the key issues to address.

Page 65: NATO Code of Best Practice for C2 Assessment

60 NATO Code of Best Practice for C2 Assessment

This identification of key issues leads to acharacterisation of the context for the study (e.g.,relevant geopolitical factors, identification of the keyactors and threats, identification of generic C2 issues,review of prior studies). Based on the results of thischaracterisation, the analysis team identifies what itperceives as the real issues to address. It is vital for theteam to engage in a dialogue with the key sponsor andstakeholders to get “buy in” for these issues. Once thatis achieved, the team must identify and characterisethe remaining elements of the problem formulationphase. To facilitate that activity, the analysis team shouldidentify/create and apply selected problem formulationtools and techniques (e.g., brainstorming, Delphianalyses, directed graphics, influence diagrams). Theresults of that activity will include a summary of theassumptions, high-level MoM, independent variables(both controllable and uncontrollable), and constraintson the variables. Once it is coordinated with the sponsorand stakeholders, the end product documents what isto be done in the analysis. The next key activity will beto develop a solution strategy that describes how thestudy is to be done.

Bounding the Problem/Issues andAssumptions

In dealing with fuzzy or uncertain boundaries, theproblem formulation process needs to explore andunderstand the significance of each boundary beforemaking (or seeking from customers) assumptionsabout it. This involves keeping an open mind, duringthe early stages of problem formulation, about wherethe boundaries lie and their dimensional nature. This

Page 66: NATO Code of Best Practice for C2 Assessment

61Chapter 3

is difficult because it makes the problem modellingprocess more complicated. A call for hard specificationtoo early in the problem formulation process must beavoided. In the end, of course, the problem must beformulated in order to solve it, but formulation shouldbe an output from the first full iteration, not an earlyinput to it.

In formulating an OOTW problem, we are trying tobound a complex system. This is partly a process ofunderstanding boundaries which exist in reality (e.g.,mission statements, geographical areas and the timingof a procurement process) and partly imposing artificialboundaries in order to illuminate the structure of theproblem and constrain the scope of the analysis. Toavoid the trap of over-specification, boundaries(especially self-imposed ones) should be kept porous,allowing for cause and effect chains to flow throughthe external environment of the portion of the complexsystem that the boundaries define.

While clear definitions and hard conceptual boundariesare ultimately necessary in order to create amanageable problem space, care must be taken toavoid coming to closure prematurely.

High-Level MoM

Identification of high-level MoM should start with idealmeasures of the desired benefits before consideringwhat can be practically generated by analysis (the lattermay force the use of surrogate MoM, but these mustbe clearly related to the desired measures).

A structured analysis of potential benefits2 should becarried out as a basis for constructing appropriate

Page 67: NATO Code of Best Practice for C2 Assessment

62 NATO Code of Best Practice for C2 Assessment

MoM. Mapping techniques, such as cognitive andcausal mapping, are a good way to express the variousrelationships within the problem space and to identify‘chains’ of analysis (i.e. links among the independentvariables and between the independent and dependentvariables). These lead to resultant structure in termsof independent and dependent variables, and henceto high-level MoM.

Problem Formulation Tools

It is useful to identify, develop (if necessary), and applyappropriate tools to support problem formulation.Representative tools and techniques include:techniques for supporting expert elicitation, influencediagrams, causal maps, system dynamic models, andagent-based models.

Problem Formulation is fundamentally a social processof developing a shared understanding. People skillssuch as the ability to facilitate a ‘brainstorming session’or to elicit information and context, are thus important.‘Throwaway models’ (which may be simple simulationmodels, causal maps, system dynamic models, etc.)may be developed as part of the process, and thendiscarded as insight is gained.

Tools and approaches used for problem formulationmust be consistent with other tools and techniqueslikely to be considered for the subsequent analysis inorder to produce a sensible ‘multi-methodology’approach to the entire problem and its solution.

Page 68: NATO Code of Best Practice for C2 Assessment

63Chapter 3

Constraints on the Variables

The formulation of the problem is completed when theconstraints on either the independent or dependentvariables have been identified. Constraints on thedependent variables represent “acceptable” thresholdsor limits. For example, one could place a constrainton blue loss, time to accomplish a mission, collateraldamage, or some combination of factors. Constraintson the independent variables represent either feasibleor acceptable limits on such factors as humanperformance, C2 system performance, or evensupplies. They also could represent doctrinal or legalprocesses that act as constraints.

The Next Step

The next step in the C2 assessment process is thedevelopment of a solution strategy. It should be notedthat the team is not finished with problem formulationat this point but is now ready to proceed to build asolution strategy. As work progress on thedevelopment of a solution strategy, it will also certainlybe necessary to revisit the specification of high-levelMoM and the constraints. This chapter concludes witha discussion of the products of problem formulation.

Products of Problem Formulation

Figure 3.2 depicts the essential elements of theformulated problem.

Page 69: NATO Code of Best Practice for C2 Assessment

64 NATO Code of Best Practice for C2 Assessment

Figure 3.2. Problem Formulation

A checklist can be used to ensure that all the aspectsdescribed in the definition have been covered. These include:

• Precise statements of the question being researched;

• A list of independent variables;

• A list of high-level MoM; and

• A list of assumptions and constraints.

Diagrams

Typically, the problem formulation phase should alsoproduce a number of diagrams such as influence mapswhich summarise the key issues and interactions.

Page 70: NATO Code of Best Practice for C2 Assessment

65Chapter 3

Data Glossary

The problem formulation phase must begin to createa glossary of key data elements, metadata,information, and terms.

Chapter 3 Acronyms

C2 – Command and Control

MoM – Measures of Merit

OOTW – Operations Other Than War

TTP – Tactics, Techniques, and Procedures

Chapter 3 References

Checkland, P. (1999). Systems thinking, systems practice.Chichester, UK: John Wiley and Sons Ltd.

De Maio, J. (Ed.). (2000). The challenges ofcomplementarily [Report on the fourth workshopon protection for human rights and humanitarianorganisations]. Geneva, Switzerland: InternationalCommittee of the Red Cross.

Defence Science and Technology Laboratory (DSTL).(2000). A guide to best practice in C2 assessment:Version 1. Farnborough, United Kingdom.

Fisher, R. A. (1951). Design of experiments (6th ed.).Ontario, Canada: Oliver and Boyd, Ltd.

Flood, R. L., & Jackson, M. C. (1991). Creative problemsolving: Total systems intervention (ISBN0471930520). Chichester, UK: John Wiley andSons Ltd.

Page 71: NATO Code of Best Practice for C2 Assessment

66 NATO Code of Best Practice for C2 Assessment

Holt, J., & Pickburn, G. (2001). OA techniques for thefuture [Defence Evaluation and ResearchAgency (DERA) Unclassified Report].Farnborough, United Kingdom: DefenceScience and Technology Laboratory (DSTL).

Keppel, G. (1973). Design and analysis—Aresearcher’s handbook (ISBN 0-13-200030-X).Englewood Cliffs, NJ: Prentice-Hall.

Midgley, G. (1989). Critical systems and the problem ofpluralism. Cybernetics and Systems, 20, 219-231.

Moffat, J., & Dodd, L. (2000). Bayesian decisionmakingand catastrophes. Defence Science andTechnology Laboratory (DSTL) [UnpublishedReport] Available from [email protected].

Moffat, J., & Witty, S. (2000). Changes of phase in thecommand of a military unit. Defence Scienceand Technology Laboratory (DSTL)[Unpublished Report] Available [email protected].

NATO Open Systems Working Group (2000). NATOC3 Technical Architecture. AC/322(SC/5)WP/31, Brussels

NATO Consultation, Command and Control Board(2000). NATO C3 System ArchitectureFramework. AC/322-WP/0125, Brussels

NATO Consultation, Command and Control Board(2000). NATO Interoperability ManagementPlan (NIMP). AC/259-D/1247(rev), Brussels

Page 72: NATO Code of Best Practice for C2 Assessment

67Chapter 3

NATO Study Group on Modelling and Simulation(1998). NATO Modelling and Simulation MasterPlan, Version 1.0. AC/323(SGMS)D/2, Brussels

Pidd, M. (1996). Tools for thinking—Modelling inmanagement science. Chichester, UK: JohnWiley and Sons Ltd.

Prins, G. (1998). Strategy, force planning anddiplomatic/military operations. London: TheRoyal Institute of International Affairs.

Rosenhead, J. & Mingers, J. (Ed.). (2001). Rationalanalysis for a problematic world revisited:problem structuring methods for complexity,uncertainty and conflict (2nd ed.). Chichester,UK: John Wiley and Sons Ltd.

Windass, S. (2001). Just war and genocide—TheOttawa Papers. Woodstock, United Kingdom:Foundation for International Security.

Witty, S., & Moffat, J. (2000). Catastrophe theory andmilitary decisionmaking. Defence Science andTechnology Laboratory (DSTL) [UnpublishedReport] Available from [email protected].

The following additional documents are under development:

NATO C3 System Baseline Architecture

NATO C3 System Overarching Architecture

1Broad C2 issues include key systems, doctrine, Tactics,Techniques, and Procedures (TTP), organisational structures,and key assumptions (e.g., system performance parameters).2The structured analysis of benefits is a logical process thatseeks causally to map lower level MoM that can be related toinvestments or other actions to higher level MoM that can bevalued directly by decisionmakers.

Page 73: NATO Code of Best Practice for C2 Assessment

69

CHAPTER 4

Solution Strategies

Operations research is a scientific method.Executives have often in the past used someof the techniques…to help themselves arriveat decisions…But the term “scientific methods”implies more than sporadic application andoccasional use of a certain methodology; itimplies recognized and organized activityamenable to application to a variety ofproblems and capable of being taught.

—Philip M. Morse and George E Kimball,Methods of Operations Research

Page 74: NATO Code of Best Practice for C2 Assessment

70 NATO Code of Best Practice for C2 Assessment

The Study Plan

A conscientious effort is required to create and follow a study plan that guides data collection and

analyses and prepares for the use of the insights anddata to be collected to contribute to a solution to theproblem at hand. The study plan consists of two inter-related parts—the formulated problem (the What) andthe solution strategy (the How). The output of the initialproblem formulation provides the assessment teamwith an operating definition of what needs to be done.The output of the solution strategy phase provides theteam with an operating definition of how this will beaccomplished. As the project unfolds, there will usuallybe a significant amount of iteration that modifies boththe problem formulation and the solution strategy.Without a study plan, it is unlikely that needed effortswill be properly scoped, prioritised, scheduled, andresourced. Even if the way ahead seems clear, thearticulation of a formal Solution Strategy is necessary.

The objective of this phase of the study is to developa feasible approach to go from the specification of whatis to be done to how it is to be done. This involvesdeveloping an approach that will result in the team’sability to collect the data necessary to determine thevalues of the Measures of Merit (MoM) for specifiedvalues of independent variables. The characteristicsof data collection instruments and analysis tools andtechniques will determine the resources required, thetime needed, and the risks inherent in the solutionapproach. When compared to the study constraintsand the problem formulation, it will be determinedwhether the solution approach is both feasible andsatisfies the requirements of the problem formulation

Page 75: NATO Code of Best Practice for C2 Assessment

71Chapter 4

(e.g., measures the right things). Figure 4.1 depictswhat is involved in moving from a problem formulationto a solution strategy.

Figure 4.1. From Problem Formulation to Solution Strategy

Key Definitions

Solution Strategy

A solution strategy consists of the specification of aset of sequential and parallel analytical steps, ofteninvolving several methodologies and tools. Thesolution strategy is designed to begin with what isknown, and by execution of the specified steps, leadsto what one desires to know—an illumination of theissues. The strategy can be:

• simple—calculate mortgage payments by findingthe input values for the payment equation andthen evaluating the result;

Page 76: NATO Code of Best Practice for C2 Assessment

72 NATO Code of Best Practice for C2 Assessment

• moderately complicated—define input variables,output variables and precision requirements,create a designed experiment and run theexperiment with appropriate measurementsincluding the regression analysis; or

• extremely complex—identify the relevantvariables and systems of variables, specify howthey might be measured, hypothesize how theyare related, and design research strategies thatallow for complex adaptive systems or other“messy” structures or processes.

The solution strategy must take the outputs fromproblem formulation, refine and operationalise them,and develop a plan to collect and analyse appropriatedata (including the development and/or selection ofmodels, the design of collection instruments, and theselection of analysis tools) to understand therelationships among the relevant variables associatedwith MoM, the scenarios, and human andorganisational factors. (Figure 4.2)

Measures of Merit (MoM)

MoM are a set of variables that focus the assessmenton the issues of interest. In most analyses, these arethe dependent variables. In many cases there aresignificant inter-relationships among the MoM.

Human Factors

Human factors consists of a set of variables thatcharacterise concepts including beliefs, cultural norms,stress, fatigue, fear, arousal, morale, intelligence, and

Page 77: NATO Code of Best Practice for C2 Assessment

73Chapter 4

level of experience. In Command and Control (C2)assessments these are typically independent orintervening variables.

Organisational Factors

Organisational factors consist of a set of variables thatcharacterise organisations, such as cohesion,command structure, explicit and tacit relationships,information flows, and organisational cultures. Theseare also typically independent of intervening variablesin C2 analyses.

Scenarios

Scenarios consist of the evolution in time of severalelements: a context (e.g., a characterisation of ageopolitical situation), the participants (e.g., intentions,capabilities of blue, red, others), and the environment(e.g., natural, weather and manmade, mines). In C2assessments, the purpose of scenarios is to ensurethat the analysis is performed within the appropriaterange of opportunities to observe the relevant variablesand their interrelationships.

Model

A model is a physical, analogue, or symbolicrepresentation of relevant aspects of reality for apurpose. It is an abstraction of reality. A modelemphasises particular aspects (a subset) of reality.

The assessment model of primary interest is theassessment team’s model (conceptual or in someanalytic manifestation) of the C2 problem including the

Page 78: NATO Code of Best Practice for C2 Assessment

74 NATO Code of Best Practice for C2 Assessment

variables of interest, their hypothesised relationships,and any prior assumption about their values andlinkages. The assessment team may also employ ordevelop other models or simulations in order to performanalysis or explore risks and uncertainties. Some teamswill employ more than one of these analytic tools.

Tool

A tool facilitates the exploration of relationships amongmodel variables and/or develops “solutions” (e.g.,maximise value subject to constraints). A tool may beas simple as a checklist or an algorithm, or it may bean extremely large simulation. A simulation is theinstantiation of a model that serves to facilitate theexploration of the relationships among the variables –it generates data for analysis and generallyemphasises the passage of time. Models andsimulations are frequently subdivided into categoriesof constructive, virtual, and live.

Occasionally, the distinctions among a tool, model,and data are subtle. For example, in a linear programthe model is the set of formulas that specify theobjective function and the constraints. The tool is thesimplex method (or similar algorithmic solutionmethod). The data is an instantiation of the formulas(provides values for the coefficients and constants).In the case of a simulation, the simulation environmentand simulation engine are the tools, the codedsimulation embodies the model, and the input valuesto the simulation comprise the data. Often, thesimulation code and the data, together, are requiredfor a complete definition of the model.

Page 79: NATO Code of Best Practice for C2 Assessment

75Chapter 4

Data

Data are the values associated with the variables. Datamay be ratio, interval, ordinal, or nominal in scale. Datamay originate from empirical observation; be derived frommodels, simulations, or analyses; be established fromsubject matter experts; or be established by assumption.

Developing a Solution Strategy

The development of a solution strategy is an iterativeprocess that strikes an artful balance between whatthe team would like to do and what, given the state ofthe art, the available data, tools, schedule, andresources, is possible to do.

Prerequisites

The solution strategy should not be designed beforethe problem formulation process is substantiallycomplete (Figure 4.1) and the problem formulationproducts specified in Chapter 3 are available to theteam. This means that:

• The “real” question to be answered is known;

• The assumptions have been articulated;

• The high level MoM have been identified;

• The independent variables have been identified; and

• The constraints associated with the variableshave been identified.

Page 80: NATO Code of Best Practice for C2 Assessment

76 NATO Code of Best Practice for C2 Assessment

However, the assessment team should alwaysremember the inherently iterative nature of theprocess. Adjustments may prove necessary in theproblem formulation as the solution strategy matures.

Steps in Developing a Solution Strategy

As an initial step, the team should elaborate on theMoM to specify the detailed MoM that are to beevaluated. This is sometimes referred to as developingoperational definitions for the MoM—definitions thatspecify the metric to be used, the instrument, and thecontext in which the measurement is to take place.Often the value of a particular MoM can not easily beobserved or measured and one or more surrogatemeasures are used in its place. In any event, thedevelopment of the set of MoM to be used in the studyanchors the process that will eventually lead to asolution strategy.

This process (Figure 4.2) revolves around theconceptual model that the assessment team builds,and is at the heart of that process. It is best practise tomake this model explicit and have it serve as thecommon picture that develops a high quality of sharedunderstanding among the team, sponsors,stakeholders, and other key study participants. Theinitial conceptual model consists simply of the MoM, afirst cut of the hypothesised relationships among them,assumptions about variables and their relationships,and constraints. Later iterations include additionalindependent variables that are known or assumed toaffect the values of the MoM or the nature of therelationship among them, increasingly detailedspecifications of relationships, and specific values orranges for the independent variables.

Page 81: NATO Code of Best Practice for C2 Assessment

77Chapter 4

Figure 4.2. Solution Strategy

The identification of human and organisational factorsthat impact model variables and relationships servesto flesh out the basic conceptual model generated inthe problem formulation phase.

Scenarios then need to be derived to provideopportunities in an appropriate context for datacollection and exploration of the variables andrelationships contained in the conceptual model. Thedata the study requires are, in large measure, aderivative of the scenarios utilized and the design ofthe assessment.

The design of an assessment also requiresspecification of methods and tools and how they willbe employed. Methods and tools are required toexplore the relationships among the independentvariables and between the independent and dependentvariables. Complex solution strategies may benecessary. In these cases, multiple analyses will be

Page 82: NATO Code of Best Practice for C2 Assessment

78 NATO Code of Best Practice for C2 Assessment

implied. The problem must be divided into parts, eachpart requiring analysis with its own set of tools.Frequently the tools that are available do not provideinterfaces from one part of the analysis to the next.

Taken together, the detailed specification of the MoM,the development of a conceptual model including therelevant human and organisational factors, thespecification of a set of scenarios, and a data collectionand analysis plan (that consists of the methods andtools to be used) constitute a solution strategy.

The solution strategy developed needs to be tested tosee if it can be expected to address the issues at hand,within schedule and resource constraints, and withaccepted levels of uncertainty and risk. However,uncertainty and risks are being continually assessedthroughout the process of developing a solutionstrategy. The team should also consider the form ofstudy output and its relevance to the decisionmaker.Iteration of these ideas with the stakeholdersthroughout the study helps to avoid surprises and toensure that the basic assumptions underlying the studyhave not changed.

Iterating the Study Plan

Figure 4.3 depicts the iterative nature of the processinvolved in developing the Overall Study Plan, linkingproblem formulation and solution strategies togetherwith the inputs from study sponsors and stakeholders.

Page 83: NATO Code of Best Practice for C2 Assessment

79Chapter 4

Figure 4.3. Overall Study Plan

A first order feedback loop is shown between problemformulation and solution strategy, with both processeshaving iterative internal processes. An analysis of thestudy of risk and uncertainty provides the controlmechanism that drives the iterative process to anacceptable result.

Study Management Plan

The team should also create and maintain a StudyManagement Plan (SMP) to guide the direction,management and co-ordination of the project team. TheSMP should include a detailed, time-phased executionplan for the study and a Work Breakdown Structure(WBS).1 The SMP should show the requirements for allthe team products and their delivery dates therebycreating delivery milestones for the execution of thestudy. It should show the planned dates for all scheduled

Page 84: NATO Code of Best Practice for C2 Assessment

80 NATO Code of Best Practice for C2 Assessment

meetings including progress meetings and technicalinterchange meetings. The plan should also include atime-phased manning plan identifying the types,quantities, and period of performance for all membersof the study team. The SMP should include details ofthe controls that will be applied to supervise anycontractor performance. The team should maintain acurrent version of the SMP during the study period ofperformance. The SMP would typically include theassociated supporting plans:

• Study glossary;

• Analysis plan;

• Tool deployment and modelling and simulation plan;

• Data collection/engineering plan;

• Configuration management plan;

• Explicit consideration of risk and uncertainty;

• Quality assurance plan;

• Security plan;

• Review plan; and

• Plan of deliverables.

The elements of an ideal SMP are defined anddiscussed below.

Study Glossary

The assessment team should create and maintain astudy glossary comprising all relevant definitionsneeded in the study. It should aim to create a general

Page 85: NATO Code of Best Practice for C2 Assessment

81Chapter 4

study glossary that is improved by every study thatuses this glossary. As a starting point, the NATO AAP-6, “NATO Glossary of Terms and Definitions,” or theJoint Publication 1-02, “DOD Dictionary of Military andAssociated Terms,” should be used.

Analysis Plan

The assessment team should also create and maintainan analysis plan for the study. The analysis plan shoulddescribe the analyses in detail. This description shouldinclude the analysis methodology, the tools to be usedfor analysis, the input data requirements, the EssentialElements of Analysis (EEA), the MoM to be used toevaluate the results, and any analysis assumptions.

Tool Deployment and Modelling andSimulation Plan

The team may create and maintain a tool deploymentand modelling and simulation plan covering the needsof each task for numerical simulations or other appliedmeans of operations analysis (OA). It should describethe use of tools and models and simulations in thefeasibility study (FS). The plan should include adescription of each tool to be used, a list of the keyassumptions and caveats for each tool, an analysis ofthe suitability of each tool in addressing the functionalityand performance issues in the FS, the source of inputdata for each tool, the available output from each tool,and should detail any changes to these tools that areintended. The plan should indicate how datatraceability from one tool to another will be maintained.

Page 86: NATO Code of Best Practice for C2 Assessment

82 NATO Code of Best Practice for C2 Assessment

The modelling and simulation plan may be includedas part of the analysis plan in simple studies.

Data Collection/Engineering Plan

The team should create and maintain a DataCollection/Engineering Plan (DCEP) which covers dataand metadata necessary to describe:

• The scenario;

• The essential elements of analysis in the analysis plan;

• The MoM to be used to evaluate the result (alsoin the analysis plan); and

• The input and output parameters of tools to beused within the study.

The DCEP describes and documents who owns thedata, where the data can be found (including opensources like the Web), necessary methodologies andprocedures to prepare the data, and assumptions andconstraints connected to generated data, etc. The datadefinitions used in the DCEP have to be harmonisedwith the study glossary.

Configuration Management Plan

The team should create and maintain a configurationmanagement plan. The plan should ensureidentification, traceability and control of the descriptionsof the system elements, interfaces and architecturesconsidered in the FS, as well as associateddocumentation. The plan should show how the studydatabase must be controlled and updated. The planshould follow NATO STANAG 4159, “NATO Materiel

Page 87: NATO Code of Best Practice for C2 Assessment

83Chapter 4

Configuration Management Policy and Procedures forMulti-National Joint Projects” (1992), as a guideline.The plan should also ensure that the description ofthe system configuration being simulated or analysedcan be precisely identified, as well as any system ortechnology improvements considered, with respect toan identified baseline configuration.

Study Risk Register

The team should also identify and assess the technicaland schedule risks concerned with the successfulcompletion of the FS. The list of risks identified shouldbe maintained in a study risk register, which showsthe probability of occurrence of each risk and its impacton the FS. The register should include the riskmitigation activity for each risk and the expectedimprovements to time and performance. The study riskregister should be regularly maintained, should beavailable to the assessment team by arrangement, andshould be presented at each progress meeting. TheGeneric Risk Register (GRR) developed during theSAS-026 efforts (Chapter 10) is recommended.

Quality Assurance Plan

The team should create and maintain a qualityassurance plan. The plan should declare all relevantquality standards and procedures that are to be appliedin the course of the study, and should describe thequality organisation to be used, including the principalquality officers and their lines of authority. The policyand requirements for quality assurance in NATO aregiven in the following documents, which should be usedas guidelines:

Page 88: NATO Code of Best Practice for C2 Assessment

84 NATO Code of Best Practice for C2 Assessment

• STANAG 4107, “Mutual Acceptance ofGovernment Quality Assurance and Usage ofthe Allied Quality Assurance Publications(AQAPs),” 6th Edition (March 1998).

• AQAP 100, “General Guidance on NATO QualityAssurance,” 2nd Edition (March 1995) orequivalent/comparable national standards.

The Security Plan

The team should create and maintain a security plan.This plan should contain the approach to the utilisation,storage, publication, dissemination, and control ofclassified and unclassified materials.

Review Plan

The team should create and maintain a review plan.For every critical phase of the study, preferably markedby respective milestones, reviews of the study haveto be planned and executed. Participants should gobeyond the members of the study team to include peerreviews. The review results should be documented.

Plan of Deliverables

The team should also create and maintain a plan ofdeliverables for each phase of the study. This includeswhat is to be delivered, when it is to be delivered, towhom, in what form and format, and how many copies.

Chapter 4 Acronyms

C2 – Command and Control

Page 89: NATO Code of Best Practice for C2 Assessment

85Chapter 4

DCEP – Data Collection/Engineering Plan

EEA – Essential Elements of Analysis

FS – Feasibility Study

GRR – Generic Risk Register

MoM – Measures of Merit

OA – Operations Analysis

SMP – Study Management Plan

WBS – Work Breakdown Structure

Chapter 4 References

With most textbooks, the bulk of the text is concernedwith teaching about the available tools, not with howto use them as an ensemble. Each of the followingreferences provides some information concerningsolution strategies.

Cleman, R. T. (1990). Making hard decisions: Anintroduction to decision analysis. Pacific Grove,CA: Duxbury Press.

Gass, S.I., & Harris, C.M. (Eds.), Encyclopedia ofoperations research and management science(2nd ed.). New York: Kluwer Academic.

Hartley, D. S., III. (1996). Operations other than war:Requirements for analysis tools research report(K/DSRD-2098). Oak Ridge, TN: LockheedMartin Energy Systems. Available from http://www.msosa.dmso.mil/ootw.html

Page 90: NATO Code of Best Practice for C2 Assessment

86 NATO Code of Best Practice for C2 Assessment

Hartley, D. S., III. (2001). Military operations other thanwar. In S. I. Gass & C. M. Harris (Eds.),Encyclopedia of operations research andmanagement science (2nd ed.). New York:Kluwer Academic.

Hillier, F. S., & Lieberman, G. J. (1990). Introductionto operations research (5th ed.). New York:McGraw-Hill.

Hoeber, F. P. (Ed.). (1981). Military applications ofmodelling: Selected case studies. Alexandria,VA: Military Operations Research Society.

Hughes, W. P., Jr. (Ed.). (1989). Military modelling(2nd ed.). Alexandria, VA: Military OperationsResearch Society.

Jaiswal, N. K. (1997). Military operations research:Quantitative decisionmaking. New York: KluwerAcademic.

Olae, N. G., Roy, J., & Wetter, M. (2000). Performancedrivers—A practical guide to using the balancedscorecard (ISBN 0-471-49542-5). Chichester,UK: John Wiley and Sons Ltd.

Quade, E. S. (Ed). (2000). Analysis for militarydecisions. Alexandria, VA: Military OperationsResearch Society.

Rosenhead, J. (2001). Problem structuring methods.In S. I. Gass & C. M. Harris (Eds.), Encyclopediaof operations research and managementscience (2nd ed.). New York: Kluwer Academic.

Page 91: NATO Code of Best Practice for C2 Assessment

87Chapter 4

Wagner, H. M. (1975). Principles of managementscience (2nd ed.). Upper Saddle River, NewJersey: Prentice-Hall.

1A WBS is a decomposition of the effort into its constituent parts(or tasks) and the assignment of assets to the component tasks.Assets may be people’s time, facilities, or other elementsrequired to complete the task. WBS are common in constructionand engineering projects.

Page 92: NATO Code of Best Practice for C2 Assessment

89

CHAPTER 5

Measures of Merit

It’s best to know what you are lookingfor, before you look for it.

—Winnie the Pooh, from A. A. Milne

In order to understand the impact of Command andControl (C2), it is necessary not only to analyse and

measure the effect of C2 on military operations, butalso the effects on the components of the constituentsystems. No single measure or methodology existsthat satisfactorily assesses the overall effectivenessof C2. Therefore, a multi-faceted and sometimes multi-phased approach is necessary. The benefits of C2should be evaluated through their impact on the

Page 93: NATO Code of Best Practice for C2 Assessment

90 NATO Code of Best Practice for C2 Assessment

fulfilment of the military and policy objectives, and theimpact of C2 should be measured in terms of specificqualities that are relevant to these objectives. A set ofscenarios provides the contexts in which Measures ofMerit (MoM) are determined.

MoM Challenges

During the last two decades, many new automatedC2 systems have been developed and fielded.However, the determination of both the performanceand the effectiveness of these systems has proven tobe a complex problem. Recognising this, the MilitaryOperations Research Society (MORS) has sponsoredseveral workshops on MoM since 1985. Theworkshops have led to the development of an analysisframework, Modular Command and Control EvaluationStructure (MCES), for the measurement ofperformance and effectiveness within a conceptualmodel for C2. Based on the MORS workshops, theUS Army’s Training & Doctrine Command AnalysisCenter (TRAC) developed the C2 Measures ofEffectiveness (MoE) Handbook in 1990. Thisdocument and the measurement tools developed forthe Headquarters Effectiveness Assessment System(HEAT) and the Army Command and ControlEvaluation System (ACCES) represented the thenestablished best practices.

The AC/243 Panel 7 Ad Hoc Working Group (AHWG)on the Impact of Command, Control, Communications& Intelligence (C3I) on the Battlefield acknowledged thatthe specification of measures of effectiveness is difficult.The 1992 final report recommended that a hierarchy ofmeasures be established as an important step in

Page 94: NATO Code of Best Practice for C2 Assessment

91Chapter 5

understanding overall system effectiveness, and thatsystems be analysed at different levels of detail. Thetypes of measures were grouped relating to C2 systemperformance, force/commander effectiveness, andbattle outcome. To quote from the final report,“Measures…are often inadequate and too model orscenario specific. In addition, they have often beengenerated in ad hoc ways, suggesting a lack of formalanalysis in their development.” Since then, RSG-19,SAS-002 and SAS-026 have canvassed the field, andhave brought together the best ideas and practices inorder to support MoM development and applicationswithin C2 assessment.1 This version of the Code of BestPractise (COBP) extends this thinking and includes theOperations Other Than War (OOTW) domain.

Definitions

It has been recognised that a single definition formeasures of performance (MoP) and effectiveness(MoE) does not exist. MoM is recommended as ageneric term to encompass different classes ofmeasures. The measures are defined in hierarchicallevels related to each other, each in terms of its ownboundary. From the conceptual viewpoint, it isimportant to keep in mind the level of analysis and thecontext in which the measurements are made.

Within the MCES framework, MORS has developed afour-level hierarchy of measures from high-level forceeffectiveness to low-level rudimentary measures ofphysical entities, which were adopted by RSG-19. Inthe context of OOTW, a fifth level is added, Measuresof Policy Effectiveness (MoPE), to characterise thecontribution of military actions to broader policy and

Page 95: NATO Code of Best Practice for C2 Assessment

92 NATO Code of Best Practice for C2 Assessment

societal outcomes. For OOTW, political factors areparamount and considerations such as mediacoverage, local regional stability, and sustainment ofcommunity societal standards must be taken intoaccount. Military missions may not directly achievepolicy objectives, although they often strive to providean environment more conducive to these objectives.However, MoE of military tasks should quantifyperformance against military missions, not the overallpolitical aspirations.

The Code of Best Practice has adopted the followingfive levels of MoM:

• Measures of Policy Effectiveness (MoPE), whichfocus on policy and societal outcomes;

• Measures of Force Effectiveness (MoFE), whichfocus on how a force performs its mission or thedegree to which it meets its objectives;

• Measures of C2 Effectiveness (MoCE), whichfocus on the impact of C2 systems within theoperational context;

• Measures of Performance (MoP), which focus oninternal system structure, characteristics andbehaviour; and

• Dimensional Parameters (DP), which focus onthe properties or characteristics inherent in thephysical C2 systems.

Figure 5.1 emphasises the diminishing impact of aparticular MoM as the circle widens.

Page 96: NATO Code of Best Practice for C2 Assessment

93Chapter 5

Figure 5.1. Relationships of Measures of Merit

Measurement Objectives

The important issues raised by decision-makersrequire a sense of the degree to which C2 performancemay improve force and policy effectiveness. Therefore,C2 assessments are often called upon to provideconvincing evidence of the expected improvementsin mission effectiveness that can be attributed toimproved C2. The ideal approach may be to define asingle measure that reflects the military and politicalobjectives of the missions under consideration.However, the determination of such a measure isgenerally not feasible, although not necessarily

Page 97: NATO Code of Best Practice for C2 Assessment

94 NATO Code of Best Practice for C2 Assessment

impossible for particular classes. In mostdecisionmaking problems, it is necessary to defineseveral measures that together provide the necessaryinsights. A major reason for this is that a singlemeasure may not provide sufficient scope and/or detailto analyse the impact of specific C2 elements,particularly second and third order effects orunintended consequences. Many analyses areconducted precisely in order to enable trade-offbetween important equities which can only be seen ifa set of MoM is generated for analysis. The set ofMoM selected must be comprehensive to ensure thatall factors are considered.

MoM are used to compare different options on equalterms, and serve a wide range of purposes, including:

• Establishing a standard or expectation ofperformance (for new requirements);

• Establishing the bounds of performance of a systemas well as the effects of imposed constraints;

• Comparing and selecting alternative systems thatmay be very dissimilar but are designed toachieve a similar purpose;

• Assessing the utilisation of a system in new orunexpected application domains or missions;

• Identifying potential weaknesses in specific areasof an organisation or system (areas of high errorpotential or high user workload);

• Analysing the impacts of organisational changes;

• Analysing training effectiveness;

Page 98: NATO Code of Best Practice for C2 Assessment

95Chapter 5

• Determining the most cost-effective approachesto achieve desired objectives;

• Comparing a replacement system, orcomponents of a system, against predecessorsor competitors;

• Assisting in generating and validatingrequirements and deriving specific C2requirements from broad statements of objectives;

• Evaluating the effectiveness of humandecisionmaking in the C2 cycle;

• Determining the degree of mission success; and

• Determining the return to normality in OOTW.

Characteristics of MoM

The development of an operational definition of themeasure, the development of instruments, theapplication of the instruments to collect of appropriatedata, and the establishment of relationships among agroup of MoM vary in difficulty, effort, cost, precision,generalizability and other characteristics. Thesedifferences are often related to the type(s) of MoMinvolved and the domain in which the measurementneeds to take place.

C2 Assessments involve measurement of variablesthat exist in the physical, information, and cognitivedomains.2 In general, the development of operationaldefinitions, instrumentation and data collection forvariables to be measured in the physical andinformation domains are more straightforward andrequire less effort and expense than dealing with

Page 99: NATO Code of Best Practice for C2 Assessment

96 NATO Code of Best Practice for C2 Assessment

variables that are measured in the cognitive domain.Furthermore, the former can be measured more“precisely,” are easier to comprehend, and are lesssubject to interpretation than the latter.

In general, DP, MoP related to systems, and MoFEtend to be measured in the physical domain (e.g.,bandwidth, computing capacities, time to accomplisha task, force exchange ratios) while MoP related tomeasures of C2 effectiveness (quality of awareness,shared awareness, and trust) and MoPF (will of anadversary, public opinion) tend to be measured in thecognitive domain.

As one goes up the hierarchy of MoM, the measurestend to become more context, task, or mission specific.For example, the performance characteristics of systems(DP) apply to systems in general but MoFE are usuallylimited to a set of tasks or missions. MoFE for combatare very different from MoFE for various OOTW. If donewell measures of C2 effectiveness will be scenarioindependent so one can compare C2 effectivenessacross a range of missions and circumstances.

Except for DP any of the MoM can be either anindependent or dependant variable in a givenassessment with any of the independent variablesbeing either “controllable” or not. The difficulty inestablishing relationships among the MoM varies asa function of the level of the independent variable.

The Assessment Team should recognize and plan forthe difficulties associated with using various MoM andshould avoid substituting easier to deal with but lessrelevant MoM. It is always better to try to measure(estimate, approximate) MoM that reflect first order

Page 100: NATO Code of Best Practice for C2 Assessment

97Chapter 5

effects than to precisely measure MoM that do notadequately reflect key aspects of the problem.

Measuring MoM

This section outlines measurement theory conceptsthat apply to ensure that the right measuringinstruments are selected and applied correctly. Bydefinition, measurement is the assignment of valuesto observation units that express properties of the units.Four levels of measures relate numbers to propertiesof interest: nominal (e.g., artillery vs. infantry), rank orordinal (e.g., worst to best), relative or interval value(e.g., change in temperature), and absolute value orratio (e.g., 2 kilobits per second). Analysts must ensurethat the specific MoM adopted are at the appropriatelevels of measurement.

The key properties for quality assurance are reliabilityand validity. Other significant properties includepractical issues, such as the effort required to collectappropriate data and the convenience of measurement(e.g., whether the collection process itself interfereswith the conduct of an exercise or experiment). Ideally,measurements should be easy to capture and easy toapply. There are clearly trade-offs to be made betweenMoM that may closely track the property of interestand that are costly and/or difficult to measure and thosethat are less strongly related to the property of interestbut that are easier to measure. The effort required forcollection bears no direct relationship with validity, butreliabil ity may be related to cost. Reliablemeasurements require repeated observations andappropriate sample sizes. Reliability representsaccuracy and consistency. A cost-effective

Page 101: NATO Code of Best Practice for C2 Assessment

98 NATO Code of Best Practice for C2 Assessment

measurement plan provides enough data for usefuland definitive conclusions. However, cost and/orconvenience of measurement may be an overridingfactor in system evaluation.

Failure to take validity and reliability into account raisesthe risk of generating false conclusions. Validity andreliability are not absolutes, but matters of degree.Validity is the degree to which a measure characterisesthe attribute of interest and only that attribute. Complexconcepts often require multiple measures to providevalid information. In order to make a valid link betweenthe performance of a system as a whole againstperformance of its components, the measures mustcorrespond to critical tasks. Reliability representsaccuracy and repeatability. A measure may be reliablebut not valid, or it may be valid but not reliable.

Validity

The properties of validity may be categorised into fivetypes: internal, construct, statistical, external, and expert.

• Internal validity is defined as the establishment ofcausal relationships between variables ofinterest. This is necessary to accept ahypothesis that a given measure is responsiblefor a specific effect on another measure;

• Construct (also referred to as content) validitymeans that the target objects, and only the targetobjects, are measured;

• Statistical validity implies that sufficient sensitivityis involved in order to determine relationshipsbetween independent and dependent variables.

Page 102: NATO Code of Best Practice for C2 Assessment

99Chapter 5

Statistical tests control two types of errors inmeasurement. Type I, or alpha, is the probabilityof rejecting a claimed hypothesis that is true,Type II, or beta, is the probability of accepting anhypothesis that is not true;

• External validity implies that the results may beextended to other populations or environments; and

• Expert validity refers to the degree to whichmeasures are accepted by those knowledgeablein the field.

A MoM should meet the validity-related criteria outlinedin Table 5.1.

Table 5.1. Validity Criteria of Measures

Reliability

Reliability involves the expectation of errors associatedwith measurements. It is defined as the accuracy of ameasurement, as reflected in the variance of repeatedmeasurements of the same phenomenon. The keyprinciples of reliability are consistency (repeatability)and accuracy. The variance associated withmeasurement must be known or estimated to interpretresults and to discriminate between real effects andmeasurement effects.

Page 103: NATO Code of Best Practice for C2 Assessment

100 NATO Code of Best Practice for C2 Assessment

The reliability-related criteria that MoM should meetare outlined in Table 5.2. Additional criteria that haveproven usual in the past are outlined in Table 5.3.

Table 5.2. Reliability Criteria of Measures

Table 5.3. Other Useful Criteria

Practical MoM Issues

The assessment of C2 requires the application of aframework to yield values for appropriate MoM.Analyses of C2 systems and processes often reveal acomplex hierarchical composition. A structuredresolution/functional decomposition approach may berelated to the organisational structure to yieldperformance measures for the organisation as a whole,individual components within the organisation, andspecific tasks within the organisational cells.

If the analyst assumes that C2 effectiveness is positivelycorrelated with overall military unit effectiveness, MoMcould be obtained by addressing the outcomes orproducts of such unit activities. Goal-level evaluationattempts to define the ability of the specific militaryformation to make the system state match the goal(directive) provided by the superior headquarters. Theseare measures of force effectiveness. The degree towhich the system state matches the desired goal states

Page 104: NATO Code of Best Practice for C2 Assessment

101Chapter 5

indicates a level of effectiveness. Alternatively, C2effectiveness may be viewed as dependent on thefunctional processes of the C2 system, with measuresobtained mainly at the task level.

A C2 assessment framework encompasses severalfactors that must be considered iteratively, asdiscussed in the introductory chapter. Typical factorsimportant for the identification and selection of MoMinclude the:

• Assessment configuration, e.g., storyboard,testbed, constructive simulation, field trial;

• Assessment goal or purpose;

• Context, assumptions, and constraints;

• Scenarios or stimuli;

• Collection means, e.g., subject matter experts,automatic data logging;

• Analysis plan; and

• Interpretation of results.

Categories of Measures

A common thread in the approaches for C2assessment is the functional decomposition of the C2cycle. C2 effectiveness depends upon the functionalprocesses of the C2 system, and the evaluation offunctions may be determined by data measured at thetask level.

The evaluation of tasks provides the most detailedinsight into C2 activities. The primary measures are

Page 105: NATO Code of Best Practice for C2 Assessment

102 NATO Code of Best Practice for C2 Assessment

expressed in terms of time consumed and accuracy.Task analysis must be performed prior to evaluation,with the identification of task definition and the criticalelements for successful task completion.

Measures of a C2 system’s behaviour may thus bereduced to measures based on time, accuracy, or acombination which may be interdependent. Timebased measures are quantitative, while accuracymeasures may be quantitative or qualitative.

For C2 tasks, time-based metrics include the:

• Time taken to react to an event (time to noticeprocess and act upon new information);

• Time to perform a task (time to make decision);

• Time horizon for future for predictive analysis; and

• Rate of performing tasks (tempo).

Metrics for accuracy include:

• Precision of the observed system(s) performance;

• Reliability of the observed system(s) performance;

• Completeness (known unknowns, unknown unknowns);

• Errors (alpha, beta, omission, transposition,severity); and

• Quality of information produced.

Some accuracy measures may be calculated in unitsof time, e.g., the time taken to detect an error. Qualityof decisions is difficult to evaluate objectively, exceptby focusing on outcomes. The processes involved mayhave to be examined to obtain objective measures, or

Page 106: NATO Code of Best Practice for C2 Assessment

103Chapter 5

subject matter experts may be consulted to make anevaluation. Accuracy of information implies both theaccuracy of the data and the accuracy of theinterpretation of the data.

Time based and accuracy measures often bear aninverse relationship, implying a trade-off between speedof performance and accuracy of performance. Speedof performance must be specified in terms of minimumdesired accuracy or completeness, and accuracymeasurements in terms of time available. Therefore,the specification of thresholds or standards for metricsmust be referenced in terms of imposed constraints.

Examples of time and accuracy based measures arecompiled in Table 5.4. Table 5.5 provides someadditional examples, specifically MoP and MoCE.

Table 5.4. Examples of Time- and Accuracy-Based Measures

Page 107: NATO Code of Best Practice for C2 Assessment

104 NATO Code of Best Practice for C2 Assessment

Table 5.5. Examples of MoPs and MoCEs

Example Headquarters C2 MoM

C2 measures may also be divided into setscorresponding to the sequential steps of the C2 cycle.These include:

• Monitoring and understanding: informationtransmission, values, times, effect, comprehension;

• Planning: information exchange, coordination,impact, flexibility, process quality; and

• Directing and disseminating.

The MoM for C2 can also be focused on four levels: anetwork of headquarters, a single headquarters, theindividual cells within the headquarters, andperformance of specific tasks within the cells.

Page 108: NATO Code of Best Practice for C2 Assessment

105Chapter 5

OOTW MoM

While national NATO policies require that militaryforces be prepared for high intensity conflict, forceshave been increasingly involved in low-intensityconflicts and C2 analyses for OOTW are thereforebecoming important. OOTW include force deploymentto create or maintain conditions for a political solutionin order to avoid escalation into hostilities. Threats tointernational and national security may also unfold fromnatural disasters, terrorism organised crime, civilunrest, migration, or other territorial intrusions. MostOOTW are inherently joint or combined operations.

While the determination of MoM has been stated asdifficult to obtain, OOTW offer even greater challengefor MoM. Traditional MoFE such as loss exchangeratios, combat effectiveness, or duration of thecampaign are rarely applicable to OOTW. In suchoperations, military forces may play important rolesbut political concerns may limit the scope of imposablesolutions. Public and political pressures may result inshifts in the selection of criteria for MoM e.g., moreemphasis may be placed on personnel casualties andless on equipment losses.

OOTW MoPE

While MoFE and MoCE provide measures of successfor military operations, MoPE measure the degree ofattaining political objectives. In some cases, such ashumanitarian assistance or nation building MoPE maymeasure the degree of improvement in the quality oflife of the populace.

Page 109: NATO Code of Best Practice for C2 Assessment

106 NATO Code of Best Practice for C2 Assessment

MoFE usually were the highest MoM used within theanalysis of Article V missions3 assuming thateffectiveness is directly related to the higher levelMOPE such as “winning the war.” However, such anassumption may not always apply to OOTW. Forexample, military actions that would be highly effectivein accomplishing mission objectives in war might bequite counterproductive in OOTW. In fact, the valueof military actions in OOTW is not so much a questionof physical effects, but rather how military actions andtheir physical effects are perceived by the variousactors and the population in the theatre, how themilitary actors interpret the behaviour of the otheractors and the critical mission task conditions suchas, for example, political interest and media attention.

McCafferty and Lea developed low-level military-relatedmeasures (MoCE) to cover OOTW (McCafferty and Lea,1997). The MoM, which are classified as MoCE, include the:

• Time between the arrival of friendly forces in thearea and their deployment;

• Time between deployment of friendly forces andcontact with adversary forces;

• Length of time adversary forces were underobservation without posing a threat to friendly forces;

• Length of time friendly forces are in potentialdanger (i.e., adversary forces have theopportunity to fire on friendly forces); and

• Time horizon of friendly C2 processes (how farinto the future they are focused).

Page 110: NATO Code of Best Practice for C2 Assessment

107Chapter 5

Mobility may be important for OOTW, as well assustainability and self-sufficiency in theatre, with theimplication of emphasising measures of reliability andmaintainability. Moreover, the perception of thecapabilities of deployed forces acts as deterrence orcoercion on the parties in conflict.

Some examples identified by McCafferty and Lea are:

• Opportunities to employ forces, which reflects therange of military capabilities available;

• Strategic deployment, which is related todeploying and recovering the right force totheatre efficiently and in time;

• Endurance, to maintain an effective force intheatre for an extended time;

• Mission objectives, to measure the success ofachieving military objectives in OOTW; and

• Successful termination, to deal with progress tothe desired end state (the criteria may be politicaland thus not measured by military activities).

One class of effectiveness indicators in OOTW isprovided by transition measures, which focus on theprogress in the transfer of responsibilities to the follow-on military force or civil agency. Transition measuresfocus on the degree that follow-on organisationsassume tasks and responsibilities.

Progress toward success may be tracked by normalityindicators, which are indirect measures of the effectsof military involvement in OOTW, although causalrelationships are difficult to prove (Lambert, 2000).

Page 111: NATO Code of Best Practice for C2 Assessment

108 NATO Code of Best Practice for C2 Assessment

These MoPE may be obtained by evaluating the extentto which conditions have been restored.

Normality indicators measure the level of improvementin the quality of life of the general population, and maybe defined as “relative measures of the state ofnormalcy characterising an element of the civilenvironment, through data collected on a regular basisand assessed to have the frequency, quantity,consistency and coverage required to make a usefulobjective assessment of the changes occurring in thecivilian populace.” (Department of National Defence,Canada, 1999). Normality indicators can be groupedin categories and adapted to meet the changingrequirements: political, socio-economic levels ofdevelopment, cultural, legal and technological.

Table 5.6. Normality Indicators

Limitations of normality indicators include:

• Inexperienced personnel available for datacollection and analysis. A mix of inexperiencedcivil and military personnel are often assigned tocollect data in fields foreign to them; training maybe necessary to assure reliable and valid data;

• Temporary effect due to military presence for datacollection. The mere presence of military personnelcollecting data may affect normality measurements;

• Difficulties in obtaining valid and reliable datacalibrated against baselines. It may be difficult to

Page 112: NATO Code of Best Practice for C2 Assessment

109Chapter 5

establish the threshold for “normality” if archivedata is not available;

• Extrapolation in space and time from a specificlocality. It may be inappropriate to extrapolate civilprogress to an entire region. Sampling is important;

• Limited resources and constraints for collectionand analysis. Data must be collectedconsistently but may be occasionallyunobtainable due to physical inaccessibility orlack of personnel;

• A “snapshot” which may not provide trends ifinfrequently obtained. Trend analysis requiressufficient data and sampling rates; and

• During OOTWs, the relevant MoM may changeover time, particularly the lower level MoCE andMoP. For example, during the earliest phases ofNATO operations in Bosnia, tracking weaponsystems and knowing how many of them were incantonments was a major MoM. Once the forceswere separated and most weapon systems undercontrol, emphasis shifted to the disruptive activitiessuch as road blocks and ethnic harassment. Asthese behaviours became less frequent, NATO’semphasis shifted to normality indicators.

MoM Hierarchies: Some Examples

Modular Command and Control Evaluation System

Evaluation of C2 effectiveness requires acomprehensive approach for the preparation of theevaluation process, the collection of data, and its

Page 113: NATO Code of Best Practice for C2 Assessment

110 NATO Code of Best Practice for C2 Assessment

interpretation. MCES addresses both the managerialand analytical aspects of evaluation and was originallydeveloped for the systematic comparison of C2systems. The objective of MCES is to guide analystsin the identification of appropriate measures forestimating the effects of C2 on combat.

MCES prescribes a process of measurement, but doesnot identify either a measurement system or a set ofmeasures. Similarly, while calling for the collection ofdata, MCES does not provide details on how data areto be collected. MCES does provide guidance on howgood measures and good collection procedures arecharacterised, but leave the details of themeasurement, data collection, and analysis plans tothe analyst.

MCES considers C2 as consisting of three components:

• Physical entities (equipment, software, people);

• Structure (interrelationships of entities); and

• Processes (C2 functions).

The boundary of a C2 system here is defined as a delineationbetween the system studied and the environment. The USArmy’s Training and Doctrine Command (TRADOC) C2MoCE Handbook adds mission objective as the top layerof the hierarchy of C2 components.

MCES focuses on measures as opposed to models,but includes the cybernetic loop model of generic C2.It consists of seven procedural steps.

Page 114: NATO Code of Best Practice for C2 Assessment

111Chapter 5

HEAT

While originally developed for theatre-level combatapplications, the HEAT system has proven robust. Forexample, it has been used to assess US Departmentof State crisis task force performance, militaryoperations in Grenada, Panama, and Haiti, and severalexercises focused on peacekeeping, and humanitarianassistance missions. The underlying C2 process ofmonitoring, understanding, and developing alternativeactions; predicting the consequences of eachalternative for each possible future underconsideration, decision; developing and promulgationof plans and directives and requests for support;seeking information; making reports; and respondingto inquiries are all relevant across the ranges of OOTWmissions. HEAT has been modified since 2000 toinclude measure of the quality of collaboration, whichis often a key process in OOTW.

ACCES

ACCES is a derivation of HEAT, which was developedprimarily for joint theatre-level operations. ACCESreorganised HEAT concepts into army doctrinallanguage and doctrine, but shares the samephilosophy. ACCES has been applied to numerousdivision and corps command centre assessments. Itrepresents a comprehensive set of practical andobjective performance measures for C2 activities. Theprimary focus of ACCES is the overall performance ofa command centre or network of command centres,at various stages of the C2 process, from the collectionof data to the conversion of data to intelligence to theimplementation of plans and directives. The underlying

Page 115: NATO Code of Best Practice for C2 Assessment

112 NATO Code of Best Practice for C2 Assessment

approach to ACCES is that C2 comprisesinterdependent sub-processes which can be observedand measured. ACCES considers C2 as an adaptivecontrol process, where information collected from theoutside is processed internally to generate plans thatmay be adapted to reflect new information. ACCEStakes the view that the overall effectiveness of acommand centre can be judged by the viability of itsplans. A good plan is one that can be executed withoutthe need for modification beyond the contingenciesstated in the plan and that remains in effect throughoutits intended life.

Multi-Attribute Utility Theory (MAUT)

MAUT is similar to ACCES in the sense that both usefunctional decomposition and function-specificevaluation metrics. The major differences are that MAUTcan be used with any set of metrics (including thosefrom ACCES), which must be specified by the analyst.MAUT assigns weights to the MoM at each level of theMoM hierarchy and utility values or scores at the lowestlevel. MAUT then aggregates upwards the weightedscores to provide composite scores of effectiveness.MAUT, if properly used with appropriate application ofjudgmental weights, will allow integrated analyses basedon multiple MoM. While this is often satisfying todecisionmakers (it provides a single index of quality),analysis should always monitor the components of suchindices as they may provide insight into strength andweaknesses of the C2 system. For example, manyOOTW C2 problems involve a variety of objectivefunctions and trade-offs. MAUT results should alwaysbe assessed by sensitivity analyses.

Page 116: NATO Code of Best Practice for C2 Assessment

113Chapter 5

Many applications of MAUT assume additivecomposition of MoM. However, this is a very restrictiveassumption that needs to be validated in each case.In addition MoM may interact suggesting that theaggregation of MoM is at partly multiplicative (Keenyand Raiffa, 1976; Sarin, 2000).

Collaboration C2 Metrics

The following collaboration metrics have evolved outof work done by Evidence Based Research, Inc. forthe United States Office of Naval Research. Thesecollaboration metrics focus on individual and teamcognitive/awareness, team behaviour, and teamproducts. Individual cognitive metrics measurecollaboration, team members’ understandings abouttheir mission and their team, and team cognitivemetrics apply the individual cognitive metrics toquantify the level of awareness in a team. There arefour classes of these metrics:

• Averages of the understanding among team members;

• Extent of alignment of these understandings;

• Maximum level of understanding anywhere withinthe team; and

• Presence of gaps in understanding throughoutthe entire team.

Team behaviour metrics measure the key behavioursindicative of effective teams. These behaviours includesmooth and efficient synchronisation, efficientinformation exchange, adaptability, effective workloaddistribution, and team member engagement. Teamproduct metrics measure product quality and team

Page 117: NATO Code of Best Practice for C2 Assessment

114 NATO Code of Best Practice for C2 Assessment

efficiency and are the bottom-line “proof of the pudding”metrics, applicable whether a team or a singleindividual produces the product.

Other metrics under consideration are: taskperformance, workload, level of engagement (buy-in),synchronisation, information needs workload andhandling, workload awareness and handling, andproblem awareness and handling.

Other Considerations in theSelection and Interpretation of MoM

Effects of Uncertainty

In order to state a level of confidence in theinterpretation of MoM, the underlying assumptionsmust be clearly stated and uncertainties recognised.Uncertainties manifest themselves in several ways thatmay affect MoM. They may be grouped as follows:

• Study assumptions—(uncertainties in thescenario, model input);

  Relevance to the purpose of the evaluation,uncertainties in the military objective,knowledge of enemy concept of operations,intentions, capabilities, weapon performance,uncertainties in terrain data, etc.

• Modelling assumptions—(uncertainties in themodel, structural uncertainty); and

  Human performance, parameters, objects,attributes, processes, effects of constraints,effects of aggregation and de-aggregation,

Page 118: NATO Code of Best Practice for C2 Assessment

115Chapter 5

deterministic (usually high hierarchical levelbut low resolution) versus stochastic models,especially in OOTWs; and

  Uncertainties about the implications of valuechanges in lower-level MoM with respect tothe values of higher-level MoM (e.g.,increases in arrest rates at late stages of anOOTW may be negative, while increases inarrest rates at early stages may be positive).

• Model sensitivity—(uncertainties in the outcome)

  Hypersensitivity to input variations, (instabilityor chaos theory), effects of model non-linearities and non-monotonic behaviour(effects of thresholds), decisionmaking forlocal versus global optimisations, etc.; and

  Sensitivity analysis may be applied to identifyuncertainty. By varying the assumptions andinput data within the plausible ranges,excursions in the analysis verified by thesubject matter experts provide insight into theeffects of uncertainty.

Impact of Technological Changes

The rapid pace of technological change involvinginformation systems is causing major changes in the wayC2 is perceived and executed, leading to potential changesin the way war fighting commands are organised. Forexample, the last decade has seen the emergence ofcollaborative technologies, which enable new ways tocommand and control military forces and for them tointerface with other actors. To keep pace with and to

Page 119: NATO Code of Best Practice for C2 Assessment

116 NATO Code of Best Practice for C2 Assessment

evaluate the impacts of these changes, the nature of thesechanges and impacts need to be understood so that theappropriate MoM can be developed.

One approach to doing this postulates thosedifferences likely to occur between today’s C2 andfuture C2, and then describes an evaluationmethodology, including MoM, to measure the impactof the changes. Other approaches are SocialConstruction of Technology (SCOT) (Bijker, 1989),Socio-Technical Networks (Elzen, Enserink and Smit,1996) and Actor Network Theory (ANT) (Latour, 1997).

Conclusions

No single measure or methodology exists thatsatisfactorily assesses the overall effectiveness of C2systems. As a minimum, the following factors must beconsidered in conducting an analysis of C2:

• Determine the appropriate levels of MoM hierarchy;

• Identify specific MoM which are practically obtainable;

• Specify means of collection of MoM;

• Assure the validity and reliability of measures for correctinterpretation with quantifiable levels of confidence;

• Be aware that variation in measurements (e.g.,due to human factors) may well causeunacceptable levels of uncertainty. Hence theanalyst must pay particular attention tomeasurements related to the human element;

• Consider that while MoPE and MoFE may providethe most persuasive measures from the military

Page 120: NATO Code of Best Practice for C2 Assessment

117Chapter 5

perspective, MoCE and MoP are the most readilyderivable by operations analysts; and

• Account for the principles of reliability and validityto avoid the risk of generating false conclusions.

Recommendations for Generation andSelection of MoM

The principal objective for MoM is to determinejudgements of the degree to which C2 or changes toC2 may improve force effectiveness and to provideconvincing arguments for the improvements. It isimportant to stress that the purpose is to assess thecontributions of C2 in terms of how C2 improves theeffectiveness of military missions, and not the qualityof the C2 process itself. However, to arrive at theseassessments and assign attribution to the C2 system,the C2 system must be included in the analysis. Toachieve this objective, the following steps are required:

• The objectives for the assessment must beestablished and clearly stated;

• Selection of MoM should not be done in isolationfrom consideration of the assumptions,constraints, models, tools, scenarios, or otherelements of the analytic plan and assessmentprocess. The assumptions used in the modeland/or evaluation must be stated along with theirpotential impact on the results;

• A detailed assessment of reliability and validity ofthe selected measures needs to be made inorder to determine a level of confidence inmeasures; and

Page 121: NATO Code of Best Practice for C2 Assessment

118 NATO Code of Best Practice for C2 Assessment

• For C2 acquisition analyses, the generation ofmeasures should occur in parallel with thedevelopment of the system, so that as thesystem is being matured, developers can knowthe standards to which they are being held.

Summary of the Challenges and Issuesin the Evaluation of C2

• Correlation of MoPE and MoFE with C2 processmeasures (e.g., battle outcome against lower-level measures) is difficult;

• Separation and linkage of the respectiverelationships between C2 and users,organisations and military objectives requiressome effort;

• Aggregated measures (e.g., as obtained fromACCES or MAUT) have limitations in thediagnosis of C2 success or failure. A carefulanalysis is required to provide a comprehensiveassessment of highly complex C2 systemsbased on a small number of summary measuresof outcome and process;

• The assessment of the reliability of measures inan environment where sample sizes are smallwill remain difficult and may require the use ofnon-parametric statistics;

• The analyst must pay attention to the complextask of establishing and measuring controlvariables in order to achieve correlation ofmeasures against a wide spectrum of scenariosand staff;

Page 122: NATO Code of Best Practice for C2 Assessment

119Chapter 5

• Defining criteria to differentiate measures mustbe established;

• Verifying measurement criteria (e.g.,discrimination) must be ensured;

• For the near future, collecting data to support C2measures will remain labour intensive becauseC2 processes remain human intensive;

• Many of the measures for information processingconcern completeness of the information.Deciding what makes information completerequires coordination and cooperation betweenthe assessor and the user;

• The relationship between outcome and processmay be complex because C2 is an integratedsystem with continuous feedback;

• The analysis of uncertainties and measures ofcentral tendency and dispersion are bothsignificant when examining C2 issues; and

• A Command Post Exercise (CPX) is a usefulvenue for evaluation of C2. However, the costsinvolved generally preclude conducting a CPXsolely to evaluate the C2 process or a C2 system.Cost control is increasingly leading to the use oflaboratory and human in the loop experimentationto develop knowledge and insights.

Summary of the Challenges and Issuesfor OOTW MoM

• Cost/benefit: cost in time and effort for datacollection and analysis may outweigh benefit;

Page 123: NATO Code of Best Practice for C2 Assessment

120 NATO Code of Best Practice for C2 Assessment

• Standardisation of data collection, evaluation,and analysis with many diverse non-governmental organizations (NGOs), UNagencies, and militaries;

• Factors outside military control: other agencies,policy restrictions;

• Creation of a rich and comprehensive set of MoMto preclude reliance on a limited number of MoMas the key to success;

• Availability and consistency of information (e.g.,in OOTW, a possible consequence of differentfactions controlling geographic areas);

• Merging of strategic, operational, and tacticaldomains; and

• Clear recognition of roles and responsibilities ofall participants.

Chapter 5 Acronyms

ACCES – Army Command and Control Evaluation System

AHWG – AC/243 Panel 7 Ad Hoc Working Group

ANT – Actor Network Theory

C2 – Command and Control

C3I – Command, Control, Communications & Intelligence

COBP – Code of Best Practise

CPX – Command Post Exercise

DP – Dimensional Parameters

Page 124: NATO Code of Best Practice for C2 Assessment

121Chapter 5

HEAT – Headquarters Effectiveness Assessment System

MAUT – Multi-Attribute Utility Theory

MCES – Modular Command and Control Evaluation Structure

MoCE – Measures of C2 Effectiveness

MoE – Measures of Effectiveness

MoFE – Measures of Force Effectiveness

MoM – Measures of Merit

MoP – Measures of Performance

MoPE – Measures of Policy Effectiveness

MORS – Military Operations Research Society

NGO – Non-Governmental Organization

OOTW – Operations Other Than War

SCOT – Social Construction of Technology

TRAC – TRADOC Analysis Center

TRADOC – US Army Training & Doctrine Command

Chapter 5 References

This chapter is based on the results of the SAS-002 workshopon Measures of Merit. Additional references include:

Adelmam, L. (1992). Evaluating decision support andexpert systems. Chichester, UK: John Wileyand Sons Ltd.

Bijker, W.E. (1990). The Social Construction ofTechnology. PhD Thesis University of Twente,Eijsden, The Netherlands.

Page 125: NATO Code of Best Practice for C2 Assessment

122 NATO Code of Best Practice for C2 Assessment

Bornman, L. G., Jr. (1993). Command and controlmeasures of effectiveness handbook (C2 MoCEhandbook) (TRAC-TD-0393). Fort Leavenworth,KS: U.S. Army Training & Doctrine Command(TRADOC) Analysis Center.

Brown, R. V., Kahr, A. S., & Peterson, C. (1974).Decision analysis: An overview. New York: Holt,Rinehart and Winston.

Canadian Department of National Defence DocumentB-GG-005-004/AF-023. (1999). Civil-Militarycooperation in peace, emergencies, crisis andwar. Available on-line at http://www.dnd.ca/eng/index.html

Defence Systems. (1982). Theatre headquarterseffectiveness: Its measurement and relationshipto size, structure, functions, and linkages; Vol.1: Measures of effectiveness and theheadquarters effectiveness assessment tool.McLean, VA: Author.

Defining and Measuring Success in the New StrategicEnvironment [Technical Cooperation ProgramPanel JSA-TP-3] Proceedings Report (1999)Ottawa, Ontario, Canada.

Duff, J. B., Konwin, K. C., Kroening, D. W., Sovereign,M. G., & Starr, S. H. (1992). In T. J. Pawlowski(Ed.), Command, control, communications,intelligence, electronic warfare measures ofeffectiveness [Final Report]. Fort Leavenworth,KS: Military Operations Research Society(MORS) workshop.

Page 126: NATO Code of Best Practice for C2 Assessment

123Chapter 5

Elzen, B., Ensenrink, B. & Smit, W.A. (1996). Socio-Technical Networks: How a Technology Studiesapproach may help to solve problems relatedto technical change. [Social Studies of Science,vol 26, pp.95-141]

Engel, R., & Townsend, M. (1991). Division CommandAnd Control Performance Measurement.Toronto, Ontario, Canada: Defence and CivilInstitute of Environmental Medicine (DCIEM).

Fliss, R., Feld, R., Geisweidt, L., Kalas, W. D., Kirzl,J., Layton, R., et al. (1987). Army commandand control evaluation system user’s handbook(MDA903-96-C-0407). Ft. Leavenworth, KS:Army Research Institute for the Behavioral andSocial Science.

Halpin, S. M. (1992). Army command and controlevaluation system (ACCES): A brief description(Working paper LV-92-01). Ft. Leavenworth,KS: US Army Research Institute for theBehavioral and Social Sciences.

Hawkins, G. (1984). Structural variance and otherrelated topics in the SHAPE Armour/Anti-Armour study. In R. K. Huber (Ed.), Systemsanalysis and modeling in defense—development, trends, and issues. New York:Plenum Press.

Human Systems. (1993). Survey of evaluationmethods for command and control systems.Ontario, Canada.

Page 127: NATO Code of Best Practice for C2 Assessment

124 NATO Code of Best Practice for C2 Assessment

The impact of C3I on the battlefield Panel 7 on thedefence applications of operational research(AC243 panel-7 TR/4). [North Atlantic TreatyOrganisation (NATO) unclassified technicalreport] (1994). Ad Hoc Working Group on theimpact of C3I on the battlefield. Brussels,Belgium: NATO Defence Research Group.Available from www.nato.int

Keeny, R. L., & Raiffa, H. (1976). Decisions with multipleobjectives: Preferences and value tradeoffs.Chichester, UK: John Wiley and Sons Ltd.

Lambert N. J. (1998). Operation Firm Endeavour:Campaign Monitoring; Factional complianceand the Return to Normality in Bosnia andHerzegovina. Operational Analysis Branch, HQARRC, Queens Avenue, JHQ Rheindahlen,41179 Moenchengladbach, Germany. Alsoavailable from the British Library DocumentSupply Centre, Boston Spa, Wetherby, WestYorkshire, LS23 7BQ, United Kingdom,Reference DSC Shelfmark: 99/20361 and 99/20367 (annexes).

Lambert, N. J. (1998). Operational analysis in the field:The utility of campaign monitoring to theimplementation force (IFOR) operation inBosnia and Herzegovina 1995—1996 .Proceedings from Cornwallis III, Analysis forPeace Operations. (ISBN 1-896551-22-X). TheLester B Pearson Canadian InternationalPeacekeeping Training Centre, CornwallisPark, PO Box 100, Clementsport, NS BOS1EO, Canada.

Page 128: NATO Code of Best Practice for C2 Assessment

125Chapter 5

Lambert, N. J. (in press). Measuring the Success ofthe North Atlantic Treaty Organisation (NATO)Operation in Bosnia and Herzegovina 1995—2000 . European Journal of OperationsResearch, Special 2000 Edition.

Latour, B. (1997). Bruno Latour’s keynote speech: Onrecalling ANT. In Actor network and after. Centrefor Social Theory and Technology (CSTT),Department of Sociology and Social Anthropology,Keele University, United Kingdom. Available fromhttp://www.comp.lancs.ac.uk/sociology/stslatour1.html

Latour, B. (1999). On Recalling ANT. In J. Law & J.Hassard (Eds.), Actor network theory and after(pp. 15-25). Oxford, United Kingdom: BlackwellPublishers.

Leroy, D., & Hofmman, D. (1996). Criterion ofcommand, control, communications,intelligence (C3I’s) effectiveness. 1996Command & Control Research & TechnologySymposium Proceedings. Washington, DC:National Defense University.

McCafferty, M. C., & Lea, S. C. (1997). Measures ofeffectiveness for operations other than war(CDA/HLS/R9714/1.1). Farnborough, UnitedKingdom: Center for Defence Analysis.

National Research Council. (1998). Modeling humanand organizational behavior—Application tomilitary simulations (ISBN 0-309-06096-6).Washington, DC: National Academy Press.Available from http://www.nap.edu

Page 129: NATO Code of Best Practice for C2 Assessment

126 NATO Code of Best Practice for C2 Assessment

1The following reports can be ordered from http://www.rta.nato.int, RTO-MP-029 AC/323(SAS)TP/9. Workshop onC3I Modelling (April 1999), RTO-MP-038 AC/323 (SAS) TP/12Modelling and Analysis of Command and Control (June 1999),RTO-TR-9 AC/323 (SAS) TP/4 COBP on Command and Control(March 1999).2David S. Alberts, John J. Garstka, Richard E. Hayes, and DavidA. Signori, Understanding Information Age Warfare (Washington,DC: CCRP Publication Series, August 2001).3Article V of the North Atlantic Treaty, which deals with classicmilitary attacks on members.

Olae, N. G., Roy, J., & Wetter, M. (2000). Performancedrivers—A practical guide to using the balancedscorecard (ISBN 0-471-49542-5). Chichester,UK: John Wiley and Sons Ltd.

Sarin, R. K., (2001). Multi-attribute utility theory. In S.I. Gass & C. M. Harris (Eds.), Encyclopedia ofoperations research and management science(Centennial ed.). Boston: Kluwer Academic.

Sovereign, M., Kemple, W., & Metzger, J. (1994).Command, control, communications,intelligence evaluation workshop (C3IEW).Measures Workshop, Phalanx, UnitedKingdom.

Sweet, R., Metersky, M., & Sovereign, M. (1985).Command and control evaluation workshop[Military Operations Research Society (MORS)C2 MoCE Workshop]. Monterey, CA: NavalPostgraduate School.

Tolk, A. (1995). Zur Reduktion struktureller Varianzen[On the reduction of structural variances].Unpublished doctoral dissertation, University ofthe Federal Armed Forces of Germany, ProUniversitate Verlag, Sinzheim.

Page 130: NATO Code of Best Practice for C2 Assessment

127

CHAPTER 6

HumanOrganizational

Factors

If I had time…to study, I think that Ishould concentrate almost entirely onthe ‘actualities of war,’ the effect oftiredness, hunger, fear, lack of sleep,weather…It is the actualities that makewar so complicated and so difficult, andare usually neglected by historians.—Field Marshall Archibald Wavell, 1883–1950,

Author of Soldiers and Soldiering

Page 131: NATO Code of Best Practice for C2 Assessment

128 NATO Code of Best Practice for C2 Assessment

Importance of Human andOrganisational Factors

T he human dimension largely distinguishescommand and control (C2). Key differences

between C2 analyses and traditional militaryoperations analysis (OA) applications include the needto deal not only with military organisations, but alsowith distributed military teams (and organisations)under stress and their decisionmaking behaviour aswell. Moreover, in operations other than war (OOTW),consideration must be paid to the behaviour of andinteraction with non-military organisations, politicalgroupings, and amorphous groups such as crowds andrefugees. Thus, the formulation of the problem andthe development of solution strategies cannot becompleted without explicit consideration of both humanand organisational issues.

The human factors of interest fall into three major categories:

• Human behaviour related to performancedegradation, such as stress and fatigue, and asa consequence of social interactions amongindividuals and members of groups;

• Decisionmaking behaviour (cognitive questions)including the cognitive complexity of the issuesand the capacities of the commanders or otherdecisionmakers of interest; and

• Command style.

By contrast, organisational factors deal withrelationships among groups of individuals, includingconnectivity, roles, and organisational structures.

Page 132: NATO Code of Best Practice for C2 Assessment

129Chapter 6

Since both human and organisational factors can affectC2 performance, the operations analyst must considertheir impact early in the research design process andreview a priori assumptions about them in an iterativemanner throughout the entire analytical process.Human and organisational factors must be consideredas part of structuring the problem, selecting measuresof merit (MoM), defining scenarios, developing solutionstrategies, and selection of tools.

The first key consideration when structuring theproblem is whether individual decisionmaking andbehaviour of individuals or groups is important to theC2 processes under analysis. If the research questioncan be answered without considering differencesbetween individual decisionmakers and groups thenthe additional complexity that issue introduces shouldbe avoided. For example, in addressing a C2 issuethat deals, all other things being equal, with a simplechange in connectivity (which headquarters will havewhich linkages to others), human behaviour may notbe important to the analysis at least as long as combatmissions are involved. However, the same change inconnectivity might affect relations with non-militaryorganisations and individuals essential for missionsuccess in OOTW suggesting their response must beaccounted for in an appropriate manner. Thus,deciding on the importance of human issues in OOTWand identifying the issues at stake requires a goodgeneral understanding of human behaviour and itsunderlying motivations beyond military common senseas well as knowledge about relevant cultural factorsand the interests of the parties involved.

Page 133: NATO Code of Best Practice for C2 Assessment

130 NATO Code of Best Practice for C2 Assessment

Human Factors

Human Performance and Behaviour

Human performance affects behaviour and vice versa.Human performance depends on psycho-physiologicalvariables (e.g., stress, fatigue, sleep deprivation, hunger,and alertness) and on ergonomic and external factorslimiting performance and behavioural freedom.Individual and group behaviour is the result of socialinteraction. It includes interactions by militarycommanders and their troops, underlying psychologicalprocesses and factors (e.g., fear, morale, and values),and the cultural, educational, and religious backgroundof individuals. There is significant historical evidencethat inferior combat potential as measured in terms ofnumbers of personnel and weapon systems may becompensated for by superior human performance inbattle (Dupuy, 1979).

Any time human performance and/or behaviour are atissue, parameters and/or models will be needed toreflect those issues. For example, systems that involvehuman activity, such as watch or command centres,need to be studied in ways that reflect differences inC2 performance that can be traced to humanperformance/behaviour issues. In addition, differencescan arise from experience or training, coalition features(e.g., language, national doctrine, command style), orservice/branch-unique doctrine and practice. Thesekinds of individual performance and behaviour issuesmay be modelled in two ways. They can be treatedstochastically, in a manner that reflects theiroccurrence in “real” systems depending on situationalfactors (black box approach), or in terms of process-

Page 134: NATO Code of Best Practice for C2 Assessment

131Chapter 6

oriented behavioural models, describing thepsychological processes behind the observablebehaviour of individuals or groups in a given situation(Long Term Scientific Study SAS-017 on HumanBehaviour Representation, Final Report, Chapter 2).The decision that human performance and behaviourmay vary meaningfully will have a clear impact on thechoice of models and analytic approaches (e.g.,stochastic processes or action-theoretic models suchas Norman’s Activation Trigger Scheme (ATS) forsimulating the dynamics of actions or reactions)(Norman, n.d.).

Where human performance is considered to be ameaningful factor (e.g., C2 within command centreswhen wearing chemical protection gear), someexperimentation may be necessary to develop realisticparameters for the impact on error rates or the paceof work. In other cases, such as simple fatigue, humanfactors specialists may be able to provide validparameters from work in other contexts. Suchspecialists are often valuable members of researchteams. In any case where the workflow and work ratewithin command centres is relevant, humanperformance parameters must be considered. Inaddition, human performance issues will have someeffect on decisionmaking. Error rates increase aspeople become tired and overloaded, altering the waythey work and the information they consider.

However, research on how to represent humanbehaviour and its impact on performance in models isstill in an early stage. This is particularly true in thecontext of OOTW. Those operations require the co-operation of non-military actors. In addition, militarymission objectives include providing security and

Page 135: NATO Code of Best Practice for C2 Assessment

132 NATO Code of Best Practice for C2 Assessment

assistance in the reconstruction of conditions in thetheatre of operations and helping to stabilise thesituation to a degree that ultimately permits the militaryforces to leave. The use of force in OOTW is limited indegree and kind to responding appropriately to threatsfor defensive and protective purposes and to coerciveactions for enforcing compliance with agreements.Military actions that would be highly effective inaccomplishing mission objectives in war might becounterproductive in OOTW. In fact, the value ofmilitary actions in OOTW is not so much a question ofphysical effects, but rather how military actions andtheir physical effects are perceived by the variousactors and the population in the theatre, how themilitary actors interpret the behaviour of the otheractors, and the critical mission task conditions (e.g.,political interest and media attention). Thus, a carefulanalysis of mission tasks by human science expertsis indispensable for modelling and assessment of C2options in OOTW (Baeyer, 2001).

It follows that, in addition to the traditional compositionoperations research/operations analysis (OR/OA)assessment teams, assessors of C2 for OOTW mustpossess hands-on experience with such operationsand relevant non-military organisations (e.g., privateaid organisations). The assessment team must alsohave access to experts from the fields of politicalscience, cultural anthropology, demography, sociology(including media impact research), social psychology,and individual psychology. These experts willcontribute the expertise for diagnosing the relevanceof, and differences in, performance and behaviour ofactors and for the formulation of hypotheses forassessing the “human issue risk” of analysis results.

Page 136: NATO Code of Best Practice for C2 Assessment

133Chapter 6

In addition, they may contribute to the modelling ofbehavioural processes in analysis tools used to testC2 system sensitivity via parametric variations ofhuman performance and behaviour parameters.

However, well-documented empirical knowledge onhuman performance in military operations is scarceand little is known about its relevance in circumstancesother than those prevailing when the underlying datawere collected. Similarly, experience andsystematically compiled data on behaviour andresponse of individuals and groups to actions andsituations in OOTW are still limited and theories onhuman behaviour are mostly untested in the contextof military operations. Therefore, dealing with humanissues in C2 analyses reduces to the problem ofaddressing decisions under risk and uncertainty wheneach of the C2 design options is tested for the rangeof possible hypotheses on the implications of humanissues for C2 effectiveness.

Human Decisionmaking

Increasingly, assessment teams have to deal withissues where individual decisions are important. Thisis especially true for OOTW in which even tactical-level decisions by a lower level military leader mayhave strategic implications because of mediapresence. This represents a major challenge becausethe variety of human behaviours involved makesmodelling decisionmaking very difficult. Fortunately,there are some approaches that can be used to copewith these difficulties. The correct choice, however,will depend on the research issue(s).

Page 137: NATO Code of Best Practice for C2 Assessment

134 NATO Code of Best Practice for C2 Assessment

In some cases the analyst is asked to assume thatdecisionmaking will follow established doctrine andtactics, techniques, and procedures (TTPs). In thesecases, the challenge is to craft a set of rules or look uptables that reflect the existing guidance correctly. Hence,a model that replicated the “correct” set of decisionswould be useful for assessing simple C2 issues suchas the impact of new communication and informationtechnology or changes in connectivity or supportingrelationships within the force. However, maximising thebenefits of technical and organisational changes in aC2 system might require an appropriate adaptation ofdoctrine and TTPs. Therefore, testing the impact ofestablished rules should be part of the analysis.

Testing the impact of rules is an indispensableprerequisite for models that have built-in sets of rulesthat are not driven by approved TTPs, but rather byopinions of subject matter experts or modellers whoserationales have been neither validated nor accredited.Considerable knowledge exists on how to organiseand validate such expert elicitation. Here again,specialised team members may be helpful. Simpleadoption of models developed from subject matterexperts will put the assessment team at considerablerisk of accepting false conclusions. When such modelsmust be adopted, they should be explored in detail touncover their driving assumptions and subjected tosensitivity analyses (Chapter 9). Where this cannotbe done, these models are best avoided when C2assessments are performed.

From human factors research on stimuli that influencehuman decisionmaking we have learned that humandecisionmaking capability is degraded in some

Page 138: NATO Code of Best Practice for C2 Assessment

135Chapter 6

situations and enhanced in others. These stimuli mayoriginate from, inter alia:

• Biological/physiological processes (e.g., physicaloverexertion [fatigue], use of bio-chemicalsubstances, and/or sensory deprivation);

• Cognitive sources (e.g., confusion arising inunfamiliar or unknown situations);

• Psychological processes (e.g., processescausing emotions and stress);

• Social processes (e.g., group dynamics thatcoerce or reinforce individual decisionmakingdepending on accepted social norms,organisational infrastructure and procedures);

• Environmental factors (e.g., darkness, austereand/or uncomfortable environments); and

• The decision support tools and technologies thathumans use (e.g., information displays and decisionsupport software). It is important to look beyondtechnology at whether or not human decisionmakingis improved, or even constrained in some cases, byusing computerised decision aids depending upontheir functionality and configuration.

Types of Decisions

The nature of the decisions being supported by C2 systemswill also enable the assessment team to make intelligentdecisions about how they influence the analysis. Threeuseful decision types can be distinguished:

Page 139: NATO Code of Best Practice for C2 Assessment

136 NATO Code of Best Practice for C2 Assessment

• Automatable decisions;

• Contingent decisions; and

• Complex decisions.

Automatable Decisions

Automatable decisions fall into the category of “simpledecisions.” The range of decision options is finite andknown, and the criteria for selecting among them areclear. Basic sensor-to-shooter decisions are simpledecisions that are usually automated for the sake oftimely response (e.g., anti-aircraft or missile defencesystems). Similarly, the selection of patrol routes,inspection strategies, and many logistics decisionsrelevant for OOTW can be automated. For example,scheduling can be seen as an optimisation problem inwhich time, space, and priorities are traded off togenerate a “best” answer. Even though the decisionenvironment is constantly changing due to factors suchas weather and mechanical problems, schedulingdecisions are characterised by rules and algorithms.Models of automatable decisions of this kind can bebuilt relatively easily.

However, where the C2 system employs humans tomake these choices, some error rate parameters willbe needed if the results are to be meaningful. Forexample, error rates may increase if the time availableto make a decision is insufficient or physical demandsinduce fatigue. Even where the operational conceptcalls for the use of automated systems, the analystshould explore the quality of the data, information, orknowledge used to drive the process and the likelihoodthat humans will be involved in collection or fusion.

Page 140: NATO Code of Best Practice for C2 Assessment

137Chapter 6

In these fully automatable decisions the assumptionis that “to know is to decide.” In these cases, ifuncertainty were adequately reduced, the correctcourse of action or decision would be obvious. In thatcase, decision theory classifies the problem as a“decision under certainty.” These problems are trivialexcept for the determination of the utility function incase there are more than two selection criteria thatneed to be considered and/or when constraints needto be accounted for to complete the analysis (Keenyand Raiffa, 1976).

Contingent Decisions

The next level of decisionmaking complexity is bestthought of as contingent decisions. These are caseswhere the commander has thought through the situationand developed a set of alternative actions or decisionsthat are appropriate to the situation, but further informationon the operational environment will be needed todetermine which is the proper course of action. In otherwords, “to know is to decide, but knowing is not yetpossible.” In some NATO countries the researchcommunity terms this “opportunistic decisionmaking.”

In most cases a lack of clear, precise knowledge isunavoidable. For example, the commander in adefensive posture may recognise that the adversaryhas several potentially viable options. The adversarymay not even know which alternative he will choose. Insuch a case, the defending commander would bothdevelop courses of action to meet likely contingenciesand also undertake a variety of information collectionactivities designed to provide as much warning aspossible when the attacker selects a main attack option.

Page 141: NATO Code of Best Practice for C2 Assessment

138 NATO Code of Best Practice for C2 Assessment

Modelling contingent decisions is much more difficultthan modelling automatable decisions, but is similarin that an underlying set of rules or algorithms stilldrives the process. The added complexity comes fromthe need to find the time when information is adequateto select one of several actions. The best models forthat purpose are essentially hypothesis testing models.They align information about the operationalenvironment against a finite set of alternative futuresand perform probability calculations to determine whenthe commander has enough confidence to act or toestimate the information gain in terms of the expectedvalue added to decisions as new information arrives(Sherrill, 1996). Information entropy is also a valuablemeasure of the information state of the commander. Itcan also be extended to a measure of informationdominance (Perry and Moffat, 1997).

Complex Decisions

Finally, “complex” decisions are very difficult to model.These require the decisionmaking system to:

• Recognise when a decision needs to be made;

• Identify the relevant set of options;

• Specify the criteria by which they will be judged; and

• Determine when the decision will be made.

Examples of complex decisions include the definitionof missions at the operational level, decisions tochange the fundamental activity of the organisation(e.g., shift from the offence to the defence), and theprocess that creates courses of action in response toevents on the battlefield or in OOTW. Except when

Page 142: NATO Code of Best Practice for C2 Assessment

139Chapter 6

doctrinal answers are available, complex decisions arevery difficult to model and even more difficult to validateor accredit. Most successful efforts dealing withcomplex decisions have used “human-in-the-loop”techniques and relied on the quality and variety ofexperts employed for reliability and validity. Somepromising research on modelling complex decisionsin military operations has been completed in the UKand this is in the process of being incorporated as thecore of the next generation of closed form simulationmodels of conflict being developed by Defence Scienceand Technology Laboratories (DSTL) (Moffat, 2000;Moffat, 2002) Similar research on tactical decisionautomata is going on in Germany to improve thecapability of simulation systems for analysis as wellas training and staff exercises support (Hofmann andHofman, 2000, von Baeyer, 2001).

Command Style

Assessment teams often encounter the argument thatdecisionmaking depends on the “commander’s style.”Moreover, they are told, systems must be designed tosupport commanders with differing styles. Because itis an elusive and multi-dimensional concept, commandstyle represents a challenge to modelling. However,this factor can be accommodated if the analyst is ableto develop a clear concept of the alternative commandstyles that must be recognised and their consequencesfor military decisionmaking.

Attributes of the Commander

Differences in command style may be reflected byappropriate attributes such as the background and

Page 143: NATO Code of Best Practice for C2 Assessment

140 NATO Code of Best Practice for C2 Assessment

training of commanders, their decision and order style,risk tolerance, and operational experience. Forexample, in conjunction with field experience, thebackground and training of commanders affect therichness of their understanding of the military situationand their capacity to influence it.

Organisational Style

Another not totally unrelated topology deals with thedegree to which the commander uses a formaldecomposition of the situation versus a holistic, integratedvision. The decomposition style of management isassociated with hierarchical and segmented work, as inthe Napoleonic or classic German general staff. Thisheavily structured process allows centralised control andtight coupling between the structure of the problem, thestructure of the supporting staff, and the flow ofinformation within and between command centres. Theclassic centralised commander imposes his style on theC2 process and impacts key organisational issues aswell as decision style.

The alternative command style is an open and holisticone in which senior staff and commanders from relatedcommand levels are directly involved in a broaddevelopment of courses of action and implementationplans. This more open process also has implicationsfor the information flow within and between commandcentres. While decisions are still made authoritativelyat the centre (by the commander or senior staff), theytend to generate loose guidance (mission type orders)and to enable lower level commanders and their staffsmore latitude in implementation.

Page 144: NATO Code of Best Practice for C2 Assessment

141Chapter 6

Risk Style

Another topology applied by some practitioners is thedegree to which individual commanders (anddoctrines) are risk averse versus attracted to risk. Mostmilitary enterprises have some properties that impelcommanders to minimise risk. The fact that lives,national treasure, and serious national interests areinvolved in warfare suggests that risk averse strategieswill tend to dominate. However, some militarycommanders are more comfortable with greater risk.Indeed, outnumbered or otherwise disadvantagedforces must often take risks in order to prevail. To theextent that the relative risk aversion of commandersis relevant to the C2 analyses underway, assessmentteams will need to define and model variables thatrepresent this factor (Schultz, n.d.).

Recent research on Bayesian decisionmaking hadindicated that the perception of risk as measured by utilityloss relative to a goal value and perception of futureoutcomes can in combination give insight into the waythese affect command decisionmaking (Moffat, 2002).

Orders Style

Commanders, and national command styles, have alsobeen shown to differ in the degree of detail containedin directives to subordinates. At one end of thespectrum is the commander who issues detailed ordersthat specify what is to be done, how it is to beaccomplished, and when and where the specifiedactivities are to occur. At the opposite end of thespectrum is the commander who issues “mission typeorders” which simply specify the mission to beaccomplished and leave decisions about the detailed

Page 145: NATO Code of Best Practice for C2 Assessment

142 NATO Code of Best Practice for C2 Assessment

objectives, forces to be employed, critical terrain, andtiming up to the subordinate commanders. In betweenare those who specify a series of linked objectives(cross the river, take the high ground in the north, andbe prepared to defend or carry the attack north-eastinto the valley) and supporting detail (e.g., forcesavailable, rough timetable keyed to the objectives) butleave subordinates with considerable discretion withinthat guidance. Both the speed of the C2 process andthe distribution of C2 work across command centres(particularly planning and operations management) willvary greatly depending on the commander’s style onthis dimension. National doctrine and practice may alsoinfluence this factor.

Other Typologies

Other typologies of command styles are, of course,possible and may be more relevant to particular C2analyses. Human behaviour experts (e.g., cognitiveand organisational psychologists and anthropologists)should be recruited to the project team if novelcategories are developed. However, the mostimportant issue when dealing with command style iswhether it is included in the analysis at all. The C2research related hypotheses under analysis shoulddictate the forms of command style examined.However, the impact of command style should onlybe examined because it appears to be necessary toanswer the analytic question(s) of interest. Otherwiseit tends to introduce a level of complexity that mayconfound the other analyses underway.

Page 146: NATO Code of Best Practice for C2 Assessment

143Chapter 6

Organisational Factors

All elements of the C2 system are ultimately related toone another. The linkage between human andorganisational issues, however, is particularly directand close. Properly done, organisational designreflects the interaction among the tasks to be done,the people available to perform them, and the systemsor tools that support those people. Hence, the “proper”organisation of C2 depends in large measure on thecapabilities, training, and experience of the people inthe C2 system.

Organisation is a serious subject in military analyses.For centuries the military has sought to implementunambiguous relationships and responsibilities. Unityof command is a central principle of war. When it hasbeen lost or comes into question, as in OOTW, theprofessional militaries of the world have foundthemselves very uncomfortable. Fortunately, militaryorganisational issues are driven by a fairly small andfinite list of principles. Assessment teams asked towork on C2 issues can use the known list of factors asa checklist about organisational differences todetermine whether they need to build organisationalmatters into their research designs. This includes theissue of informal relationships that may have evolvedin order to overcome organisational deficits and thusstreamline day-to-day operations. In fact, anorganisational design and command style that aresupportive of building informal relationships mayprovide the flexibility for efficiently handling themanifold demands facing commanders in OOTW.Also, it should be recognised that organisationalimplications which are perceived as detrimental to the

Page 147: NATO Code of Best Practice for C2 Assessment

144 NATO Code of Best Practice for C2 Assessment

interests of the affected individuals and groupsinevitably would jeopardise cooperation, technical andprocedural improvements in C2 notwithstanding.

Organisational Differences

The principal differences between militaryorganisations are related to structure, function, andcapacity. Any change or innovation that can beintroduced in a C2 organisation falls into one of thesethree categories which, therefore, may be used toguide analysts when structuring a problem.

Structural differences include:

• The number of echelons or layers in thecommand structure;

• The span of control for nodes in thecommand structure;

• The pattern of linkages between those nodes(e.g., hierarchical, spokes of a wheel, multi-connected, networked);

� Permanent versus transitory organisationalrelationship; and

� Formal versus informal relationships.

Functional differences include:

• The distribution of responsibility: where functionalactivities are located (e.g., intelligence, logistics,command, civil military cooperation [CIMIC]);

� The distribution of authority (ideally co-located with responsibility);

Page 148: NATO Code of Best Practice for C2 Assessment

145Chapter 6

� The distribution of information;

� Functional specificity (e.g., fire support vs.infantry or close air support vs. defensivecounter air) vs. general and integratedmilitary capabilities (mission tailored taskforces); and

• Degree of ambiguity in command relationships.

Difference in capacity is related to differences in:

• Personnel (e.g., quality, training, experience);

• Communications systems and architectures;

• Information processing systems andarchitectures; and

• Field training and operational experience.

All these dimensions can be modelled, some moreeasily than others. However, the assessment team’schallenge is to identify those organisational factors thatare relevant to the C2 analyses underway. This issuemust also be addressed knowing that organisationalfactors are interrelated: changing one may changeothers. For example, the decision to eliminate a levelof hierarchy within a military organisation may have aprofound influence on the span of control. Similarly,changing the distribution of information so that it nolonger follows the chain of command may haveprofound implications for the ambiguity of commandrelationships. Similar effects can be expected whencoalition operations involve ad hoc members.

Page 149: NATO Code of Best Practice for C2 Assessment

146 NATO Code of Best Practice for C2 Assessment

Treatment of Organisational Factors

Because of the large numbers of organisationalvariables that may be relevant to the analysis of C2issues, they must be approached carefully andsystematically. When possible, organisation theoryexpertise should be brought into the assessment team.Review of organisational issues is treated in a two-step process guided by a hypothesis testing logic. Thefirst step should assess whether any organisationalvariable is being manipulated directly. For example, adecision to move from warfare domain task forces1 tomission tailored task forces with air and land unitsplanning and operating together under a jointcommander would be seen as a direct manipulationof organisational factors and should be studied assuch. The second step, a search for indirect effects oforganisational factors, may be more difficult and willrequire that the assessment team use the list ofpossible factors as a checklist and think throughwhether they may be altered in a prepositional (if, then)logic. An assessment team that posits a relationshipbetween the C2 analysis and an organisational issueshould be able to make a clear statement of thehypothesis and the causality anticipated. This willenable the research design to cover not only the grosseffect anticipated, but also the underlying causalmechanism(s) that will be present if the proposition iscorrect. Adopting this hypothesis approach is also asafeguard against assuming that organisational issuesare easily or well understood and can be treated byassumption. In fact, the organisational arena, like thatof human factors, is one of the most difficult in C2analysis and must be approached with care and rigour.

Page 150: NATO Code of Best Practice for C2 Assessment

147Chapter 6

For example, the small group literature makes a clearprediction that multi-connected groups will be able togenerate better answers to complex problems, but willtake more time to do so than either hierarchies or starshaped groups. The causal mechanism in that theory isgreater dialogue and the representation of moreindependent viewpoints. Moreover, these richerdiscussions are expected to take more time. All otherthings being equal, multi-connected groups that are foundto generate better answers to complex problems shouldalso engage in more dialogue and be found to haveconsidered more information and or solutions. Modellerswho want to take advantage of this factor to explorealternatives to traditional hierarchical militarydecisionmaking must also include the negative featuresin their C2 models (e.g., demands for more time fromalready overburdened staffs and slower decisionmaking).

Roles

The concept of a role comes from sociology. A role isa set of behaviours expected by the self and others.For most military systems the roles of commander andkey staff are well understood and arise from acombination of tradition, training, experience, andrational planning. Because of their origin, roles areoften a convenient way of capturing the doctrine aboutresponsibilities within the C2 system.

Roles can be used to capture “syndromes” or sets ofrelated attributes within a C2 system. For example,an object oriented program might have differentfunctional organisations and their leaders might bedefined as having different attributes that reflect theirdecisionmaking responsibilities and the information

Page 151: NATO Code of Best Practice for C2 Assessment

148 NATO Code of Best Practice for C2 Assessment

they would receive or be able to obtain from theinformation network. When assessing new C2systems, analysts will often need to search for potentialrole gaps or role overlaps. Either of these would bedysfunctional in military operations over time. Changesin information structures also have considerablepotential for creating problems of this type.

Role gaps, role overlaps, and even role conflicts maybe more of a problem in OOTW because militaryorganizations have to assume new roles notembedded in their tradition and experience andrequiring them to cooperate with a variety of non-military actors and organizations pursuing theirparticular objectives and the roles of which are illdefined in many cases. Thus, rather than being asource of friction, changing or adapting existinginformation structures may be part of the decisionproblems that the C2 system must address in order toeventually bring about a secure and stable situation inthe theatre of operations through overarching “unityof command.”

Human and Organisational Issuesand Technology

The foregoing discussion of human and organizationalissues revealed that human performance and behaviouras well as the organizational design of C2 systems,and therefore the effectiveness of C2, depend on theavailable communication and information technology.In fact, a C2 organization resembles a systemcomposed of interacting human, organizational, andtechnological elements as depicted schematically inFigure 6.1 adapted from Mandeles et al. (1996).

Page 152: NATO Code of Best Practice for C2 Assessment

149Chapter 6

Figure 6.1 is meant to illustrate that the character andperformance of a C2 system may change as anyone ofelements in these three categories changes. Moreover,since the human, organizational and technologicalelements are closely linked in most cases, optimisingeach one of them at a time under ceteris paribusassumptions for the other two rarely ever results in anefficient C2 system (Schot and Rip, 1996).

Figure 6.1. Representation of the C2 System, basedon Mandeles et al. (1996)

In particular, the assessment of the human-technologyrelationship is a critical requirement that implieschallenges that can be both social and technical in nature.Without adapting human thought and behaviour patternsand organizational structures it may be impossible toexploit the potential of new technology. On the other hand,the performance of new communication and informationtechnologies may exceed human capabilities ofprocessing information (information overflow) and thusresult in a degradation of human performance and overall

Page 153: NATO Code of Best Practice for C2 Assessment

150 NATO Code of Best Practice for C2 Assessment

effectiveness of a C2 system, the improvement oftechnical parameters notwithstanding.

The challenges of adapting technological capabilitiesto meet human capabilities and the requirements of thesocial interaction processes of commanders and staff,and non-military actors and populations in OOTW,require socio-technical assessment approaches of thekind that evolved in the fields of science and technologystudies (STS), technology assessment (TA) (Rip.1995),and constructive technology assessment (CTA) (Vande Poel, 1999).

Integrated Analyses

Because the issues arising from human andorganisational factors are so complex and so tightlycoupled, C2 assessment teams often use integratingtools to define the key dimensions relevant to theiranalyses and explore the relationships between andamong them. Integrating tools are those that useselected key factors with powerful influence to cutthrough the clutter and detail implied by trying to studyeverything and concentrate instead on the mostimportant elements in the problem. These key drivingfactors are used to conduct a simpler analysis thatcan then be augmented by sensitivity analyses andanalytic excursions to ensure that the problem hasbeen fully and properly understood.

For example, Figure 6.2 has been used to illustratethe relationship between the time available to make adecision, the complexity of the decision, and theuncertainty of the information available about thesituation. These three factors also reflect the risk or

Page 154: NATO Code of Best Practice for C2 Assessment

151Chapter 6

opportunity inherent in a military situation. The morecomplex a situation, the less time available, and thegreater the uncertainty of the available information,the greater the risks (and opportunities) present.

While each of these three dimensions can be examinedindependently, considerable insight can be derivedfrom examining them as a related set. This examinationnormally begins as an exercise in hypothesisgeneration, but can, as research is accomplished, beconverted into a component of a knowledge base. Thatis, as evidence confirming or calling for revising thekey hypotheses is generated, the graphic becomes away of conveying known relationships and generatingnew propositions about regions or subspaces that havenot yet been examined empirically.

In some sense, one corner of this cube represents theworst of all C2 worlds—almost no time available, anenormously complex problem, and considerableuncertainty about the situation. Past research suggeststhat when these conditions exist the decisionmaker hasno choice except to use “best professional judgement”to match the operational situation to some class of well-understood military situations and act accordingly.(Hayes, 1994). However, decisionmaking theory alsoindicates that the wise commander will take short-termactions designed to create more time and/or moreinformation and thereby relocate the problem to a“better” portion of the space. A “risk averse” commanderwill clearly attempt this transformation of the situation.However, a more risk oriented leader may attempt tocut through the fog of war with decisive action.

Page 155: NATO Code of Best Practice for C2 Assessment

152 NATO Code of Best Practice for C2 Assessment

Figure 6.2. Decisionmaking Drivers

The opposite corner of this analytic space, defined asample decision time available, limited complexity, andlow uncertainty, provides the ideal situation fordecomposition of the problem and development of“optimal” military plans. Many innovations in C2systems are designed to move the situations facingcommanders of friendly forces toward this region.Indeed, Van Creveld’s analysis of C2 defines it as asearch for greater certainty (Van Creveld, 1985).

This cube also emphasises an imperfectly understooddimension of C2 systems and the decisionmaking theyimply. That dimension is the speed at which the situationis changing (the pace of operations) in relation to thetime required to make and implement a military decision(the speed of the C2 system). Where the speed of theC2 system is faster, proactive decisions are possible.When the pace of operations is faster, decisions mustbe reactive. The commander who is capable of makingdecisions that transform the operation from reactive to

Page 156: NATO Code of Best Practice for C2 Assessment

153Chapter 6

proactive is rare and enjoys vision not only about whatis, but also about what is possible.

This key relationship (pace of operations to speed ofthe C2 system) is the driving force behind the observe,orient, decide, act (OODA) loop and the resultingguidance to seek to “turn inside the enemy’s C2 loop.”However, C2 analysts must constantly discipline theiranalyses away from assuming that speed alone is adesirable attribute of a C2 system or organisation.Making and implementing bad decisions quickly willresult in more rapid failures, not military success. Asis discussed in detail in Chapter 5: Measures of Merit,multiple dimensions of performance need to beanalysed whenever C2 systems are assessed.However, this requirement to look at multipledimensions in order to assess C2 does not obviatethe value of performing integrated analyses of humanfactors and organisational issues.

Conclusions

• Issues of human performance and behaviourshould be incorporated in models used toanalyse issues that require human activity eitherin the form of performance parameters orappropriate sub-models on behaviour.

• Decisionmaking that is rule or algorithmicallybased can be modelled directly, but error ratesshould be estimated if humans are involved inthe relevant decisionmaking.

• Simple decisions are programmable (withappropriate error rates), but also requireestimations of when decision would be made.

Page 157: NATO Code of Best Practice for C2 Assessment

154 NATO Code of Best Practice for C2 Assessment

• Complex decisions can be treated with “human inthe loop” tools and techniques, but newtechniques are being developed and applied(see Chapter 5).

• Style of command and decisionmaking shouldbe considered in C2 analyses that focus onspecific decisionmaking.

• Organisational issues can be decomposed intoconstituent elements for analysis.

• Hypotheses or propositional structures are oftenthe most useful approach to human factors andorganisational issues.

• Integrated analyses involving roles or selectedaspects of a problem space often provide acohesive approach to the complexity inherent inhuman factors and organisational issues.

• Research in organisations and human factors isexpanding and analysts are advised to consultthe available literature. Experts in this areashould be included on the interdisciplinary C2assessment teams.

• Operational knowledge of human issues is stillweak in many areas. Systematic effort isrequired for organising a consistent program forexperiments on human issues.

Recommendations

• Human and organisational issues are not closedtopics and should be considered early in the

Page 158: NATO Code of Best Practice for C2 Assessment

155Chapter 6

process of C2 analysis when the problem isformulated and a strategy is adopted.

• Test the impact of established decisionmaking rulesthat reflect existing guidance as part of an analysis.This is an indispensable prerequisite for models.

• The assessment team should explore the qualityof data, information, or knowledge used to drivethe automatable decisionmaking process.

• Early on, the assessment team should establishworking relationships with the potential subjectsof the study but be careful not to allow this tointroduce a bias.

• Human factors and organisational expertiseshould be included in all C2 assessment teams;at least until a decision can be made that theyare not major elements of the analysis.

• Separate human performance issues (e.g.,stress and fatigue) from cognitive issues (e.g.,decisionmaking) when possible, but recognisethat they interact.

• Use a checklist (Annex D) and hypothesis-testinglogic for reviewing human and organisationalissues. Remember that human and organisationalissues may interact as do the structural,functional, and capacity arenas of organisations.

• Integrated analytic tools that focus on keyvariables that drive human factors andorganisational issues will often prove useful insimplifying analysis.

Page 159: NATO Code of Best Practice for C2 Assessment

156 NATO Code of Best Practice for C2 Assessment

• Sensitivity analyses are particularly importantwhen working with human factors andorganisational issues.

• Experiments for testing hypotheses on humanbehaviour underlying the C2 analysis arestrongly recommended.

Chapter 6 Acronyms

ATS – Norman’s Activation Trigger Scheme

C2 – Command and Control

CIMIC – Civil-Military Cooperation

CISS – Center for Information Systems Security

CTA – Constructive Technology Assessment

DISA – Defence Information Systems Agency

DSTL – Defence Science and Technology Laboratories

MoM – Measures of Merit

OA – Operations Assessment

OODA loop – Observe, Orient, Decide, Act

OOTW – Operations Other Than War

OR/OA – Operations Research/Operations Assessment

STS – Science and Technology Studies

TA – Technology Assessment

TTP – Tactics, Techniques, and Procedures

VTC – Video Teleconference

Page 160: NATO Code of Best Practice for C2 Assessment

157Chapter 6

Chapter 6 References

Alberts, D. S. (1996). The unintended consequencesof information age technologies: Avoiding thepitfalls, seizing the initiative. Washington, DC:National Defense University.

Alberts, D. S., & Hayes, R. E. (1995). Commandarrangements for peace operations .Washington, DC: National Defense University.

Baeyer, A. V. (2000). Human Factors in militärischenFührungsprozessen für die simulation [Humanfactors in military guidance processes for thesimulation] (B-IK 3103/01). München, Germany:Industrieanlagen-Betriebsgesellschaft (IABG).Available from http://www.unibw-muenchen.de/campus/itis/projekte.html

Baeyer, A. V. (2001, June). Task analysis, training andsimulation for operations other than war(OOTW). Paper presented at the North AtlanticTreaty Organisation (NATO) SpecialistsMeeting on Human Factors in the 21st Century,Paris.

Darilek, R. (2001). Measures of effectiveness for theinformation age army. Santa Monica, CA: RandArroyo Center. Available from http://www.rand.org/publications/MR/MR1155/

Depuy, T. (1979). Numbers, prediction and war: Usinghistory to evaluate combat factors and predictthe outcome of battles. Indianapolis, IN: Bobs-Merrill.

Page 161: NATO Code of Best Practice for C2 Assessment

158 NATO Code of Best Practice for C2 Assessment

Deutsch, R., & Hayes, R. E. (1990). Personality andsocial interaction of effective expert analysts:The issue of cognitive complexity [technicalpaper].

Deutsch, R., & Hayes, R. E. (1992). The war story inwarfighting situation assessment anddecisionmaking [technical paper].

Dodd, L. (1995). Adaptive C2 modeling for RISTAeffectiveness studies. Second InternationalCommand & Control Research & TechnologySymposium Proceedings. Washington, DC:National Defense University.

Hayes, R. E. (1994). Systematic assessment of C2effectiveness and its determinants. 1994Symposium on C2 Research and Decision Aids.Monterey, CA: Naval Post Graduate School.

Hofmann, H. W., & Hofman, M. (2000). On thedevelopment of command and control moduleson battalion down to single item level. In Newinformation processing techniques for militarysystems (pp. 8.1–8.12). North Atlantic TreatyOrganisation (NATO) Research & TechnologyOrganisation (RTO) Meeting Proceedings 49.Available from ftp://ftp.rta.nato.int/PubFulltext/RTO/MP/RTO-MP-049/MP-049-$$TOC.pdf

Jaques, E. (1976). A general theory of bureaucracy.Portsmouth, NH: Heinemann EducationalBooks.

Keeny, R. L., & Raiffa, H. (1976). Decisions with multipleobjectives: Preferences and value tradeoffs.Chichester, UK: John Wiley and Sons Ltd.

Page 162: NATO Code of Best Practice for C2 Assessment

159Chapter 6

Long term scientific study SAS-017 on humanbehaviour representation [Final Report, Chapter2]. (n.d.). Available, with clearance, from: http://www.rta.nato.int/RDP.asp?RDP=RTO-TR-047

Mandeles, M.D., et al. (1996). Managing “commandand control” in the Persian Gulf war (ISBN 0-275-952614). Westport, CT: GreenwoodPublishing Group.

Moffat, J. (2000). Representing the command andcontrol process in simulation models of conflict.Journal of the Operational Research Society,51(4), 431-439. Available with registration athttp://www.palgrave-journals.com/jors/

Moffat, J. (2002). Command and control in theinformation age—Representing its impact.Norwich, United Kingdom: The StationeryOffice. Available from www.tso.co.uk

National Research Council. (1998). Modeling humanand organizational behavior—Application tomilitary simulations (ISBN 0-309-06096-6).Washington, DC: National Academy Press.Available from http://www.nap.edu

Noble D., & Flynn, W. (1993). Recognized primeddecisions (RPD) tool concept document.Vienna, VA: Engineering Research Associates.

Page 163: NATO Code of Best Practice for C2 Assessment

160 NATO Code of Best Practice for C2 Assessment

Noble, D., Flynn, W., & Lanham, S. (1993, June). TheTADMUS recognized primed decisions (RPD)tool: Decision support for threat assessmentand response selection. Paper presented at the10th Annual Conference on Command andControl Decision Aids.

Norman, D.A. (n.d.). Organization of action slips.Psychological Review, 88, 1-15.

Perry, W. L., & Moffat, J. (1997). Developing modelsof decisionmaking. Journal of the OperationalResearch Society, 48(5), 457-470. Availablewith registration from http://www.palgrave-journals.com/jors/

Perry W. L., & Moffat, J. (1997). Measuring consensusin decisionmaking: An application to maritimecommand and control. Journal of theOperational Research Society, 48(4), 383-390.Available with registration from http://www.palgrave-journals.com/jors/

Perry W. L., & Moffat, J. (1997). Measuring the effectsof knowledge in military campaigns. Journal ofthe Operational Research Society, 48(10), 965-972. Available with registration from http://www.palgrave-journals.com/jors/

Poel, I. (1998). Changing technologies. A comparativestudy of eight processes of transformation oftechnical regimes. Dissertation. Enschede, theNetherlands: Twente University Press.

Page 164: NATO Code of Best Practice for C2 Assessment

161Chapter 6

Rip, A. (1995). Introduction of a new technology:Making use of recent insights from sociologyand economics of technology. In Technologyanalysis & strategic management. JournalsOxford, 7(4), 417-430.

Schot, J., & Rip, A. (1996). The past and future ofconstructive technology assessment. InTechnological forecasting and social change,54, 251-268. New York: Elsevier Science.

Schultz, J. V. (n.d). A framework for militarydecisionmaking under risks [Prospect theoryrelated to military decisionmaking].

Sherrill, E. T., & Barr, D. R. (1996). Exploringrelationships between tactical intelligence andbattle results. Military Operations Research, 2(3).Available to order from http://www.mors.org/publications/mor/vol02.htm

Van Creveld, M. (1985). Command in war. Cambridge,MA: Harvard University Press.

Van Lente, H. (1993). Promising technology; Thedynamics of expectations in technologicaldevelopments. Unpublished doctoral dissertation,Twente University Press, The Netherlands.Available from [email protected]

1Land warfare with one commander, air warfare with another,each reporting directly to a joint commander.

Page 165: NATO Code of Best Practice for C2 Assessment

163

CHAPTER 7

Scenarios

Before beginning operations you must,without any indulgence or self-deception, examine objectively everystep that the enemy might undertaketo thwart your plan, and consider ineach conceivable case what meansare open to you to fulfill your goal. Themore you anticipate the difficulties inadvance, the less surprised you will beshould you encounter them during thecampaign. Besides, you have alreadythought about these obstaclesdeliberately, and with composure you

Page 166: NATO Code of Best Practice for C2 Assessment

164 NATO Code of Best Practice for C2 Assessment

have perceived the means of avoidingthem, so nothing can surprise you.

—Frederick the Great

Plans that assume the likelihood ofone particular world run the risk ofbeing seriously wrong.

—James A. Dewar, Carl H. Builder, William M. Hix,Morlie H. Levin from Assumption-Based Planning—

A Planning Tool for Very Uncertain Times

Purpose of Scenarios

The report of the NATO Panel 7 Ad Hoc WorkingGroup on the Impact of C3I on the Battlefield gave

extensive consideration to the role of scenarios inoperations analysis (OA). This chapter builds on thatmaterial to discuss the role of scenarios in Commandand Control (C2) analysis. Figure 2.2 in Chapter 2shows the role played by scenarios in the overall C2assessment process. The analysts craft a set ofscenarios to provide the context or environment forthe conduct of the operational analysis. The scenariosbound the arena of the analysis and are used by theanalyst to focus the analysis on central issues.

Definitions

These definitions are of particular relevance to thisCode of Best Practice (COBP):

Scenario

A description of the area, the environment, means,objectives, and events related to a conflict or a crisis

Page 167: NATO Code of Best Practice for C2 Assessment

165Chapter 7

during a specified time frame suited for satisfactorystudy objectives and the problem analysis directives.

As described in Chapter 4, scenarios consists of fourelements—a context (e.g., a characterisation of ageopolitical situation), the participants (e.g., intentions,capabilities of blue, red, others), the environment (e.g.,natural—weather and manmade—mines), and theevolution of events in time. In C2 assessments, thepurpose of scenarios is to ensure that the analysis isinformed by the appropriate range of opportunities toobserve the relevant variables and their interrelationships.

Approved Scenario

In order to support the analysis, the use of an approvedscenario is more relevant than to analyse a currentsituation or future plans. The impact of all the elementsis simpler to analyse.

In some countries these scenarios are mainlydeveloped by a Strategic Committee and are designedto meet the government strategy. The first step is toimplement a steering group, composed ofrepresentatives of the government and members ofthe Armed Forces, which has the responsibility todescribe all the commitments or the precise regionsin which any major event may have an impact on theforeign politic or on the economy. This basic work isthen forwarded to a larger group or a next military levelwhich shares the responsibility for implementing themain recommendations in a more strategic field. Thesescenarios specify the elements that must be kept inmind in the generic planning process or in theoperational studies in relation with the use of military

Page 168: NATO Code of Best Practice for C2 Assessment

166 NATO Code of Best Practice for C2 Assessment

forces. These scenarios also describe the global threat,the geographic areas, the political and militaryobjectives and the level of force commitment.

With this basic approach done, the analyst or hismilitary counterpart has only to adapt those guidelines,to compare his study to the scenarios and to proposethe more relevant alterations to his sponsor and hisassessment team (See Assessment Team paragraphin Chapter 2).

At this stage the decision to keep this scenario, todiverge or to choose another one will be discussedamong the stakeholders. The products of this processare called approved scenarios.

After that, at any time the analysts may developvignettes using the approved scenarios as a base inorder to focus on a specific issue. These vignettescan be used as small scenarios to explore a particulartopic. The use of prepared scenarios must not beconsidered as a limitation for customers or analystsbecause all the data could be adapted by the customeror the assessment team, the only limitation is to avoidthe re-use of this particular scenario for a study not inclose relation with the original one.

Planning Scenario

A planning scenario is one in which these elementsare defined:

• Time frame of the analysis;

• Geographic area;

• Meteorological environment;

Page 169: NATO Code of Best Practice for C2 Assessment

167Chapter 7

• Political, historical, economic, and social context;

• Mission objectives and constraints;

• Level of threat;

• Friendly forces (and links between eachmembers of the coalition);

• Adversary forces (e.g., enemy order of battle);

• Neutral or uncommitted forces;

• Non-combatant (non-government organisations[NGOs], international organisations, etc.) andother relevant actors; and

• Media (when relevant).

Operational Scenario

An operational scenario contains additional detailsespecially with respect to threats, orders of battle,tactics, rules of engagement, courses of action,deployment, end state and reserves.

Vignette

The term “vignette” is sometimes used for a scenariothat is not approved. The term is also primarily usedfor smaller scenarios, particularly as excursions fromthe main scenario.

Role of Scenarios in C2 Analysis

In general, the ideal OA is scenario independent. Allrelevant factors can be identified and dealt with

Page 170: NATO Code of Best Practice for C2 Assessment

168 NATO Code of Best Practice for C2 Assessment

empirically and algorithmically across a range of militarycontexts. However, C2 involves human behaviour,organisations, missions, and other complexphenomena. Human behaviour is very difficult to putinto equations (see “Human Decisionmaking” in Chapter6). There is no single linear dimension for organisationsor human issues. Moreover, military missions do notform simple dimensions. Therefore, for most C2analyses, the context must be defined. This is the rolefor which the analyst defines the scenarios.

The formulation of the original problem dictates thecontents of the scenarios. There are no overallscenarios that are independent of a specific problem.Scenarios are never truly generic, but rather arecustomised if only by the assumptions built into them.The boundaries of the scenario space should bedefined in part by the issues unique to the problemunder analysis.

Organisational issues include the involvement ofvarious levels of military and non-military hierarchies,including different command levels. This requires thatscenarios accommodate analysis across differentechelons of command. Information processing and thecharacteristics of information must also beaccommodated. Human factors include thedecisionmaking process and supporting staff activities.

The analyst will need to design or select scenarios toaddress C2 under a broad range of circumstances.This taxonomy of C2 analysis might include:

• Defense-planning;

• Force structure and organisation;

Page 171: NATO Code of Best Practice for C2 Assessment

169Chapter 7

• Mission analysis;

• Doctrine/tactics development;

• Cost-benefit/effectiveness analysis;

• Training and education;

• Balancing C2 systems and weapon/sensor systems;

• C2 system procurement, which will often requiremore detailed, task specific scenarios to coverthe range of relevant system uses; and

• Non-combatant actor’s access to C2 systemsand integration into the information flow, typicallythrough liaison officers, but increasingly througha variety of other means.

In essence, the role of a scenario is to define a set ofconditions and restrictions to enable “good” analysisas well as to create a structure within which the resultsof the analysis can be understood and interpreted.

Understanding and Interpreting theResults of Operational Analysis

The analyst uses scenarios to understand and interpretthe value of OA study results for the focus of theanalysis. The scenarios provide the context in whichthe C2 system will be assessed. It should reflect thescenarios envisaged by the originator of therequirement for the system. Often artificial constraintsmust be introduced into the scenarios, due to costconsiderations, to properly focus the analysis, or both.The scenario developer must have an appreciation ofthe objectives of the simulation, experiment, or

Page 172: NATO Code of Best Practice for C2 Assessment

170 NATO Code of Best Practice for C2 Assessment

exercise analysis plan in order to determine the artificialconstraints necessary to facilitate the analysis. Theanalyst needs to be aware of scenario assumptionsand artificial constraints.

There are several essential questions in C2 analysisthat should be addressed in the scenario considerations:

• operational benefits of C2 to be translated inthe definitions of Measures of PolicyEffectiveness (MoPE) and Measures of ForceEffectiveness (MoFE);

• required or desired performance thresholds andnominal Measures of Merit (MoM) values; and

• the impact of an improved volume, accuracy, and/or quality of information on the final outcome.

Developing and Specifying Scenarios

Prerequisites in Scenario Definition

Several prerequisites are essential before usingscenarios for C2 analysis:

• Approval: the analyst should strive for thecreation of a family of approved scenarios. Increating a family of approved scenarios, whichreflect the mission objectives and forcecapabilities and cover all significant warfareareas, the analyst facilitates the scenariodevelopment process to a great extent, becausereferences to basic assumptions and conditionscan be made. This will also increase the validityof the analysis in the eyes of the client and

Page 173: NATO Code of Best Practice for C2 Assessment

171Chapter 7

facilitate comparison of the results from differentstudies and analyses that use the sameapproved scenarios;

• Breadth: a scenario should reflect those factorsthat are hypothesised to have a significantimpact on C2 issues;

• Capability: a scenario should stress C2capabilities, including human and organisationalfactors (military and/or civilian) whereappropriate; and

• Credibility: scenarios should include logicalassumptions about the environment under analysis.

Scenarios should represent plausible real worldsituations. The synthetic scenario environment shouldbe consistent across OA studies. The scenarios willgain credibility if a broadly based scenario team isinvolved in the process from the outset. This teamshould include a variety of perspectives and expertise,such as:

• OA analysts for defining the required scenarioinformation, to avoid biasing the analysis byselecting an inappropriate scenario space, andfor working out the process of framing of thescenario space;

• Defence concept planners to propose optionsand highlight the critical factors;

• Policy makers to ensure that the strategicdecision points and the alternative options foreach of these decision points are clear andconsistent with Defence Policy; and

Page 174: NATO Code of Best Practice for C2 Assessment

172 NATO Code of Best Practice for C2 Assessment

• Subject matter experts to credibly explore the range ofpossibilities (scenario space) and foster discussions.

C2 Organisation Infrastructure andOperating Environment

Organisation infrastructure and environment are oftenpre-set conditions and not the subject of the study. Theyinclude C2 concepts of operation, decision hierarchy ofthe units under consideration, degree of technologicalcompetence relative to that of the adversary,requirements or objectives placed upon the system interms of speed, accuracy, flexibility, etc., and theimpacts of terrain, weather, and adversary activities.

C2 Processes

The scenarios need to provide for the realisticexecution of the C2 processes. These include the spanof control of the various military command levels andcivilian authorities, information management schemes,information flows, the elements of the decision cycle,the decision processes (course of action development,planning, directing), and the communicationsprocesses and capabilities (data update rates,throughput, reliability, accuracy, etc.). All too oftenthese issues are characterised by simple performanceindicators and not examined in detail. They may alsobe an important subject of the study. These factorsneed to be explicitly built into the scenarios.

C2 Systems

Characteristics of C2 systems are directly related tosystem improvements. They include system

Page 175: NATO Code of Best Practice for C2 Assessment

173Chapter 7

performance parameters, command and controlinformation systems (CCIS), data availability,intelligence functions (fusion, correlation, aggregation,etc.), surveillance, targeting and acquisition (STA),communications systems, throughput, and so forth.

Human Factors

C2 studies are complex in nature. One of thecomplicating factors is the involvement of humanbeings and their interpretation of a situation, order, orrule of engagement. These factors can be covered bythe “aggregation/de-aggregation” phenomena in thecommand chain. Human factors have to be includedin the analysis and modelling activities, but guidelineson how to integrate the “human in the loop” are partlydefined in the scenarios. (Chapter 6—HumanDecisionmaking discusses these in some detail.)

Miscellaneous

C2 studies usually cover various levels of hierarchy.However, the nature of C2 issues does not materiallychange for the various command levels. The analystmay need to perform a cost/benefit analysis on theinclusion of lower level C2 issues in a closed simulationmodel. The analyst needs to consider to what extentthese issues (e.g., performance of a logistic informationsystem) need to be converted to enabling factors (e.g.,sustainability, operational delays) at the higher levels.

These issues are mentioned to illustrate that in the scenariodefinition a great deal of attention is required to ensurethat the scenario enables the proper C2 issue to beaddressed in problem definition. The elements are often

Page 176: NATO Code of Best Practice for C2 Assessment

174 NATO Code of Best Practice for C2 Assessment

dependent on each other. For example, someshortcomings in C2 systems can be compensated for byalterations in C2 processes. Similarly, inefficiencies in C2processes can be met by an adaptable C2 organisation.The relationships between these issues should berecognised and taken into account in the scenarios.

Approach to Scenario Development

This section describes a framework for the definitionof a scenario. Then, based on this framework, somespecific aspects of actually using scenarios for C2analysis are addressed.

Scenario Structure

The general scenario framework developed from theNATO Panel 7 Ad Hoc Working Group on the Impactof C3I on the Battlefield has been adopted (Figure 7.1).

At the first two levels, a description of the externalfactors and the capabilities of the actors, includingnational security interests; the political, historical, andmilitary situation; and the acting assumptions,boundary conditions, and limitations related toadversaries, threats, risks, coalition partners, warfaredomains etc. are given. Very often a reference to anapproved scenario will suffice.

At the third level, the mission environment is defined.Whether it is a generic, virtual geographic environmentor a specific geographical area is not important. What isessential is that the mission environment be addressed.

Page 177: NATO Code of Best Practice for C2 Assessment

175Chapter 7

Figure 7.1. The Scenario Framework

The intermediate level is the most challenging one. Theactual military problem has to be projected on themission, the military forces and capabilities, the civiliancapabilities, and the resources available. A scenariomust be developed by coherently aggregating a numberof components or dimensions with their attached values,taking into account of the problem formulation. It shouldaddress at least these components:

• Geopolitical situation, including historical aspects;

• Geographical area;

� Availability/usability of civil infrastructure;

� Terrain and climate;

• Political/economical/military objectives;

� Level of violence, type of warfare areas, andpreparation times;

• Mission context and objectives;

� Mission tasks and goals;

Page 178: NATO Code of Best Practice for C2 Assessment

176 NATO Code of Best Practice for C2 Assessment

� National contributions and roles within thecoalition;

� Order of battle;

� Doctrines, procedures, (range of acceptable)rules of engagement, and concepts of operation;

� Temporal factors (e.g., anticipated durationof operations);

� Desired end states;

• Opposition/threat/risks;

� Adversary forces and their organisation;

� Other actors (neutral and uncommittedforces, non-combatants, refugees, IOs,NGO’s, Red Cross, etc.);

� Level of threat and risk;

� C2 structures;

� Interaction between friendly, adversary, andother information systems; and

� Assumptions/hypotheses/axioms about level oftechnology, impact on information defence, etc.

The actual military problem will be placed in the contextof the friendly and adversary military forces involved, e.g.:

• Force organisation, C2 structure, force components;

• Doctrines, tactics, rules of engagements;

• Courses of action;

• Information systems; and

Page 179: NATO Code of Best Practice for C2 Assessment

177Chapter 7

• Logistics.

The framing of scenarios begins with the identification ofthe key dimensions relative to the problem beingaddressed. The search of these dimensions is delicateand requires thorough reflection, for instance, on theresults of a structural analysis. Once identified, each keydimension or factor must be characterised by a range ofpossible values or sectors, which enables the set up ofthe sector-factor matrix. This sector factor matrix is thenanalysed through a morphological analysis (constructionand reduction of the morphological space or space of“all possible scenarios”), which provides the user with aset of appropriate scenarios. The scenarios aredependent upon the problem and the objective of thestudy and can range from generic to very specific. Theanalyst will select the level of detail required to drive andfocus the model.

Using Scenarios in C2 Assessment

It has already been noted that a well-formulated OAproblem definition, guidelines and directives for howto approach the analysis should accompany ascenario. Emphasis in this COBP is given to C2elements in relation to:

• Mission scope: as stated before, one of thecharacteristics of C2 is that it can not be studiedin isolation. C2 is an integrating and enablingfactor. As a consequence there is a tendency toconsider the entire mission, covering all thearenas connected to it (from logistics tomanoeuvre, from artillery support to closecombat, from security/police issues to refugee

Page 180: NATO Code of Best Practice for C2 Assessment

178 NATO Code of Best Practice for C2 Assessment

management). Therefore, the typical missionscope for C2 analysis will be broad;

• Levels of hierarchy: the C2 chain is not limited toa special hierarchical level; information andcommand flows are running from the lowestlevels to the higher echelons and vice versa. Asa consequence, there is a tendency to cover awide set of hierarchy levels in considering a C2problem. Single layer analyses do not representthe dynamics of military problems adequately toanswer most C2 issues; and

• Aggregation/disaggregation: the aggregation ofdata flows, data fusion in support of intelligenceprocesses, merging C2 items to more abstractlevels, etc. is difficult but manageable. Theprocess of deleting, merging, and combininginformation is reasonably well understood. Theintegration of soft factors (e.g., human andorganisational) is less well understood and makesthe problem more difficult to simulate, study, oranalyse. In particular, the effects of interactionsbetween functional groups and echelons must beconsidered when decomposition is undertaken foranalytic purposes.

Mission Scope versus Levels of Hierarchy

Experience with prior modelling efforts indicates thatthe scenario developer needs to understand therelationships between the scope of the mission, thehierarchy assigned to conduct the mission, and thelevel of detail at which the echelon operates. In general,as shown in Figure 7.2, the broader the mission scope,

Page 181: NATO Code of Best Practice for C2 Assessment

179Chapter 7

the higher the echelon required to conduct the mission.Also, higher command levels tend to use informationat higher levels of aggregation, or less detail, than lowerlevels. The scenario developer should attempt tooperate in the shaded area of the diagram, matchingthe echelon levels to the proper aggregation level andmission scope.

Figure 7.2 Intersections in Hierarchy

In many complex problems, the analyst is required tosubdivide the problem into smaller parts, perhaps evenusing different models or model federations torepresent different levels. The scenario developershould work in conjunction with the analyst to ensurethat the hierarchical intersections and interactions (e.g.,organisation and infrastructure, C2 processes, and C2resources) are properly represented, that the modelsare consistent, and their inputs and outputs areproperly linked. Figure 7.3 graphically depicts this

Page 182: NATO Code of Best Practice for C2 Assessment

180 NATO Code of Best Practice for C2 Assessment

relationship, with the rectangles X, Y, and Zrepresenting different models.

Figure 7.3 Segmentation Hierarchy Range

Aggregation/De-Aggregation: Non-CausalityBetween C2 Issues At Different Levels

Analysis of C2 issues generally requires assessingthe effects of events and actions across commandlevels to determine causality between actions at oneechelon and events at another. For example, theanalyst will want to know if the capability to make fasterdecisions at battalion level has an impact at brigadelevel or higher. Some C2 items have an impactthroughout the entire hierarchical range, some affectonly one other level, and some are purely local, i.e.,level-specific. The scenario developer will need to beaware of the analyst’s requirements in this area in orderto design the proper linkages between events.

Page 183: NATO Code of Best Practice for C2 Assessment

181Chapter 7

Additional Aspects

Additional Areas of Consideration:

• Specification: the purpose of the C2 analysis willinfluence the kind of scenarios to be used. Thereare no universal generic scenarios for C2analysis. Some nations have “validated”scenarios that can be taken off the shelf andmodified to support particular analyses;

• Merging mission operations areas: mission areasfor the various levels of intensity of conflict andvarious level of civilians involvement are notdiscrete, and it may be necessary to includeelements of more than one type into a scenario;

• Decomposition: sometimes it might be useful todecompose a scenario into two or more detailedscenarios that each deal with a certain subset ofC2 issues. This is more or less analogous to thedecomposition into one or more missionoperations areas. It may be necessary to addsome vignettes to look in greater detail such thatall C2 issues of interests are covered and thatthe whole range of relevant OOTW is explored;

• Adversary-friendly interaction: the attention givento adversary and friendly C2 processes shouldbe balanced in those cases where adversaryactivities are germane to the problem, especiallyif counter C2 (information defence) is part of theanalytic focus;

• Assumptions and guidelines: the scenarios arepart of the process in developing an analysis (See

Page 184: NATO Code of Best Practice for C2 Assessment

182 NATO Code of Best Practice for C2 Assessment

Figure 2.3 in Chapter 2). In the scenario phase, allscenario assumptions, guidelines, and boundariesfor the study should be revisited. Each study isexecuted within the framework of the scenario,and therefore the study findings are only validwithin the limitations of the various assumptionsand artificial constraints of the scenario;

• Traceability: analysts should understand whichscenario assumptions and/or boundaryconditions are driving factors in the analysis. Adetailed description of past use of scenariosshould be maintained on a national level in orderto avoid duplication. Such a repository shouldalso contain Verification, Validation, andAccreditation (VV&A) information on thescenarios; and

• Awareness: create awareness of the robustness ofthe overall conclusions and decisions and beaware of the degrees of uncertainty in the scenario.

Conclusions

• Using a scenario for C2 analysis is only one partof a larger analytical methodology. The contextprovided by the scenario impacts in other areas,and the scenario in turn is affected by thosesame areas;

•Six prerequisites should be in place before usinga scenario for C2 analysis:

� It should be approved for the assessment;

� It should reflect the factors that havesignificant impact on C2 needs;

Page 185: NATO Code of Best Practice for C2 Assessment

183Chapter 7

� It should stress C2 issues;

� It should be militarily credible;

� It should be credible in terms of civil-military objectives;

� It should facilitate the design process;

• At least three C2 elements should be reflected in ascenario in order to make it useful for C2 analysis:

� The C2 organisation and infrastructure,including human issues;

� The C2 processes;

� The C2 systems;

• Scenario guiding directives should indicate howthe scenario has been used in a hierarchy ofscenarios (interpretation of input and outputevents, etc.);

• The actual C2 analysis problem usually will bebroader in scope than OA will allow. HenceScenario Analysis in combination with militaryand civilian judgement (including lessonslearned) must bridge this gap; and

• Analysts need to use multiple scenarios; nosingle scenario is sufficient.

Recommendations

Practice

• Organise a set of scenarios and vignettes thatallow the analysis to cover or sample theinteresting problem space for the C2 analysis;

Page 186: NATO Code of Best Practice for C2 Assessment

184 NATO Code of Best Practice for C2 Assessment

• Create a (national) base of approved scenariosand vignettes reflecting the civil-militaryobjectives within the national hierarchy ofoperations and thus the required spectrum ofmilitary missions including OOTW capabilities;

• Explicitly identify and describe the scenarios priorto the execution of a study. However, it might benecessary to revisit the scenario definition duringthe conduct of the study;

• Information and hypotheses on threats,adversary forces, and non-combatants should beaddressed in the scenario;

• Explicitly identify the C2 aspects underconsideration within the problem definition; and

• During the analysis, the key scenario assumptionsshould be identified and documented.

Challenges

• Standards for judging the applicability andaccreditation of (existing) models should bedeveloped; and

• For coalition C2 assessments the scenarios shouldbe developed or adapted by teams includingrepresentatives from all participating nations.

Chapter 7 Acronyms

C2 – Command and Control

CCIS – Command and Control Information Systems

COBP – Code of Best Practice

Page 187: NATO Code of Best Practice for C2 Assessment

185Chapter 7

IO – International Organisation

MoFE – Measures of Force Effectiveness

MoM – Measures of Merit

MoPE – Measures of Policy Effectiveness

NGO – Non-Government Organisation

OA – Operational Analysis

STA – Surveillance Targeting and Acquisition

VV&A – Verification, Validation, and Accreditation

Chapter 7 References

Almèn, A., DeBayser, O., Eriksson, P., Jarlsyik, H.,Lascar, G. & Ritchey, T. (1999). Models forevaluating peace support operations. A FOA—DGA/DSP /Centre d’Analyse de Défense JointWorkshop.

Coyle, R. G., & Yong, Y. C. (1996, April). A scenarioprojection for the South China Sea. Furtherexperience with field anomaly relaxation.Futures, 28(3).

Entretiens Sciences & Défense. (1998). Nouvellesavancées scientifiques et techniques, tome 1[New scientific and technical advances, Volume1]. DGA/CHEAr/D.R.E.E.

Godet, M. (1998). Manuel de prospective stratégique,tome 1, Une indiscipline intellectuelle, tome 2,L’art et la méthode [Strategic prospectivehandbook, Volume 1. An intellectual

Page 188: NATO Code of Best Practice for C2 Assessment

186 NATO Code of Best Practice for C2 Assessment

indiscipline, Volume 2. In The method and art].Paris: Dunod.

Jantsch, E. (1968). La prévision technologique [Thetechnology forecast]. Paris: OCDE.

Schwartz, P. (1993). La planification stratégique parscénarios [Strategic planning by scenario].Futuribles, 176. Available from: http://www.futuribles.com

Starr, S. H. (1996). Developing scenarios to supportC3I analyses. Proceedings of the CornwallisGroup, Pearson Peacekeeping Center,Clementsport, Nova Scotia, Canada.

Page 189: NATO Code of Best Practice for C2 Assessment

187

CHAPTER 8

Methods and Tools

The reasonable course of action in anyuse of arms starts with calculation.Before fighting, first assess the relativesagacity of the military leadership, therelative strength of the enemy, the sizeof the armies, the lie of the land, andthe adequacy of provisions. If you sendtroops out only after making thesecalculations, you will never fail to win.

—Liu Ji, 1310-1375, Lessons of War

War is essentially a calculation ofprobabilities.

—Napoleon

Page 190: NATO Code of Best Practice for C2 Assessment

188 NATO Code of Best Practice for C2 Assessment

If the only tool that you have is ahammer, you tend to see everyproblem as a nail.

—attributed to a Harvard anthropologist

The purpose of this chapter is to consider the bestmethods for representing Command and Control

(C2) systems, processes, organisations, and theirinteraction in order to support assessments of C2 overthe full spectrum of operations, to include OperationsOther Than War (OOTW). The key objective is toestablish an ‘audit trail’ from data or informationcollected, through its processing, presentation,dissemination, and use to the performance of C2processes and organisation as well as to high-levelmeasures of their effects on battle or operationsoutcome.

This chapter of the revised COBP extends the code tocover method and tool considerations regardingOOTW. The material in this chapter is a distillation ofthe best approaches and ideas being considered incurrent NATO research for representing C2 across thefull spectrum of operations in all models/tools.

Types of Methods and Tools

This chapter covers all tools (simulations or otherquantitative or qualitative techniques), whether usedfor analysis, training, or operational purposes that canbe used to assess C2 processes, performance, andeffectiveness. Available methods and tools can becategorised into four distinct groups:

Page 191: NATO Code of Best Practice for C2 Assessment

189Chapter 8

• Data collection/generation: methods and toolsused to either collect or generate subjective orobjective data for subsequent analysis from live,virtual, or constructive sources, whether past,present, or future;

• Data organisation/relationship: methods andtools used to organise data in some logical way,or used to establish relationships between data.These methods and tools, rather than providinga mathematical solution to problems, tend to bemore qualitative, subjective, and exploratorytechniques based on expert opinion, judgement,and interaction, whether obtained directly orthrough role playing. Although, in some cases,these tools/techniques may totally solve theproblem at hand, more often they will illuminateother associated or sub-problems, determineareas for further analysis, or provide expert input“data” for more quantitative “solving” tools;

• Solving: methods and tools which have beentypically associated with operations research,business, mathematics, computer science,information science, engineering, ormanagement science which tend to bequantitative in nature and which usually consistof techniques providing mathematically derivedsolutions, even if the data analysed is subjectivein nature; and

• Support: methods and tools used to collect,organise, store, and explore typically large setsof empirical data.

Page 192: NATO Code of Best Practice for C2 Assessment

190 NATO Code of Best Practice for C2 Assessment

Table 8.1 provides some recommended methods andtools that fall into each category. The intent here isnot to provide an exhaustive list or to debate into whichcategory a specific method or tool should be included,but to give the range of methods and tools availableto the analyst for C2 assessment. While the range andscope of potential methods and tools is broad, clearlythe emphasis of this chapter of the COBP, and of theanalysis community at large over recent years, is onconstructive modelling and how best to enhance,orchestrate, and apply it to the assessment of C2impacts on battle/operations outcome.

Table 8.1. Some Recommended Tools by Category Type

Page 193: NATO Code of Best Practice for C2 Assessment

191Chapter 8

Issues

The NATO analytic community has had manychallenges in the past analysing the effectiveness ofC2 related systems and determining what it is that setsit apart from other types of operational analysis. Theadded complexity and number of confounded variablesin OOTW make analysis of these operations evenmore challenging. The key to the problem, no matterwhere on the spectrum of conflict the analysis islocated, lies in making a properly quantified linkagebetween C2 Measures of Performance (MoP), suchas communication system delays, C2 Measures ofMeasures of C2 Effectiveness (MoCE), such asplanning time, and their resultant impact on higher levelMeasures of Force Effectiveness (MoFE) or Measuresof Policy Effectiveness (MoPE), which capture theeffects on battle or operations outcome. These higherlevel MoFE/MoPE are required in order to be able totrade off investment in C2 systems against investmentin combat systems such as tanks or aircraft. At present,there is no routine way of making this linkage, nor anyone tool that can be applied to generate the requiredmeasures. Hence, all analyses of C2 issues demanda high level of creative problem structuring andapproach, and selection and application of a range ofavailable analysis tools, to overcome this challenge.

The range of issues is very broad and challenging.Particularly challenging are the following issues thatwill be addressed in more detail later in this chapter:

• Representation of human behaviour (e.g. rule-based, algorithmic, or “human-in-the- loop”);

• Homogeneous models versus hierarchies/federations;

Page 194: NATO Code of Best Practice for C2 Assessment

192 NATO Code of Best Practice for C2 Assessment

• Stochastic versus deterministic models;

• Adversarial representation;

• Verification, Validation, and Accreditation(VV&A); and

• The conduct of sensitivity analysis and otherways of dealing with uncertainty.

Of particular importance for analysis of OOTW:

• Selecting an orchestrated set of tools thatgenerate the required MoM;

• Scoping the analysis considering tool availability;

• Considering the human dimension in toolselection early in the process; and

• Ensuring tools selected have the trust andconfidence of the decisionmaker(s).

Representation of Human Behaviour:Rule-Based, Algorithmic, or “Human-in-the Loop”

In developing methods and tools that represent theprocess and performance of C2 explicitly, mostapproaches until very recently have been founded onthe artificial intelligence (AI) methods of expertsystems. These represent the commander’sdecisionmaking process (at any given level ofcommand) by a set of interacting decision rules. Theadvantage of such an approach is that it is based onsound AI principles. However, in practice it leads totools which are large, complex, and slow. The decision

Page 195: NATO Code of Best Practice for C2 Assessment

193Chapter 8

rules themselves are, in many cases, very scenariodependent and, as noted in Chapter 6 – HumanDecisionmaking, human factors and organisationalexpertise may be needed on a project team to treatthese issues correctly.

These factors were not a problem when the Cold Warprevailed. There was sufficient time to completeextended analyses, and one key scenario dominated.However, in the post-Cold War environment, suchcertainties have evaporated. Indeed, uncertainty is nowone of the key drivers of analysis. There is anincreasing requirement to consider large numbers ofscenarios and to perform a wide range of sensitivityanalyses. This has led to a requirement for ‘lightweight,’fast running tools, that can easily explore a wide rangeof scenarios, yet still appropriately represent C2. Somehave begun to explore advanced algorithmic toolsbased on Bayesian mathematics, catastrophe theory,and complexity theory. Such approaches to therepresentation of C2 are at the core of a newgeneration of closed form constructive simulationmodels that are beginning to be used for analysis.(Moffat, 2002; Moffat, 2000).

Many analyses employ “human-in-the-loop” techniquesin order to ensure realistic human performance or tocheck assumptions and parameters. However, “human-in-the-loop” techniques are expensive and require theinclusion of soft factors and their attendant MoM. Theintroduction of “human-in-the loop” introduces a sourceof variances and uncertainty. The increased cost,complexity, and uncertainty of “human-in-the-loop”methods often requires analysts to limit the use of these

Page 196: NATO Code of Best Practice for C2 Assessment

194 NATO Code of Best Practice for C2 Assessment

techniques rather than employ them as the primaryanalytical method.

Homogeneous Model-Tools versusHierarchies/Federations

In order to build an “audit trail” that traces theinterrelationships among individual C2 systems,processes, and organisations, as well as their impactson mission operational outcomes, there is a need torepresent the key detailed processes involved, such asthe transmission of communications across thebattlefield and the impact of logistics on decisionmaking.Taking this as an example, the question then arises asto whether all the transmission media (radio, satellites,etc.), with their capacities, security level,communications protocols, etc., should be representedexplicitly, or whether these details should be split out ina supporting model. Similarly, the details of logisticscould be undertaken as part of the main model or in aspecialised supporting model. Supporting models couldbe run off-line, providing sets of input data to the mainmodel (giving rise to a model hierarchy) or they couldbe run in real time interaction with the main model (givingrise to a model federation). In the off-line mode, themain model would generate demands on thecommunications and logistics systems. The supportingmodels would check if these demands could besatisfied. If not, communication delays and logisticsconstraints in the main model would be increased, andthe main model re-run. This would have to be done anumber of times to bring the main and supportingmodels into balance. However, such an approach cangenerate valuable analytical insights.

Page 197: NATO Code of Best Practice for C2 Assessment

195Chapter 8

Figure 8.1 shows the main model-tool producing (inaddition to its MoFE) a detailed set of dynamicdemands on the communications (such as capacityrequired of different communications systems as afunction of simulated time), and logistics processes(demands for transport and key consumables), in orderto achieve the assessed levels of MoFE. These arethen fed back into detailed model-tools of thecommunications and logistics infrastructure. Thosesupporting model-tools can then be matched againstthe dynamic demand placed on the communicationsand logistics infrastructure to the available capacity. Ifthere is a mismatch, the assumptions in the mainmodel-tools are adjusted iteratively to bring the twomodel-tools into balance. This approach is moreflexible and reactive for a large set of C2 assessments.However, this approach increases the complexity ofthe architecture (number of linked sub-processes,failure of the sub-model, etc.)

Figure 8.1. Model Linkage

Page 198: NATO Code of Best Practice for C2 Assessment

196 NATO Code of Best Practice for C2 Assessment

A similar approach can be applied to concepts ofoperation. In some models, it is possible to describe aconcept of operations as a sequence of standardmissions (e.g. attack, defend, move). These missionscan then be analysed to determine the demands theyplace on the supporting infrastructures. This can betested off-line to see if the infrastructure can cope.Again, this would have to be iterated a number of times,but provides an ability to relate, in an understandableway, the infrastructure capacity to its ability to supporta defined concept of operations (and hence battleoutcome). In addition to the use of such hierarchies ofsupporting models in an off-line mode, it is possible tocreate real-time federations of such models torepresent, inter alia, combined or joint operations.

Stochastic versus Deterministic Models

Stochastic and deterministic models differ in how theytreat variables. Stochastic models incorporate theattributes of the probability density (or distribution)functions associated with model variable into run timecalculations. The actual values of model variables willtherefore differ each time the model is run. Thus, theoutput of a stochastic model will yield different resultseach time even when a set of inputs is “fixed.”Deterministic models utilise point estimates for thevalues of model variables, and hence for any givenset of inputs the model will produce a single output.

Stochastic and deterministic models each have theiradvantages. The selection of which to use or how toemploy each type of model in an assessment dependsupon the nature of the problem and other elements ofthe solution approach. Whether the model is time step

Page 199: NATO Code of Best Practice for C2 Assessment

197Chapter 8

driven or event sequence driven will also impact onthe selection of the appropriate models(s).

Deterministic Models

The merits of a deterministic approach are that run-timesare reduced, and there is a single ‘thread’ connectingthe input data and the results, making the analysis of thetool output potentially easier. The justification for usingdeterministic (expected value) tools is usually based onthe assumption that, due to the large number of stochasticprocesses involved in combat on higher levels, theirresults converge rather quickly to the mean valuesobtained from stochastic models.

Stochastic Models

Stochastic models do provide significantly moreinformation than deterministic tools. They are deemedto be indispensable for an assessment of therobustness of the results as well as providing the “data”needed for risk analysis. These additional benefitscome at the cost of significantly higher run timerequirements and a more complex analysis task.

Chaos theory shows that structural variance (or‘deterministic chaos’) can occur when sets of decisionrules interact in the simulation of a dynamic process.Small changes in initial conditions can lead to verydifferent trajectories of system evolution. Anysimulation model of combat, with a representation ofC2, has to face this kind of problem. The merits of adeterministic approach are that run times are reducedand there is a single ‘thread’ connecting the input dataand the results, making analysis of the model output

Page 200: NATO Code of Best Practice for C2 Assessment

198 NATO Code of Best Practice for C2 Assessment

potentially easier. However, the representation of theC2 process (whether using decision rules or not) givesrise to a number of alternative decision options at anygiven moment, and can thus potentially give rise tosuch ‘deterministic chaos’. If such effects are likely toarise, one solution is to use stochastic modelling. Theuse of stochastic sampling in the model, together withmultiple replications of the model, gives rise to adistribution of outcomes, which is much more resistantto such chaotic effects.

US Army Training And Doctrine Command (TRADOC)has been experimenting with Deterministic CombatModels. “A potential alternative solution, when the issueunder study warrants, is to conduct analysis of multipleruns of a deterministic model where the initial states ofinformation systems are varied.” (Bailey, 2001).

Representing Adversary Forces

Historically, adversary capabilities and behaviourswere often fully scripted or heavily constrained. Thiswas more appropriate in Cold War contexts than it istoday. However, it was never ideal for C2 analysisbecause the dynamic interaction among friendly,adversary, and other forces is a critical element of C2representation. Today, much more robust adversaryrepresentation of operational capabilities and choicesare employed and indeed are necessary. Analystsmust consider not only a range of scenarios, but alsothe range of possible adversary actions and reactions.

Page 201: NATO Code of Best Practice for C2 Assessment

199Chapter 8

Verification, Validation, andAccreditation (VV&A)

VV&A has historically been a challenge for modeldevelopment efforts, but is particularly challenging forC2 modelling. This is due to the variability inherent inmost C2 processes, especially those that involve thehuman aspects of information processing anddecisionmaking. The approach to VV&A needs to becarefully considered, particularly in light of the need toassess future C2 systems and capabilities in associationwith new concepts of operation, new organisationalforms, new doctrine and asymmetrical adversaries.

Selecting an Orchestrated Set of Tools

The natural tendency of an analyst is to simplify aproblem. Part of that simplification is to select a tool,preferably only one, which will meet the analysisrequirements. In the analysis of C2 of combatoperations, this may be possible if the analysis isproperly scoped. In the analysis of OOTW C2, theissues are typically too numerous, the variables tooconfounding and the scope too broad for one tool tosatisfy all analysis requirements. An orchestrated setof complementary tools will normally be required.

Scoping the Analysis ConsideringTool Availability

An analyst must always scope the analysis duringproblem formulation to enable it to be accomplishedwithin available resource constraints. During problemformulation, however, the consideration of available

Page 202: NATO Code of Best Practice for C2 Assessment

200 NATO Code of Best Practice for C2 Assessment

tools has typically not been a driving factor. Also, inthe analysis of C2 combat operations this may not bea problem because selection of the tool(s) to be usedis more obvious based on past experience. However,for OOTW C2 analysis, the availability of tools, andtheir orchestration, requires more consideration earlyin the process (i.e. during problem formulation).

Consideration of the Human Dimension

C2, by its very nature, is closely linked to humanbehaviour, and its analysis requires careful considerationand inclusion of the human dimension. Often, as a wayof simplifying the analysis, the C2 assessment teameliminates these considerations by assuming that humancommanders and their staffs are not affected by theirenvironment and will always make the best decision, andthe same decision, given the required information. Thisis unsafe and not good practice. This has been especiallytrue for the analysis of OOTW C2, even though thisanalysis is often more impacted by the human dimensionthan C2 for combat operations.

Ensuring Trust and Confidence in the Tools

Analysts select and apply the tools of their trade basedon what those tools can do for them in accomplishingtheir analysis objectives. Over time they becomecomfortable with certain tools and the customers fortheir analysis also develop a trust in the tools andconfidence that they will produce valid results for them.For OOTW C2 analysis, given that the tools are morenumerous, must be orchestrated to work together, andare sometimes unknown or not understood by the

Page 203: NATO Code of Best Practice for C2 Assessment

201Chapter 8

customer, the development of trust and confidence inthese tools is difficult to achieve.

New Methods and Emerging Practices

Given the inherent problems associated with the issuesdescribed previously, a best practice for the applicationof analysis tools for C2 analysis is still emerging. Thisbest practice is described below, first for tools ingeneral, then for models, both at the model level itselfand at the algorithm level.

Selection of Methods and Tools

The selection of tools to apply to C2 analysis shouldbe based both on evaluation of the candidate toolsthemselves against a set of evaluation criteria and onconsideration of the type of study to be undertaken.The evaluation criteria for tool selection includes criteriarelated to the functionality of the candidate tool and tothe performance of the candidate tool. The followingare established evaluation criteria for tool selection.They are as applicable to tool selection for OOTW C2as for combat C2 analysis.

• Functionality-related tool selection criteria:

� Resolution: the level of detail inrepresentation of entities within the tool;

� Completeness/scope: the extent to which thetool is able to address analysis issues;

� Functionality: the extent to which the toolrepresents the full range of functions;

Page 204: NATO Code of Best Practice for C2 Assessment

202 NATO Code of Best Practice for C2 Assessment

� Explicitness: the ability of the tool to explicitlyrepresent required entities;

� MoM Generation: the ability of the tool togenerate the MoM required;

� VV&A: the determination of whether the toolhas been verified, validated, and/oraccredited for it intended use; (Note: Not allNATO member nations recognise the term“Accreditation.” Accreditation here refers tosome form of formal approval to use themodel for the analysis intended.)

• Performance-related tool selection criteria:

� Responsiveness: the amount of time betweenrequest and receipt of information;

� Simplicity: the ease of preparation and usethe tools;

� Preparation/use time: the length of timenecessary to prepare and use the tool;

� Data availability and parameters: the ease inacquiring or generating the necessary data orparameters for tool use;

� Interoperability: the ability of the tool tointeroperate with other tools;

� Resource requirements: the amount ofresources (time, personnel, and funds)required; and

� Credibility: the extent to which the customersand users accept tool results.

Page 205: NATO Code of Best Practice for C2 Assessment

203Chapter 8

Using Models

The potential for exploiting recent advances inmathematics in order to create fast running model-tools was noted earlier. Such models have exploitedemerging approaches, such as complexity theory,chaos theory, catastrophe theory, and game theory inorder to produce a ‘good enough’ representation. Theycan be used to complement more complex, detailedmodels of the problem area. In many cases, a tailoringof models, or other tools, will be required to properlyaddress the analysis issues at hand. (Moffat, 2002;Moffat, 2000).

Model Federations

A number of new approaches share a key set ofcharacteristics. First, an object-oriented approachwithin the model-tool allows different objects to bebrought together to represent the complete commandprocess, rather like ‘Lego™ bricks.’ Such a philosophyalso encourages the development of model-toolsbased on holistic and evolutionary principles. In otherwords, always capture a complete model of theprocess, including the parts whose representation isstill unclear. As understanding develops, improve thoseparts (or objects) which were rudimentary at the start.At the next level up, the use of run-time interfacingallows different model-tools to be brought together tocreate a federation to represent the process understudy. This federation then may also have to beintegrated with the use of a mix of tools, which includetechniques other than modelling, to fully address thestudy issues.

Page 206: NATO Code of Best Practice for C2 Assessment

204 NATO Code of Best Practice for C2 Assessment

Agent-Oriented Modelling

A second key aspect is the description andrepresentation of the C2 process through agentmodelling and programming techniques. Modelling ofthe C2 process as a group of agents, based on artificialintelligence concepts, favours the capture of thecognitive nature of command tasks. Agents can beimplemented, in an object-oriented environment, aseither objects (e.g., actor or “applet” type of agents),or aggregates of objects (coarse-grain agents). Suchagents interact with each other through a messaginginfrastructure. The term “agent-oriented modelling” issuggested as a way of capturing this idea.

Linking of Performance Model-Tools toEffectiveness Tools

This third idea, used in a number of NATO countries,uses a structured hierarchy of model-tools to createan audit trail from C2 systems, processes, andorganisations through to battle operations outcome.The idea is to create supporting performance levelmodel-tools of particular aspects of the process (e.g.,communications, logistics) which can be examined atthe performance level. These then form inputs to higherlevel force on force models. This ensures that thecombat models themselves do not become overlycomplex. We have already discussed this in the senseof hierarchies of support models.

For example, in Figure 8.2, a detailed model of theintelligence system can be very complex, if we wish totake account flow of intelligence requirements, tasking,collection processes, fusion processes, andintelligence products. In order to analyse the impact

Page 207: NATO Code of Best Practice for C2 Assessment

205Chapter 8

of intelligence, it is important to have all of this detail,but it does not necessarily have to be representedexplicitly in the main model. A supporting model-toolwhich captures all of this detail can be created (or usedif one already exists) in order to produce outputs atthe MoCE level, such as speed and quality ofintelligence. These can then form inputs to the mainsimulation model. The main model-tool then takesthese into account in producing its own outputs. Thesewill now be at the MoFE level. Examples are friendlycasualty levels, adversary attrition, and time to achievemilitary objectives.

Figure 8.2. Model-Tool Hierarchy

Scanning Scenario Space

The use of very fast model-tools to scan the overallspace of possibilities and to identify areas of concernfor further analysis appears to give a good balancebetween the use of simple and complex modelling

Page 208: NATO Code of Best Practice for C2 Assessment

206 NATO Code of Best Practice for C2 Assessment

approaches. Fast models, which are simpler but mayhave less analytic depth, allow the analyst to scopethe problem and determine the degree of complexitya model-tool must represent in order to conduct thedesired level of analysis.

Decisionmaking Process

It is important to have a proper representation of thedecisionmaking process in order to establish the linkfrom C2 performance through MoE to overall MoFEand to represent information operations (IO) effectssuch as Counter C2 or digitisation of the battlespace.Representation of the decisionmaking process itself,however, remains difficult because of the difficulty inrepresenting human performance, command styles,and organisational relationships.

Parameter Development Context

Finally, it may become necessary to generate new oradditional data to validate new or existing model-toolsto incorporate C2 factors. This may be especially truein the case of integrating soft factors into C2 analysis.Possible methods include field trials, “model-test-model”, or advanced warfighting experiments. Fieldtrials are used if uncertainty revolves aroundmeasurable factors that are only observable in the fieldor are not reproducible in the laboratory. Model-Test-Model (M-T-M) or Model-Exercise-Model (M-E-M) isused as part of an iterative process to develop andapply systematically more in-depth and sophisticatedmodel-tools and, in some cases, more simplisticmodel-tools to increase their validity. The originalmodel-tool is executed, modified based on results of

Page 209: NATO Code of Best Practice for C2 Assessment

207Chapter 8

the test or experiment, and executed again until it hasdeveloped a sufficient representation of a complexprocess. Experiments, like advanced warfightingexperiments, are useful in modelling new, large scale,and complex interactions for which little data or fewvalidated tools exist. Each approach requires additionaltime and resources, and the data sets may not bevalidated for some time.

C2 Modelling Guidelines

A number of common ideas have emerged which areworth consideration for new modelling and tooldevelopments. Figure 8.3 shows how each of thefollowing common ideas are represented within anideal command and control model:

• Understanding of adversary intent can berepresented by having a number of prescribedintents or options, which are updated in anadvanced data architecture, or Bayesian way, asmore information becomes available;

• Representing headquarters explicitly in the model-tool allows proper representation of InformationWarfare (IW) effects such as counter C2;

• Explicit representation of the “recognised picture”within each headquarters (HQ) allows the model-tool to run based on different perceptions by eachindividual unit on each side represented. Thisallows the effects of aspects such as deception,shock, and surprise to be explicitly considered;

• Represent information as a commodity. Thisconsideration is the most critical and difficult to

Page 210: NATO Code of Best Practice for C2 Assessment

208 NATO Code of Best Practice for C2 Assessment

implement, but is the foundation for the otherguidelines, as well as for the model itself.Information should be considered as a resourcethat can be collected, processed, anddisseminated. It includes information aboutadversary, friendly, and other forces,considerations such as political-military factors andrules of engagement (ROE), as well asenvironmental information such as weather andterrain. Information should posses dynamic valuessuch as accuracy, relevance, timeliness,completeness, and precision. These values shouldin some way affect other activities within the model,to include, when appropriate, combat functions;

Figure 8.3. Modelling Guidelines

• Represent the realistic flow of informationthroughout the operational environment.Information has a specific source, and that sourceis usually not the end user of the information. Arequirement exists, therefore, to move informationfrom one place to another in the operational

Page 211: NATO Code of Best Practice for C2 Assessment

209Chapter 8

environment. Communications systems of allforms exist to accomplish this movement. Thesesystems can be analogue or digital. Informationcan be lost and/or degraded as it flows around theoperational environment. The model-tool shouldrepresent the communications systems andaccount for these degradation factors as itrepresents information flow;

• Represent the collection of information frommultiple sources and tasking of informationcollection assets. This guideline applies equallyto adversary, neutral, and friendly and otherforce information. For the collection of adversaryand other force information, the model-toolshould represent a full suite of sensors andinformation collection systems, and the ability ofthese systems to be tasked to collect specificinformation. For the collection of friendlyinformation, this consideration is just as critical.Knowledge of one’s own capability in combat, aswell as that of the adversary and other forces, isessential for effective decisionmaking;

• Represent the processing of information.Information is rarely valuable in its original form. Itusually has to be processed in some way. Typicalprocessing requirements include filtering,correlation, aggregation, disaggregation, andfusion of information. These processes can beaccomplished by either manual or automatedmeans. The ability, or inability, to properly processinformation, and the time it takes can have adirect bearing on combat operational outcome;

Page 212: NATO Code of Best Practice for C2 Assessment

210 NATO Code of Best Practice for C2 Assessment

• Represent C2 systems as entities in theoperational environment. C2 systems performinformation collection, processing, dissemination,and display functions. They should be explicitlyrepresented as entities that can be targeted,degraded, and/or destroyed by either physical ornon-physical means. Additionally, the model-toolshould account for continuity of operations ofcritical functions during periods of system failureor degradation;

• Represent unit perceptions built, updated, andvalidated from the information available to theunit from its information systems. This is acritical requirement. Each unit should have itsown perceptions, gaining knowledge fromsuperior, subordinate, or adjacent units onlywhen appropriate;

• Represent commander’s decisions based on theunit’s perception of the battlefield. Each unitshould act based on what it perceives thesituation to be informed by its commander’sintent of mission, goals, constraints, and biases,not based on ground truth available within themodel. When a unit takes action based oninaccurate perceptions, it should suffer theappropriate consequences; and

• Represent IO for each/all combatants. Withinformation so critical to combat operationsoutcome, the model-tool should be able torepresent the deliberate attack and protection ofinformation, information systems, and decisions.This applies to all sides represented in the model.

Page 213: NATO Code of Best Practice for C2 Assessment

211Chapter 8

As shown in Figure 8.3, explicit representation of theseinformation operations elements within a commandand control will facilitate the assessment of humanfactors, psychological operations, deception, C2systems effectiveness, and staff structure issues.

Conclusions

The following conclusions are made regarding thestrengths and weaknesses of current C2 tools/modelling approaches.

Strengths in Current C2 Tools/Modelling

An assessment of current C2 modelling approachesemployed against the guidelines above show that,while they may not yet be satisfied, there are somestrengths in the C2 tools/modelling approachescurrently being implemented:

• There is a common understanding of issues. Thishas not always been the case. With the inherentcomplexity of C2, as well as the challenges inmodelling such a complex subject, there hasbeen a tendency in the past to ignore thesubject, or just to accept it as something that istoo complex to address. This does not seem tobe the case now. Perhaps through theemergence of new technologies and through thework of groups such as the NATO SAS groups,the analysis and modelling of C2 is nowconsidered possible. Member nations now seemto have both an understanding of C2 and itsimportance to combat and OOTW operations

Page 214: NATO Code of Best Practice for C2 Assessment

212 NATO Code of Best Practice for C2 Assessment

and a common understanding of the modellingchallenges that exist.

• There is wide application of C2 modelling.Member nations now apply C2 modelling andanalysis to a wide range of issues. These issuesinclude those associated with investment,requirement identification, force structuring andoperational support. In all of these areas, there isunderstanding of the sensitivity and criticality ofC2 to the proper analysis of combat operationsoutcome. Selection of the model-tools to apply toa particular problem should be based onevaluation of specific criteria, as discussedpreviously in this chapter.

• Although each nation develops its models fordifferent purposes and tailors their models and othertools for specific issues, there exists a commonalityof approaches in different nations that serves tostrengthen their collective merits. These commonapproaches are an outgrowth of the modellingtechnologies now available, but also result fromshared experiences by member nations.

• Most of the progress and success in C2modelling has been with regard to high-intensitycombat. This is perhaps due to the belief bymany that a high-intensity combat scenario is stillmost appropriate for the analysis of combat,particularly for analysis of primary combatsystems. Progress, therefore, has been focusedon embellishing high-intensity combat analysiswith C2 improvements to models. Unfortunately,low-intensity combat and OOTW modelling and

Page 215: NATO Code of Best Practice for C2 Assessment

213Chapter 8

analysis have not received the same level ofattention until recently.

• There is wide use of evolutionary developmentapproaches. After many years of neglect, aproblem as complex and difficult as C2 modellingrequires years of focused research anddevelopment. There are no simple fixes to theproblem. It is evident that member nationsrecognise this and are willing to approach theproblem in an incremental manner, applyingevolutionary approaches.

• There is progress in linking and federatingmodels. Significant progress has been made byseveral nations in linking performance model-tools with combat effectiveness models, eitherdirectly or through off-line approaches.Additionally, creating federations of modelsthrough standard interface protocols hassignificantly improved the use, and reuse, ofexisting models and has provided a promisingapproach for future modelling. The inherentdifficulties in federating models, given today’sstate of the art, must be considered whencontemplating this approach.

• There is progress in modelling “soft factors.”Several nations have made real progress inmodelling phenomena that have non-physical, orsoft impact, on combat operations outcome.Among these factors are morale, fatigue, andtraining proficiency. These and other soft factorshave increased importance on combatoperations outcome as C2 modelling improvescombat models.

Page 216: NATO Code of Best Practice for C2 Assessment

214 NATO Code of Best Practice for C2 Assessment

• Standard interface protocols, data standards, andother standards either now exist, or are underdevelopment. These standards serve to make thisdifficult task easier. Continued development of suchstandards is envisioned for the future is essential.

• There is widespread use of Commercial Off TheShelf (COTS) products. These products aregenerally available to all member nations. Thisuse of COTS has served to help furtherstandardise individual modelling approaches andwill continue to do so in the future.

• There is application of new informationtechnologies. New technologies, such as thosesupporting animation, have been applied to thechallenge of C2 modelling simulation, andanalysis. Additional technological advancementswill no doubt continue and will be similarlyapplied to this problem.

Weaknesses in Current C2 Tools/Modelling

An assessment of current C2 modelling approachesemployed against the guidelines above show thatweaknesses exist in current approaches. Theseweaknesses, rather than being enumerated here, areexpressed as challenges below.

Recommendations

It is recommended that analysts take advantage of thestrengths available in current approaches and in thenew methods that are evolving. They should also beaware of the challenges that must still be resolved and

Page 217: NATO Code of Best Practice for C2 Assessment

215Chapter 8

should attempt to play their part in helping address thosechallenges both through study activities and research.

Challenges: OOTW

As discussed previously, recent world events andcurrent projections call for an emphasis on OOTW.C2 in these environments can be quite different andmay require fresh C2 tool/modelling approaches to linkC2 to outcome in these environments. The followingchallenges with tools-models for C2 Assessment exist:

• Orchestrating a set of applicable tools. Becausethere is no one universally accepted tool that willsatisfy OOTW C2 analysis requirements, a set oftools must be selected, based on evaluation ofpotential tools against selection criteria, andapplied to the analysis. Orchestrating a set oftools that complement the strengths andweaknesses of each to satisfy the analysisrequirements is difficult. Proper consideration ofassumptions and constraints during toolselection, and careful scoping the analysisissues, will help to simplify the orchestration oftools. Additionally, it must be remembered thatnot only the tools, but trained, skilled users of thetools are required.

• Breadth of tool application. Because of thecomplexity of OOTW analysis, the full set ofpotential tools should be considered forapplication throughout the study process, fromproblem formulation through sensitivity analysis.Additionally, the more subjective nature ofOOTW dictates that tools, heretofore not

Page 218: NATO Code of Best Practice for C2 Assessment

216 NATO Code of Best Practice for C2 Assessment

typically applied to the analysis of C2, beconsidered for use throughout the study process.

• Relationship of tools to data availability and MoMgeneration. Tools must be selected for OOTWC2 analysis that have necessary data available,or able to be generated or obtained from theapplication of other tools. Reliance on moresubjective data, especially for higher C2echelons, may be necessary. Additionally, toolsmust be selected that generate the MoMs thatwill help answer the study issues at hand. TheOOTW C2 analyst will be challenged to becreative in establishing the strong relationshipbetween data, tools, and MoM required for asuccessful OOTW C2 analysis. It must beremembered that analysts, not the toolsthemselves, answer the analysis issues.

• Consideration of M-E-M or M-T-M. Capitalisationon testing or training events for C2 analysispurposes can be highly beneficial, both from aresource and analytic point of view, but doing socan be difficult. Such approaches are moresubjective, take more human involvement, andare inherently more complex than more classicalanalysis tools/approaches. The advantages ofhaving live players/subjects in the analysis,however, generally outweigh the disadvantages.These approaches are becoming morecommonplace and more accepted as alegitimate analysis approach, especially for moresubjective issues such as those associated withOOTW C2.

Page 219: NATO Code of Best Practice for C2 Assessment

217Chapter 8

• Sharing of tools between different communities.Given the nature of analysis of OOTW C2 and theemergence of M-E-M and M-T-M approaches, theboundaries between the testing, training, analytic,and operational communities are blurring and toolsonce considered for use in only one of thesecommunities are finding application in another. Thesharing of available tools among these communitiesis considered even more appropriate now with therise in importance of complex OOTW C2 analysisrequirements. This sharing, however, is difficult dueto differences in terminology, as well as culturaldifferences between the different communities.

• Reuse of operational schemas and data models.OOTW typically involves many moreorganisational entities than combat operations,making interoperability of organisations andsystems critical to successful operations. C2analysts, therefore, are challenged to ensurethey use, to the extent possible, existingoperational schemas, such as orders andreports, and data models used for C3I systemsintegration, whenever possible to furtherstandardization and interoperability goals.

• Analysis of architectures. Technical, operational,and system architectures are sometimesdeveloped in order to facilitate integration andinteroperability of C3I systems. For NATO,required architecture frameworks are containedin the NATO C3 Interoperability ManagementPlan (NIMP), Volume II. Considerable challengesexist in developing and applying analytic tools for

Page 220: NATO Code of Best Practice for C2 Assessment

218 NATO Code of Best Practice for C2 Assessment

evaluation architectures, to include analysis ofalternative architectures and their implications.

• Management of customer expectations andrelations. OOTW C2 analysis is so difficult that itis necessary for the analyst to ensure thecustomer for the analysis understands theinherent difficulty and to attempt to manage thecustomer expectations for the analysis. Thisincludes informing the customer of the tools to beemployed and to gain an understanding and trustin those tools by the customer. It also implies aspecial relationship between analyst andcustomer for OOTW C2 analysis. This is evenmore critical when the customer is a subject of theanalysis, such as when a commander has his/herown command analysed.

Challenges: Modelling C2

The following challenges with modelling C2 exist:

• Better representation of cognitive processes. C2can be incorporated at one level of resolution incombat tools through representation of theeffects of particular decisions. At another level,representation of the decision process itself isdesirable. It would enable alternativedecisionmaking styles and the effects of softfactors such as stress, training level, fatigue, andmorale to be more easily assessed. Thesefactors become more important as the full rangeof IO representation is attempted;

• Long standing challenges associated with bothstochastic and deterministic models. The

Page 221: NATO Code of Best Practice for C2 Assessment

219Chapter 8

advantages and disadvantages associated withstochastic and deterministic modellingapproaches will remain as C2 modellingimproves. The objective is to select the bestmodelling approach for the issue at hand.Recognising the inherent advantages anddisadvantages, and then capitalising on theadvantages while minimising the disadvantages,are the challenges;

• Better standard definition of C2 terms. Thischallenge has plagued the C2 community formany years, because of both the scope andcomplexity of the subject. This is particularly trueacross service boundaries and across theinternational community. A standard set ofdefinitions would greatly simplify the C2modelling challenge;

• The definition and application of C2 scope. Thischallenge, related to the previous challenge, isespecially critical to modellers. The C2 of afighter aircraft or a carrier group is very differentfrom the C2 of an army corps or an army squad.On the other hand, there are C2 aspects of eachof these combat elements that are similar.Modelling of C2, however, can be vastly differentin each case. The scope of each modellingundertaking must be properly considered anddiscipline must be applied throughout modeldevelopment to focus on the proper scope. Oncethe scope is established, a mix of tools may berequired to address the full scope of the analysis;

• Multiple application of C2 model-tools to analysis,training, and operational requirements. C2

Page 222: NATO Code of Best Practice for C2 Assessment

220 NATO Code of Best Practice for C2 Assessment

phenomena are relatively constant whether theyexist within the analytic, training, or operationalenvironment, and they should be consistentlymodelled in each environment. This fact, as wellas the obvious need to conserve expensivemodel-tool development resources whereverpossible, leads to the challenge of developing C2models, or at least component software modulesthat can be used to support analysis, training,and operational requirements;

• Level of resource application to the breadth anddepth of C2 modelling. Because of the largescope of C2, there has been a tendency bysome to model C2 at great breadth (multipleapplications), at the expense of modelling C2phenomena at a corresponding depth. In aconstrained resource world, sufficient resourcesare not usually available for both. The challengeis to either apply sufficient resources or torecognise the shortfall and to level the availableresources across the breadth and depth of theproblem. Models and other tools must, therefore,be tailored to the extent possible to fit the studyissues being addressed;

• Differences in the level of modelling of friendlyand adversary forces. Many combat models donot represent adversary forces to the same levelof resolution as friendly forces. In the past, theremay have been good reasons for this. Besidesthe obvious resource savings, the lack of C2representation often precluded furtherrepresentation of adversary forces. Validrepresentation of C2, to include full play of IO

Page 223: NATO Code of Best Practice for C2 Assessment

221Chapter 8

such as deception and psychological operations,will require equal representation across bothadversary and friendly forces, as well as anyother supporting or neutral forces in thesimulation. All discussion and recommendationsin the COBP, therefore, are equally applicable tomodelling of adversary forces as it is tomodelling friendly force C2. This represents asignificant challenge to many modelling efforts;

• Continuing lack of “soft factor” representation anddata. As discussed previously, a robust C2representation in combat models will permit softfactors to be better represented. The biggerchallenge, perhaps, may not be the modellingmethodology itself, but the acquisition of data tosupport it. The effects of such things as stress,training proficiency, morale, fatigue, and shock, forexample, necessitate new data generationapproaches, which will take some years toimplement. The tasks of VV&A and Verification,Validation, and Certification (VV&C) are mostsevere in their soft factor arena. The certification ofsoft factor data, as well as most all C2-related data,is particularly difficult to achieve. Innovative andfocused C2 data VV&C programs are required;

• VV&A of C2 model-tools and the parameters thatdrive them. This is always a challenge for model-tool development efforts, but is particularlychallenging for C2 modelling, due to the variabilityinherent in most C2 processes, especially thosethat involve the human aspects of informationprocessing and decisionmaking; and

Page 224: NATO Code of Best Practice for C2 Assessment

222 NATO Code of Best Practice for C2 Assessment

• Sensitivity Analysis. The challenges associatedwith the proper conduct of sensitivity analysis ofC2 is as great as, or perhaps greater than, thatassociated with other analyses. This is becauseof the uncertainty associated with C2 itself, andthe relatively immature modelling of C2 thatexists today. Innovative, yet cost-effectiveapproaches to sensitivity analysis are required.

Chapter 8 Acronyms

ACCESS – Army Command and Control Evaluation System

AI – Artificial Intelligence

C2 – Command and Control

CPX – Command Post Exercise

FTX – Field Post Exercise

HEAT – Headquarters Effectiveness Assessment System

HQ – Headquarters

MAPEX – Map Exercise

M-E-M – Model-Exercise-Model

MoCE – Measures of C2 Effectiveness

MoFE – Measures of Force Effectiveness

MoP – Measures of Performance

MoPE – Measures of Policy Effectiveness

M-T-M – Model-Test-Model

ROE – Rules of Engagement

Page 225: NATO Code of Best Practice for C2 Assessment

223Chapter 8

OOTW – Operations Other Than War

STAFFEX – Staff Exercise

TRADOC – US Army Training And Doctrine Command

VV&A – Verification, Validation, and Accreditation

VV&C – Verification, Validation, & Certification

Chapter 8 References

This chapter is based in part on the results of previousworkshops on models used for C3I systems andanalyses. Additional Useful references include:

Alberts, D. S., & Czerwinski, T. J. (1997). Complexity,global politics and national security. Washington,DC: National Defense University.

Allen, P. M. (1988). Dynamic models of evolving systems.System Dynamics Review, 4, 109-130. Availableto purchase from http://www.albany.edu/cpr/sds/publications.htm

Bailey, T. J. (2001). Multiple Runs of a DeterministicCombat Model (Technical Document 01-001).Fort Leavenworth, KS: U.S. Army Training &Doctrine Command (TRADOC) Analysis Center.

Bailey, T. J., & Clafin, R. A. (1996). Informationoperations in force on force simulations (USArmy TRAC Technical Memorandum TM-0196).Fort Leavenworth, KS: U.S. Army Training &Doctrine Command (TRADOC) Analysis Center.

Page 226: NATO Code of Best Practice for C2 Assessment

224 NATO Code of Best Practice for C2 Assessment

Bares, M. (1996). Pour une prospective des systèmesde commandements [For a futurology of thesystems of commands]. Polytechnica.

Bellman, R. E., & Zadeh, L. A. (1970). Decision-makingin a fuzzy environment. Management Science,17(4), 141-164.

Code of Best Practice (COBP) on the assessment ofcommand and control (Research & TechnologyOrganisation (RTO) Technical Report 9AC/323(SAS)TP/4). (1999). Neuilly-Sur-SeineCedex, France: North Atlantic TreatyOrganisation (NATO) RTO. Available from http://www.rta.nato.int/RDP.asp?RDP=RTO-TR-009

Hartley, D. S., III. (1996). Operations other than war:Requirements for analysis tools research report(K/DSRD-2098). Oak Ridge, TN: LockheedMartin Energy Systems. Available from http://www.msosa.dmso.mil/ootw.html

Hartley, D. S., III. (2001). Battle modeling. In S. I. Gass& C. M. Harris (Eds.), Encyclopedia of operationsresearch and management science (2nd ed.).New York: Kluwer Academic.

Klein, G. A. (1989). Recognition primed decisions.Advances in Man-Machine Systems Research,5, 47-92.

Kroening, D., & Moffat, J. (Eds.). (1999). Workshop onC3I modeling (AC/323SASTP/9). North AtlanticTreaty Organisation (NATO) Research &Technology Organisation (RTO) MeetingProceedings 29. Available for order fromwww.rta.nato.int

Page 227: NATO Code of Best Practice for C2 Assessment

225Chapter 8

Lauren, M. K. (2000). Modelling combat using fractals andthe statistics of scaling systems, 5(3), 47-58.Alexandria, VA: Military Operations Research Society.

Lauren, M. K. (2001). Beyond Lanchester: A fractalbased approach to equations of attrition.Manuscript submitted for publication.

Libicki, M. C. (1997). Defending cyberspace and othermetaphors. Washington, DC: National DefenseUniversity.

Modeling and analysis of command and control (AC/323(SAS)TP/12). (1999). North Atlantic Treaty Organisation (NATO)Research & Technology Organisation (RTO) MeetingProceedings 38. Available from http://www.rta.nato.int/RDP.asp?RDP=RTO-MP-038

Moffat, J. (1996). The system dynamics of future warfare.European Journal of Operational Research, 90(3),609-618. Available with registration from http://www.palgrave-journals.com/jors/

Moffat, J. (2000). Representing the command and controlprocess in simulation models of conflict. Journalof the Operational Research Society, 51(4), 431-439. Available with registration at http://www.palgrave-journals.com/jors/

Moffat, J. (2002). Command and control in theinformation age—Representing its impact.Norwich, United Kingdom: The Stationery Office.Available from www.tso.co.uk

Page 228: NATO Code of Best Practice for C2 Assessment

226 NATO Code of Best Practice for C2 Assessment

Moffat, J., & Passman, M. (2002). Metamodels andemergent behaviour in models of conflict.‘Proceedings of the Simulation Study Group TwoDay Workshop, March 2002.’ The OperationalResearch Society, UK. Available onwww.orsoc.org.uk

Moffat, J., & Prins, G. (2000, July). A revolution in militarythinking? [Issues from the 1999 DefenceEvaluation and Research Agency (DERA) SeniorSeminars]. Journal of Defence Science, 5(3).Available by request from [email protected]

Moffat, J., & Witty, S. (2001). Bayesian decisionmakingand military command and control. Manuscriptsubmitted for publication.

Moffat, J., Witty, S., & Dodd, L. (2002). Bayesiandecisionmaking and the precise use of force. ‘Journalof Defence Science 7(1).’ Available [email protected]

Nowak, M., & May, R. (1992, October). Evolutionarygames and spatial chaos. Nature, 359.

Perry, W. L., & Moffat, J. (1997). Developing models ofdecisionmaking. Journal of the Operational ResearchSociety, 48(5), 457-470. Available with registrationfrom http://www.palgrave-journals.com/jors/

Perry W. L., & Moffat, J. (1997). Measuring consensus indecisionmaking: An application to maritime commandand control. Journal of the Operational ResearchSociety, 48(4), 383-390. Available with registrationfrom http://www.palgrave-journals.com/jors/

Page 229: NATO Code of Best Practice for C2 Assessment

227Chapter 8

Perry W. L., & Moffat, J. (1997). Measuring the effects ofknowledge in military campaigns. Journal of theOperational Research Society, 48(10), 965-972.Available with registration from http://www.palgrave-journals.com/jors/

Pidd, M. (1996). Tools for thinking—Modeling in managementscience. New York: John Wiley and Sons.

Richards, R., Neimeier, H. A., Hamm, W. L. (n.d.). Analyticalmodelling in support of C4ISR mission assessment(CMA).

US Department of Defense (DOD) model and simulationmaster plan. (1995). http://www.msiac.dmso.mil/ms_needs_documents/MS_Master_Plan.PDF

Wallace, G. J., & Moffat, J. (1997, January). The use ofPetri Nets to model satellite communicationssystems. Journal of Defence Science, 2(1).Available from [email protected]

Page 230: NATO Code of Best Practice for C2 Assessment

229

CHAPTER 9

Data

Know the enemy and know yourself; ina hundred battles you will never be inperil. When you are ignorant of theenemy but know yourself, yourchances of winning or losing areequal. If ignorant both of your enemyand of yourself, you are certain inevery battle to be in peril.

—Sun Tzu, The Art of War

Page 231: NATO Code of Best Practice for C2 Assessment

230 NATO Code of Best Practice for C2 Assessment

Definitions

Data are factual information that are organised foranalysis and in a form suitable for machine

processing. Data are usually thought of as anchoringan epistemological scale with understanding or wisdomanchoring the other end. Information is data that havebeen put into context.

Metadata are “information about information.”Metadata can describe data as well as metadata,therefore there are several levels of metadata possible.An example for metadata on a higher level is thereliability of the source that data have been derivedfrom. In general, metadata are documentation of theattributes of data directly attached to the data, andtherefore can be archived along with the data.

Role of Data

The role and importance of data in C2 assessment isunderestimated by many people, often including thedecisionmakers and the assessment team itself. Figure9.1, Data Taxonomy, lays out a number of the typesof data including broad categories of sources that willbe of interest to the analyst. The ability to determinethe needed data and the ability to assemble or collectthis data determine the solution strategy. The capabilityto obtain or collect the appropriate data:

• Will often reflect the difference between the desiredor “ideal” measures of merit (MoM) and the set ofMoM actually available and used in the assessment;

Page 232: NATO Code of Best Practice for C2 Assessment

231Chapter 9

• Frequently either constrains or determines thescenario or set of scenarios that are used;

• Is a key or major factor in determining the set oftools appropriate for the assessment; and

• Often acts as a schedule and cost driver forthe analysis.

Figure 9.1. Data Taxonomy

Reuse of Data

Because of the centrality of data to the assessment,there is increasing interest in data reuse. While theamount of data potentially available for a given studyis growing exponentially, the real opportunities for thereuse of data have proven to be limited because:

Page 233: NATO Code of Best Practice for C2 Assessment

232 NATO Code of Best Practice for C2 Assessment

• The rapid change of technical data derailingperformance of systems or sub-systems must beidentified by version and data;

• Assessment teams do not always know that dataexist and have no easy way to find out aboutsuch “legacy” data;

• The data that are available are seldom in aneasily accessible form;

• The conditions under which the data werecollected are not documented;

• The definitions, languages, and measurementinstruments used in different analyses varywidely; and

• Restrictions arising from security considerations.

Despite these barriers, an effort should be made tofind and reuse data. Sources that should be consideredinclude “official sources” such as the customer, opensources such as those available on the Internet, andprior studies on similar topics. Because the dataneeded will seldom be available in a form and formatideal for the assessment, data engineering is oftenneeded to gather, organise, and transform theavailable data.

Data generated by a given assessment phase may, initself, be valuable as an input later in the research orto other project teams. To enable appropriatereusability of data, every modification, constraint,assumption, etc. has to be documented adequately.To facilitate the reuse of data, it is best practice to usemetadata for this purpose.

Page 234: NATO Code of Best Practice for C2 Assessment

233Chapter 9

In order to facilitate the reuse of data across communityboundaries, alignment of the processes and methodsfor data and metadata modelling on the mid term—resulting in shareable data and metadata models anda common ontology on the long term—is necessary.

Furthermore, the issue of verification, validation, andcertification (VV&C) for data becomes an issue ofincreasing significance. It is good practise to usecertified data whenever possible.

Definition of Data Domains

Data can be directly connected to the other sectionsof the Code of Best Practice (COBP) by respectivedata domains. These data domains categorise whatthe information is about—and what data areavailable—as well as the assessment needs of thestudy—and what data are required. Some of thesecategories are:

• Scenario data: the set of data describing thescenarios and vignettes;

• Human organisational issues data: the set ofdata describing the scenarios and vignettes;

• System performance data: the set of datadescribing system performance in differentscenarios; and

• Tool data: the set of additional data used for thetools that are not covered by any other category.Hard-wired assumptions belong here as well asstudy-specific configuration parameters used fortechnical calibration of tools. It is good practice

Page 235: NATO Code of Best Practice for C2 Assessment

234 NATO Code of Best Practice for C2 Assessment

to be aware of the hidden data as well as theinput parameters for each tool to be used withinthe study, especially when it is planned to buildfederations of tools.

Data Sources

Data can be obtained from various sources. The mostcommon sources include:

• Official sources: sources such as militarydatabases, governmental data, data owned bythe United Nations, etc. The customer, or otherstakeholders, controls the access to this data oris at least aware of the existence and structure ofthe data. It can be assumed that the customerwill support the analyst in getting access to thedata or a sanitised version, in case the originaldata are not releasable to the study.

• Open sources: data sources that are neitherinfluenced nor controlled by the customer. TheInternet, commercial organisations, as well asopen data sources of non-participatingorganisations are examples.

• Legacy study results: data sources derived fromother studies conducted by the operationsanalysis and operations research (OA/OR)community, including political, psychological, andsociological studies and Command, Control,Communications, Computers, Intelligence,Surveillance, and Reconnaissance (C4ISR) testsand evaluations. They may be the delivered as

Page 236: NATO Code of Best Practice for C2 Assessment

235Chapter 9

the result of a former study as well as havingbeen an intermediate result.

If needed data are not available from empirical sources,subject matter experts should be used as sources toestimate needed values.

Data Classes

Data classes describe the technical aspects of datumformat, datum types, and stage of processing. Data classesidentified in the context of this section are the following:

• Raw data are unprocessed. Raw data may comefrom observations of reality or artificial reality, theproduct of an instrumented reality, experimentalsituations, or selected artificial realities.

• Processed data and information are the result oftransforming one or more raw data elements intoanother variable. For example, one or moreradar returns are transformed into a track(friendly or not).

• Aggregate data and information are properties ofa collection of elements. For example, themovement of individuals on a battlefield versusthe movement of a squad or a platoon.

• Statistical values1 are assessed on a sample of apopulation and characterised by this population.These include mean, modes, medians, standarddeviations, and kurtosis. Statistics are often usedas parameters in assessments.

Page 237: NATO Code of Best Practice for C2 Assessment

236 NATO Code of Best Practice for C2 Assessment

• Derived data and information are outputs from aformula or simulation model that implicitly incorporatesa set of assumptions. For example, loss-exchangeratios given sensor and weapons assumptions.

• Intermediate data and information are theproducts of one phase or component of theassessment that provides input to another phaseor component.

Some C2 assessments use assumptions or presetparameters in place of data or statistics.

Use of Metadata

Data that are collected without adequatedocumentation are frequently viewed as suspect. Toavoid having good data discarded due to a lack ofdocumentation, acceptable community standards fordocumentation should be employed. The data mustbe clearly described in a manner that is understandableto a subject matter expert, not directly involved withconducting the study. The data description must berobust enough to inspire user confidence in the data.It is good practice to record these considerations inthe metadata associated with the data to facilitate thedata use and reuse.

Data may be in any of the relevant levels ofmeasurement—ratio, interval, ordinal, or nominal. In C2assessments, some significant factors may be nominal.It is good practise to record all of the assumptions,constraints, and limitations in the metadata.

In general, every time data are modified or processedto create new data, appropriate metadata

Page 238: NATO Code of Best Practice for C2 Assessment

237Chapter 9

documentation should be provided to ensuretraceability of results, validation, verification, andcertification of respective data, and reuse of data inlater phases or within other studies. This also allowsdealing with the challenge of multiple instances of datain different studies.

Data and Problem Formulation

The initial data available will often be vague, uncertain,incomplete, and contradictory. Analysts usually preferdata to be sharp, certain, complete, and consistent. Itis necessary to be explicit about the assumptionsinherent in this transformation.

If the assessment team finds it necessary to transform“soft” data to “hard” data in order to use tools thatrequire “hard” data, it is best practice to record thesource, perceptions etc. in the metadata associatedwith the data.

Obtaining Data

It may be safely assumed that not all of the requireddata will be available pre-packaged for the studyquestion. Some relevant data will likely be submergedamong a pool of irrelevant information.

Not all data will be under military/government control.Data belong to the stakeholders who are notnecessarily connected to (or even friendly toward) thecustomer of the study. A lot of information is availablefrom open sources. The challenge is to find, organise,verify, process, and convert it into the data needed.

Page 239: NATO Code of Best Practice for C2 Assessment

238 NATO Code of Best Practice for C2 Assessment

The team should be aggressive and persistent in thepursuit of required data.

The data needed will seldom be available in its rawform. Often, data have been transformed andaggregated. When tools have to be applied to derivedata, the derived data should be tagged withexplanatory metadata to record that information in aform that will facilitate its reuse.

If the data are not available and can neither beaggregated nor derived from the available sources, itis good practice to use the knowledge of subject matterexperts to generate the necessary data. Increasingly,when C2 assessment teams are tasked with exploringnew concepts of operation, empirical data can not beexpected to exist. It is further best practice to documentthis in respective metadata and replace suchassumption based data as early as possible withempirical data, e.g., as soon as another study deliversthe needed inputs. It is recommended to check therespective study results if subjectively generated dataare replaced, particularly when the study result hasbeen shown to be sensitive to this data element.

The team needs to know:

• What data are needed/structure of data;

� Preferred data (independent variables withinthe MoM);

� Necessary data (to be able to drive the tools);

� Available data (derivation, extraction,collection, etc.);

• Who owns this data;

Page 240: NATO Code of Best Practice for C2 Assessment

239Chapter 9

• Security issues, possibility of declassifying orotherwise obtaining release may be another issue;

• Costs to;

� Buy data;

� Collect data (people, time, resources); and

� Generate data.

It is essential to make the value of data clear to all levelsof decisionmakers and operational planers to ensurethat data collection issues are included in all phases ofthe study. The data collection and engineering plan, asintroduced in section 4-E-4, as part of the solutionstrategies, has to take this into account.

The process of data acquisition is important, but dataacquisition should not be emphasised over data interpretation.

There is an increasing urgent need for data describingoperations over the complete mission spectrum. It isgood practise to collect and exploit operational datawhenever possible. It is therefore recommended tosynchronize respective data collection and engineeringplans, ensuring from the start that the desired datawill be collected and archived appropriately.

The Use of Data within the Study

The archiving of data in retrievable form is essential.This is necessary both to support the ongoing studyand also to be of value for future study efforts. Theteam should establish and adopt process models thatensure the build-up of archives within a respectiveinfrastructure. Metadata have a critical role to play indata archiving and retrieval.

Page 241: NATO Code of Best Practice for C2 Assessment

240 NATO Code of Best Practice for C2 Assessment

The discussion of data, and the development of acommunity database, must be driven by agreed upondefinitions of data that are sufficient to support thecommunity. This should be a high priority item.

Data can be described as the “glue” that holds togetherthe different phases of a study. As the results fromone phase are transported to the next it is unlikelythat both will use identical processes and procedures,therefore harmonised data formats will allow forsmooth transition and continuity of effort.

The analyst will be faced with data in various formsand formats. In order to consider all available data,find it if necessary, and format it in the required manner(e.g., as an input parameter for the MoM or a model)data must be stored appropriately. One standard forstoring data is the Information Resource DictionarySystems (IRDS) [ISO 1990].

IRDS is a layered database that not only comprisesthe data, but also the metadata describing the meaningof the data, the format, the constraints, the intendeduse, the source, degree of certainty, vagueness andreliability, etc. It comprises data as well as metadata.

A common understanding of the problem between thecustomer and the study group as well as among theinterdisciplinary study team is essential for the successof the study. Therefore, a study glossary—based on ageneral and evolving OR glossary (e.g., the NATOJOINT Pub 1-02 [US DoD 1999])—is needed. The datadefinitions stored as metadata have to be aligned withthe definitions found in the study glossary. This providesa basis for standardised documentation of study results

Page 242: NATO Code of Best Practice for C2 Assessment

241Chapter 9

in a highly reusable form that can be manipulated easilyand reliably, perhaps by automated systems.

Using the right toolkit for the management of datawithin the IRDS, such as the Shared Data Environment(SHADE), can create the initial state of every futuredata driven application. (DISA, 1996; Krusche andTolk, 2000). The same techniques and tools can andshould be used for information systems delivering theneeded functionality to the warfighter anddecisionmaker. (Tolk, 2000).

Conclusions and Recommendations

Data transcend all phases of an assessment and maybe seen as the “glue” that holds the phases of anassessment together. Given its importance, resourcesmust be committed to ensure effective data acquisition,management, and availability for reuse.

There is an urgent need to agree on standards fordata, metadata, and data management, including theconditions under which the data are collected, dataelement definitions, metrics, etc. It is good practise touse established standards, where appropriate, suchas the Source for Environmental Representation andInterchange (SEDRIS™), or to use de facto standardslike Digital Terrain Elevation Data (DTED).

As the data being used today by the analysts will bethe data needed tomorrow by systems engineers,decisionmakers, and commanders for their operations,alignment of the standardisation processes and therespective toolsets as early as possible with thecommand and control systems community is good

Page 243: NATO Code of Best Practice for C2 Assessment

242 NATO Code of Best Practice for C2 Assessment

practice. A significant first step in such an alignmentwould be using the same IRDS [NATO 1999]. Acommon C4I and Modelling and Simulation (M&S)community is needed, to make visions like integratedalternative course of action analyses become a reality.

Chapter 9 Acronyms

C2 – Command and Control

COBP – Code of Best Practice

C4ISR – Command, Control, Communications, Computers,Intelligence, Surveillance, and Reconnaissance

IRDS – Information Resource Dictionary Systems

MoM – Measures of Merit

M&S – Modelling and Simulation

OA/OR – Operations Analysis and Operations Research

SHADE – Shared Data Environment

Chapter 9 References

Defense information infrastructure (DII) shared dataenvironment (SHADE), Capstone document.(1996). Arlington, VA: U.S. DoD, DefenseInformation Systems Agency (DISA). Availablefrom http://www.disa.mil/coe/srs/srs_base/srs_csa_PDF/zsrscsa.pdf

The Federation of American Scientists (FAS) WebPage on Digital Terrain Elevation Data: http://www.fas.org/irp/program/core/dted.htm

Page 244: NATO Code of Best Practice for C2 Assessment

243Chapter 9

IS10027: 1990 an information resource dictionarysystem (IRDS) framework. (1990).

Krusche, S., & Tolk, A. (2000, October). Informationprocessing as a key factor for modernfederations of combat information systems.Paper presented at the North Atlantic TreatyOrganisation (NATO) Research & TechnologyOrganisation (RTO) Information SystemsTechnology Panel (IST) Symposium: Newinformation processing techniques for militarysystems, Istanbul, Turkey. Available in PDFformat from http://home.t-online.de/home/Andreas.Tolk/literat.htm

Lambert, N.J, (1997). “Operation Firm Endeavour:Campaign Data, Its Collection and Use.”Presented at the Directorate of Land WarfareHistorical Co-ordination Group Conference,Trenchard Lines, DGD&D, Upavon, UK, 28-29April 1997. Also available from the BritishLibrary Document Supply Centre, Boston Spa,Wetherby, West Yorkshire, LS23 7BQ, UnitedKingdom, Reference DSC Shelfmark: 99/20363.

Mathieson, G. L., Tolk, A., Sundfoer, H. O., & Sinclair,M. (2001). Data the Glue to Assessment.Washington, DC: Cooperative ResearchProgram (CCRP).

North Atlantic Treaty Organisation (NATO)Consultation, Command and Control Board(NC3B), Information Systems Sub-Committee(ISSC), NATO Data Administration Group(NDAG). (1999). Feasibility study for the NATO

Page 245: NATO Code of Best Practice for C2 Assessment

244 NATO Code of Best Practice for C2 Assessment

C3 information resource dictionary system,(NC3 Information Resource Dictionary System(IRDS), AC/322SC/5-WG/3WP4). Brussels,Belgium. Available from http//www.nato.int/docu/home.htm

The report of the data working group. (1998). InSimulation technology-1997 (SIMTECH 97, Vol.III). Alexandria, VA: Military OperationsResearch Society.

Sinclair, M., Tolk, A. (2002, April). Building up aCommon Data Infrastructure NATOSymposium SAS039 on the “Analysis of theMilitary Effectiveness of Future C2 Conceptsand Systems” NATO C3 Agency, The Hague,The Netherlands.

The Source for Environmental Representation andInterchange (SEDRIS™) Home Page: http://www.sedris.com

Tolk, A. (2000, October). Functional Categories ofSupport to Operations in Military InformationSystems. Proceedings of the North AtlanticTreaty Organisation (NATO) RegionalConference on Military Communication andInformation Systems, Zegrze, Poland. Availablein PDF format from http://home.t-online.de/home/Andreas.Tolk/literat.htm

Tolk, A. (2001, March). Bridging the data gap -Recommendations for short, medium, and longterm solutions, (01S-SIW-011). Proceedings ofthe Simulation Interoperability Workshop Spring2001, Orlando, Florida. Available in PDF format

Page 246: NATO Code of Best Practice for C2 Assessment

245Chapter 9

from http://home.t-online.de/home/Andreas.Tolk/011_0301.pdf

Tolk, A., & Kunde, D. (2000, June). Decision supportsystems – Technical prerequisites and militaryrequirements. Proceedings of the 2000Command and Control Research andTechnology Symposium (CCRTS), Monterey,CA. Available in PDF format from http://www.dodccrp.org/6thICCRTS/Cd/Tracks/Papers/Track6/038_tr6.pdf

US Department of Defense (DOD), Joint Chiefs ofStaff. (1999). Dictionary of military andassociated terms. Washington, DC: GreenhillBooks. Available from http://www.dtic.mil/doctrine/jel/doddict/index.html

1Statistics is a branch of mathematics dealing with the collection,analysis, interpretation, and presentation of masses of numericaldata; alternatively, statistics is a collection of quantitative data.(Webster Online Dictionary).

Page 247: NATO Code of Best Practice for C2 Assessment

247

CHAPTER 10

Risk andUncertainty

We may at once admit that anyinference from the particular to thegeneral must be attended with somedegree of uncertainty, but this is notthe same as to admit that suchinference cannot be absolutelyrigorous, for the nature and degree ofthe uncertainty may itself be capableof rigorous expression.

—R.A. Fisher, The Design of Experiments

Page 248: NATO Code of Best Practice for C2 Assessment

248 NATO Code of Best Practice for C2 Assessment

There are risks associated with the decisionmaker’ssituation that are an inherent part of the analysis.

There are also risks related to the conduct of theanalysis itself. This chapter deals with both of thesesets of risks, focusing upon reducing uncertainty andother contributors to risk as well as the mitigation oftheir effects. Failure to deal effectively with risks willjeopardise the accomplishment of the goals of thestudy, namely to provide high quality decision support.It should be noted that, by the end of the assessment,the team will generally be aware that the initial studyplan will have been changed, whereas flaws in theconceptual model or in the assessment design mayremain hidden for some time from both the analystand the decisionmaker.

While the adoption of the guidance contained in thisCode of Best Practice (COBP) will help minimise therisks, it will not eliminate them. This chapter discussesa number of issues related to the risks anduncertainties that the assessment team needs toexplicitly address.

Risk

Risk is commonly defined as the possibility of sufferingharm or loss; in other words, an exposure to harm orloss.1 This includes an opportunity loss. The differencebetween risk and loss is that risk inherently involvesbefore the fact probability, while loss is an after thefact certainty. Insurance is about estimating andcovering an exposure of value to uncertainty. Risk oftenhas a negative connotation, yet “taking risks” can alsobe a positive act when a proper balance or trade-off is

Page 249: NATO Code of Best Practice for C2 Assessment

249Chapter 10

made between good and bad outcomes and therespective probabilities associated with the outcomes.

Perceptions of risk can substantially differ from moreobjective assessments of probability and impact. Thisis particularly important in command and control (C2)problems, which have a high sociological content.

Uncertainty

Uncertainty can be generally defined as an inability todetermine a variable value or system state (or nature)or to predict its future evolution. Uncertainty is inherentin risk. Even if a person is certain about the possibleoutcomes and the probabilities associated with theseoutcomes, there is uncertainty about which outcomewill in fact occur. When this situation applies we cansay that we have a known risk. For example, thepossible outcomes and the probabilities associatedwith the toss of a coin (and many other popular formsof gambling) are known. What is uncertain is theoutcome. Hence there is a risk, in this case a knownrisk, associated with obtaining a particular outcome.

There are other types of uncertainty, includinguncertainty about the possible outcomes (or theirvalues) and uncertainty about the probabilitiesassociated with these outcomes. Uncertainties arisefrom other uncertainties, namely uncertainties aboutthe potential actions of others or what is referred to as“states of nature” and their associated probabilities.In C2-related assessments, in particular, perceptionsof uncertainty may substantially differ from a moreobjective assessment of lack of knowledge. There arealso a number of uncertainties that are associated with

Page 250: NATO Code of Best Practice for C2 Assessment

250 NATO Code of Best Practice for C2 Assessment

the assessment itself. These are discussed later inthis chapter.

Dealing with Risk

There are three basic ways to deal with risk. The first isto reduce the uncertainty that underlies the risk. Thesecond is to mitigate risk by developing and selectingrisk-averse strategies. The third is to effectivelycommunicate the nature of the risk involved.

Reducing Uncertainties

In effect, all methods for reducing the uncertaintiesthat underlie risk involve the collection or analysis ofinformation. This is, of course, a major focus of anassessment. Clearly the assessment team needs toidentify the most important uncertainties—acombination of the degree of uncertainty and theconsequences of that uncertainty. However, there willalways be some significant residual uncertainties. Theaim of the assessment team, at this point, is to reducethe risk associated with the study by learning moreabout the robustness (or lack thereof) of the studyfindings and conclusion. Sensitivity analysis is onemethod to accomplish this purpose.

Sensitivity Analysis

The goals of sensitivity analysis are fourfold:

• To establish the regions for which establishedresults are valid;

Page 251: NATO Code of Best Practice for C2 Assessment

251Chapter 10

• To isolate those factors that contribute most tothe uncertainties that exist and identify the risksassociated with these uncertainties as theyrelate to study findings and conclusions;

• To make study results more robust for decisionsupport by allowing the decisionmaker to see theimmediate consequences as external factors arechanging; and

• To give the decisionmaker a richer understandingof the decision problem, highlighting theconsequences of limited changes relative to asolution proposed by the study.

The team should take a three-step approach:

• Identify the variables that are associated with thegreatest combination of risk and uncertainty;

• Perform sensitivity analyses that vary across the morelikely region of key parameters (often theneighbourhood of the initial estimate) to see how theresult is influenced. A sensitivity analysis investigatesthe main region of the output space; and

• Investigate the extremes of the same output space,focusing most strongly on the negative regions.

The assessment team also partially takes an invertedapproach, looking for possible failures, and thenseeking possible sources of each kind of failure.

Together, these three steps should provide the teamwith a better understanding of residual uncertaintiesand associated risks. The need for and results ofsensitivity analyses must be stressed in discussions

Page 252: NATO Code of Best Practice for C2 Assessment

252 NATO Code of Best Practice for C2 Assessment

with the decisionmakers. This will help avoid the under-resourcing of this critical activity.

Uncertainty and Risk in C2Assessments

Uncertainty and the risks associated with it can not betotally eliminated in any real world C2 assessmentproblem. Moreover, in most real-life problems, onecannot even identify all of the unknowns. This hasgiven rise to the term the “unknown unknowns.”Strategies to minimise risk have evolved to handlethese facts of life. Therefore, even the best possibleassessment approach will result in residual uncertaintyand risks.

In general, assessments should be judged by theirability to reduce uncertainty so that the decisionmakeris in a better position (less risk) after the assessmentthan before the assessment. Looking at the absoluteuncertainty that remains is not as useful a measuresince it may say more about the nature of the problemthan the success of the assessment.

Dealing with Uncertainty

It is important to treat uncertainty consistently andexplicitly. This allows information from two givensources or results to be fused, (e.g., by taking the mostprecise assessment of each factor from the tworesults). Thus the resulting knowledge will be betterthan either of the two separate results. On the otherhand, if uncertainty is not treated explicitly orconsistently, the best one can do is to pick the single

Page 253: NATO Code of Best Practice for C2 Assessment

253Chapter 10

result that seems best. This makes it more difficult fora study to add value to a decisionmaker.

C2 issues are complex, and it is an understatement tosay they are incompletely understood. C2requirements and solutions tend to depend heavily onthe nature of the operating environment. Thus, a C2-related study will rely on factors that are impreciselydetermined and change frequently over time. Almostregardless of the assessment effort, parts of theproblem-space will have been investigated lessthoroughly than would be ideal. Sensitivity analysesare required to give high quality study results.

C2 assessment problems (particularly in operations otherthan war (OOTW) contexts) generally have manyinteracting factors. It is unwise to rely upon single factorsensitivity analysis (i.e., testing sensitivity to one factorat a time). Multi-factorial experimental design methodsare good practice in such circumstances. (Keppel, 1973).2

OOTW studies typically have less well-formedquantitative factors and more qualitative factors. The“softer” nature of these factors makes assessmentmore difficult. The types of factor seen in OOTWproblems include:

• Social and political activity impacting the tactical level;

• A strong influence of negotiation and persuasionas opposed to coercion;

• Non-optimal performance of military capabilitiesfrom a technical perspective due to their poor fitto the problem;

• Severe rules of engagement (ROE) constraints; and

Page 254: NATO Code of Best Practice for C2 Assessment

254 NATO Code of Best Practice for C2 Assessment

• Unclear or evolving goals and objectives.

The team must be aware of the assumptions andlimitations included in models, scenarios, and datastructures, which should be captured in the metadata.In particular, it should be noted that humans involvedin C2 experiments (as analysts or subjects) alwaysbring assumptions with them. These need to beidentified and collected to form an audit trail. This isone place where the project leader’s log ofassumptions and decisions made during theassessment pays off. Uncertainty over the validity ofthese assumptions and limitations provide a sourceof uncertainty in study results to which studyconclusions must be made robust.

In C2 assessments, in particular, all aspects of a studyand study-problem may be connected to uncertainty.Thus, different sorts of uncertainty should beaddressed explicitly at appropriate stages in the study.Examples of these are:

• Parameter value uncertainty—many of theparameters and factors in C2 assessment aredifficult to evaluate confidently;

• Model-based uncertainty—i.e., over whetherunderlying models are valid and representative;

• Uncertainty of focus—i.e., over whether theassessment covers all of the important factorsand/or issues (this includes uncertainty ofscenarios); and

• Complexity of uncertain factors (i.e., theirdimensionality)—when a sufficiently complexfactor (e.g., scenarios or future technology) is

Page 255: NATO Code of Best Practice for C2 Assessment

255Chapter 10

uncertain, the team can not expect to overviewthe set of all possible true states.

Uncertainty in complex factors, such as scenarios,should be addressed thoroughly. Even though it isphilosophically impossible to know everything abouta problem, an adequately complete knowledge canbe better assured by explicit use of checklists thathighlight the breadth of factors typically involved in C2assessments, such as:

• Technology—disruptive uncertainty anddisruptive innovation;

• Organisational use of technology;

• Scenarios, tasks, and nature of operations;

• Data;

• Context or environment of the assessment; and

• Co-evolution of factors (e.g., summarised indoctrine, organisation, training, material,leadership, personnel, and facilities[DOTMLPF]).

Nonetheless, the team should not rely completely onchecklists, but rather complement them with criticalthinking in the specific study context.

Within electronic systems, organisations, and battleconcepts, there is a lot of opportunity for disruptivetechnology, producing substantial uncertainty. Inhuman-in-the-loop experiments there is a greaterdanger of bias in subjective judgements.3

Page 256: NATO Code of Best Practice for C2 Assessment

256 NATO Code of Best Practice for C2 Assessment

In considering sensitivity analysis, it is important notto associate it only with statistical variance in parametervalues. Qualitative consideration, such as variation ofmodel, perspective, or assumptions (i.e., categoricalvariations) should also be used to test for and assesssensitivity. Variations in ranking (ordinal variation) canalso be a powerful tool for sensitivity analysis. Otheranalytic tools and constructs relevant to sensitivityanalysis are:

• Non-parametric statistics (Siegal and Castellan, 1988);

• Belief-functions (as an alternative to probability);

• Judgmental uncertainties (Wilson and Corlett, 1990);

• Fuzzy numbers, theories of semiorders, scoringcriteria (Siegal and Castellan, 1998);

• Multi-dimensional scaling; and

• Mathematics applied to non-ordinal scales.

Semiorders constitute the intermediate level betweenordinal information and value. They apply to manyfields but in this context especially to scored data orpreference data—in that account is taken of theintervals of imprecision around the measuring systemsused. Analysts should be aware of the thresholdsassociated with the collected data above whichdifferences can be legitimately distinguished toproduce values. This idea is very useful when themeasurements cannot be repeated (as in statisticaltheory) (Prilot and Vincke, 1997).

Keeping C2 assessment rigorous and robust in theface of the many uncertainties and complexities of the

Page 257: NATO Code of Best Practice for C2 Assessment

257Chapter 10

subject matter, as well as the need to use a richcombination of methods, can be difficult. Again, it isgood practice to use checklists and structuredappraisal, in this case to maintain an objective view ofstudy rigour. The choice of checklist or appraisalstructure can depend upon personal preferences, butAnnex F lists a number of structures, which haveproved useful in the experience of the nationscontributing to this COBP. These include:

• Repeatability, independence, grounding in reality,objectivity of process, uncertainty androbustness (RIGOUR); and

• Strengths, weaknesses, opportunities, andthreats (SWOT).

Where an assessment uses experimentation orobservation of exercises it is important to identifyindependent and dependent variables and, for theformer, which are controllable, which are measurable,etc. Figure 10.1, for example, illustrates the variety ofvariables that need to be considered when studyingdecisionmaking.

A thorough understanding of the variables of a studyis essential for effective treatment for uncertainty.

Page 258: NATO Code of Best Practice for C2 Assessment

258 NATO Code of Best Practice for C2 Assessment

Figure. 10.1. Illustration of the Variety of VariablesRelevant to Decisionmaking

Risk-Based Analysis

In cost-benefit or cost-effectiveness analyses there hashistorically been a tendency to focus on the single mostlikely value (its expected value) for each input factor(including scenario and course of action) and to “solve”the problem based upon these point estimates. Thiscan lead to fragile solutions, which do not providedecisionmakers with help in dealing with theuncertainties and associated risks inherent in the realproblem. A risk-based approach can overcome somemajor pitfalls by adding a focus on the multiplicity ofpossible outcomes and opening up the possibility ofricher solutions involving portfolios of action that producerobustness rather than narrow optimality. Portfolio-based solutions can be associated with cost-benefitapproaches, but this has not been common in practice.

The subject of C2 assessments is typically both highlyuncertain and opaque in nature. The team should,

Page 259: NATO Code of Best Practice for C2 Assessment

259Chapter 10

therefore, expect a complex and partly hidden set ofrisks to C2 studies and C2-related decisions. It isrecommended that serious efforts be made to illuminatethe risks, and it is good practice to include an explicitrisk-based analysis in the study and in study planning.

Different people have different worldviews and differentapproaches to risk taking. Analysts should seek to findout about how risks are traded for expected gain bythe study sponsor so that their worldview can beappropriately represented in the assessment.

Risk-based analysis needs metrics for risks and failureas well as success and benefits, which means thatone needs a way of expressing various levels of all ofthese dimensions.

In C2 assessments, analysts need to be particularlyalert to the possibility of chaotic behaviours arising fromdynamic interactions. Human and organisationalfactors are particularly prone to this type of instability.A sound and explicit treatment of boundaries andsystem definitions during problem formulation isessential to managing this aspect of the assessment.Holistic systems thinking and complexity-basedanalysis may be needed for this purpose.

When dealing with problems involving humandecisionmaking the analyst must be aware of thediversity of courses of action that are possible as thesituation evolves. The analyst must ensure that thesecourses of action are represented in a way that allowsfor the possibility of a discontinuous set of possiblefuture states. If wide divergence in course of actionselection is possible over the timescales underconsideration, then the treatment of scenarios may

Page 260: NATO Code of Best Practice for C2 Assessment

260 NATO Code of Best Practice for C2 Assessment

be particularly problematic and may require explicitconsideration under the risk and sensitivity heading.

Managing Study Risk

C2 assessments are inherently complex. They oftencontain poorly understood study problems. Thesefactors enhance the level of risk in the design andconduct of the assessment.

It is good practice to try to make a complete list ofrisks and then treat them in appropriate detail. A top-down approach may be useful in assuring a certaindegree of completeness. Independently of the depthof each concrete study, it is good practice to use arisk perspective to explicitly assess the robustness ofa conclusion or recommendation. If possible, oneshould try to keep track of which mechanisms underlieeach risk, the probability that it will occur and how itcan be mitigated, then consider cost of mitigation andcost of risk impact, before taking a cost-benefitapproach in managing the study risk level.

C2 problems are often weakly bounded. There is aparticular risk associated with problem formulation andthe identification of factors. Annex F lists a number ofchecklists, which have proved useful in ensuring anadequately complete treatment of the multiple risk areas.

The Generic Risk Register

The generic risk register for C2 assessment (GRR) isa good starting point for a risk-based analysis of astudy project (not the decision problem supported bythe study). The lists of risks, consequences of impact,

Page 261: NATO Code of Best Practice for C2 Assessment

261Chapter 10

and mitigation recommendations are directly derivedfrom the COBP. They are, therefore, generic in theirform and should only be taken as a starting point for aproject-specific risk analysis. The GRR hasfunctionality, which allows it to be used as a basic tool,or the list of risks can be copied into a more elaboratetool used for risk handling in the project.

An illustrative example is the case study undertakenby the SAS-026 study group, when a brief journey ofonly an hour through the generic risk register turnedout very useful. Although the study team was wellaware of the advice and possible pitfalls in advance,explicitly addressing them with a risk perspective ledto the recognition of two significant flaws:

• The low number of planned iteration had thepotential to lead to a risk of an inefficient andunfocused study with possibly misleadingresults; and

• The relatively narrow selection of methodologicalapproaches entailed a risk of misleadingconclusions. There could be importantconsequences of varying the C2 system thatwere not reflected in the study, and the possiblybiased representation would represent a hiddenflaw in conclusions.

The example illustrated both the usefulness of makingthe risk assessment explicit, since these problems wereactually known to all participants prior to the riskmanagement session, and also illustrated that evensurface scratching (as was the case here) may lead tosignificant results. It is, therefore, advisable not to skiprisk analysis even when time and resources are limited.

Page 262: NATO Code of Best Practice for C2 Assessment

262 NATO Code of Best Practice for C2 Assessment

Communication of Risk and Uncertainty

The high level of uncertainty (and hence risk) in C2problems and their assessment mean that thecommunication of risk and uncertainty to studycustomers, sponsors, and stakeholders is of particularimportance. The value of a quality assessment is that itprovides decisionmakers with the evidence they need tomake better decisions. The nature and quality of evidencerequired depend upon the decisionmaker’s approach toand tolerance for risk-taking and his/her level of priorknowledge of the problem area being assessed.

Communication is about giving the receiver of amessage a right impression, not about formulating astatement that is formally correct on its own. This mightseem obvious, but in communicating uncertainties andrisk, it should be given particular attention, since thecomplexities of the subject, human limitations inreflecting on uncertainty, and the lack of a commonset of concepts (and also hidden agendas) often willmake communication far less than perfect.

As discussed earlier, some uncertainty can be reducedby analysis. However, some uncertainty is inherent inthe problem and needs to be exposed to thedecisionmaker. A failure to do this can lead to falseconfidence in study conclusions. C2 assessments, inparticular, will contain many areas of unresolvabledoubt and uncertainty. These should be openly andhonestly communicated to decisionmakers to avoidmisinterpretation of study conclusions. Support todecisionmaking under uncertainty is a vitalcomplementary activity to C2 assessment.

Page 263: NATO Code of Best Practice for C2 Assessment

263Chapter 10

Different ways of framing results and uncertainties maystrongly influence the way results are perceived. Thisshould be considered thoroughly to assure compliancewith ethical standards. One should be aware thatstakeholders (including customers) might have a tendencyto gloss over or alternately over-focus on uncertainties.An analyst should take care in communicating an objectiveimpression of risks and uncertainty.

Limitations and shortcomings in a study are a crucialpart of the study result and should be communicated aseffectively as possible. This enables alignment of studyresults and background knowledge on the problem.

Human ability to understand and reason on uncertaintyis limited. (Kahnmann et al., 1982). These limitationsshould be given particular attention whencommunicating risk and uncertainty to stakeholdersand decisionmakers.

People with different backgrounds will have differentconcepts of uncertainty (e.g., people without somemathematical background won’t necessarily intuitivelyunderstand Bayesian concepts). Thorough dialoguemay be needed to find a common language.Visualisation techniques will be helpful in this regardsince they are usually more powerful than verbalreference to abstract concepts.

One should be careful not to overwhelm an audiencewith details on uncertainties and possibleshortcomings. However, a continuing dialogue aboutuncertainty will facilitate a common understanding.Also, the analysis team should be aware of thepossibility that residual uncertainties may make itimpossible to draw robust conclusions.

Page 264: NATO Code of Best Practice for C2 Assessment

264 NATO Code of Best Practice for C2 Assessment

Conclusions

The explicit treatment of risk and uncertainty is bestpractice in all studies and is of particular importancein C2 assessment. A variety of dimensions and aspectsof C2 assessments carry risk and uncertainty,particularly because they are liable to include complex,interacting factors. Even when study resources arelimited, it is best practice to include both an assessmentof most likely outcome (result), to do sensitivityanalyses looking for other likely outcomes, and to takea risk-based approach looking for the more extremepossible outcomes (in particular failures).

The use of checklists is recommended to ensure arigorous treatment of risk and uncertainty. The bestchoice of checklist depends upon personal preference,but a number of examples are presented that havebeen found useful by the nations contributing to thisCOBP. Additionally, the GRR has proved useful as anaid to C2 study risk management.

Chapter 10 Acronyms

C2 – Command and Control

COBP – Code of Best Practice

DOTMLP – Doctrine, Organisation, Training, Material,Logistics, Personnel

DOTMLPF – Doctrine, Organisation, Training, Material,Leadership, Personnel, Facilities

GRR – General Risk Register

Page 265: NATO Code of Best Practice for C2 Assessment

265Chapter 10

METT-TC – Mission, Enemy, Troops, Terrain, Troops,Time, and Civil considerations

OOTW – Operations Other Than War

PESTLE – Political, Economic, Social, Technological,Legal, and Environmental

RIGOUR – Repeatability, Independence, Grounding inreality, Objectivity of process, Uncertainty, and Robustness

ROE – Rules of Engagement

SWOT – Strengths, Weaknesses, Opportunities, Threats

Chapter 10 References

Kahnmann, D., Slovik, P., & Tversky, A. (Eds.). (1982).Judgement under uncertainty: Heuristics andbiases. New York: Cambridge University Press.

Keppel, G. (1973). Design and analysis—Aresearcher’s handbook (ISBN0-13-200030-X).Englewood Cliffs, NJ: Prentice-Hall.

Kruskal, J. B., & Wish, M. (n.d.). Multidimensionalscaling. In E. M. Uslaner (Ed.), QuantitativeApplications in the Social Sciences (Series 07–011, ISBN 0-8039-0940-3). Beverly Hills, CA:SAGE Publications.

Mathieson, G. L., Tolk, A., Sundfoer, H. O., & Sinclair,M. (2001). Data the glue to assessment.Washington, DC: Cooperative ResearchProgram (CCRP).

Page 266: NATO Code of Best Practice for C2 Assessment

266 NATO Code of Best Practice for C2 Assessment

Pirlot & Vincke. (1997). Semiorders: Properties,representations, applications (ISBN 0-7923-4617-3). New York: Kluwer Academic.

Siegal & Castellan. (1998). Non-parametric statisticsfor the behavioral sciences (2nd ed., ISBN 0-07-100326-6). New York: McGraw-Hill.

Wilson, J., & Corlett, E. (Eds.). (1990). Evaluation ofhuman work: A practical ergonomicsmethodology (2nd ed.). London: Taylor &Francis.

Winer, B. J. (1962). Statistical principles inexperimental design. New York: McGraw-Hill.

1This interpretation of risk is used throughout this section.However, other common definitions have been included at AnnexE as an aid to common understanding.2Tom Lucas’ work in Project Albert.3These types of experiment are typically unrepeatable due toresource constraints.

Page 267: NATO Code of Best Practice for C2 Assessment

267

CHAPTER 11

Products

We are what we repeatedly do.Excellence, then, is not an act, buta habit.

—Aristotle

The purpose of study products is threefold: tocommunicate results to sponsors and

stakeholders; to provide a lasting record of what wentinto the planning; and to establish credibility within thetechnical community. Verbal communication andprogress reports may be necessary, especially for ashort study, however a thorough written record isessential to the credibility and longevity of study results.

Page 268: NATO Code of Best Practice for C2 Assessment

268 NATO Code of Best Practice for C2 Assessment

Study products that are delivered to the customergenerally include a study plan, periodic status/progressreports, and a final report. Several other products maybe produced and delivered to the customer. Somestudy products are created and maintained primarilyfor internal study support. These products are notunique to C2 OOTW studies or to general C2 studies.

Products that are typically produced from a study include:

Study Plan

The study plan described here is a subset of the StudyManagement Plan of section 4-E. It generally includesthe initial problem formulation and solution strategy,emphasizing a general understanding of the problem,deliverables, budget, time line, and solution approach.The study plan includes at a minimum:

• Statement of the problem (problemformulation)—“the what”;

• Solution strategy “the how and the when”;

� Tasks and their relationships; and

� Milestones.

Periodic Status/Progress Reports

Periodic status reports describe the activities of the mostrecent period and the expected activities of the nextperiod, and link the activities to the tasking statement.Status reports also contain cost information and trackadherence to the planned schedule. One of the mostimportant sections of the status report is the “Problems

Page 269: NATO Code of Best Practice for C2 Assessment

269Chapter 11

Encountered” section. This section should includetechnical problems, budgetary problems, and mostimportantly problems relating to sponsor/assessmentteam relations. These reports may be delivered:

• To the sponsor and other stakeholders; and

• To the peer review team.

Final Report

The final report contains sections that address the following:

• Objectives (customer question and theproblem formulation);

• Scope and assumptions;

• Approach (solution strategy);

• Findings/conclusions (with caveats);

• Recommendations (optional);

• Future challenges (optional);

• Appendices;

� Data collection instruments or discussion ofinstrumentation (optional);

� Data dictionary (optional);

� Data (optional);

� Glossary of terms/acronyms; and

� References.

Page 270: NATO Code of Best Practice for C2 Assessment

270 NATO Code of Best Practice for C2 Assessment

The findings/conclusions and recommendations (ifpresent) should address each of the objectives.

The final report should be produced as an archivabledocument that can be readily accessed by thecommunity. This means that the document is producedin electronic form using commercial standards. Ideallythese documents should be made available throughweb sites. However, it is recognised that securityconsiderations and language will limit the availabilityof many documents. Briefings are a useful form ofcommunication. It is desirable that a briefingaccompany the more formal final report. However,when only a briefing is produced, it must be annotated.

Command and control (C2) problems tend to be multi-dimensional and highly complex. Therefore, they poseunique challenges to the assessment team incommunicating the result effectively to thedecisionmaker. There are several steps that can betaken to facilitate this communication. First, if the productis an architecture, it is recommended that the templatesdeveloped by community to depict alternativearchitectural perspectives (e.g., operational, system,and technical architectures) be employed. Second, ithas proven useful to employ “stop light” charts (red,yellow, and green) to characterise measures ofperformance/effectiveness. However, such techniquesare useful only in conveying qualitative insights. Atendency to rely heavily upon this presentationtechnique may result in assessments that do not driveto quantitative results. Third, there is a need to developand employ visualisation techniques to capture the fullrichness of the insights, particularly risk and uncertainty(e.g., depicts the distribution rather than just the

Page 271: NATO Code of Best Practice for C2 Assessment

271Chapter 11

statistical) that are derived in assessments. Preliminaryresearch is underway in this area. It needs to beextended and translated into application.

Because C2 data are rare, every effort should be madeto retain the data and make it available to otherrecorders. Sometimes others will require “sanitising”the data to prevent anyone knowing which units orexercises produced it. When archiving data themetadata labels that identify the conditions under whichit was collected should also be preserved.

A peer review process is an essential part of producinga final report. It should begin with the preparation ofthe study plan and continue throughout the life of theassessment. Draft products must be provided toreviewers in ample time for them to review andcomment on the product and for the team to reflecttheir comments. A failure to institute an adequatereview process can compromise the quality, credibility,and utility of the assessment.

Other Delivered Products

Several products may be created and delivered,depending on the needs of the project:

• Description of the assessment participants(including assessment team) and theirrelationships (defined in 2-B);

• Description of the budget and time constraints toprovide context information for future studies;

• Human and Organizational Factors Checklist;

• Scenario details

Page 272: NATO Code of Best Practice for C2 Assessment

272 NATO Code of Best Practice for C2 Assessment

• Video and audio presentations; and

• Created models, spreadsheets, decision supporttools, etc.

• Simple models and tools (EG for the sponsor andother parties to interactively explore interrelationshipsbetween the variables of the study);

• Experimentation Campaign Plan (if the C2Assessment makes use of a series of linkedevents such as seminars, wargames, commandpost exercises (CPX), field training exercises(FTX) etc).

Other Products

In addition to study products that are delivered to thesponsor there are a number of products that bestpractice demands are produced and maintained duringthe course of a study. These include:

• A project journal;

• Study Management Plan (defined in 4-E)

• Data collection plan;

• Data analysis plan; and

• Study Glossary

Conclusion

A study is generally appraised based on the quality of itsstudy products. This Code of Best Practice aims tohighlight important areas that will improve both the

Page 273: NATO Code of Best Practice for C2 Assessment

273Chapter 11

assessment process, and the quality, longevity, and utilityof the study products. The goal is to make the state ofpractice one and the same with the state of the art.

Of all the communities available to usthere is not one I would want to devotemyself to, except for the society of thetrue searches, which has very fewliving members at any time…

—Albert Einstein

Page 274: NATO Code of Best Practice for C2 Assessment

A-1

ANNEX A

ReferencesAdelmam, L. (1992). Evaluating decision support and

expert systems. Chichester, UK: John Wileyand Sons Ltd. [Chapter 5]

Alberts, D. S. (1996). The unintended consequencesof information age technologies: Avoiding thepitfalls, seizing the initiative. Washington, DC:National Defense University. [Chapter 1][Chapter 6]

Alberts, D. S., & Czerwinski, T. J. (1997). Complexity,global politics and national security .Washington, DC: National Defense University.[Chapter 8]

Alberts, D. S., & Hayes, R. E. (1995). Commandarrangements for peace operations .Washington, DC: National Defense University.[Chapter 1] [Chapter 6]

Allen, P. M. (1988). Dynamic models of evolvingsystems. System Dynamics Review, 4, 109-130. [Chapter 8] Available to purchase fromhttp://www.albany.edu/cpr/sds/publications.htm

Almèn, A., DeBayser, O., Eriksson, P., Jarlsyik, H.,Lascar, G. & Ritchey, T. (1999). Models forevaluating peace support operations. A FOA–

Page 275: NATO Code of Best Practice for C2 Assessment

A-2 NATO Code of Best Practice for C2 Assessment

DGA/DSP /Centre d’Analyse de Défense JointWorkshop. [Chapter 7]

Baeyer, A. V. (2000). Human Factors in militärischenFührungsprozessen für die simulation [Humanfactors in military guidance processes for thesimulation] (B-IK 3103/01). München, Germany:Industrieanlagen-Betriebsgesellschaft (IABG).[Chapter 6] Available from http://www.unibw-muenchen.de/campus/itis/projekte.html

Baeyer, A. V. (2001). Task analysis, training andsimulation for operations other than war(OOTW). Paper presented at the North AtlanticTreaty Organisation (NATO) SpecialistsMeeting on Human Factors in the 21st Century,Paris. [Chapter 6]

Bailey, T. J. (2001). Multiple Runs of a DeterministicCombat Model (Technical Document 01-001).Fort Leavenworth, KS: U.S. Army Training &Doctrine Command (TRADOC) AnalysisCenter. [Chapter 8]

Bailey, T. J., & Clafin, R. A. (1996). Informationoperations in force on force simulations (USArmy TRAC Technical Memorandum TM-0196). Fort Leavenworth, KS: U.S. ArmyTraining & Doctrine Command (TRADOC)Analysis Center. [Chapter 8]

Bares, M. (1996). Pour une prospective des systèmes decommandements [For a futurology of the systemsof commands]. Polytechnica. [Chapter 8]

Page 276: NATO Code of Best Practice for C2 Assessment

A-3Annex A

Bellman, R. E., & Zadeh, L. A. (1970). Decision-makingin a fuzzy environment. Management Science,17(4), 141-164. [Chapter 8]

Bijker, W.E. (1990). The Social Construction ofTechnology. PhD Thesis University of Twente,Eijsden, The Netherlands.

Bornman, L. G., Jr. (1993). Command and controlmeasures of effectiveness handbook (C2 MoCEhandbook) (TRAC-TD-0393). FortLeavenworth, KS: U.S. Army Training &Doctrine Command (TRADOC) AnalysisCenter. [Chapter 5]

Brown, R. V., Kahr, A. S., & Peterson, C. (1974).Decision analysis: An overview. New York: Holt,Rinehart and Winston. [Chapter 5]

Canadian Department of National Defence DocumentB-GG-005-004/AF-023. (1999). Civil-Militarycooperation in peace, emergencies, crisis andwar. [Chapter 5] Available on-line at http://www.dnd.ca/eng/index.html

Candan, U. and Lambert, N. J. (2002). Methods used inthe Analysis and Evaluation of the ImmediateReaction Task Force (Land) Command andControl Concept. TN 897. NATO C3 Agency, POBox 174, 2501 CD, The Hague, The Netherlands.

Checkland, P. (1999). Systems thinking, systemspractice. Chichester, UK: John Wiley and SonsLtd. [Chapter 3]

Page 277: NATO Code of Best Practice for C2 Assessment

A-4 NATO Code of Best Practice for C2 Assessment

Cleman, R. T. (1990). Making hard decisions: Anintroduction to decision analysis. Pacific Grove,CA: Duxbury Press. [Chapter 4]

Code of Best Practice (COBP) on the assessment ofcommand and control (Research & TechnologyOrganisation (RTO) Technical Report 9AC/323(SAS)TP/4). (1999). Neuilly-Sur-SeineCedex, France: North Atlantic TreatyOrganisation (NATO) RTO. [Chapter 8]Available from http://www.rta.nato.int/RDP.asp?RDP=RTO-TR-009

Coyle, R. G., & Yong, Y. C. (1996). A scenarioprojection for the South China Sea. Furtherexperience with field anomaly relaxation.Futures, 28(3). [Chapter 7]

Darilek, R. (2001). Measures of effectiveness for theinformation age army. Santa Monica, CA: RandArroyo Center. [Chapter 6] Available from http://www.rand.org/publications/MR/MR1155/

De Maio, J. (Ed.). (2000). The challenges ofcomplementarily [Report on the fourthworkshop on protection for human rights andhumanitarian organisations]. Geneva,Switzerland: International Committee of the RedCross. [Chapter 3]

Defence Science and Technology Laboratory (DSTL).(2000). A guide to best practice in C2assessment: Version 1. Farnborough, UnitedKingdom. [Chapter 3]

Defence Systems. (1982). Theatre headquarterseffectiveness: Its measurement and relationship

Page 278: NATO Code of Best Practice for C2 Assessment

A-5Annex A

to size, structure, functions, and linkages; Vol.1: Measures of effectiveness and theheadquarters effectiveness assessment tool.McLean, VA: Author. [Chapter 5]

Defense information infrastructure (DII) shared dataenvironment (SHADE), Capstone document.(1996). Arlington, VA: U.S. DoD, DefenseInformation Systems Agency (DISA). [Chapter9] Available from http://www.disa.mil/coe/srs/srs_base/srs_csa_PDF/zsrscsa.pdf

Defining and Measuring Success in the New StrategicEnvironment [Technical Cooperation ProgramPanel JSA-TP-3] Proceedings Report (1999)Ottawa, Ontario, Canada.

Depuy, T. (1979). Numbers, prediction and war: Usinghistory to evaluate combat factors and predictthe outcome of battles. Indianapolis, IN: Bobs-Merrill. [Chapter 6]

Deutsch, R., & Hayes, R. E. (1990). Personality andsocial interaction of effective expert analysts:The issue of cognitive complexity [technicalpaper]. [Chapter 6]

Deutsch, R., & Hayes, R. E. (1992). The war story inwarfighting situation assessment anddecisionmaking [technical paper]. [Chapter 6]

Dodd, L. (1995). Adaptive C2 modeling for RISTAeffectiveness studies. Second InternationalCommand & Control Research & TechnologySymposium Proceedings. Washington, DC:National Defense University. [Chapter 6]

Page 279: NATO Code of Best Practice for C2 Assessment

A-6 NATO Code of Best Practice for C2 Assessment

Duff, J. B., Konwin, K. C., Kroening, D. W., Sovereign,M. G., & Starr, S. H. (1992). In T. J. Pawlowski(Ed.), Command, control, communications,intelligence, electronic warfare measures ofeffectiveness [Final Report]. Fort Leavenworth,KS: Military Operations Research Society(MORS) workshop. [Chapter 5]

Elzen, B., Ensenrink, B. & Smit, W.A. (1996). Socio-Technical Networks: How a Technology Studiesapproach may help to solve problems relatedto technical change. [Social Studies of Science,vol 26, pp.95-141]

Engel, R., & Townsend, M. (1991). Division CommandAnd Control Performance Measurement.Toronto, Ontario, Canada: Defence and CivilInstitute of Environmental Medicine (DCIEM).[Chapter 5]

Entretiens Sciences & Défense. (1998). Nouvellesavancées scientifiques et techniques, tome 1[New scientific and technical advances, Volume1]. DGA/CHEAr/D.R.E.E. [Chapter 7]

The Federation of American Scientists (FAS) WebPage on Digital Terrain Elevation Data: http://www.fas.org/irp/program/core/dted.htm

Fisher, R. A. (1951). Design of experiments (6th ed.).Ontario, Canada: Oliver and Boyd, Ltd. [Chapter 3]

Fliss, R., Feld, R., Geisweidt, L., Kalas, W. D., Kirzl,J., Layton, R., et al. (1987). Army commandand control evaluation system user’s handbook(MDA903-96-C-0407). Ft. Leavenworth, KS:

Page 280: NATO Code of Best Practice for C2 Assessment

A-7Annex A

Army Research Institute for the Behavioral andSocial Science. [Chapter 5]

Flood, R. L., & Jackson, M. C. (1991). Creative problemsolving: Total systems intervention (ISBN0471930520). Chichester, UK: John Wiley andSons Ltd. [Chapter 3]

Gass, S.I., & Harris, C.M. (Eds.), Encyclopedia ofoperations research and management science(2nd ed.). New York: Kluwer Academic.[Chapter 4]

Godet, M. (1998). Manuel de prospective stratégique,tome 1, Une indiscipline intellectuelle, tome 2,L’art et la méthode [Strategic prospectivehandbook, Volume 1. An intellectualindiscipline, Volume 2. In The method and art].Paris: Dunod. [Chapter 7]

Halpin, S. M. (1992). Army command and controlevaluation system (ACCES): A brief description(Working paper LV-92-01). Ft. Leavenworth,KS: US Army Research Institute for theBehavioral and Social Sciences. [Chapter 5]

Hartley, D. S., III. (1996). Operations other than war:Requirements for analysis tools research report(K/DSRD-2098). Oak Ridge, TN: LockheedMartin Energy Systems. [Chapter 4] [Chapter8] Available from http://www.msosa.dmso.mil/ootw.html

Hartley, D. S., III. (2001). Battle modeling. In S. I. Gass& C. M. Harris (Eds.), Encyclopedia ofoperations research and management science

Page 281: NATO Code of Best Practice for C2 Assessment

A-8 NATO Code of Best Practice for C2 Assessment

(2nd ed.). New York: Kluwer Academic.[Chapter 8]

Hartley, D. S., III. (2001). Military operations other thanwar. In S. I. Gass & C. M. Harris (Eds.),Encyclopedia of operations research andmanagement science (2nd ed.). New York:Kluwer Academic. [Chapter 4]

Hawkins, G. (1984). Structural variance and otherrelated topics in the SHAPE Armour/Anti-Armour study. In R. K. Huber (Ed.), Systemsanalysis and modeling in defense—development, trends, and issues. New York:Plenum Press. [Chapter 5]

Hayes, R. E. (1994). Systematic assessment of C2effectiveness and its determinants. 1994Symposium on C2 Research and Decision Aids.Monterey, CA: Naval Post Graduate School.[Chapter 6]

Hillier, F. S., & Lieberman, G. J. (1990). Introductionto operations research (5th ed.). New York:McGraw-Hill. [Chapter 4]

Hoeber, F. P. (Ed.). (1981). Military applications ofmodelling: Selected case studies. Alexandria,VA: Military Operations Research Society.[Chapter 4]

Hofmann, H. W., & Hofman, M. (2000). On thedevelopment of command and control moduleson battalion down to single item level. In Newinformation processing techniques for militarysystems (pp. 8.1–8.12). North Atlantic TreatyOrganisation (NATO) Research & Technology

Page 282: NATO Code of Best Practice for C2 Assessment

A-9Annex A

Organisation (RTO) Meeting Proceedings 49.[Chapter 6] Available from ftp://ftp.rta.nato.int/PubFulltext/RTO/MP/RTO-MP-049/MP-049-$$TOC.pdf

Holt, J., & Pickburn, G. (2001). OA techniques for thefuture [Defence Evaluation and ResearchAgency (DERA) Unclassified Report].Farnborough, United Kingdom: DefenceScience and Technology Laboratory (DSTL).[Chapter 3]

Hughes, W. P., Jr. (Ed.). (1989). Military modelling(2nd ed.). Alexandria, VA: Military OperationsResearch Society. [Chapter 4]

Human Systems. (1993). Survey of evaluationmethods for command and control systems.Ontario, Canada. [Chapter 5]

The impact of C3I on the battlefield Panel 7 on thedefence applications of operational research(AC243 panel-7 TR/4). [North Atlantic TreatyOrganisation (NATO) unclassified technicalreport] (1994). Ad Hoc Working Group on theimpact of C3I on the battlefield. Brussels,Belgium: NATO Defence Research Group.[Chapter 5] Available from www.nato.int

IS10027: 1990 an information resource dictionarysystem (IRDS) framework. (1990). [Chapter 9]

Jaiswal, N. K. (1997). Military operationsresearch:Quantitative decisionmaking. NewYork: Kluwer Academic. [Chapter 4]

Page 283: NATO Code of Best Practice for C2 Assessment

A-10 NATO Code of Best Practice for C2 Assessment

Jantsch, E. (1968). La prévision technologique [Thetechnology forecast]. Paris: OCDE. [Chapter 7]

Jaques, E. (1976). A general theory of bureaucracy.Portsmouth, NH: Heinemann EducationalBooks. [Chapter 6]

Kahnmann, D., Slovik, P., & Tversky, A. (Eds.). (1982).Judgement under uncertainty: Heuristics andbiases. New York: Cambridge University Press.[Chapter 10]

Keeny, R. L., & Raiffa, H. (1976). Decisions withmultiple objectives: Preferences and valuetradeoffs. Chichester, UK: John Wiley and SonsLtd. [Chapter 5] [Chapter 6]

Keppel, G. (1973). Design and analysis—Aresearcher’s handbook (ISBN0-13-200030-X).Englewood Cliffs, NJ: Prentice-Hall. [Chapter3] [Chapter 10]

Klein, G. A. (1989). Recognition primed decisions.Advances in Man-Machine Systems Research,5, 47-92. [Chapter 8]

Kroening, D., & Moffat, J. (Eds.). (1999). Workshopon C3I modeling (AC/323SASTP/9). NorthAtlantic Treaty Organisation (NATO) Research& Technology Organisation (RTO) MeetingProceedings 29. [Chapter 8] Available for orderfrom www.rta.nato.int

Krusche, S., & Tolk, A. (2000). Information processingas a key factor for modern federations of combatinformation systems. Paper presented at theNorth Atlantic Treaty Organisation (NATO)

Page 284: NATO Code of Best Practice for C2 Assessment

A-11Annex A

Research & Technology Organisation (RTO)Information Systems Technology Panel (IST)Symposium: New information processingtechniques for military systems, Istanbul, Turkey.[Chapter 9] Available in PDF format from http://home.t-online.de/home/Andreas.Tolk/literat.htm

Kruskal, J. B., & Wish, M. (n.d.). Multidimensionalscaling. In E. M. Uslaner (Ed.), QuantitativeApplications in the Social Sciences (Series 07– 011, ISBN 0-8039-0940-3). Beverly Hills, CA:SAGE Publications. [Chapter 10]

Lambert, N.J, (1997). “Operation Firm Endeavour:Campaign Data, Its Collection and Use.”Presented at the Directorate of Land WarfareHistorical Co-ordination Group Conference,Trenchard Lines, DGD&D, Upavon, UK, 28-29April 1997. Also available from the British LibraryDocument Supply Centre, Boston Spa,Wetherby, West Yorkshire, LS23 7BQ, UnitedKingdom, Reference DSC Shelfmark: 99/20363.

Lambert, N. J. (1998). Operational analysis in the field:The utility of campaign monitoring to theimplementation force (IFOR) operation in Bosniaand Herzegovina 1995–1996. Proceedings fromCornwallis III, Analysis for Peace Operations.(ISBN 1-896551-22-X). The Lester B PearsonCanadian International Peacekeeping TrainingCentre, Cornwallis Park, PO Box 100,Clementsport, NS BOS 1EO, Canada.

Lambert N J, (1998). Operation Firm Endeavour:Campaign Monitoring; Factional complianceand the Return to Normality in Bosnia and

Page 285: NATO Code of Best Practice for C2 Assessment

A-12 NATO Code of Best Practice for C2 Assessment

Herzegovina. Operational Analysis Branch, HQARRC, Queens Avenue, JHQ Rheindahlen,41179 Moenchengladbach, Germany. Alsoavailable from the British Library DocumentSupply Centre, Boston Spa, Wetherby, WestYorkshire, LS23 7BQ, United Kingdom,Reference DSC Shelfmark: 99/20361 and 99/20367 (annexes).

Lambert, N. J. (2002). Measuring the Success of theNorth Atlantic Treaty Organisation (NATO)Operation in Bosnia and Herzegovina 1995–2000.Issue 140/2, pp. 459-481. European Journal ofOperations Research, Special 2000 Edition.

Latour, B. (1997). Bruno Latour’s keynote speech: Onrecalling ANT. In Actor network and after.Centre for Social Theory and Technology(CSTT), Department of Sociology and SocialAnthropology, Keele University, UnitedKingdom. [Chapter 5] Available from http://w w w . c o m p . l a n c s . a c . u k / s o c i o l o g y /stslatour1.html

Latour, B. (1999). On Recalling ANT. In J. Law & J.Hassard (Eds.), Actor network theory and after(pp. 15-25). Oxford, United Kingdom: BlackwellPublishers. [Chapter 5]

Lauren, M. K. (2000). Modelling combat using fractalsand the statistics of scaling systems, 5(3), 47-58. Alexandria, VA: Military OperationsResearch Society. [Chapter 8]

Page 286: NATO Code of Best Practice for C2 Assessment

A-13Annex A

Lauren, M. K. (2001). Beyond Lanchester: A fractal basedapproach to equations of attrition. Manuscriptsubmitted for publication. [Chapter 8]

Leroy, D., & Hofmman, D. (1996). Criterion ofcommand, control, communications,intelligence (C3I’s) effectiveness. 1996Command & Control Research & TechnologySymposium Proceedings. Washington, DC:National Defense University. [Chapter 5]

Libicki, M. C. (1997). Defending cyberspace and othermetaphors. Washington, DC: National DefenseUniversity. [Chapter 8]

Long term scientific study SAS-017 on human behaviourrepresentation [Final Report, Chapter 2]. (n.d.).[Chapter 6] Available, with clearance, from: http://www.rta.nato.int/RDP.asp?RDP=RTO-TR-047

Mandeles, M.D., et al. (1996). Managing “commandand control” in the Persian Gulf war (ISBN 0-275-952614). Westport, CT: GreenwoodPublishing Group. [Chapter 6]

Mathieson, G. L., Tolk, A., Sundfoer, H. O., & Sinclair,M. (2001). Data the Glue to Assessment.Washington, DC: Cooperative ResearchProgram (CCRP). [Chapter 9] [Chapter 10]

McCafferty, M. C., & Lea, S. C. (1997). Measures ofeffectiveness for operations other than war(CDA/HLS/R9714/1.1). Farnborough, UnitedKingdom: Center for Defence Analysis.[Chapter 5]

Page 287: NATO Code of Best Practice for C2 Assessment

A-14 NATO Code of Best Practice for C2 Assessment

Midgley, G. (1989). Critical systems and the problemof pluralism. Cybernetics and Systems, 20, 219-231. [Chapter 3]

Modeling and analysis of command and control (AC/323(SAS)TP/12). (1999). North Atlantic TreatyOrganisation (NATO) Research & TechnologyOrganisation (RTO) Meeting Proceedings 38.[Chapter 8] Available from http://www.rta.nato.int/RDP.asp?RDP=RTO-MP-038

Moffat, J. (1996). The system dynamics of futurewarfare. European Journal of OperationalResearch, 90(3), 609-618. [Chapter 8] Availablewith registration from http://www.palgrave-journals.com/jors/

Moffat, J. (2000). Representing the command andcontrol process in simulation models of conflict.Journal of the Operational Research Society,51(4), 431-439. [Chapter 6] [Chapter 8]Available with registration at http://www.palgrave-journals.com/jors/

Moffat, J. (2002). Command and control in theinformation age—Representing its impact.Norwich, United Kingdom: The StationeryOffice. [Chapter 6] [Chapter 8] Available fromwww.tso.co.uk

Moffat, J., & Dodd, L. (2000). Bayesian decisionmaking andcatastrophes. Defence Science and TechnologyLaboratory (DSTL) [Unpublished Report]. [Chapter3] Available from [email protected]

Moffat, J., & Passman, M. (2002). Metamodels andemergent behaviour in models of conflict.

Page 288: NATO Code of Best Practice for C2 Assessment

A-15Annex A

‘Proceedings of the Simulation Study Group TwoDay Workshop, March 2002. The OperationalResearch Society, UK.’ [Chapter 8] Available onwww.orsoc.org.uk

Moffat, J., & Prins, G. (2000). A revolution in militarythinking? [Issues from the 1999 DefenceEvaluation and Research Agency (DERA)Senior Seminars]. Journal of Defence Science,5(3). [Chapter 8] Available by request [email protected]

Moffat, J., & Witty, S. (2000). Changes of phase in thecommand of a military unit. Defence Science andTechnology Laboratory (DSTL) [Unpublished Report].[Chapter 3] Available from [email protected]

Moffat, J., & Witty, S. (2001). Bayesian decisionmakingand military command and control. Manuscriptsubmitted for publication. [Chapter 8]

Moffat, J., Witty, S., & Dodd, L. (2002). Bayesiandecisionmaking and the precise use of force.‘Journal of Defence Science 7(1)’ [Chapter 8]Available from [email protected]

National Research Council. (1998). Modeling humanand organizational behavior—Application tomilitary simulations (ISBN 0-309-06096-6).Washington, DC: National Academy Press.[Chapter 5] [Chapter 6] Available from http://www.nap.edu

Noble D., & Flynn, W. (1993). Recognized primeddecisions (RPD) tool concept document.Vienna, VA: Engineering Research Associates.[Chapter 6]

Page 289: NATO Code of Best Practice for C2 Assessment

A-16 NATO Code of Best Practice for C2 Assessment

Noble, D., Flynn, W., & Lanham, S. (1993). TheTADMUS recognized primed decisions (RPD)tool: Decision support for threat assessmentand response selection. Paper presented at the10th Annual Conference on Command andControl Decision Aids. [Chapter 6]

Norman, D.A. (n.d.). Organization of action slips.Psychological Review, 88, 1-15. [Chapter 6]

North Atlantic Treaty Organisation (NATO) Bi-MNC C2plan part 2—Command and controlrequirements. (1998).

North Atlantic Treaty Organisation (NATO) Annex Bto MC Guidance for Defence Planning. MC-299/5. (1996)

North Atlantic Treaty Organisation (NATO)Consultation, Command and Control Board(NC3B), Information Systems Sub-Committee(ISSC), NATO Data Administration Group(NDAG). (1999). Feasibility study for the NATOC3 information resource dictionary system,(NC3 Information Resource Dictionary System(IRDS), AC/322SC/5-WG/3WP4). Brussels,Belgium. [Chapter 9] Available from http//www.nato.int/docu/home.htm

North Atlantic Treaty Organisation (NATO) Open SystemsWorking Group (2000). NATO C3 TechnicalArchitecture. AC/322(SC/5)WP/31, Brussels

North Atlantic Treaty Organisation (NATO)Consultation, Command and Control Board(2000). NATO C3 System ArchitectureFramework. AC/322-WP/0125, Brussels

Page 290: NATO Code of Best Practice for C2 Assessment

A-17Annex A

North Atlantic Treaty Organisation (NATO)Consultation, Command and Control Board(2000). NATO Interoperability ManagementPlan (NIMP). AC/259-D/1247(rev), Brussels

North Atlantic Treaty Organisation (NATO) StudyGroup on Modelling and Simulation (1998).NATO Modelling and Simulation Master Plan,Version 1.0. AC/323(SGMS)D/2, Brussels

Nowak, M., & May, R. (1992). Evolutionary games andspatial chaos. Nature, 359. [Chapter 8]

Olae, N.–G., Roy, J., & Wetter, M. (2000). Performancedrivers—A practical guide to using the balancedscorecard (ISBN 0-471-49542-5). Chichester,UK: John Wiley and Sons Ltd. [Chapter 4][Chapter 5]

Perry, W. L., & Moffat, J. (1997). Developing modelsof decisionmaking. Journal of the OperationalResearch Society, 48(5), 457-470. [Chapter 6][Chapter 8] Available with registration from http://www.palgrave-journals.com/jors/

Perry W. L., & Moffat, J. (1997). Measuring consensusin decisionmaking: An application to maritimecommand and control. Journal of theOperational Research Society, 48(4), 383-390.[Chapter 6] [Chapter 8] Available withregistration from http://www.palgrave-journals.com/jors/

Perry W. L., & Moffat, J. (1997). Measuring the effectsof knowledge in military campaigns. Journal ofthe Operational Research Society, 48(10), 965-972. [Chapter 6] [Chapter 8] Available with

Page 291: NATO Code of Best Practice for C2 Assessment

A-18 NATO Code of Best Practice for C2 Assessment

registration from http://www.palgrave-journals.com/jors/

Pidd, M. (1996). Tools for thinking—Modelling inmanagement science. Chichester, UK: JohnWiley and Sons Ltd. [Chapter 3] [Chapter 8]

Pirlot & Vincke. (1997). Semiorders: Properties,representations, applications (ISBN 0-7923-4617-3). New York: Kluwer Academic. [Chapter 10]

Poel, I. (1998). Changing technologies. A comparative studyof eight processes of transformation of technicalregimes. Dissertation. Enschede, the Netherlands:Twente University Press. [Chapter 6]

Prins, G. (1998). Strategy, force planning and diplomatic/military operations. London: The Royal Instituteof International Affairs. [Chapter 3]

Quade, E. S. (Ed). (2000). Analysis for militarydecisions. Alexandria, VA: Military OperationsResearch Society. [Chapter 4]

The report of the data working group. (1998). InSimulation technology-1997 (SIMTECH 97, Vol.III). Alexandria, VA: Military OperationsResearch Society. [Chapter 9]

Research and Technology Organization (RTO) SAS-003 report on IFOR and SFOR data collection& analysis (6 AC/323(SAS)TP/11). [TechnicalReport]. (1999). Neuilly-Sur-Seine Cedex,France: North Atlantic Treaty Organisation(NATO) RTO. [Chapter 2] Available from http://www.rta.nato.int/RDP.asp?RDP=RTO-TR-006

Page 292: NATO Code of Best Practice for C2 Assessment

A-19Annex A

Richards, R., Neimeier, H. A., Hamm, W. L. (1997).Analytical modeling in support of C4ISR missionassessment (CMA) . Third InternationalSymposium on Command and ControlResearch & Technology, National DefenseUniversity, Fort McNair, Washington DC[Chapter 8]

Rip, A. (1995). Introduction of a new technology:Making use of recent insights from sociologyand economics of technology. In Technologyanalysis & strategic management. JournalsOxford, 7(4), 417-430. [Chapter 6]

Rosenhead, J. (2001). Problem structuring methods.In S. I. Gass & C. M. Harris (Eds.), Encyclopediaof operations research and managementscience (2nd ed.). New York: Kluwer Academic.[Chapter 4]

Rosenhead, J. & Mingers, J. (Ed.). (2001). Rationalanalysis for a problematic world revisited:problem structuring methods for complexity,uncertainty and conflict (2nd ed.). Chichester,UK: John Wiley and Sons Ltd. [Chapter 3]

Sarin, R. K., (2001). Multi-attribute utility theory. In S.I. Gass & C. M. Harris (Eds.), Encyclopedia ofoperations research and management science(Centennial ed.). Boston: Kluwer Academic.[Chapter 5]

Schot, J., & Rip, A. (1996). The past and future ofconstructive technology assessment. InTechnological forecasting and social change,

Page 293: NATO Code of Best Practice for C2 Assessment

A-20 NATO Code of Best Practice for C2 Assessment

54, 251-268. New York: Elsevier Science.[Chapter 6]

Schultz, J. V. (n.d). A framework for militarydecisionmaking under risks [Prospect theoryrelated to military decisionmaking]. [Chapter 6]

Schwartz, P. (1993). La planification stratégique parscénarios [Strategic planning by scenario].Futuribles, 176. [Chapter 7] Available from:http://www.futuribles.com

Sherrill, E. T., & Barr, D. R. (1996). Exploringrelationships between tactical intelligence andbattle results. Military Operations Research,2(3). [Chapter 6] Available to order from http://www.mors.org/publications/mor/vol02.htm

Siegal & Castellan. (1998). Non-parametric statisticsfor the behavioral sciences (2nd ed., ISBN 0-07-100326-6). New York: McGraw-Hill.[Chapter 10]

Sinclair, M., Tolk, A. (2002). Building up a CommonData Infrastructure NATO Symposium SAS039on the “Analysis of the Military Effectiveness ofFuture C2 Concepts and Systems” NATO C3Agency, The Hague, The Netherlands.

The Source for Environmental Representation andInterchange (SEDRIS™) Home Page: http://www.sedris.com

Sovereign, M., Kemple, W., & Metzger, J. (1994).Command, control, communications, intelligenceevaluation workshop (C3IEW). MeasuresWorkshop, United Kingdom. [Chapter 5]

Page 294: NATO Code of Best Practice for C2 Assessment

A-21Annex A

Starr, S. H. (1996). Developing scenarios to support C3Ianalyses. Proceedings of the Cornwallis Group,Pearson Peacekeeping Center, Clementsport,Nova Scotia, Canada. [Chapter 7]

Starr, S. H. (2001). Lessons recorded from applyingthe North Atlantic Treaty Organisation (NATO)code of best practice (COBP) for C2assessment to operations other than war(OOTW). Proceedings of the Command andControl Research & Technology Symposium,USNA, Annapolis, MD, June 19-21, 2001.[Chapter 2] Available from http://www.mitre.org/support/papers/tech_papers_01/starr_nato/starr_nato.pdf

Starr, S.H., Haut, D., and Hughes, W. (1997)“Developing Intellectual Tools to Support C4ISRAnalyses for Operations Other Than War,”Proceedings of the Third InternationalSymposium on Command and ControlResearch & Technology, National DefenseUniversity, Fort McNair, Washington, DC, pp.600-608, June 17–20, 1997.

Sweet, R., Metersky, M., & Sovereign, M. (1985).Command and control evaluation workshop[Military Operations Research Society (MORS)C2 MoCE Workshop]. Monterey, CA: NavalPostgraduate School. [Chapter 5]

Tolk, A. (1995). Zur Reduktion struktureller Varianzen[On the reduction of structural variances].Doctoral dissertation, University of the FederalArmed Forces of Germany, Pro UniversitateVerlag, Sinzheim. [Chapter 5]

Page 295: NATO Code of Best Practice for C2 Assessment

A-22 NATO Code of Best Practice for C2 Assessment

Tolk, A. (2000). Functional Categories of Support toOperations in Military Information Systems.Proceedings of the North Atlantic TreatyOrganisation (NATO) Regional Conference onMilitary Communication and InformationSystems, Zegrze, Poland. [Chapter 9] Availablein PDF format from http://home.t-online.de/home/Andreas.Tolk/literat.htm

Tolk, A. (2001). Bridging the data gap—Recommendations for short, medium, and longterm solutions,(01S-SIW-011). Proceedings ofthe Simulation Interoperability Workshop Spring2001, Orlando, Florida. [Chapter 9] Availablein PDF format from http://home.t-online.de/home/Andreas.Tolk/011_0301.pdf

Tolk, A., & Kunde, D. (2000). Decision supportsystems—Technical prerequisites and militaryrequirements. Proceedings of the 2000Command and Control Research andTechnology Symposium (CCRTS), Monterey,CA. [Chapter 9] Available in PDF format fromhttp://www.dodccrp.org/6thICCRTS/Cd/Tracks/Papers/Track6/038_tr6.pdf

US Department of Defense (DOD), Joint Chiefs ofStaff. (1999). Dictionary of military andassociated terms. Washington, DC: GreenhillBooks. [Chapter 9] Available from http://www.dtic.mil/doctrine/jel/doddict/index.html

US Department of Defense (DOD) model and simulationmaster plan. (1995). [Chapter 8] http://www.msiac.dmso.mil/ms_needs_documents/MS_Master_Plan.PDF

Page 296: NATO Code of Best Practice for C2 Assessment

A-23Annex A

Van Creveld, M. (1985). Command in war. Cambridge,MA: Harvard University Press. [Chapter 6]

Van Lente, H. (1993). Promising technology; Thedynamics of expectations in technologicaldevelopments . Unpublished doctoraldissertation, Twente University Press, TheNetherlands. [Chapter 6] Available [email protected]

Wagner, H. M. (1975). Principles of managementscience (2nd ed.). Upper Saddle River, NewJersey: Prentice-Hall. [Chapter 4]

Wallace, G. J., & Moffat, J. (1997). The use of PetriNets to model satellite communications systems.Journal of Defence Science, 2(1). [Chapter 8]Available from [email protected]

Wilson, J., & Corlett, E. (Eds.). (1990). Evaluation ofhuman work: A practical ergonomicsmethodology (2nd ed.). London: Taylor &Francis. [Chapter 10]

Windass, S. (2001). Just war and genocide—The OttawaPapers. Woodstock, United Kingdom: Foundationfor International Security. [Chapter 3]

Winer, B. J. (1962). Statistical principles inexperimental design. New York: McGraw-Hill.[Chapter 10]

Witty, S., & Moffat, J. (2000). Catastrophe theory and militarydecisionmaking. Defence Science and TechnologyLaboratory (DSTL) [Unpublished Report]. [Chapter3] Available from [email protected]

Page 297: NATO Code of Best Practice for C2 Assessment

A-24 NATO Code of Best Practice for C2 Assessment

The following additional documents are underdevelopment:

NATO C3 System Baseline Architecture

NATO C3 System Overarching Architecture

Page 298: NATO Code of Best Practice for C2 Assessment

B-1

ANNEX B

Acronyms

AACCES Army Command and Control Evaluation

System

ACE Resources Allied Command Europe Resources(part of SHAPE)

AF(N) Regional Command

AHWG AC/243 Panel 7 Ad Hoc Working Group

AI Artificial Intelligence

AMF(L) ACE Mobile Force (Land)

ANT Actor Network Theory

ARRC ACE Rapid Reaction Corps

ATS Norman’s Activation Trigger Scheme

B

CC2 Command and Control

Page 299: NATO Code of Best Practice for C2 Assessment

B-2 NATO Code of Best Practice for C2 Assessment

C3 Command, Control, and Communicationsor Consultation

C3I Command, Control, Communications orConsultation, and Intelligence

C4 Command, Control, Communications orConsultation, and Computers

C4I Command, Control, Communications orConsultation, Computers, and Intelligence

C4ISR Command, Control, Communications orConsultation, Computers, Intelligence,and Surveillance and Reconnaissance

CIMIC Civil-Military Co-operation

CIS Command Information Systems

CISS Center for Information Systems Security

COBP Code of Best Practice

CPX Command Post Exercise

CTA Constructive Technology Assessment

DDCEP Data Collection/Engineering Plan

DISA Defence Information Systems Agency

DOTMLP Doctrine, Organisation, Training, Material,Leadership, Personnel

DOTMLPF Doctrine, Organisation, Training, Material,Leadership, Personnel, Facilities/Forces

Page 300: NATO Code of Best Practice for C2 Assessment

B-3Annex B

DP Dimensional Parameters

DSTL Defence Science and TechnologyLaboratories (UK)

EEEA Essential Elements of Analysis

FFS Feasibility Study

FTX Field Post Exercise

GGRR Generic Risk Register

HHB(A) UK Historical Branch (Army)

HEAT Headquarters Effectiveness AssessmentSystem

HQ Headquarters

IIO International Organisation

IRDS Information Resource Dictionary Systems

Page 301: NATO Code of Best Practice for C2 Assessment

B-4 NATO Code of Best Practice for C2 Assessment

IRTF(L) Immediate Reaction Task Force (Land)

JJCSC Joint Sub-Regional Command South

Centre

JCSE Joint Sub-Regional Command SouthEast

JFCOM Joint Forces Command

K, L

MMAPEX Map Exercise

MAUT Multi-Attribute Utility Theory

MCES Modular Command and Control EvaluationStructure

M-E-M Model-Exercise-Model

METT-TC Mission, Enemy, Troops, Terrain, Troops,Time, and Civil considerations

MND(C) Multinational Division (Centre)

MoCE Measures of C2 Effectiveness

MoE Measures of Effectiveness

MoFE Measures of Force Effectiveness

MoM Measures of Merit

Page 302: NATO Code of Best Practice for C2 Assessment

B-5Annex B

MoP Measures of Performance

MoPE Measures of Policy Effectiveness

MORS Military Operations Research Society

M&S Modelling and Simulation

M-T-M Model-Test-Model

NNATO North Atlantic Treaty Organization

NC3A NATO C3 (Consultation, Command &Control) Agency

NGO Non Governmental Organisations

NL MOD Netherlands Ministry of Defense

OOA Operational Analysis

OOTW Operations Other Than War

OR Operations Research

PPESTLE Political, Economic, Social, Technological,

Legal, and Environmental

PfP Partnership for Peace

PRL Policy Requirements Land

Page 303: NATO Code of Best Practice for C2 Assessment

B-6 NATO Code of Best Practice for C2 Assessment

PVO Private Volunteer Organisations

Q

RRIGOUR Repeatability, Independence, Grounding

in reality, Objectivity of process,Uncertainty, and Robustness

ROE Rules of Engagement

SSACLANT OA Supreme Allied Command Atlantic

Operational Analysis Cell

SCOT Social Construction of Technology

SFS Strike Force South

SHADE Shared Data Environment

SHAPE Supreme HQ Allied Powers Europe

SMP Study Management Plan

SOW Statements of Work

STA Surveillance Targeting and Acquisition

STAFFEX Staff Exercise

STS Science and Technology Studies

SWOT Strengths, Weaknesses, Opportunities,Threats

Page 304: NATO Code of Best Practice for C2 Assessment

B-7Annex B

TTA Technology Assessment

TRAC TRADOC Analysis Center

TRADOC US Army Training & Doctrine Command

TTP Tactics, Techniques, and Procedures

U

VVTC Video Teleconference

VV&A Verification, Validation, and Accreditation

VV&C Verification, Validation, and Certification

WWBS Work Breakdown Structure

X, Y, Z

Page 305: NATO Code of Best Practice for C2 Assessment

C-1

ANNEX C

The MORS Codeof Ethics

The MORS Code of Ethics and Responsibilities for

Practitioners of Military Operations Research

Military OR Professionals must aspire to be:

• Honest, open and trustworthy in all their relationships;

• Reliable and consistent in the conduct ofassignments and responsibilities, always doingwhat is right rather then expedient;

• Objective, constructive and accurate in what theysay and write;

• Accountable for what they do and choose not to do;

• Respectful of the work of others, giving duecredit and refraining from criticism of themunless warranted; and

• Free from affiliation with others or with activitiesthat would compromise them, their employers, orthe Society.

Page 306: NATO Code of Best Practice for C2 Assessment

D-1

ANNEX D

Human AndOrganisational

Issues ChecklistThis checklist presents a summary of a particular setof human and organisational issues that is specificallyrelevant to behaviour and performance of humans incommand and control situations. Its purpose is tosensitise the analyst and help him to assess whetherhuman or organisational issues are part of the problemdomain and should be addressed in the solution space.Also, it should assist him in identifying human sciencesdisciplines for consultation.

It should be pointed out, however, that this checklist ispreliminary and must not be considered to becomprehensive. The field of human sciences is toolarge and the set of human and organisational issuesto ill defined to provide the analyst with an exhaustivechecklist, at this time. Research in this area is ongoingand we expect that later editions of this document willprovide the analyst revised and improved checklists.

1. Human Issues

• Physiological factors

Page 307: NATO Code of Best Practice for C2 Assessment

NATO Code of Best Practice for C2 AssessmentD-2

This refers to bio-medical and environmentalfactors that influence behaviour and performance

• Stress

• Fatigue and lack of sleep

• Blood sugar

• Fitness conditions

• Weather conditions

• Geographical terrain conditions

• Ergonomic factors (e.g. performancedegradation due to working in protective suits)

Ergonomics is a science discipline that studieshuman work and work environment relationships.

• Behavioural factors (related to functioning in aC2 group)

• Social Competence & Experience

Capability to interact with others

• Communication skills

• Language skills

• Empathy (Social Awareness/understanding)

• Conflict handling style

• Frustration handling style

• Military Competence & Experience

• Handling Danger

Page 308: NATO Code of Best Practice for C2 Assessment

D-3Annex D

• Defining mission objectives

• Task Competence & Experience

• Cognitive factors

These factors are related to how humansperceive their environment, how they givemeaning to what they see.

• Information processing style

• Information processing capacity

• Creating Situational Awareness

• Individual decision making

• Risk tolerance

• Pre-disposition!

• Receptivity to new information(open / closed)

• Expertise

• Prior training and knowledge(operational codes)

• Emotional factors

• Morale

• Attitude

• Separation from Family

• Fear

• Stress

Page 309: NATO Code of Best Practice for C2 Assessment

NATO Code of Best Practice for C2 AssessmentD-4

• Resilience (ability to overcomenegative feedback)

• Leadership factors

There exists no clear-cut concept of leadershipeffectiveness, but we do know some of thefactors that affect the effectiveness of a leader

• Expectation and behaviour of superiors

• Expectations and behaviour ofsubordinates

• Expectations and behaviour of colleagues

• Personality and experience

• Organisational Culture and policy – withregard to leadership and command.Allocation or responsibility and authority.

• Ability to Motivate and to Direct others

• Moral and Judicial responsibilities

• Coaching capabilities (towardssubordinates)

2. Organisational Issues

• Structure

• Number of echelons or layers

• Span of control for nodes

• Pattern of linkages between nodes (e.g.hierarchical, multi-connected)

• Permanent versus transitory

Page 310: NATO Code of Best Practice for C2 Assessment

D-5Annex D

• Formal versus Informal

• Function

• Distribution of responsibility

• Distribution of authority

• Distribution of Information

• Functional Specificity

• Ambiguity in command relationships

• Capacity

• Differences in communication systems /Architectures

• Differences in information processingsystems / architectures

• Differences in field training and operationalexperience

• Differences in personnel

• Experience

• Training

• Cognitive ability

• Roles

• Allocation of Responsibility / Authority

• Role Conflicts. The interference of multipleRoles in one individual

• Sociological factors

• Understanding of environment

Page 311: NATO Code of Best Practice for C2 Assessment

NATO Code of Best Practice for C2 AssessmentD-6

• Political

• Social

• Cultural

• Interoperability Issues

• National and cultural differences

• Organizational approaches and values

• Communication standards and technology

• Differences in perceptions

• Organisational command style

• Decentralised or Centralised

• Collaborative versus Authoritative

• Formal versus Informal

• Command products: orders, objectives,missions

• Organisational Culture

Shapes patterns of organisational behaviour andreflects thought and activity patterns bymembers of the organisation.

• Belief systems

• Organisational norms & how are theyexpressed

• Organisational values & how are theyexpressed

• Open to organisational learning

Page 312: NATO Code of Best Practice for C2 Assessment

D-7Annex D

I.e. does the organisational culture embedcharacteristics that enable and facilitateorganisational earning?

3. Interaction between Human andOrganisational Issues

• Group decision making

Groups can exert social pressure on membersthat affect the decision-making capability ofindividual members.

• Group Dynamics

The behaviour of individuals in a group will beinfluenced by the fact that they are interacting ina group. This interactive process contains anumber of dynamic dimensions and does effectthe result of interaction. A command post is agroup of people and its performance is affectedby it group dynamics.

• Social interaction & communication

• Communication

• Social Identity and social conflicts

• Individual dominance/leadership

• Cohesion

• Teambuilding, teamwork

• Trust in group member’s competence andloyalty.

• Cultural factors

Page 313: NATO Code of Best Practice for C2 Assessment

NATO Code of Best Practice for C2 AssessmentD-8

Culture is more or less the whole of beliefs andassumptions, including norms and values, aboutthings and behaviour in a group. As such itguides the behaviour of people, also in themilitary and in command posts.

• Note! Culture itself is immaterial, intangible.It can only be distilled form observed actionsby members of an organisation or societalgroup

• Note! Culture is learned through interactionwith others.

• Especially in multi-national OOTW,members of different culturalbackground have to interact. Mutuallyacceptable norms of behaviour willhave to be developed in the process.

• Socialisation process

• Culture lag

Changes in the environment or to theorganisation often require adaptations inbehaviour. Sometimes these changes arerequired or introduced so fast that anorganisation hasn’t had the time to the wholeof its culture to adapt to the new situation,i.e,. to embed the changes in its culture. Thismay lead to friction.

• E.g.1: The introduction of newtechnologies may change the way wework. If this is done too fast, or withoutthoughtful guidance it may have a

Page 314: NATO Code of Best Practice for C2 Assessment

D-9Annex D

disruptive effect on behaviour and thuson organisational effectiveness.

• E.g.2: Peace Keeping and WarOperations each require differenttypes of behaviours which is oftenquite different from the type ofbehaviour in the peaceful environmentat home. Western armies operate onpersonnel rotation schedules wherebyindividuals change frequently from onesituation to another. This may alsoresult in culture lag causing friction.

• Sub-culture

Within the larger set of culture, groups maydevelop specialised cultures, such as, e.g.,distinct unit cultures.

• Sometimes these sub-cultures mayconflict over some issues with thedominant organisational or societalculture. Understanding thesedifferences is often the first step inresolving these situations.

• Social Control.

Individual (group, organisational) memberswatch and correct each other if they acceptedprocedures of behaviour aren’t followed.

• If this corrective action pattern is toostrict, the group becomes rigid and isless able to adapt to new situations.

Page 315: NATO Code of Best Practice for C2 Assessment

NATO Code of Best Practice for C2 AssessmentD-10

• If it is too loose, and no culture ornorms are enforced, chaos anduncertainty may rule the group.

• Commanders Intent.

A leader can effect the current (sub)culture of his unit by setting the exampleand by establishing and enforcing requiredprocedures and behaviour.

• Both must be synchronized, elsefriction will occur.

• By action and word a leader sets thelevel and direction of (social) control.

• Ethics

• Cooperability

• Effectiveness of communication

• Willingness to co-operate and collaborate

4. Environmental factors

These factors affect individual, organisationaland group issues.

• Noise

• Visibility

• Temperature / humidity

• Terrain type

• Infrastructure in area of operations. (E.g.transport, communication, healthcare,

Page 316: NATO Code of Best Practice for C2 Assessment

D-11Annex D

agriculture, food distribution, water, civiladministration -infrastructures)

• Military support infrastructure

• Social, economic, and political situation in areaof operations

• Rules of Engagement

• Ease of interaction

• Command Post layout

• Communication mechanisms (e.g. voice,teletype, VTC)

• Geographic distribution of the command

Related Human Science Disciplines

This list provides an overview of the relevant scientificdisciplines with the expertise to answer questionsarising in the context of human and organizationalissues in command and control. The list implies a roughtaxonomy of disciplines only. Even though boundariesbetween disciplines are frequently fuzzy, an attemptwas made to minimise the overlap.

1 Psychology

1.1 Individual Psychology

1.1.1 Cognitive Psychology

1.1.2 Learning Psychology

1.1.3 Action Theories

Page 317: NATO Code of Best Practice for C2 Assessment

NATO Code of Best Practice for C2 AssessmentD-12

1.1.4 Differential Psychology

1.1.5 Psychometrics

1.2 Social Psychology

1.3 Clinical Psychology

2 Educational Sciences andPedagogics

3 Sociology

3.1 Social Morphology (e.g. Social Stratification,Demography)

3.2 Sociology of Political Power (See Also PoliticalSciences)

3.3 Mass Communication

3.4 Sociological Methodology (Empirical SocialResearch)

4 Organisational Sciences

5 Economic Sciences

6 Political Sciences

6.1 Science of Domestic Politics

6.2 Science of International Relations

6.3 Political Disaster Research

Page 318: NATO Code of Best Practice for C2 Assessment

D-13Annex D

7 Ergonomics, Human FactorsResearch

7.1 Physiology and Anthropometrics

7.2 Technical Ergonomics (Man-Machine-Interface,Workspace)

7.3 Cognitive Ergonomics (See Also IndividualPsychology)

7.4 Crew Ergonomics (See Also Social Psychology)

7.5 Safety and Health Hazard Prevention

8 International Law of War,International Humanitarian Law

9 Ethics

9.1 Deontology (Theories of Ethical Norms andValues)

9.2 Theories of Ethical Practise and Morality

10 Social and Legal Philosophy

11 Cultural Anthropology

Literature

Ali, M.I. et al. (2000). Enabling Learning: A ProgressReport On The Enterprise Social LearningArchitecture Task. A paper presented at CCRTSymposium in 2000 in Canberra, Australia.

Page 319: NATO Code of Best Practice for C2 Assessment

NATO Code of Best Practice for C2 AssessmentD-14

Capstick, M.D. (1998). Command and Leadership inOther Peoples Wars. Land Force WesternHeadquarters, Edmonton, Canada. Publishedat the NATO RTO Workshop: The Human inCommand, Kingston, Canada, 1998.

Jager, H.de & Mok, A.K. grondbeginselen derSociologie, Gezichtspunten en begrippen. H.E.Stenfert Kroese B.V.: Leiden, the Netherlands.

LD-1. (200). Leidraad Commandovoering. DoctrineCommission Royal Netherlands Army, RNLAStaff, The Hague.

Pigeau, R & MacCann, (1998). Re-Defining Commandand Control. Defence and Civil Institute ofEnvironmental Medicine. Ontario, Canada.Published at the NATO RTO Workshop: TheHuman in Command, Kingston, Canada, 1998.

Shattuck, L.G. (1998). Communication of Intent inMilitary Command and Control Systems.Department of behaviroal Sciences andLeadership, United States Military Academy.Westpoint, United States. Published at theNATO RTO Workshop: The Human inCommand, Kingston, Canada, 1998.

Persson, P., James, M.N. & Eriksson, H. Commandand Control: A Biased Combination? NationalDefence College, Sweden. Published at theNATO RTO Workshop: The Human inCommand, Kingston, Canada, 1998.

Page 320: NATO Code of Best Practice for C2 Assessment

D-15Annex D

Vogelaar, A.L.W., Muuse K.F. & Rovers, J.H. (eds)(1998). The Commanders Responsibility inDifficult Circumstances. NL Arms, Netherlandsannual review of military studies. Gianotten b.v.:Tilburg.

Winslow, D (1998) Misplaced Loyalties. The Role ofMilitary Culture in the Breakdown of Disciplinein Peace Operations. Department of Sociology.University of Ottawa. Published at the NATORTO Workshop: The Human in Command,Kingston, Canada, 1998.

Zsambok, C. E., & G. Klein. (1997). NaturalisticDecision Making. Mahwah, New Yersey:Lawrence Erblaum Associates, Publischers

Page 321: NATO Code of Best Practice for C2 Assessment

E-1

ANNEX E

AlternateDefinitions of Risk

and Uncertainty

A number of different definitions of risk anduncertainty are in common use, and this can lead

to confusion and misunderstanding. This COBP adoptsspecific working definitions of risk, uncertainty, andsensitivity; however, this annex lists a range of othersfor information.

Risk

Risk is defined in this COBP as the possibility ofsuffering harm or loss. In other words, exposure toharm or loss. This includes an opportunity loss.

Alternate definitions of risk in commonuse include:

• not achieving your objective;

• the likelihood of not achieving your objective;

• a threat to successful outcome;

Page 322: NATO Code of Best Practice for C2 Assessment

E-2 NATO Code of Best Practice for C2 Assessment

• an assessment of the probability of failure;

• an uncertain future scenario; and

• a perception of consequential pain.

The alternate definitions may be seen as subtlevariations of each other and are all equally valid.However, this very subtlety can be a source ofconfusion and doubt within a study unless a clearworking definition is adopted.

Uncertainty

Uncertainty is defined in this COBP as an inability todetermine a variable value or system state (or nature)or to predict its future evolution.

Alternate definitions of uncertainty incommon use include:

• a lack of clarity in the definition of a systemor variable;

• a lack of confidence in an assumption orresult; and

• a lack of knowledge about a subject of interest.

All of these definitions have some validity, but thisshould not be allowed to cloud the thinking duringC2 assessment.

Page 323: NATO Code of Best Practice for C2 Assessment

E-3Annex E

Sensitivity

Sensitivity, as used in this COBP, describes a pre-disposition to respond strongly to a stimulus orvariations in an input factor.

Alternate definitions for sensitivity incommon use include:

• an indication of the importance or criticality of afeature or variable of an analysis; and

• a measure of the political profile of the subject ofstudy.

Sensitivity, as it relates to the production of robustanalysis, must be clearly defined in terms of responseto stimulus rather than importance or criticality. In C2systems many insensitive factors are critical and manysensitive ones are relatively unimportant.

Page 324: NATO Code of Best Practice for C2 Assessment

F-1

ANNEX F

SAS-026 History

This revised and expanded version of the COBPfor C2 assessment was developed by SAS-026

building upon the initial (1998) version of the COBPproduced by SAS-002. This edition of the COBP is asynthesis of decades of expertise from variouscountries and hundreds of analyses. It was developedusing a set of case studies and incorporates feedbackfrom users of the initial version. Lastly, SAS-039provided a peer review of the final draft product.Members of SAS-026 are listed below.

Dave Alberts (chair), United States

Tim Bailey, United States

Paul Choinard, Canada

Cornelius d’Huy, The Netherlands

Uwe Dompke, NC3A

Dean Hartley, United States

Richard Hayes, United States

Reiner Huber, Germany

Don Kroening, United States

Stef Kurstjens, The Netherlands

Page 325: NATO Code of Best Practice for C2 Assessment

F-2 NATO Code of Best Practice for C2 Assessment

Nick Lambert, NC3A

Georges Lascar, France

Christian Manac’h, France

Graham Mathieson, United Kingdom

Jim Moffat, United Kingdom

Orhun Molyer, Turkey

Valdur Pille, Canada

David Signori, United States

Mark Sinclair, United States

Mink Spaans, The Netherlands

Stuart Starr, United States

Swen Stoop, The Netherlands

Hans Olav Sundfor, Norway

Klaus Titze, Germany

Andreas Tolk, United States

Corinne Wallshein, United States

Gary Wheatley, United States

John Wilder, United States