a personalized and prescriptive jl m s bromabe ...-eeenn-ennee meeemmmmmmemm. decioi scienvce...

95
' AD-AI16 096 DECISIONSCIENCE CONSORTIUM INC FALLS CHURCH VA F/S 13/10 A PERSONALIZED AND PRESCRIPTIVE DECISION AID.(U) JL 82 M S COHEN, R C BROMABE, J 0 CHINNIS N0002-82-C-0138 UNCLASSIFIED TR62-4 ML ElmEnmmEEEEEEIm EmmmmEEEEEEEEE EEmnEmmEEEEEEE -EEENN-ENNEE mEEEmmmmmmEmm

Upload: others

Post on 27-Jan-2021

0 views

Category:

Documents


0 download

TRANSCRIPT

  • ' AD-AI16 096 DECISION SCIENCE CONSORTIUM INC FALLS CHURCH VA F/S 13/10A PERSONALIZED AND PRESCRIPTIVE DECISION AID.(U)JL 82 M S COHEN, R C BROMABE, J 0 CHINNIS N0002-82-C-0138

    UNCLASSIFIED TR62-4 MLElmEnmmEEEEEEIm

    EmmmmEEEEEEEEEEEmnEmmEEEEEEE-EEENN-ENNEEmEEEmmmmmmEmm

  • DECIOi SCIENVCE CONSORTUM, Mr.C

    A PERSONALIZED AND PRESCRIPTIVE DECISION AID

    TECHNICAL REPORT 82-4

    .. By

    Marvin S. Cohen, Robert C. Bromage, James 0. Chinnis, Jr.John W. Payne, and Jacob W. Ulvila

    Prepared for

    Office of Naval Research

    Contract N00014-82-C-0138

    C-

    Li]i July 1982 -A

    .. Approved t,)r pu01c rele e;DistribuLion Unlimited

    Decision Science Consortium, Inc.

    7700 Leesburg Pike, Suite 421Falls Church, Virginia 22043

    (703) 790-0510

    ,92 0f )'

    r .

  • UnclassifiedSECURITY CLASSIFICATION OiF THIS PAGE (W)ten Doa Entered)

    PAGE READ INSTRUCTIONSREPORT DOCUMENTATION PBEFORE COMPLETING FORM1. REPORT NUMBER 2. GOVT ACCESSION NO. 3. RECIPIENT'S CATALOG NUMBER

    82-4-7)9/ L4. TITLE (and Subtitle) S. TYPE OF REPORT & PERIOD COVERED

    A Personalized and Prescriptive Decision Aid Final, 1 January - 30 June1982

    6. PERFORMING ORO. REPORT NUMBER

    Technical Report 82-47. AUTHOR(s) S. CONTRACT OR GRANT NUMBER(e)

    Marvin S. Cohen, Robert C. Bromage,James 0. Chinnis, Jr., John W. Payne, Jacob W.Ulvila N00014-82-C-0138

    S. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT, PROJECT, TASKAREA & WORK UNIT NUMBERS

    Decision Science Consortium, Inc. 62757ni; RF57-701;7700 Leesburg Pike, Suite 421 RF577ol8ol; NR 461-001Falls Church, Virginia 22043

    II. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE

    Office of Naval Research July 1982

    800 North Quincy Street 13. NUMBER OF PAGES

    Arlington, Virginia 22217 8714. MONITORING AGENCY NAME & AODRESS(II dilferent from Controlling Office) 15. SECURITY CLASS. (of this report)

    Same Unclassified

    ISs. DECLASSIFICATION DOWNGRADINGSCHEDULE

    IS. DISTRIBUTION STATEMENT (of this Report)

    For public release; distribution unlimited.

    17. DISTRIBUTION STATEMENT (of the ebstrect entered In Block 20, It different from Report)

    18. SUPPLEMENTARY NOTES

    Ir

    19. KEY WORDS (Continue on reverse side If necessary and Identify by block number)

    Decision aids Decision analysisPersonalized systems Cognitive stylesMan-machine interaction Command and control

    ,-Heuristics and biases Attack planning0 \,Cognitive psychology Anti-submarine warfare

    20. ABSTRACT (Continue on reverse aide If necessary and identify by block number)

    This report describes the development of a computer-based display and analy-sis system which caters to the personal decision-making styles of userswhile hedging them about with safeguards against potential errors or biases.The general conceptual design brings together descriptive research in cog-

    * §nitive psychology on individual strategies in judgment and choice, andqprescriptive theories which constrain optimal solutions while accommodating

    differences in judgment and ways of structuring the problem. A demonstra-tion prototype aid, incorporating an advanced user interface, has been de-

    DD r 1473 EDITION OF I NOVSS IS OBSOLETES 'N 0102- LF. 014. 6601 n a f

    SECURITY CLASSIFICATION OF TNI PAGE (When Dote Bntered)

  • UnclassifiedSECURITY CLASSIFICATION OV THIS PAGE (Whm D4ae ffet3we

    signed and partially implemented in a specific testbed and has sucessfullyundergone a preliminary test with representative potential users.

    The prototype aid design consists of a data base, a flexible geera-purposePlanning Module, and four relatively specialized routines for customizing theaid. Portions of the data base, Planning Module, and Selection Module havebeen implemented in a demonstration attack planning system for the commandstaff of an attack submarine. The current system provides a relatively effortless user interface, consisting of high resolution color display, a joystick,and a single function key.

    The design accommodates individual differences in beliefs, values, and pre-ferred level of problem structure, in search organization and decision rule,and in features of "cognitive style" (number of options focused on, numberof dimensions utilized, and time span of planning).

    in addition, it offers substantial prescriptive support: Data and inferencesare clearly separated, and sources of uncertainty indicated. Outputs ofprescriptive models at any level may be directly displayed, at the user'srequest, based either on default values or on the user's own judgments.rinally, prescriptive prompts advise the user when information he has notobserved seems to warrant attention.

    A study of individual data-gathering and decision-making styles among sub-narine officers suggests that pronounced differences do exist to which apersonalized aid might cater. A preliminary demonstration of the functioningprototype suggests that it will cater effectively and acceptably to thosedifferences.

    Aecssio -

    Ns i Rl&T -

    Dit r ,,,

    U :'i2 01

    .

    $IN 0102- LF- 014.6601 OTIG"\

    UnclassifiedINPE SECURlTY CLAWIPICTIO Or TIS PAGEt(hn Date EIt ee

    -.. . . . .

  • * 9

    A PERSONALIZED AND PRESCRIPTIVE DECISION AID

    TECHNICAL REPORT 82-4

    By

    Marvin S. Cohen, Robert C. Bromage, James 0. Chinnis, Jr.John W. Payne, and Jacob W. Ulvila

    Prepared for

    Office of Naval Research

    Contract N00014-82-C-0138

    July 1982

    Decision Science Consortium, Inc.7700 Leesburg Pike, Suite 421Falls Church, Virginia 22043

    (703) 790-0510

    _9 _4

  • TABLE OF CONTENTS

    Acknowledgments ....................................... iv

    Summary ............................................... v

    1.0 INTRODUCTION ..................................... 1-1

    1.1 Computers as Aids to Thought ................. 1-11.2 Overview .................................... 1-3

    2.0 DECISION MAKERS, DECISION PROCESSES, ANDPERSONALIZED AIDS ................................ 2-1

    2.1 Individual Prescriptive Decision

    Models ...................................... 2-1

    2.1.1 Implications for personalized aids.. .2-2

    2.2 Individual Strategies in Inferenceand Choice .................................. 2-4

    2.2.1 Choice.................................. 2-42.2.2 Inference ............................... 2-62.2.3 Cognitive style......................... 2-62.2.4 Implications for a personalized

    aid .................................. 2-8

    2.2.4.1 Personalization andefficient flexibility ....... 2-8

    2.2.4.2 Prescriptive prompting ...... 2-9

    3.0 A PROTOTYPE AID: GENERAL CONCEPT ................. 3-1

    3.1 Data Base ................................... 3-13.2 Planning Module............................... 3-33.3 Customizing Modules ............................ 3-4

    3.4 Summary ..................................... 3-5

    4.0 SUBMARINE APPROACH AND ATTACK .................... 4-1

    4.1 The Decision Setting ........................ 4-14.2 The Decision Makers........................... 4-3

    4.3 Testbed Characteristics ..................... 4-4

    ii

  • TABLE OF CONTENTS (Continued) IK

    5.0 A PERSONALIZED AND PRESCRIPTIVE ATTACKPLANNING AID ..................................... 5-1

    5.1 Implementation .............................. 5-15.2 Data Base ................................... 5-25.3 Planning Module ............................. 5-3

    5.3.1 The planning module in use ........... 5-55.3.2 Stored alternatives ................... 5-105.3.3 Personalized information search ...... 5-10

    5.4 Selection Module............................ 5-115.5 Adjust Module (Conceptual Design) ........... 5-125.6 Alert Module (Conceptual Design) ............ 5-175.7 Advisory Module (Conceptual Design) ......... 5-195.8 Summary ..................................... 5-23

    6.0 INFORMAL DEMONSTRATION OF THE PROTOTYPE AID ...... 6-1

    7.0 CONCLUSIONS ...................................... 7-1

    8.0 REFERENCES ....................................... 8-1

    APPENDIX A: SYSTEM CONFIGURATION

    APPENDIX B: PRESCRIPTIVE MODEL FOR ATTACK PLANNING

    APPENDIX C: DECISION PROCESSES OF SUBMARINE COMMANDERS

    APPENDIX D: ILLUSTRATIVE AID ENHANCEMENTS

    iii

  • Acknowledgments

    The work reported here has been sponsored wy the Engineering

    Psychology Group of the Office of Naval Research (ONR), under

    the Defense Small Business Advanced Technology (DESAT) Program.

    We are grateful to John O'Hare for his constructive blending of

    personalized attention and prescriptive advice. Some of the

    ideas exploited here received initial development under a differ-

    ent project, also sponsored by the Engineering Psychology Group

    of ONR.

    Lynn Merchant-Geuder managed the production of the report.

    Others who have contributed in various ways to the work include

    Christopher Herot, David A. Seaver, Rex V. Brown, and William G.

    Stillwell.

    Appreciation is due to LCDR Richard Pariseau (US Navy-Ret.), CDR

    Fred W. Carter (US Navy-Ret.), CAPT Jack D. Clay (US Navy), and

    CAPT John N. Asher (US Navy) for agreeing to serve as "represen-

    tative users" and critics of the aid.

    iv- 4 4.

  • SUMMARY

    This report describes the development of a computer-based dis-

    play and analysis system which caters to the personal decision-

    making styles of users while hedging them about with safeguards

    against potential errors or biases. The general conceptual de-

    sign brings together descriptive research in cognitive psycho-

    logy on individual strategies in judgment and choice, and pre-

    scriptive theories which constrain optimal solutions while accom-

    modating differences in judgment and ways of structuring the

    problem. A demonstration prototype aid, incorporating an ad-

    vanced user interface, has been designed and partially imple-

    mented in a specific testbed and has successfully undergone a

    preliminary test with representative potential users.

    The prototype aid design consists of a data base, a flexible

    general-purpose Planning Module, and four relatively specialized

    routines for customizing the aid. Portions of the data base,

    Planning Module, and Selection Module have been implemented in a

    demonstration attack planning system for the command staff of an

    attack submarine. The current system provides a relatively

    effortless user interface, consisting of high resolution color

    display, a joystick, and a single function key.

    The data base consists of basic inputs (in the submarine testbed

    these concern own ship, contacts, and the environment) together

    with a set of prescriptive models which aggregate those inputs

    into higher level inferences and forecasts.

    The Planning Module serves, first of all, as an extremely versa-

    tile tool for interacting directly with the data base and for

    evaluating alternative actions. It enables the user to sample

    information from any preferred level of aggregation in the data

    base, organize his search for information efficiently in terms

    of either action alternatives or evaluative criteria, and vary

    v

  • the number of alternatives examined, the amount of information

    requested concerning each, and the time into the future which

    planning encompasses. Moreover, this module clearly distin-

    guishes conclusions from evidence, and indicates the sources

    from which each inference is derived.

    The Selection Module (which has also been implemented) allows

    the user to personally select the portion of a large data base

    which will be immediately accessible through the Planning Mod-

    ule. The user can then efficiently search the data base in the

    prechosen order. At the same time, because of its streamlined

    interface, the Selection Module permits him to alter his priori-

    tization and retrieve any other information quickly and easily.

    The Adjust Module will enable the user to insert subjective judg-

    ments in place of default values at any level in the data base.

    The Planning Module will then display the implications of the

    hypothetical or revised values for any higher level inference.

    (Default values, however, would continue to be stored and dis-

    played.) The Adjust Module thus accommodates individual differ-

    ences in beliefs and preferences and - from a prescriptive point

    of view - adds a potentially valuable source of information (the

    user) to the data base.

    The Alert Module facilitates individual heuristic strategies

    which evaluate actions by reference to cutoffs (as opposed to

    tradeoffs). The user can set a cutoff or threshold for any vari-

    able in the data base (at any level of aggregation). The Plan-

    ning Module will forecast whether or not the threshold is ex-

    pected to be crossed for any given action alternative, and if

    so, when. It also alerts a user, whose attention may be other-

    wise engaged, when cutoffs are crossed.

    Advisory prompts, in the Planning Module, monitor information

    not requested by the user, and advise him when that information

    vi

  • PF IF

    appears to have prescriptive implications different from the

    information he has observed. This function will enable the user

    to concentrate his own attention selectively, in areas he re-

    gards as critical, while notifying him when other issues seem

    worthy of attention. In the Advisory Module, the user sets his

    own "tolerance" for disconfirming information, thus determining

    the nature and frequency of advisory prompts.

    A study of individual data-gathering and decision-making styles

    among submarine officers suggests that pronounced differences do

    exist to which a personalized aid might cater: e.g., in level

    of aggregation of preferred information, number of evaluative

    dimensions used, employment of cutoff criteria, attitudes toward

    risk, and patterns of information search. Finally, a prelimi-

    nary demonstration of the functioning prototype suggests that it

    will cater effectively and acceptably to those differences.

    vii

    -JS

  • 1.0 INTRODUCTION

    1.1 Computers as Aids to Thought

    By interacting with machines, humans increase their capacity not

    only to sense the environment and act upon it, but also to

    think. Guidelines for the human factors engineering of the man-

    machine interface have traditionally focused on sensing and act-

    ing: i.e., display features and control configurations that

    conform to human capabilities and predilections. As computers

    take on expanded operational roles, however, attention has begun

    to turn to machine-assisted thought, and to the manner in which

    computer-implemented storage, retrieval, and manipulation of

    data can be optimally interfaced with human cognitive structures

    and processes.

    This problem tends to arise wherever information systems are

    introduced into non-routine "management level" tasks: e.g., in

    military command and control and in governmental or business

    planning. Computers have often produced controversy; in some

    cases there has been severe organizational and institutional

    resistance (e.g., Miller, 1980; Sinaiko, 1977; Beard, 1977;

    Swanson, 1974). Among the causes of such resistance, the chief

    one, perhaps, is not display and keyboard design, but the mis-

    match between an intended user's problem-solving or decision-

    making style and that rigidly imposed on him by the computer.

    Recent developments in computer science and in cognitive psycho-

    logy, if taken together, provide a basis for the design of a

    personalized information processing interface. Within computer

    science there have been impressive technical advances in hard-

    ware and architecture, display technology, programming lang-

    uages, data base structures, and query techniques (e.g., Nils-

    son, 1980; Allen, 1978; Bolt, 1979). These make possible, in

    1-1

    . ... --- I .. ... | - .. .... . " a

  • principle, a high degree of customization of man-computer inter-actions to the individual user's needs. In psychology there is

    a long-standing research tradition on individual differences in

    abilities thought to comprise intelligence. Recently, however,

    cognitive psychologists have turned to the investigation of how

    people differ in their actual performance of cognitive tasks

    (e.g., Newell and Simon, 1972; Kahneman, Slovic, and Tversky,

    1982; Payne, 1982). From such performance, hypotheses are formu-

    lated about differences in the internal structures people use to

    encode knowledge and in the strategies they adopt for obtaining

    and processing it.

    Unfortunately, little progress has been made as yet in fusing

    these two trends. One reason, perhaps, is that in the effort to

    personalize the interface for machine-assisted thought, two ob-

    jectives can easily conflict: On the one hand, we want to cater

    to the preferences of the user, whatever they may be; on the

    other hand, we want to help him avoid the errors or biases to

    which his approach might give rise. What are needed are aids

    which are personalized and (simultaneously) prescriptive; they

    must, somehow, bring to bear findings and methods not only in

    human factors engineering, computer science, and cognitive psy-

    chology, but also the prescriptive theory of decision making.

    A healthy sense of cooperation between person and prescriptive

    model may be fostered by recognition of their complementary

    limitations. Decision makers are, in many contexts, confronted

    by an increasingly unmanageable volume of information - gener-

    ated, in part, by the success of the computer in taking over

    more routine data management chores (and also due, in military

    settings, to the increased range and variety of sensors, wea-

    pons, and communication channels). Decision-making strategies

    tend to be even more idiosyncratic, and less normative, under

    information overload (e.g., Payne, 1976; Streufert and Streu-

    fert, 1981b); but a secondary effect is that decision makers in

    1-2

  • many fields have become more interested in prescriptive aids

    which help them integrate large quantities of information.

    At the same time, there is a growing sense of the limitations of

    modeling (e.g., Watson, Brown, and Lindley, 1977; Keeney, 1982).

    Since "objective" methods can seldom address more than part of a

    complex problem, prescriptive aids are appropriately regarded

    not as "know-it-alls" but as fallible "advisors" (Patterson et

    al., 1981). Eliminating the human from the process (besides

    incurring resistance) often represents a serious underutiliza-

    tion of available cognitive resources (cf., Cohen, 1982; Ramsey

    and Atwood, 1979).

    1.2 Overview

    The present report describes the development of a prototype aid

    which is both personalized and prescriptive. It attempts to

    balance two objectives: (1) to allow maximal user flexibility

    in selecting information for display and in customizing the aid

    interface according to his own needs; (2) to help the user organ-

    ize and evaluate the implications of displayed information for

    the problem at hand. An essential feature of the aid is that

    the user decides precisely how much, and what kind of, prescrip-

    tive advice he will get.

    The prototype aid has been designed and partially implemented

    for a specific operational context: approach and attack plan-

    ning by the Command staff of a nuclear attack submarine. How-

    ever, only the data base of the aid is affected by the nature of

    the specific application. Its functional logic, and the methods

    used to achieve both personalization and prescriptive impact,

    are quite general. The implementation of a demonstration proto-

    type system in a specific context, however, permits a realistic

    test of the feasibility of the concepts, with potential users.

    1-3

  • Our strategy in aid design has been neither wholly "top-down"

    nor wholly "bottom-up." User preferences have played a critical

    role both at the start and throughout. The perceived needs and

    performance patterns of submarine Command . -ff members were

    gathered initially through informal discussions with active or

    recently retired personnel, through observation of training exer-

    cises and videotapes, and through a formal questionnaire (Appen-

    dix C). After the implementation of an initial functioning pro-

    totype, interaction with the aid was observed, feedback elicit-

    ed, and - as a result - specific design changes made.

    At the same time, we have adopted the hypothesis that there are

    important current findings in cognitive psychology (about how

    people solve problems) and in prescriptive theory (about con-

    straints defining optimal methods) that ought to be taken into

    account in the design process. If these applications bear

    fruit, the potential is for an aid that is finely tuned to human

    patterns of information processing and capable of gently point-

    ing the way around likely dangers and pitfalls.

    Section 2.0 sketches the technical background in prescriptive

    theory and psychology which we have drawn upon in designing the

    aid. The functional logic of the aid is presented in general

    terms in Section 3.0. Section 4.0 introduces the operational

    setting, submarine approach and attack, which has served as the

    testbed for prototype development; and the prototype itself is

    described in Section 5.0. Section 6.0 describes a preliminary

    demonstration of the aid with former submarine commanding offi-

    cers. Conclusions regarding the feasibility of the approach and

    possible directions for further development are presented in

    Section 7.0.

    Earlier work on the attack planning problem may be found in

    Cohen and Brown (1981) and Cohen (1982).

    1-4

  • 2.0 DECISION MAKERS, DECISION PROCESSES, ANDPERSONALIZED AIDS

    How are the users of decision aids likely to differ in their

    approaches to decision making and problem solving? What are the

    consequences of such differences for success in task perform-

    ance? And how should aids be personalized so as to enhance both

    user acceptability and quality of performance?

    We consider, briefly, two general ways in which decision makers

    have been thought to differ from one another:

    0 in the parameters and structure of a prescriptive modelbased on their personal beliefs and preferences; and

    0 in the heuristic strategies, decision processes, andcognitive styles which they adopt in problem-solving.

    The interplay of findings from these areas helps define the

    potentialities and limitations of personalized decision aiding.

    2.1 Individual Prescriptive Decision Models

    Ironically, a driving force in the evolution of prescriptive

    theories of decision making has been the need to accommodate

    individual differences. An objective rule for betting in games

    of chance, maximization of expected value, applies only where

    probabilities of outcomes can be mathematically defined (as in

    rolling dice) and where the desirability of outcomes is physi-

    cally measurable (e.g., by money). Generalizations of this. basic rule to situations where those conditions do not hold have

    led to the modern technique of decision analysis (cf., Edwards,*

    1954, 1961; Raiffa, 1968; Brown, Kau, and Peterson, 1974). Von

    Neuman and Morgenstern (1947) formalized the notion of a subjec-

    tive dimension of value, i.e., utility, and extended it to in-

    2-1

  • dividual preferences among probabilistic states of affairs. DeFinetti (1937/1964) and Savage (1954) developed formal systems

    for the quantification of an individual's "degree of belief," or

    subjective probability, about uncertain propositions, and devel-

    oped axiomatic justifications for the merging of utilities and

    subjective probabilities into a new prescriptive rule, maximiza-

    tion of subjectively expected utility (see Appendix B). More re-

    cently, rigorous techniques have been developed for combining

    subjective preferences with respect to individual components of

    value into a single multiattribute utility measure (e.g., Keeney

    and Raiffa, 1976).

    The prescriptive force of decision analysis, in this form, is

    not to dictate to an individual in any absolute sense what he"ought" to do or believe. Rather, it indicates what choices and

    beliefs are logically consistent with other preferences and

    beliefs which he chooses to accept (cf., French, 1979).

    2.1.1 Implications for personalized aids. These elements of

    personalization are by no means shared by all prescriptive ap-

    proaches. Techniques in operations research (e.g., cost/benefit

    analysis) commonly purport to be "objective" and "value free"

    (Watson, 1981). The approach to decision analysis adopted

    above, however, has two important implications for personalized

    aids:

    (1) Decision-analytic aids do not address only the part of a

    problem that can be objectively measured. Actual decisions near-

    ly always involve a number of "soft factors" (e.g., uncertainty

    about the intentions of a business competitor or of a military

    foe; the relative importance of different objectives, like money

    and prestige). The decision maker's own experience may be the

    only source of relevant information in these matters, while an

    exclusively "factual" approach could be fatally incomplete. Aids

    2-2

  • which combine subjective and objective inputs must accommodate

    individual differences among users in assessments of uncertain

    states of affairs, attitudes toward risk, and tradeoffs among

    competing objectives.

    (2) The second point is equally important, though far less wide-

    ly recognized. Just as it does not prescribe inputs, decision

    theory constrains, but does not dictate problem structure.

    Typically, there is more than one way to express the probability

    of a hypothesis in terms of probabilities for other proposi-

    tions; and there are multiple decompositions of the utility of

    an option into preferences for separate attributes. A good

    structure for a particular decision maker breaks the problem

    down into components about which that decision maker has either

    objective data or personal experience. Individuals might bene-

    fit differently from different analyses of the same problem.

    In particular, it has been suggested that experts differ from

    novices in their capability to individually recognize a very

    large number of different problem situations (De Groot, 1965;

    Chase and Simon, 1973). Klein (1980) argues that experts tend

    to reason holistically, by analogy with previous similar exper-

    iences, rather than by explicit analysis and computation. Klein

    warns that imposition of analytical models may actually impair

    expert performance. In terms of decision theory, however, this

    distinction between experts and novices is accommodated by the

    notion of personalized problem structures. The expert might

    produce quite creditable holistic judgments of problem compo-

    nents which he has "seen before" but which a less; experienced

    individual would need to analyze into more familiar elements.

    (Nonetheless, experts too are subject to error - particularly

    when a problem which appears familiar has novel aspects; cf.,

    Sage, 1981. Experts may benefit from analysis of such novel

    components.) The implication is that if decision aids are to

    2-3

  • exploit the capabilities of each potential user, a variety of

    models, with different functions and at different levels of

    aggregation, should be made available (cf., Strub and Levit,

    1974).

    2.2 Individual Strategies in Inference and Choice

    Prescriptive decision theory does not provide a description of

    actual performance, either in probabilistic reasoning or in the

    evaluation of actions (cf., Einhorn and Hogarth, 1981). Recent

    research in cognitive psychology has shed light on the internal

    processes and structures which people employ in such tasks, and

    how they differ.

    2.2.1 Choice. One line of research has explored the strategies

    people use in choosing among actions. Prescriptive theory re-

    quires that a single score for each option (its expected util-

    ity) be derived, which integrates all the available information

    about that option: i.e., its score on each of a set of attri-

    butes, or the probabilities and utilities of its possible out-

    comes. Several descriptive models of choice behavior have been

    proposed, however, which involve more partial samplings of the

    available data (e.g., Payne, 1973; Svenson, 1979).

    In Tversky's (1972) Elimination-by-Aspects (EBA), for example,

    the decision maker sequentially considers attributes character-

    izing decision options; the order of consideration is probabilis-

    tically related to their relative importance to the decision

    maker. Once an attribute is selected, a threshold is estab-

    lished, and all options are eliminated that do not score at or

    above the threshold on that attribute. A new attribute is then

    selected from those remaining, and the process is continued

    until one option is left. In another decision strategy, called"satisficing" (Simon, 1957; Svenson, 1979), the decision maker

    adopts a conjunctive criterion involving cutoffs on one or more

    2-4

  • dimensions. He then compares successive options to the criter-

    ion until he finds one that is acceptable, whereupon he stops.

    In each of these examples, an option stands a chance of being

    eliminated for falling below a cutoff, even though it scores

    high on other attributes. EBA and satisficing do not allow con-

    sideration of tradeoffs across different dimensions of value.

    Different decision rules have different implications for the

    order in which people elect to receive information (Payne, 1973,

    1976). Some strategies imply a search organized by options,

    others a search organized by attributes. For example, the pre-

    scriptive model and satisficing suggest that each option will be

    evaluated before information concerning the next option is re-

    quested. EBA, on the other hand, implies that areas of evalua-

    tive concern are fully dealt with in turn.

    Individual decision makers vary in the decision strategies which

    are reflected in their information-seeking behavior and in their

    verbal protocols (Payne, 1976; Russo and Dosher, 1981). But

    little work has been done to discover whether these individual

    differences are consistent across time and tasks (Svenson,

    1979); instead, emphasis has been on the role of task variables.

    For example, when there are a large number of choice options,

    decision makers tend to select routines like EBA which quickly

    eliminate some options by more approximate methods. They may

    then switch over to routines which integrate all the available

    information about the remaining options (Payne, 1976; Wright and

    Barbour, 1977).

    A small but important body of research concerns the likely im-

    pact of non-optimal heuristic strategies on the quality of the

    judgment or decision. In many situations, the cost of using

    approximate methods may not be too great, while savings in time

    and effort may be substantial (Thorngate, 1980; Russo and

    2-5

  • Dosher, 1981). However, work by von Winterfeldt and Edwards

    (1973, 1975) suggests that large losses can in principle occur

    when available information is not used.

    2.2.2 Inference. A different body of research (e.g., Kahneman,

    Slovic, and Tversky, 1982) has explored distortions to which

    human concepts of uncertainty may be subject. For example, the

    probability of an event may be judged by the ease with which"scenarios" containing the event can be mentally called to mind,

    leading to idiosyncratic biases based on variations in indivi-

    dual experience (Tversky and Kahneman, 1974; cf., Gettys, Kelly,

    and Peterson, 1973). More fundamentally, individuals vary in

    their tendency to think about uncertainty in a probabilistic

    way, rather than in a black-and-white fashion (Wright and

    Phillips, 1976).

    A number of studies show that people consistently overestimate

    their degree of certainty, even in areas where they are (right-

    fully) regarded as experts. Experts on the other hand, appear

    to be less overconfident than novices (Oskamp, 1965; Lichten-

    stein, Fischhoff, and Phillips, 1982). People often do not suf-

    ficiently consider the imperfect credibility of an information

    source - i.e., they proceed as if information reported to them

    were known to be true rather than merely reported (Schum,

    DuCharme, and DePitts, 1973). In addition, people may ignore

    evidence that contradicts a previous datum, or double count re-

    dundant evidence (Schum and Martin, 1981). Patterns of infor-

    mation search often focus on data that cannot fail to confirm a

    favored hypothesis, rather than on data which might test it

    (Wason, 1960, 1968; Einhorn, 1980).

    2.2.3 Cognitive style. Cognitive style has been regarded as a

    relatively invariant, abstract feature of a decision maker's

    approach to information across a variety of tasks (cf., Sage,

    2-6

  • 1981; Libby and Lewis, 1977). Perhaps the most common differen-

    tiation made in this literature is represented by a related clus-

    ter of distinctions between "analytic" and "heuristic" (Huysman,

    1970; Mock, Estrin, and Vasarhelyi, 1972), "abstract" and "con-

    crete" (Schroder, Driver, and Streufert, 1967; Sage, 1981),"systematic" and "intuitive" (Bariff and Lusk, 1977; McKenney

    and Keen, 1974), and "scientific" and "managerial" decision

    makers. The common thread is a distinction between preference

    for formal, explicit analysis, breaking a problem down into ele-

    ments, and an approach based on global intuition, trial and

    error, or "common sense."

    Unfortunately, there is little evidence establishing a relation-

    ship between these categories (based on self-descriptions) and

    actual information-seeking behavior (Zmud, 1978; Keen, undated).

    It has been found that systematics generally take more time and

    do better in decision problems than heuristics (e.g., Mock et

    al., 1972). Other results, however, have been inconsistent:

    showing that systematics prefer more information or less informa-tion and prefer either aggregated or raw data as compared to

    heuristics (cf , Libby and Lewis, 1977; Zmud, 1978). McKenney

    (quoted in Mock et al., 1972) states that the propensity to be

    analytical increases with task familiarity; Klein (1980) and

    Sage (1981) suggest that experts will be more intuitive.

    A second problem in this literature is the failure to validate

    the claim that cognitive styles are task invariant. Studies

    which have attempted to do so have produced disappointing

    results (cf., Libby and Lewis, 1977), and recent reviews (Libby

    and Lewis, 1977; Sage, 1981) have shifted emphasis toward the

    influence of task features on decision styles adopted by the

    same individual at different times.

    2-7

    B-A-- -- ---.-

  • In a few cases, "cognitive styles" have been defined in relation

    to actual cognitive behavior. Thus, Driver and Mock (1976) de-

    fined four styles by reference to two fairly specific processing

    dimensions: amount of information used and degree of focus.

    The latter refers to a tendency to consider only one solution,

    model, or option versus a tendency to entertain multiple possi-

    bilities. Streufert and Streufert (1981a) present criteria for

    "integrative" decision-making styles in terms of the number of,

    and length of time between, information requests and decisions

    based upon them. Streufert and Streufert (1981b) report that

    integrative decision making decreases with decision urgency, but

    is an inverted-U-shaped function of the amount of information

    available.

    2.2.4 Implications for a personalized aid. Descriptive work on

    human inference and and decision processes has implications for

    both the personal and prescriptive aspects of decision aiding.

    2.2.4.1 Personalization and efficient flexibility. "Flexibil-

    ity" in and of itself is not a sufficient objective in system

    design. It is possible to make each of a vast number of logic-

    ally possible information acquisition strategies equally easy,

    by allowing the user to indicate what he wants item-by-item.

    But such a system does not really facilitate the selection of

    strategies as such; to deal explicitly with all possible search

    orders would be beyond the time and capabilities of both user

    and device. The objective of personalization is to delimit the

    subset of strategies which an individual is most likely to pre-

    fer. Decision aids may then be tuned to facilitate explicit

    selection from this smaller group of strategies, while still

    affording the general "flexibility" of an arbitrary item-by-item

    search sequence. Such aids are efficiently flexible in their

    responsiveness to likely user needs.

    The most natural way to acquire and process information can vary

    as a function of the individual and the task. Several such

    2-8

  • forms of variation seem to occur frequently enough in perform-

    ance to justify an aid design which facilitates their employ-

    me nt:

    0 search organized by options or by attributes,

    * decision rules based on cutoffs or tradeoffs,

    0 level of aggregation of information.

    In addition, it seems desirable that an aid facilitate differ-

    ences involving:

    0 focus on one or many options,

    0 desired amount of information, and

    * time into the future over which planning takes place.

    There is little evidence that particular individuals are consist-

    ent across tasks in these preferences, and some indication that

    they are not. In the case of gross categories like "intuitive"

    and "analytic," moreover, there is no reliable mapping of traits

    onto system design features and certainly no indication of how

    different traits interact (cf., Huber, 1982).

    2.2.4.2 Prescriptive prompting. - An important factor in design-

    ing a personalized and prescriptive aid is the impact of indivi-

    dual preferences on performance quality. Simplifying for illu-

    strative purposes, alternative strategies (A and B) for perform-

    ing the same task may fall into one of two classes in this re-

    spect:

    e Strategy A is generally expected to be more accurate oryield higher payoffs than strategy B, but requires moretraining, more time, and/or draws away more attentionfrom other tasks.

    2-9

    . . . f. . I . . ... . . . . . . . . . . . .. .. . . . . . . . . .. . .. . . . . . .

  • An example in the area of choice might be evaluating each option

    by reference to all the relevant dimensions (A) versus eliminat-

    ing some options by reference to only a few (B). In the area of

    inference, it is often difficult to assess the overall credibil-

    ity of a conclusion based on several steps of reasoning (A); one

    can simplify by ignoring the uncertainty at early stages (B).

    In these cases, differences among people in preference between A

    and B might reflect differences in their underlying ability to

    perform A, in their training or knowledge, in their handling of

    workload, in degree of motivation, or in their evaluation of the

    cost of errors.

    For some people, strategy A is expected to be moreeffective (better in accuracy, payoffs, speed, effort,etc.) than strategy B, while for other people, strategyB is more effective than A.

    An example might be search organized by options versus search

    organized by attributes. Payne (1991) speculates that these

    types of search reflect individual differences in the way know-

    ledge is internally represented. Similarly, people who differ

    in their degree of experience or areas of expertise may prefer

    different ways of structuring a problem.

    These distinctions have implications for the appropriateness of

    prescriptive advice in a personalized decision aid. In the

    second case discussed above, the user usually does best with the

    strategy which he prefers; accordingly, an interactive system

    should simply facilitate selection by the user of the informa-

    tion processing rule or structure to be employed.

    In the first case the computer's role may, at the request of the

    user, be somewhat more active. It involves a conflict between

    the user-preferred and the normative strategy - though the use

    of the former may be well justified by savings in time and

    2-10

    _2 .

  • effort. In such cases, the computer can assist by applying a

    prescriptive model to the problem, in parallel with the user's

    own effort which it monitors. The aid may then advise the user

    when discrepancies seem significant. The prescriptive model

    applied by the aid, of course, has no automatic claim to truth;

    it takes the role, rather, of a "cooperative adversary" or

    "devil's advocate." It enables the user to concentrate his own

    attention selectively, in areas that he regards as critical,

    while notifying him when other issues seem worthy of attention.

    The degree of discrepancy which justifies a prompt might be set

    by the user himself, reflecting his own informal assessment of

    the value of his time and effort relative to the cost of errors.

    Advisory prompts of this sort might signal a user who is employ-

    ing cutoff criteria that tradeoffs bear looking into. More gen-

    erally, the user would be advised when information which he has

    not examined may have implications that clash with information

    that he has examined.

    A variety of prompts or other aid features might facilitate pro-

    babilistic reasoning:

    0 inferences should be clearly distinguished from data;

    0 all sources of uncertainty in a chain of reasoningshould be clearly specified;

    e when a decision depends on the probability of an event,the aid might prompt the user with possible scenarios inwhich the event could occur.

    2-11

  • 3.0 A PROTOTYPE AID: GENERAL CONCEPT

    We have identified a set of individual differences in cognitive

    performance which it would be desireable to facilitate in a per-

    sonalized aid, and a corresponding set of biases or errors which

    prescriptive features of the aid might help prevent. In the

    present section, we describe the conceptual requirements or func-

    tional design, of a decision aid which incorporates these fea-

    tures. The description (outlineC in Figure 3-1) will be on a

    quite general level, intended to represent a framework within

    which actual aids, addressing specific sets of users and problem

    contexts, might be developed. The only constraint on such pro-

    blems is that they involve choices among options characterized

    by multiple uncertainties and/or competing objectives, and that

    there be sufficient time and opportunity to consult a decision

    aid. One such specific aid has been designed and partially

    implemented and is described in Section 5.0

    3.1 Data Base

    A major function of the aid is to facilitate the management of a

    large, and possibly changing, data base. Such a data base will

    contain information at a number of levels of aggregation. At

    the most basic level are the inputs to the aid. These may be

    either variables (i.e., periodically updated results of new ob-

    servations) or facts (i.e., prestored or relatively unchanging

    information). Inputs, although they are "raw data" with respect

    to the aid, may represent a considerable degree of prior analy-

    sis: e.g., inputs to an aid at one level in an organization may

    be the outputs of analysis at a lower level. (In general, how-

    ever, such inputs will not be taken at face value; rather, the

    aid assists the user in assessing their credibility.)

    3-1

  • Figure 3-1. Structure of Prototype Aid

    SELECTION

    (Prioritizeinformation)

    ADJUST

    (Revise defaultvalues)

    PLANNING DATA BASE

    (Input action (Inputs,alternatives; forecasts,~evaluation;display inputs, action alter-forecasts, eva ativsl)

    uation; specify natives)

    evidence) ALERT

    (Set thresholds on Idselected factors)

    ADVISORY

    (Set tolerancefor disconfirming

    information)

    3-2

  • A second element in the data base is a prescriptive model, or

    set of prescriptive models, which enables it to aggregate inputs

    into higher level inferences and forecasts: e.g., reports of

    unit status may be integrated into an assessment of overall

    organizational capability. If the user wishes to evaluate ac-

    tion alternatives, the data base stores the alternatives he

    chooses to consider and uses them to derive action-contingent

    forecasts.

    Finally, the data base model integrates all the information

    about each action alternative into a single evaluative measure

    based on probabilistic forecasts and assessments of the values

    of possible outcomes.

    3.2 Planning Module

    The heart of the aid is the Planning Module. It serves, first

    of all, as an extremely flexible and powerful tool for interact-

    ing directly with the data base. In particular, at the user's

    request, it:

    " displays basic inputs;

    " displays inferences and forecasts derived from inputdata and prescriptive models;

    * specifies the evidence upon which any inference or fore-cast is based;

    o accepts hypothetical action alternatives from the userand displays forecasts contingent on specific options;

    o prioritizes actions either with respect to specific sub-goals, or with respect to an overall figure of merit.

    The design of the Planning Module has been tailored to accommo-

    date individual differences among users both in the information

    they select for review and in the way they select it:

    3-3

  • " The degree of analysis and problem structuring is up tothe user: displayed information can be at any level ofaggregation (from raw data to forecasts of criticalevents to action evaluations).

    " The user can efficiently organize his information searcharound actions (e.g., requesting forecasts of a numberof events contingent on a given option) or around evalua-tive dimensions (e.g., exploring how a number of differ-ent actions affect the forecast of a given event.

    " The user can focus on one (or a few) options or scan alarge number superficially.

    " The user can explore the evidence for a particular infer-ence or forecast in depth, or he can review a large num-ber of relatively independent inferences and forecasts.

    " Action alternatives considered by the user may extendlonger or shorter periods of time into the future.

    3.3 Customizing Modules

    In addition to this direct, flexible use of the Planning Module,

    the user can, if he wishes, specifically customize his interac-

    tion with the data base. He can personally modify the Planning

    Module in four different ways (Figure 3-1):

    " By means of the Selection Module, the user can priori-tize data base elements. Then, in the Planning mode, hecan search the data base quickly in the prechosen order.

    " In the Adjust mode, the user can insert subjective judg-ments in place of default values at any level in thedata base. The Planning Module will then display theimplications of the hypothetical or revised values forany higher level inference. (Default values, however,continue to be stored and displayed.)

    0 In the Alert Module, the user can set a cutoff or thres-hold for any variable in the data base (at any level ofaggregation). The Planning Module will forecast whetheror not the threshold is expected to be crossed for anygiven action alternative, and if so, when. At the timewhen the threshold is in fact crossed, the user isalerted.

    3-4

  • a Advisory prompts, in the Planning mode, monitor informa-tion not requested by the user, and advise him when thatinformation appears to have implications different fromthe information he has observed. In the Advisory Mod-ule, the user can set his own "tolerance" for disconfirm-ing information, thus determining the nature and frequen-cy of advisory prompts.

    3.4 Summary

    The basic conceptual design outlined above accommodates indivi-

    dual differences in beliefs, values, and preferred level of pro-

    blem structure (Section 2.1), in search organization and deci-

    sion rule (Section 2.2.1), and in features of "cognitive style"

    (number of options focused on, number of dimensions utilized,

    time span of planning; Section 2.2.3).

    In addition, it offers substantial prescriptive support: Data

    and inferences are clearly separated, and sources of uncertainty

    indicated (Section 2.2.2). Outputs of prescriptive models at

    any level may be directly displayed, at the user's request,

    based either on default values or on the user's own judgments.

    Finally, prescriptive prompts advise the user when information

    he has not observed seems to warrent attention.

    A more detailed design for each portion of the prototype aid, as

    described above, will be presented in the particular context of

    submarine approach and attack. Parts of the Planning Module,

    Selection Module, and Data Base have been implemented in a demon-

    stration prototype system.

    3-5

    • L

  • 4.0 SUBMARINE APPROACH AND ATTACK

    4.1 The Decision Setting

    A specific context selected for development of a prototype aid

    is attack submarine-based antisubmarine warfare (ASW). The di-

    lemmas faced in approaching and attacking a hostile submarine

    are, however, characteristic of many situations involving

    stealth in warfare: How long should I attempt to remain unde-

    tected and to improve my position, before I tip my hand by

    launching a weapon?

    Some of the tradeoffs involved are outlined in Figure 4-1. On

    the one hand, if own ship fires later and closer to the target,

    the attack, if it occurs, is more likely to succeed: first,

    because the accuracy of target location and motion estimates

    will be improved, hence also the chance of a hit; second, be-

    cause the target will have less time to respond (either by evad-

    ing the weapon or counterattacking) when it hears the weapon in

    the water. On the other hand, the longer own ship waits, the

    greater the chance it will be counterdetected before attack,

    losing the advantage of surprise; or that the opportunity to

    attack will simply slip away, either by loss of contact or arriv-

    al of the target in a less favorable region. In all these con-

    siderations, the Commander must balance the objectives of pre-

    serving own ship against the objective of destroying the target.

    In fact, time of fire is not the only decision in approach and

    attack which involves uncertainty about outcomes and competing

    objectives: the CO must decide, for example, whether to attack

    at all, and if so, against what targets, with what weapons, and

    after what maneuvers. In all these choices, like time of fire,

    tradeoffs occur: in target selection, where hostile platforms

    may vary in importance and in threat to own ship; in weapon se-

    4-14 | H ,--•-, r ' "

  • Figure 4-1.

    Passive Approach and Attack:

    Time of Fire Decision Tradeoffs AmongRange-Related Opportunities and Risks

    Target . I-

    I Fire Late: Better Chanceof a Hit/Less Time for

    S. . Target to React to Weapon

    Ship Fire Early: LessShip Risk of Counter-

    detection or Lossof Opportunity

    4-2

    L... ... . ....

  • lection, where cost or supply may be correlated with effective-

    ness; and in selecting among maneuvers that may optimize target

    localization only by exposing own ship to counterdetection. In

    all these choices, the CO has a limited amount of time and a

    very large quantity of data: pertaining to the threat or

    threats (target classification, location, motion, weapon and

    sensor capabilities, and intentions), own ship (weapon and sen-

    sor capabilities, targeting priorities), and the environment

    (thermal conditions, ocean depth, geography).

    Perhaps because the information load is potentially so great,

    simple "decision rules" are often adopted (at least verbally):

    e.g., for time-of-fire, avoid counterdetection; or fire as soon

    as within maximum weapon range and in possession of a range solu-

    tion. Unfortunately, these rules imply natural cutoff points

    that may not exist: the probability of counterdetection can

    increase gradually during an approach, as can the quality of a

    solution. More importantly, they exclude, at least verbally, a

    balancing of tradeoffs - such as accepting a small risk of de-

    tection in order to accomplish a mission objective.

    4.2 The Decision Makers

    How do Commanding Officers utilize information in planning an

    attack? Are there important differences in their data-gathering

    and decision-making styles?

    Evidence which we have gathered suggests that such differences

    are quite pronounced. Responses of four former attack submarine

    officers to a hypothetical approach and attack scenario are sum-

    * marized in Appendix C. Dramatic differences were found in:

    * level of aggregation - e.g., whether the major objectivewas "killing the target" or moving him into weaponrange, preserving own ship or avoiding counterdetection;

    4-3

  • * number of evaluative dimensions - e.g., whether killingthe target and preserving own ship were considered simul-taneously in specific decisions;

    e use of cutoffs - e.g., whether actions were triggered bythresholds based on weapon range or counterdetectionrange;

    0 attitude toward risk - e.g., whether one or both of twotargets was engaged;

    * amount of information - i.e., number of items which wereconsulted from those currently available on board subma-rines which were requested; and

    o pattern of information search - i.e., the amount of evi-dence (number of different sources) sought for eachitem, and whether search among items was organized bysource or by item.

    4.3 Testbed Characteristics

    Approach and attack planning is only part of the inforw ation

    management problem in the combat center of a submarine. Several

    characteristics, however, made it appealing as a context for pro-

    totype aid development:

    o its high stakes and high expected frequency of occur-rence in wartime,

    e the complexity of the decisions involving uncertaintyand competing objectives,

    * the fact that the time available for decision making islimited, but sufficient for utilization of an aid, and

    * the evident appropriateness of an information supportsystem whic," caters to the personal styles of its users.

    4-4

    ---

  • 5.0 A PERSONALIZED AND PRESCRIPTIVE ATTACK PLANNING AID

    The feasibility of implementing the general concept of a person-

    alized and prescriptive aid has been subjected to test in the

    context of submarine approach and attack. The resulting proto-

    type aid is designed to facilitate both personalized, flexible

    manipulation of a data base and timely, valid decisions. Ad-

    vanced features of interface design minimize the labor that must

    be devoted to making the system work and free the user's atten-

    tion for the problem at hand.

    Portions of the prototype aid design are at varying stages of

    implementation. Elements of the data base, Planning Module, and

    Selection Module have been programmed, and will be described

    here in the context of the more complete conceptual design.

    5.1 Implementation

    The purpose of the current physical implementation of the attack

    planning aid is to illustrate, at low cost, the essential fea-

    tures of the user-system interface and to provide a prototype

    suitable for demonstration and testing. It is by no means in-

    tended to represent the final form the aid would take in a mili-

    tary setting.

    The most important feature of the aid interface is its simpli-

    city for the user. From his point of view, the current system

    consists of a high-resolution color display, a joystick, and

    function key. (A more detailed description of the system is in

    Appendix A.) As in the Spatial Data Management System (Herot et

    al., 1978; Bolt, 1979), the user can traverse the space of avail-

    able display options, selecting items for detailed examination

    and generating his own inputs, with surprisingly little effort.

    The joystick-plus-key control - unlike most keyboards - frees

    the user's visual attention for whatever appears on the display.

    5-1

    . - L m t,. .

  • An equally critical feature of the finished system will be its

    fit within the submarine combat center. The aid is designed

    primarily for the Commander or a high level aide (e.g., the

    Approach Officer), and should be fully consistent with the com-

    mand role. For example, although a large-screen command-

    dedicated display has been tentatively proposed for new genera-

    tion submarine combat centers (e.g., NUSC, 1978), a problem with

    this idea, as usually conceived, is that it would discourage the

    Commander from moving about the combat center (cf., Cohen et

    al., 1982). The present system design, with a light-weight

    hand-held joystick-plus-key, would give the Commander constant

    visual access and control, with no sacrifice of freedom and flex-

    ibility.

    5.2 Data Base

    The data base contains information at all levels of aggregation:

    from ultimate combat objectives (i.e., killing the target, sur-

    viving), to intermediate goals (i.e., critical combat events

    which influence the achievement of ultimate objectives, like

    counterdetection and first shot kill), to basic inputs. Basic

    inputs include (in principle) most of the data currently avail-

    able for display on board submarines: e.g., information about

    target range, ocean thermal conditions, threat weapons and sen-

    sors, etc.

    Inferences and forecasts regarding combat critical events are

    derived by applying a set of prescriptive models to basic in-

    puts. If the Commander proposes a tactical alternative for eval-

    uation (i.e., a target, weapon, and set of approach maneuvers)

    o forecasts of critical events may be computed conditional on that

    tactic.

    Evaluation in terms of ultimate objectives is accomplished by

    further prescriptive aggregation. At this level, probabilities

    5-2

  • for the outcomes of ultimate concern (killing the target and

    surviving) are computed and combined with assessed values of

    target and own ship; an overall figure of merit (expected util-

    ity) is then derived for the tactic under consideration.

    The currently implemented data base is by no means complete.

    Basic inputs have not been programmed for display - because this

    information is already available on submarines. Our focus, for

    demonstration purposes, has been on higher level aggregations

    and the prescriptive modeling required to derive them (as de-

    scribed in Appendix B). Nonetheless, basic inputs have been

    fabricated for a single hypothetical target and weapon. Algor-

    ithms have been programmed for the derivation and display of the

    following probabilistic forecasts of critical events, for the

    hypothetical target and weapon:

    0 first shot kill

    o counterdetection

    o successful target counterattack

    Prescriptive algorithms have been programmed to derive the fol-

    lowing displays of ultimate objectives:

    " probability of kill

    " probability of survival

    " figure of merit

    5.3 Planning Module

    * The Planning Module divides a display screen into specialized4

    areas, as shown in Figure 5-1. The box labeled "Data Base Dis-

    play Options" orients the user in two important ways:

    5-3

  • t-1<

    w H-- 4>

    0

    ,-4

    -.ty

    4

    0

    -)

    4J >

    o "

    0 E-4" 4

    9EU)

    : aO Z

    U E-1 E-4E-4rJ4 0 O 0

    >

    r

    E45-

    5-4

  • " It outlines the portion of the data base which is imme-diately available for display; and

    " it tells the user what his current display selection is.

    If the current display involves an inference or a forecast, the

    "Validation Options" box outlines in detail the portion of the

    data base which provides the evidence from which it is derived.

    Information selected for display appears in the "Planning Dis-

    play B" (PD-B) box. Planning Display A (PD-A) provides a comple-

    mentary overview: e.g., in attack planning it shows the loca-

    tions of contacts and own ship.

    Action alternatives can either be selected from a given list or

    generated by the user for evaluation. In the latter case, they

    are listed explicitly in the "Action Alternatives" box; or if

    there is a natural pictorial representation, they may be de-

    picted in either or both of the Planning Displays. Thus, target

    alternatives appear in PD-A and approach paths are drawn by the

    user in PD-B.

    From the Planning Module the user has direct, easy access to a

    number of more specialized routines, as indicated in the "System

    Module Options" box. These routines enable the user to custom-

    ize the operation of the Planning Module for his own purposes:

    he can make selections fc the Data Base Display Options box,

    adjust default values in the data base, set cutoffs or alerts,

    and establish thresholds for prescriptive prompts.

    5.3.1 The planning module in use. In Figure 5-2 we have the

    entire Planning Module display as the Co:.mander sees it before

    making inputs. Planning Display A gives the big picture: in

    this hypothetical scenario, it shows two targets ("1" and "2")

    5-5

  • Figure 5-2. Plarnninq Module Dis- I V Prior t-(- User Ii,,uts

    44,

    -~ f _L

    5-6

  • and the location of own ship as a green cross. (Subsequent ver-

    sions will offer the option of uncertainty ellipses representing

    target location, and fans indicating uncertainty about target

    course and speed. The user will also be able to vary the scale

    of PD-A and PD-B. In the current implementation, all distances

    are arbitrary for security reasons. Thus, scale markings have

    been omitted.)

    The Commander may select information for display from any level

    of aggregation in the data base. For example, he might call up

    a forecast of the probability of own ship's killing the target,

    contingent on some tactic; or he might prefer to examine some of

    the lower level probabilities which influence the overall proba-

    bility of kill, e.g., the probability of counterdetection, or

    the probability of a first shot kill. Finally, the Commander

    might use the aid to analyze the location and motion (LOC/MOT)

    of the target.

    In Figure 5-3 the Commander has used the joystick-plus-key to

    indicate (1) the tactical option (consisting of approach maneu-

    vers, weapon selection, and target priority) that he wishes to

    evaluate, and (2) that his current concern is the probability of

    a first shot kill.

    These inputs need take no longer than a few seconds. The joy-

    stick controls the motion of a cursor on the screen in a natural

    spatial manner. As the cursor moves along the list of data base

    display options or validation options, the indicated item

    blinks. Pressing the button means that the blinking item is

    selected, and the selected item is then highlighted. (Selec-

    tions can be canceled by pressing the button a second time or by

    entering another selection.) Weapon options (e.g., "48" or

    "37") are chosen in precisely the same way. Target selections

    are made from the spatial display in PD-A, by moving the cursor

    5-7

  • PW -144

    F '~A?- ~ Lhi

    f

    - - 1-4-

    L-Z.&-

    5-8a

  • to the location of the desired target and, again, pressing the

    button. (For quickness, the cursor skips over intervening

    spaces among discrete items.) The Commander in Figure 5-3 is

    considering the alternative of firing at target 1 with a Mark 48

    torpedo.

    The Commander also uses the joystick-plus-key to draw an

    approach path in PD-B. The region around own ship is here shown

    on a larger scale than in PD-A. Legs are created one by one by

    moving the cursor to the desired point and pressing the button;

    that point is then connected by a line to the previously indi-

    cated point (or, on the first leg, to the own ship symbol).

    (Later portions of any path can be erased by moving the cursor

    to an already used point and pressing the button; if the cursor

    is placed on the own ship symbol, the entire path will be

    erased.) Individual users can vary in the preferred time span

    over which planning takes place, by varying the lengths of the

    approach maneuvers submitted for evaluation.

    As the user draws the approach path in PD-B, PD-A replicates it

    on a smaller scale and shows, simultaneously, how the locations

    of the two targets are expected to change. (This computation at

    present assumes a constant own ship speed).

    After the Commander indicates a tactical alternative and a data

    base option, he moves the cursor back to "home" (the yellow

    square in the central margin) and presses the button. PD-B then

    displays the requested information: i.e., the predicted proba-

    bilities, at different points on the approach path, of a first

    shot kill if own ship fires at that point. In the current

    purely hypothetical example, the probability of first shot kill

    is assessed at the present as .45, but rises to .80 at -' last

    point on the indicated path. In the meantime, the validation

    box indicates that these probabilities were derived from informa-

    5-9

  • tion about the user-selected own ship weapon, analysis of the

    target's location and motion, target classification, target sen-

    sors, and target maneuvering capabilities.

    5.3.2 Stored alternatives. Letters A, B, and C in PD-B refer

    to a set of previously examined tactical alternatives (i.e.,

    different selections of target, weapon, or approach path) which

    the Commander has stored. The numbers above each letter summar-

    ize the currently relevant information for each alternative:

    i.e., the maximum probability, across times of fire, of first

    shot kill for each alternative. Since the currently displayed

    alternative is C, the Commander may conclude that - in terms of

    probability of kill at least - options A and B appear to be sig-

    nificantly inferior.

    When the cursor is moved out of the PD-B box, any unerased (and

    not previously stored) path is stored and a letter denoting that

    pF appears at the bottom of the box. Paths can be recovered

    by moving the cursor to the appropriate letter, which then

    blinks, and pressing the button; the selected letter is then

    highlighted. (A path can be erased from storage by pressing a

    second time - but a change of mind is possible since the erased

    path will remain drawn on the screen.)

    5.3.3 Personalized information search. The Commander may now

    feel he has as much information as he needs. Alternatively, he

    can pursue the Planning Module dialogue in any of several direc-

    tions:

    e He can evaluate the credibility of the current display* by calling up options from the validation box;

    e He can examine additional tactical alternatives by indi-cating a new choice of target, weapon, or approach path;

    e He can evaluate the current tactical alternative or pre-viously examined ones in the light of other combat con-cerns.

    5-10

  • An important feature of the aid helps the user to explore the

    data base in the way that most suits his decision-making style.

    He can efficiently organize his search either by alternatives or

    by evaluative "dimensions" (i.e., forecasts of critical events

    or ultimate goals). In the former case, after generating or

    retrieving a tactical alternative, and without selecting any-

    thing from the Data Display box, the user moves the cursor to

    home and presses the function key. The aid automatically dis-

    plays in PD-B the item listed first in the Display Options box;

    a subsequent button depression gives the next item in the

    column, and so on. The user can thus quickly size up a given

    alternative on all pertinent dimensions of evaluation.

    Similarily, he can quickly scan a set of options with respect to

    one dimension. To do so, he selects the dimensions from the

    Display Options box without selecting a tactical alternative.

    Button pushes at home then cause the display to step through the

    sequence of stored actions (A, B, C, etc.).

    The user-interfaces for the Planning Module and the Selection

    Module have been completely programmed; their functioning is

    circumscribed only by the gaps noted above in the data base im-

    plementation.

    5.4 Selection Module

    The Selection Module allows the user to personally select the

    portion of a large data base which will be inmediately access-

    ible through the Planning Module. The data base items likely to

    be needed by a particular decision maker will depend on the oper-

    ational situation he anticipates. The Selection Module expe-

    dites his access to that information; but at the same time, be-

    cause of its streamlined interface, it permits him to alter his

    prioritization and retrieve any other information quickly and

    easily.

    5-11

    1 6 1M W

  • The Selection Module is entered directly from the Planning Mod-

    ule. The initial display, shown in Figure 5-4, outlines the

    entire "display-option space" of the data base. That space is

    organized hierarchically, according to the three levels men-

    tioned in Section 5.1: tactical evaluation with respect to over-

    all objectives; forecasts and inferences of combat critical

    events (which may pertain to either the approach or attack

    phase); and basic inputs regarding current status (broken down

    into information regarding contacts, own ship, and environment).

    (The currently implemented display shown in the figure is a

    quite incomplete version of the final attack planning data

    base.)

    To make selections from this display, the user again moves the

    cursor in a natural spatial manner to the pertinent item. A

    press of the button causes that item to vanish from its standard

    spot and reappear in the Choice list on the right. In Figure

    5-5, a number of items have been selected in the order shown.

    The user can insert items anywhere in the list: If he moves the

    cursor to an item in the Choice list and presses the button, a

    blank appears above that item; if he now selects a new item from

    the data base listings, the new item appears in the blank. (A

    Choice list item can be deleted - i.e., moved back to the

    original list - by selecting it with the cursor and pressing

    twice.)

    When the user selects "Exit", the Planning Module returns. The

    ordering of data base items in the Display Option box will now

    * correspond to the order he has selected.

    5.5 Adjust Module (Conceptual Design)

    The Adjust Module has two important functions: incorporation of

    subjective inputs into the Data Base and what-if planning.

    5-12

  • .. ... .... .

    - -4

    C-1~

    3 .* ivy'

    EOr

    qX 1

    I--5-1M

  • Ono

    v._w si

    ------- -- ---- -

    21

    I,'. 7t

    ; -7OVA14

    a -a A

    V2

    lit

    l9k

    too*

    .4 tq

    0-4

    E-4b-q q=

    =0-4

    "Poe

  • Basic inputs in the data base are derived from a number of

    sources: e.g., sensors, third party communications, intelli-

    gence, and own ship equipment specifications. None of this in-

    formation is infallible: much of it already involves a degree

    of judgment; and some of it might well be improved by the judg-

    ment of appropriately experienced people on the spot (see Cohen,

    1982, for the role of subjective judgment in improving estimates

    of target range). The Adjust Module enables the user to insert

    his own judgments at any level of aggregation (e.g., from range

    accuracy to probability of a kill) and then observe the implica-

    tions of the adjustment for any higher level. In the same way,

    the adjustment option allows the user to explore hypothetical

    states of affairs. In both uses of the Adjust Module, the ori-

    ginal default values are retained and may be restored at any

    time.

    The user selects a data base item for adjustment from a display

    much like the Selection display of Figure 5-4. A graphical re-

    presentation of the requested information then appears, and the

    user employs the cursor-plus-key to draw in new values. For

    example, Figure 5-6 shows a conceptual display of classification

    probabilities for Target 1. Default probabilities (indicated by

    dotted lines) are computed automatically Dn the basis of sensor

    cues, third party reports, and/or intelligence information. The

    user inserts new values (the solid lines) by moving the cursor

    to the desired position on the vertical scale above the relevant

    classification type, and pressing the button. (The final proba-

    bility is computed automatically so that the total sums to 1.0.

    The user deletes a value by entering a new one; and he may re-

    store default values by selecting "Default.")

    In this example, the user has chosen to explore the assumption

    that target 1 is a Victor. If he now returns to the Planning

    Module, all subsequent displays (e.g., forecasts of counterde-

    5-15

  • Figure 5-6. User Adjustment of Classification

    Probabilities

    Default1.0

    CLSFNPROB(T2) .5F I-- | ,----1

    f I I i

    I I I I

    0VICTOR CHARLIE YANKEE

    5-16

    •MOM

  • tection probability or first-shot kill probability) will be con-

    ditional on that assumption, and appropriately labeled as such.

    5.6 Alert Module (Conceptual Design)

    In decision making, the task of compairing a sizeable number of

    options with respect to multiple dimensions can be simplified by

    utilizing cutoff criteria. For example, options which fail to

    achieve a critical level on some dimension or combination of

    dimensions may be quickly eliminated. Although there is danger

    that important compensating advantages will be overlooked, this

    strategy is often likely to be cost/effective. The Adjust Mod-

    ule has two functions: It customizes the Planning Module for

    the strategy of evaluating options by reference to cutoffs; and

    at the same time, it provides an alerting system in real time

    when those cutoffs are crossed.

    In the Adjust Module cutoffs can be quickly and easily set on

    any variable in the Data Base, at any level of aggregation.

    After the user selects a set of variables from a display like

    that in Figure 5-4, values of each requested variable are dis-

    played graphically, as shown in Figure 5-7. In this example,

    the current probability estimate for being within counterdetec-

    tion range (of either Tl or T2) is .20; for having target 1 with-

    in own ship weapon range, .80; and the 95% interval of uncertain-

    ty (2 ) concerning Tl's range, as a percentage of Tl's estimated

    range, is +35%. The user sets cutoff thresholds (shown by the

    dotted lines) by moving the cursor to the appropriate position

    in the diagram (anywhere along the desired radius) and pressing

    the button.

    When the user returns to the Planning Module, he finds a new

    Data Base Display option called "Alerts." If he selects this

    option, his cutoffs are incorporated into a PD-B display which

    5-17

  • Figure 5-7. Alert Module Display of Current

    Values and User-Set Thresholds

    CURRENT VALUE - THRESHOLD

    O/S WITHIN CD RANGE.20

    Ti WITHIN O/S WPN RANGE

    .80o

    2R1% OF Tl

    5 .35

    5-18

  • forecasts - for the tactical alternative under consideration -

    when and if each threshold will be crossed. Thus, Figure 5-8

    shows where on the approach path the objectives (or counterobjec-

    tives) with respect to weapon range and counterdetection are

    first expected to be crossed, and indicates that the desired

    range accuracy may not be achieved at all under this set of

    maneuvers. Utilizing the organized search procedure made avail-

    able by the Planning Module, the user can scan quickly through a

    series of stored action alternatives, comparing each one to a

    multidimensional conjunctive or disjunctive criterion.

    When a previously set cutoff is actually crossed, the Planning

    Module alerts the user by placing the word "ALERT" next to the

    appropriate item in the Data Options Box (the item is added if

    not already present) and/or, at the user's option, by producing

    an auditory signal.

    5.7 Advisory Module (Conceptual Design)

    Advisory prompts in the Planning Module provide a sort of "peri-

    pheral vision" for the user, warning him when there is something

    of interest outside his current focal area. Thus, a consider-

    able degree of flexibility in the user's allocation of time and

    attention to information can be combined with reasonably unobtru-

    sive prescriptive checks and safeguards.

    The objective is not simply to alert the user whenever there is

    some difference, however small, between his judgment and the

    output of a prescriptive model. The difference must be large

    4enough to matter, in the actions to be selected and in their

    expected outcomes (cf., Cohen and Freeling, 1981). Before

    prompting, therefore, the prescriptive model compares the

    expected utility of the "best" action with the expected utility

    of the action suggested by information the user has observed.

    5-19

  • Figure 5-8. Planning Module Forecasts

    for User-Set Thresholds

    7i

    ___ __ CD(50%)

    WPN-O/S (90%)

    2a R(10%) NOT ACHIEVED

    5

    5-20

  • The Advisory Module enables the user himself to determine how

    large that difference must be to warrant a prompt.

    The Advisory Module display in Figure 5-9 enables the user to

    set such a threshold on an easily interpreted utility scale.

    The value of own ship survival is set arbitrarily at 100; the

    values of killing various targets may be provided by default

    (based on mission directives) or adjusted by the Commander (see

    Appendix B for a description of the scaling procedure). The

    user moves the cursor to the desired threshold point on this

    scale and presses the button. The effect, in this example, is

    that unless the prescriptive model estimates a potential prompt

    impact of at least 28% of the value of own ship survival, no

    prompt will be provided. (In practice, the user need not be

    concerned with the interpretation of the scale; after some exper-

    ience he can simply set the threshold to achieve the desired

    frequency of prompts.)

    The Planning Module monitors the user's selection of information

    and der'ves a rough hypothesis as to the action implied by that

    informa on. (It may do this by aggregating the s"t of rank

    orderings of tactical alternatives which are produced by the

    individual dimensions which the user has observed.) It then

    compares the presumed "user-preferred action" according to the

    aggregated rank ordering with the best action according to the

    prescriptive model. If the difference in expected utility

    exceeds the user-set threshold, the next step is to determine

    which items as yet unobserved by the user would be most helpful.

    The Planning Module prioritizes unobserved dimensions in terms

    of how well they differentiate between the "user-preferred" op-

    tion and the best option according to the prescriptive model.

    * These are the dimensions most likely to fulfill the role of

    "devil's advocate" - i.e., they are most likely to lead the user

    to constructively question his current assumptions. Asterisks

    5-21

  • Figure 5-9. Arvisory Module Display of User-Set

    Tole~d.,ce for Disconfirming Information

    Tgt Tgt Own2 1 Ship

    0 25' 50 75 100

    Prompt

    Thresho01d

    5-22

  • are displayed next to these critical dimensions in the Data Base

    Options box (if not already present, the dimension is added to

    the list) - signifying to the user that information is available

    which appears to clash with information he has already observed.

    5.8 Summary

    The attack planning aid gives concrete form to the notion of a

    personalized and prescriptive aid. Direct use of the Planning

    Module enables the user to sample information from any preferred

    level of aggregation in the data base, organize his search for

    information efficiently in terms of either alternatives or attri-

    butes, and vary the number of options examined, the amount of

    information requested concerning each, and the time into the

    future which planning encompasses. Moreover, this module

    clearly distinguishes conclusions from evidence, and displays

    the sources from which each inference is derived.

    Personalizing and prescriptive features are enhanced by customiz-

    ing modules. The Selection Module allows the user to prioritize

    information for immediate access and efficient search. The Ad-

    just Module accommodates individual differences in beliefs and

    preferences and - from a prescriptive point of view - adds a

    potentially valuable source of information (the user) to the

    data base. The Alert Module facilitates individual heuristic

    strategies which evaluate actions by reference to cutoffs (as

    opposed to tradeoffs); it also alerts a user, whose attention

    may be otherwise engaged, when cutoffs are crossed. Finally,

    the Advisory Module allows the user to determine the degree of

    prescriptive prompting that occurs when potentially disconfirm-

    ing data exist which he has not yet observed.

    5-23

    __I

  • 6.0 INFORMAL DEMONSTRATION OF THE PROTOTYPE AID

    A design and its users must evolve together. When an aid de-

    fines new tasks and offers capabilities not previously exper-

    ienced, assessments based exclusively on current user practice

    are of limited value; the user must be able to respond to a con-

    crete prototype of the proposed system. Such responses, over

    many iterations, may lead both to user acceptance of new pre-

    scriptive possibilities and to significant alterations in the

    details and the objectives of the initial design (cf., Keen,

    undated).

    As a first step in this process of "middle-out" design, two

    former submarine officers were exposed to an initial version of

    the attack planning aid. After a short briefing, they operated

    the aid within the hypothetical scenario described in Section

    5.0. Since data base information was available regarding only

    one hypothetical target and weapon, the focus of the exercise

    was the selection of approach maneuvers and time of fire.

    Both users responded favorably to the aid. More significantly,

    they managed to use the aid in dramatically different ways in

    the same decision context (although some of the information they

    requested was not yet implemented in the prototype).

    In one case, the aid supported an initial decision of whether to

    Isprint" (i.e., move quickly toward the target) or "drift"

    (i.e., maneuver more slowly to gather information on range and

    classification). This user requested projections of target

    localization accuracy conditional on these two kinds of

    approach. In the second case, the decision maker preferred to

    use the aid within a more restricted planning horizon, yet

    sampled information at a higher level of aggregation. The"approach maneuvers" he evaluated consisted of a single leg, and

    6-1

  • his major concern was the probability of counterdetection.

    (This user would have preferred a display of counterdetection

    cutoffs, as in the not-yet-implemented Alert Module.)

    Both users expressed a need to validate the displayed forecasts

    (of localization accuracy and counterdetection probability, re-

    spectively) by examination of evidence. In addition, both users

    requested some sort of optimization procedure whereby the aid,

    rather than the user, generates alternatives.

    This demonstration is by no means meant to serve as a rigorous

    test of the aid: the "subjects" were too few, the prototype aid

    and the hypothetical scenario too incomplete. (Moreover, inputs

    from these officers had been utilized earlier in the design pro-

    cess, as described in Appendix C; we think it unlikely, but this

    role might have biased the outcome of the subsequent demonstra-

    tion.)

    Nevertheless, th